Top

U of T computer scientists' MOOClet framework helps personalize and improve online learning experiences

Instructors, students and scientists have ideas every day for how to improve instruction — from which explanations of concepts make things click, to which messages motivate students to start homework early.

Left to right: Pan Chen, Assistant Professor Joseph Jay Williams, Mohi Reza, Harsh Kumar and Ilya Musabirov. (Photo: Matt Hintsa)

With an eye to improving students’ learning experiences in higher education, computer scientists at the University of Toronto are democratizing access to software tools that support educational experimentation at scale.

The team is known as the Adaptive Experimentation Accelerator and is led by Assistant Professor Joseph Jay Williams. He is working alongside U of T CS graduate students Ilya Musabirov, Mohi Reza, Pan Chen and Harsh Kumar and researchers from Carnegie Mellon University and North Carolina State University. Together, they have developed cross-platform infrastructure that supports both traditional and adaptive experiments in online instructional settings.

The aim is to make it easier to present personalized ideas and learning concepts to students and, in turn, collect data about what might be working, for whom and when, allowing instructors to adapt their curricula as needed.

For this collaborative work, they have been named one of the three finalists in the XPRIZE Digital Learning Challenge, a global competition to modernize, accelerate and improve the ways effective learning tools and processes are identified.

The Adaptive Experimentation Accelerator team is competing for the US$500,000 grand prize.

Using MOOClet, software architecture previously developed by Williams and collaborators, the team transformed online course components into collaborative “micro-laboratories,” where instructors and researchers can continuously experiment, improve and personalize their content.

The team’s adaptive experimentation approach pairs machine learning with A/B testing to tailor the experience of students based on data collected from their engagements with digital learning platforms, such as answers to homework problems.

In A/B testing, two or more versions of a variable (e.g., webpage, definition of a concept, emails) are shown to user groups to determine which version leaves the maximum impact. From there, researchers can adjust the student learning experience in real-time based on feedback from students and use this data to help others more quickly. This approach allows instructors to improve an online course even after it is deployed to students and deliver better versions of course content over time.

Williams explains that MOOClet could be used by an instructor who posts the definition of a concept on an online lesson page. The instructor could then revise and improve upon the definition based on students’ questions and comments on the platform’s forum.

It could also include testing which messages from an instructor motivate students to start homework early, he says.

“When A/B testing the emails to students, the MOOClet infrastructure uses machine learning algorithms that automatically analyze data to discover which emails are most effective and send them more often to future students,” he adds.

Williams says by combining A/B testing and machine learning, the team is striving to make education compelling and continually improve it so that every student benefits.

“Just as YouTube lowered barriers for anyone to create content and share, MOOClet can lower barriers for anyone to collaborate with others to test out different ideas about what’s going to help students, and access data and machine learning algorithms that analyze that data and make rapid changes,” he says.

“We want to help everyone conduct adaptive experiments,” says first-year PhD student Pan Chen. “Even if they don’t know how to code.”

Williams and Chen agree the potential uses of the MOOClet framework are as varied as they are numerous — from customizing news articles on websites, to menu options at restaurants — but right now, their focus is squarely on education.

“We’re driven to provide these tools because education is such a hard and important problem,” Williams says. “We want everyone to be able to contribute ideas. And to do that, we wanted to provide an ecosystem where these ideas can be tested to see what works, for whom, and in what context. Just as the medical field uses adaptive designs in clinical trials to continually improve their treatments, we need education to be perpetually innovating and improving.”

To learn more about the Adaptive Experimentation Accelerator and the MOOClet framework, visit www.adaptexptinfra.org