The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

In theory, education technology could redesign school from a factory-like assembly line to an individualized experience. Computers, powered by algorithms and AI, could deliver custom-tailored lessons for each child. Advocates call this concept “personalized learning” but this sci-fi idyll (or dystopia, depending on your point of view) has been slow to catch on in American classrooms.

Website for Mind/Shift
This story also appeared in Mind/Shift

Meanwhile one piece of ed tech, called ASSISTments, takes the opposite approach. Instead of personalizing instruction, this homework website for middle schoolers encourages teachers to assign the exact same set of math problems to the entire class. One size fits all. 

Unlike other popular math practice sites, such as Khan Academy, IXL or ALEKS, in which a computer controls the content, ASSISTments keeps the control levers with the teachers, who pick the questions they like from a library of 200,000. Many teachers assign the same familiar homework questions from textbooks and curricula they are already using.

ASSISTments encourages teachers to project anonymized homework results on a whiteboard and review the ones that many students got wrong. Credit: Screenshot provided by ASSISTments.

And this deceptively simple – and free –  tool has built an impressive evidence base and a following among middle school math teachers. Roughly 3,000 teachers and 130,000 students were using it during the 2022-23 school year, according to the husband and wife team of Neil and Cristina Heffernan who run ASSISTments, a nonprofit based at Worcester Polytechnic Institute in Massachusetts, where Neil is a computer science professor.

After Neil built the platform in 2003, several early studies showed promising results, and then a large randomized control trial (RCT) in Maine, published in 2016, confirmed them. For 1,600 seventh-grade students whose classrooms were randomly selected to use ASSISTments for math homework, math achievement was significantly higher at the end of the year, equivalent to an extra three quarters of a year of schooling, according to one estimate. Both groups – treatment and control – were otherwise using the same textbooks and curriculum. 

On the strength of those results, an MIT research organization singled out ASSISTments as one of the rare ed tech tools proven to help students. The Department of Education’s What Works Clearinghouse, which reviews education evidence, said the research behind ASSISTments was so strong that it received the highest stamp of approval: “without reservations.”

Still, Maine is an unusual state with a population that is more than 90 percent white and so small that everyone could fit inside the city limits of San Diego. It had distributed laptops to every middle school student years before the ASSISTments experiment. Would an online math platform work in conditions where computer access is uneven? 

The Department of Education commissioned a $3 million replication study in North Carolina, in which 3,000 seventh graders were randomly assigned to use ASSISTments. The study, set to test how well the students learned math in spring of 2020, was derailed by the pandemic. But a private foundation salvaged it. Before the pandemic, Arnold Ventures had agreed to fund an additional year of the North Carolina study, to see if students would continue to be better at math in eighth grade. (Arnold Ventures is among the many funders of The Hechinger Report.)

Those longer-term results were published in June 2023, and they were good.  Even a year later, on year-end eighth grade math tests, the 3,000 students who had used ASSISTments in seventh grade outperformed 3,000 peers who hadn’t. The eighth graders had moved on to new math topics and were no longer using ASSISTments, but their practice time on the platform a year earlier was still generating dividends. 

Researchers found that the lingering effect of practicing math on ASSISTments was similar in size to the long-term benefits of Saga Education’s intensive, in-person tutoring, which costs $3,200 to $4,800 per year for each student. The cost of ASSISTments is a tiny fraction of that, less than $100 per student. (That cost is covered by private foundations and federal grants. Schools use it free of charge.)

Another surprising result is that students, on average, benefited from solving the same problems, without assigning easier ones to weaker students and harder ones to stronger students. 

How is it that this rather simple piece of software is succeeding while more sophisticated ed tech has often shown mixed results and failed to gain traction?

The studies aren’t able to explain that exactly. ASSISTments, criticized for its “bland” design and for sometimes being “frustrating,” doesn’t appear to be luring kids to do enormous amounts of homework. In North Carolina, students typically used it for only 18 minutes a week, usually split among two to three sessions. 

From a student’s perspective, the main feature is instant feedback. ASSISTments marks each problem immediately, like a robo grader. A green check appears for getting it right on the first try, and an orange check is for solving it on a subsequent attempt. Students can try as many times as they wish. Students can also just ask for the correct answer. 

Nearly every online math platform gives instant feedback. It’s a well established principle of cognitive science that students learn better when they can see and sort out their mistakes immediately, rather than waiting days for the teacher to grade their work and return it. 

The secret sauce might be in the easy-to-digest feedback that teachers are getting. Teachers receive a simple data report, showing them which problems students are getting right and wrong. 

ASSISTments encourages teachers to project anonymized homework results on a whiteboard and review the ones that many students got wrong. Not every teacher does that. On the teacher’s back end, the system also highlights common mistakes that students are making. In surveys, teachers said it changes how they review homework.

Other math platforms generate data reports too, and teachers ought to be able to use them to inform their instruction. But when 30 students are each working on 20 different, customized problems, it’s a lot harder to figure out which of those 600 problems should be reviewed in class. 

There are other advantages to having a class work on a common set of problems. It allows kids to work together, something that motivates many extroverted tweens and teens to do their homework. It can also trigger worthwhile class discussions, in which students explain how they solved the same problem differently.

ASSISTments has drawbacks. Many students don’t have good internet connections at home and many teachers don’t want to devote precious minutes of class time to screen time. In the North Carolina study, some teachers had students do the homework in school. 

Teachers are restricted to the math problems that Heffernan’s team has uploaded to the ASSISTments library. It currently includes problems from three middle school math curricula:  Illustrative Mathematics, Open Up Resources and Eureka Math (also known as EngageNY). For the Maine and North Carolina studies, the ASSISTments team uploaded math questions that teachers were familiar with from their textbooks and binders. But outside of a study, if teachers want to use their own math questions, they’ll have to wait until next year, when ASSISTments plans to allow teachers to build their own problems or edit existing ones.

Teachers can assign longer open-response questions, but ASSISTments doesn’t give instant feedback on them. Heffernan is currently testing how to use AI to evaluate students’ written explanations. 

There are other bells and whistles inside the ASSISTments system too. Many problems have “hints” to help students who are struggling and can show step-by-step worked out examples. There are also optional “skill builders” for students to practice rudimentary skills, such as adding fractions with unlike denominators.  It is unclear how important these extra features are. In the North Carolina study, students generally didn’t use them.

There’s every reason to believe that students can learn more from personalized instruction, but the research is mixed. Many students don’t spend as much practice time on the software as they should. Many teachers want more control over what the computer assigns to students. Researchers are starting to see good results in using differentiated practice work in combination with tutoring. That could make catching up a lot more cost effective.

I rarely hear about “personalized learning” any more in a classroom context. One thing we’ve all learned during the pandemic is that learning has proven to be a profoundly human interaction of give and take between student and teacher and among peers. One-size-fits-all instruction may not be perfect, but it keeps the humans in the picture. 

This story about ASSISTments was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.