Personalized education was already big pre-pandemic, but home schooling and digital instruction made more parents and teachers embrace the idea. With a shortage of human teachers, many schools jumped on the bandwagon of using technology that collects each child’s personal data and tailors content accordingly.
Researchers, however, warn of three dangerous pitfalls.
First, personalized education offered at the expense of standardized education exacerbates gaps between children who grow up in environments that hoard opportunities and those who don’t, between the privileged and not-so-privileged.
Personalized education proponents rightly argue that teaching to the average is an outdated and impossible target given the complexity of learning. But schools need some form of standardized education in order to identify and narrow education gaps created by socioeconomic differences among students. Standardized instruction may not adequately coach all children’s talents, but it allows educators to compare individual to collective achievements, as well as local performance to international standards.
Unfortunately, current personalized platforms are not designed to tackle educational inequality. Personalized apps push well-performing students to the top of the pile by supplying them with more advanced content. In addition, personalized algorithms demonstrate what is known as the Matthew effect: Those who start behind are left behind with their own data — without group power to lift them up.
Schools cannot offer personalized education alone, and partnerships with community organizations are vital for tackling inequalities among students. It is important to remember that personalized education technology can exacerbate the gaps between those with digital access and those without, as well as between those who are digitally literate and technologically savvy and those who are not.
Related: As the world goes virtual, big education technology players tighten their grip
Second, although education technology is made for the convenience of individual users, the content lags behind. A recent review showed that the top 100 downloaded apps designed for children’s learning are of low educational and scientific quality. The bestselling children’s apps are developmentally inappropriate, and many popular digital books contain distracting features that limit children’s story comprehension and vocabulary learning.
If the material is low quality, children’s heightened motivation for content that is presented as crafted uniquely for them is wasted.
Third, there are many concerns involving personal data and the business-models of digital learning. Some personalized educational technologies misuse children’s data for commercial purposes. Some repeatedly expose children to the same or similar content, decreasing their agency by intensifying information bubbles in a cycle that mirrors that of commercial personalization.
The adoption of personalized monitoring tools can easily turn into surveillance tools of individual families, and as digital learning increases, so do concerns about the privatization of schools by technology giants.
Rethinking the use of personal data for children’s learning must be a priority for post-pandemic education. Over the last few decades, democratic nations have developed rules and regulations against the commercialization, politicization and militarization of children’s learning. Personalized learning technologies should supplement, not substitute, these safeguards.
Rethinking the use of personal data for children’s learning must be a priority for post-pandemic education.
Even crude personalization that recognizes young users and blocks inappropriate sites could address some basic internet safety gaps. In early September, the UK introduced the Age-Appropriate Design Code (a new set of regulations referred to as the “Children’s Code”) to protect children’s rights online. The regulations apply to all online services including education technology and include rules about data minimization, data sharing and detrimental use — which are at the heart of the negative impacts of personalized education technology.
A major driver to a fundamental rethinking of the current personalized learning model is a simple principle: Socially just education is both personalized and standardized. If education leans toward the pole of personalization, a competitive “me-first” attitude will dominate. If we fall toward the extreme of standardized education, the collective will be commended at the expense of celebrating individual achievements. It follows that optimal education will adapt the process of learning to individual learners while making core curricula and tests the same for all.
The digital solution therefore does not lie in swinging the pendulum toward one or the other extreme. Education technology creates an opportunity for teachers and children and their families to co-design curricula.
This does not mean shifting the coding responsibility to schools or homes. It means giving children choices in how their data is collected and used to adapt content for them. It means giving teachers choices in how much the technologies support and how much they automate instruction. It means designing new types of technology that do not misuse data for commercial purposes, but use algorithms for advancing the knowledge of all.
Natalia Kucirkova is a professor of early childhood development in Norway, with a special interest in personalized learning and education technology.
This story about personalized learning and education technology was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Hechinger’s newsletter.