The Hechinger Report is a national nonprofit newsroom that reports on one topic: education. Sign up for our weekly newsletters to get stories like this delivered directly to your inbox. Consider supporting our stories and becoming a member today.

This is an edition of our Future of Learning newsletter. Sign up today to get it delivered straight to your inbox.

There’s little doubt that artificial intelligence will fundamentally alter how classrooms operate. But just how much bot-fueled instruction is too much? I chatted with Hechinger contributor Chris Berdik about his recent story, co-published with Wired, that explores these themes and how some schools are deploying AI assistants in the classroom. 

Q: Why did you want to explore this question of how, and in what ways, AI can replace teachers? 

From the first exploratory interviews I did on this topic, I was surprised to learn how deep the history of AI trying to teach went. There’s so much hype (and doom) currently around generative AI, and it really is remarkable and powerful, but it puts things into perspective to learn that people have tried to harness AI to teach for many decades, with pretty limited results in the end. So, then I heard the story of Watson, an AI engine that quickly and easily dispatched Jeopardy! champions, but couldn’t hack it as a tutor. It clearly had the necessary knowledge at its beck and call. What didn’t it have? If we got beyond the hype of the latest generative AI, could it muster that critical pedagogic component that Watson lacked? And, finally, if the answer was no, what then was its best classroom use? Those were my starting points.

Q: How hesitant or eager were teachers like Daniel Thompson, whose classroom you visited, to use AI assistants? 

Thompson was cautiously optimistic. In fact, he was pretty eager to use the tool, precisely because it could make the other apps and multimedia he used less cumbersome, and navigating them less onerous. But Thompson did a few quick stress tests of the assistant, by asking it to answer questions about the local Atlanta sports teams that had nothing to do with his curriculum, checking the guardrails by asking it to compose a fake message firing a colleague. The assistant declined those requests, showing once again that occasionally the most useful thing AI can do is gently tell us we’ve asked too much of it.

Q: You wrote that students weren’t interested in engaging with IBM Watson. Why not? 

As more than one source explained to me, the process of learning includes moments of challenge and friction, which can be “busy work” drudgery, but is often times at the heart of what it means to learn, to puzzle over ideas, to truly create, to find one’s own way through to understanding. And I think that students (like a lot of us) see AI as a tool that can take care of some onerous, time consuming, or tedious task on our behalf. So, it’s going to take a lot more for AI to engage students when its job is to guide them through the friction of learning rather than just be an escape hatch from it. 

Q: As more of these tools enter the classroom this school year, what will you be watching for? 

I may have a somewhat esoteric interest in what’s next with AI in classrooms. I’m personally really interested in how schools will handle critical AI literacy, where both students and educators devote the time and resources to think critically about what AI is, the wonderful things it can do, and just as importantly what it can’t, or shouldn’t do on our behalf.

Here are a few key stories to bring you up to speed:

PROOF POINTS: Teens are looking to AI for information and answers, two surveys show

My colleague Jill Barshay wrote about two recent surveys on how teens are using AI to brainstorm ideas, study for tests or get answers for questions they might be too embarrassed to ask their parents or friends. Barshay pointed out that both surveys indicate that Black, Hispanic and Asian American youth are often quick to adopt this new technology.

How AI could transform the way schools test kids

Back in the spring, my colleague Caroline Preston and I explored what AI advancements means for the future of assessments and standardized testing. Many experts believe that AI has the potential to better evaluate a students’ true knowledge and personalize tests to individual students. However, experts also warn that schools and test designers should proceed cautiously, keeping in mind the disparities in access to AI and technology and concerns of biases embedded into tools.

AI might disrupt math and computer science classes – in a good way

Last year, as part of a series on math instruction, we published a story by Seattle Times writer Claire Bryan about how some math and computer science teachers are embracing AI. Teachers say that AI can help them plan math lessons and write a variety of math problems geared toward different levels of instruction. The story was produced in partnership with The Education Reporting Collaborative, an eight-newsroom effort.

More on the Future of Learning

PROOF POINTS: Asian American students lose more points in an AI essay grading study — but researchers don’t know why,” The Hechinger Report

An education chatbot company collapsed. Where did the student data go?,” EdSurge

More than 378,000 students have experienced gun violence at school since Columbine,” The Washington Post

It takes a village: A Brooklyn high school and NYC nonprofits team up to enroll older immigrants,” Chalkbeat

This story about ai in education was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education.

The Hechinger Report provides in-depth, fact-based, unbiased reporting on education that is free to all readers. But that doesn't mean it's free to produce. Our work keeps educators and the public informed about pressing issues at schools and on campuses throughout the country. We tell the whole story, even when the details are inconvenient. Help us keep doing that.

Join us today.