The ‘one chatbot per child’ model for AI in classrooms conflicts with what research shows: Learning

AI tutors are often held up as an ideal, but prioritizing individualized teaching can detract from the benefits of learning in social environments.

Author: Niral Shah on Dec 15, 2025
 
Source: The Conversation
Yes, AI tutors can provide individualized feedback, but learning is inherently social. Maskot via Getty Images

In the Star Trek universe, the audience occasionally gets a glimpse inside schools on the planet Vulcan. Young children stand alone in pods surrounded by 360-degree digital screens. Adults wander among the pods but do not talk to the students. Instead, each child interacts only with a sophisticated artificial intelligence, which peppers them with questions about everything from mathematics to philosophy.

This is not the reality in today’s classrooms on Earth. For many technology leaders building modern AI, however, a vision of AI-driven personalized learning holds considerable appeal. Outspoken venture capitalist Marc Andreessen, for example, imagines that “the AI tutor will be by each child’s side every step of their development.”

Years ago, I studied computer science and interned in Silicon Valley. Later, as a public school teacher, I was often the first to bring technology into my classroom. I was dazzled by the promise of a digital future in education.

Now as a social scientist who studies how people learn, I believe K-12 schools need to question predominant visions of AI for education.

Individualized learning has its place. But decades of educational research is also clear that learning is a social endeavor at its core. Classrooms that privilege personalized AI chatbots overlook that fact.

School districts under pressure

Generative AI is coming to K-12 classrooms. Some of the largest school districts in the country, such as Houston and Miami, have signed expensive contracts to bring AI to thousands of students. Amid declining enrollment, perhaps AI offers a way for districts to both cut costs and seem cutting edge.

Pressure is also coming from both industry and the federal government. Tech companies have spent billions of dollars building generative AI and see a potential market in public schools. Republican and Democratic administrations have been enthusiastic about AI’s potential for education.

Decades ago, educators promoted the benefits of “One Laptop per Child.” Today it seems we may be on the cusp of “one chatbot per child.” What does educational research tell us about what this model could mean for children’s learning and well-being?

Learning is a social process

During much of the 20th century, learning was understood mainly as a matter of individual cognition. In contrast, the latest science on learning paints a more multidimensional picture.

Scientists now understand that seemingly individual processes – such as building new knowledge – are actually deeply rooted in social interactions with the world around us.

Neuroscience research has shown that even from a young age, people’s social relationships influence which of our genes turn on and off. This matters because gene expression affects how our brains develop and our capacity to learn.

In classrooms, this suggests that opportunities for social interaction – for instance, children listening to their classmates’ ideas and haggling over what is true and why – can support brain health and academic learning.

Research in the social sciences has long since proved the value of high-quality classroom discourse. For example, in a well-cited 1991 study involving over 1,000 middle school students across more than 50 English classrooms, researchers Martin Nystrand and Adam Gamoran found that children performed significantly better in classes “exhibiting more uptake, more authenticity of questions, more contiguity of reading, and more discussion time.”

In short, research tells us that rich learning happens when students have opportunities to interact with other people in meaningful ways.

AI in classrooms lacks research evidence

What does all of this mean for AI in education?

Introducing any new technology into a classroom, especially one as alien as generative AI, is a major change. It seems reasonable that high-stakes decisions should be based on solid research evidence.

But there’s one problem: The studies that school leaders need just aren’t there yet. No one really knows how generative AI in K-12 classrooms will affect children’s learning and social development.

Current research on generative AI’s impact on student learning is limited, inconclusive and tends to focus on older students – not K-12 children. Studies of AI use thus far have tended to focus on either learning outcomes or individual cognitive activity.

Although standardized test scores and critical thinking skills matter, they represent a small piece of the educational experience. It is also important to understand generative AI’s real-life impact on students.

For example: How does it feel to learn from a chatbot, day after day? What is the longer-term impact on children’s mental health? How does AI use affect children’s relationships with each other and with their teachers? What kinds of relationships might children form with the chatbots themselves? What will AI mean for educational inequities related to social forces such as race and disability?

More broadly, I think now is the time to ask: What is the purpose of K-12 education? What do we, as a society, actually want children to learn?

Of course, every child should learn how to write essays and do basic arithmetic. But beyond academic outcomes, I believe schools can also teach students how to become thoughtful citizens in their communities.

To prepare young people to grapple with complex societal issues, the National Academy of Education has called for classrooms where students learn to engage in civic discourse across subject areas. That kind of learning happens best through messy discussions with people who don’t think alike.

To be clear, not everything in a classroom needs to involve discussions among classmates. And research does indicate that individualized instruction can also enhance social forms of learning.

So I don’t want to rule out the possibility that classroom-based generative AI might augment learning or the quality of students’ social interactions. However, the tech industry’s deep investments in individualized forms of AI – as well as the disappointing history of technology in classrooms – should give schools pause.

Good teaching blends social and individual processes. My concern about personalized AI tutors is how they might crowd out already infrequent opportunities for social interaction, further isolating children in classrooms.

Center children’s learning and development

Education is a relational enterprise. Technology may play a role, but as students spend more and more class time on laptops and tablets, I don’t think screens should displace the human-to-human interactions at the heart of education.

I see the beneficial application of any new technology in the classroom – AI or otherwise – as a way to build upon the social fabric of human learning. At its best it facilitates, rather than impedes, children’s development as people. As schools consider how and whether to use generative AI, the years of research on how children learn offer a way to move forward.

Niral Shah does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

Read These Next