AI’s incipient invasion of the classroom, from personal robo-tutors to AI-powered tools, is a triumph of progress to industry boosters like Sam Altman — and a worrisome new trend to many wary teachers.
For McGraw Hill, the publisher of educational textbooks, finding the balance between the two diverging views is nothing new. The tension dates back to the invention of the calculator, says Dylan Arena the the chief data science and AI officer at McGraw Hill.
Arena has long thought about data and technology in education. He was a software engineer, then a learning science PhD, then a startup founder, before he joined McGraw Hill in 2021. The company had acquired Kidaptive, his edtech startup for tracking early childhood development.
His current job is an increasingly important one at a company with roots in traditional printing dating back to nineteenth century trade publishing.
“It’s a long journey from that company that made those textbooks that we grew up with to the sort of ed tech leader that we are today,” Arena said. Digital data has increasingly been part of the McGraw Hill product offering: For the past 25 years, its online math support program, ALEKS, has given teachers and parents insight into a student’s progress.
Upheaval in the classroom
It’s the events of the last few years however that have really underscored the importance of getting it right when it comes to technology and education.
The COVID-19 pandemic spurred the uptake of more digital teaching tools. But it was also a reminder that human relationships that students have with their classmates and their teachers are essential to learning.
Then, just as things seemed set to return to normal, OpenAI released ChatGPT and sent schools into a spiral.
AI technology has made it easier than ever for students to plagiarize homework, and as a result, some fear, to sabotage their own learning. On the other hand, educators recognize that AI is an important technology that pupils will one day need to use in the workplace.
Some edtech companies have been quick to respond with attempts to rein in the technology in the form of school-sanctioned tools. The nonprofit Khan Academy partnered with OpenAI to create Khanmigo, an AI teaching app that it has rolled out to school districts across the U.S. Magic School, an AI startup, has also created an AI tutoring app for kids, while education technology company Eduaide has geared its AI offerings more at supporting teachers in planning lessons and producing quizzes and homework worksheets.
Some countries are pushing educational publishers to integrate AI rapidly into their product offerings. South Korea, Singapore, and Rwanda are among the nations that have asked companies bidding on multi-million dollar contracts to supply textbooks and other educational materials to their school systems to include AI tutoring elements as part of those bids. The U.K. government also has put out requests for proposals for funding for AI education tools. But in the U.S., where primary and secondary school textbook choices are often made on a state-by-state or even school district by school district basis, the pace of AI integration into learning is more uneven and halting.
For McGraw Hill, a multibillion-dollar enterprise, the concern isn’t about keeping up with the (once rapid, now slowing) pace of AI scaling. It echoes the mantra of Apple CEO Tim Cook: not first, but best.
“Our levels of brand trust are so high that the greatest risk for us is not moving too slowly on AI, it’s moving too fast on AI,” as Arena puts it.
The limits of virtual tutors
Arena is particularly skeptical of any claims that generative AI-powered personal tutors will ever replace human teachers. It’s not just that he thinks the tech isn’t capable enough. He worries that AI chatbots won’t challenge students properly, and that it will discourage them from learning key social skills that are also a vital part of what schools teach. “We should absolutely not be doing that,” he says of turning education over completely to AI tutoring apps.
He also avoids tools that encourage users to personify a chatbot. He fears a future where every child has a Teddy Ruxpin bear powered by an LLM — a stuffed animal that remembers its child owner’s favorite stories and what makes them laugh or cry. “That just might set them onto a track where they go into kindergarten, and they look at the real kids, and they’re like, ‘Oh, you guys aren’t anywhere near as cool as my Teddy Ruxpin.’
Those fears aren’t unfounded. In Sam Altman’s manifesto on generative artificial intelligence, the OpenAI CEO envisioned a prosperous future in which every child has a personal virtual tutor for every subject, in any language, working at whatever pace the student needs. (Meanwhile, a family’s lawsuit against Character.AI following their teen’s suicide points to the many unknowns and potential risks of exposing young people to AI personalities.)
There is potential for AI as a personal learning assistant, but only with the right tools and appropriate guardrails, says Peter Atherton, a lecturer in the school of education at Liverpool John Moores University in the U.K. His research covers edtech, from the classroom quiz game Kahoot to social media and AI as educational tools. His forthcoming book is about how to use genAI in the learning process.
“We’re on the precipice of a paradigm shift with what learning really is, and we’re not quite there yet,” Atherton said. Like Arena, he sees AI working as a tool to enhance the dialogue between students and their human teachers, rather than a way to keep kids tied to computer screens.
As the AI edtech space develops, Atherton fears that Big Tech names could use their resources to dominate the education space, without considering what’s best for teachers and students as end-users. One drawback he noted is that those tools tend to unleash access to the unfiltered internet, whereas specialized edtech tools are engineered to only draw information from approved, verified sources and come complete with other guardrails to, for instance, ensure they aren’t generating racist or sexist content. There’s also the question of instructing both teachers and students how to use the new tools and ensuring there’s not an overflow of new programs taking over the classroom.
Proponents of AI’s use in education often frame it as a tool that could help address inequality, giving students whose parents might not be around after school or on weekends to help with homework assignments and whose families can’t afford to pay for human tutoring the chance to have personalized learning assistance. It could also help overburdened teachers in poorer school districts to work more productively. But, of course, whether AI actually functions in this way greatly depends on access—which still comes down to money. Districts need funds to purchase the new AI-enabled software and digital textbooks, and students and teachers need access to laptops, tablets, or smartphones and reliable broadband connections to use the new resources.
“Not a technology thing, it’s a learning thing”
This year, McGraw Hill rolled out its first two genAI tools for the classroom, and neither comes with a flashy interface or a friendly name. There’s an AI Reader for higher ed, and the Writing Assistant for middle and high schoolers. Both launched in August and were created with input from teachers, Arena notes. Both are built right into existing McGraw Hill tools that the students are already familiar with.
The AI Reader is enabled on 150 of McGraw Hill’s digital titles. A student can prompt it to analyze a passage of text that they’re stuck on, and the AI Reader will rephrase the highlighted section.
It was created after instructors told McGraw Hill researchers that students are often coming to office hours to ask for a simplification or reframing of what was covered in class readings.
“The feature is primarily not a technology thing, it’s a learning thing,” Arena said. He ties it to the theory of constructivism in learning, or the idea that learners don’t just absorb knowledge, they build it by interacting with it.
“When students are doing active reading, if they can’t access the text, if they can’t quite get this key concept or this weird phrase or whatever, they can’t build that knowledge structure,” he said.
More than 28,000 students have used the tool since it launched and early data shows that students who use AI Reader tend to spend 50% more time with their reading. That’s generally a good sign, although it’s too soon to tell if there’s a causal relationship, Arena notes.
Luke Williams, a lecturer at Central Washington University, used the tool in his class. He sees genAI as a game changer for higher education, from supporting student readings to helping instructors design courses. Still, he notes that academic institutions tend to be later adopters of flashy new tech.
And while some professors might be putting off AI adoption, their students are already using it, Williams notes. “It’s important to take a measured approach, but also an approach that is curious and open minded to what’s coming,” he said.
On top of the AI Reader, McGraw Hill is testing a tool known as Writing Assistant with a limited crowd of sixth through twelfth grade classrooms. It does what one might think—giving budding writers feedback on their assignments as they’re doing them. It’s embedded into two of McGraw Hill’s existing digital tools, the Actively Learn and Achieve3000 Literacy programs.
It’s meant to save teachers time on the tough task of giving every student in their class individual support as they all work on an assignment. The Writing Assistant helps students sort out the building blocks of writing, like grammar and punctuation, in real-time, so the teacher’s feedback can focus on a student’s ideas, Arena explains.
But Arena doesn’t pretend it’s an antidote to the plethora of chatbots out there.
“If the kids want ChatGPT, they can just open a tab and use ChatGPT,” Arena adds.
This story was originally featured on Fortune.com