The “Both / And” of AI in Education

AI

Written by: Christian Talbot, MSA President (posted Tuesday May 27th, 2025 | 8:05 am)

If you listen to nothing else before the end of the school year, let it be “We Have to Really Rethink the Purpose of Education,” a conversation between Ezra Klein from the New York Times and Rebecca Winthrop, the director of the Center for Universal Education at the Brookings Institution and co-author of The Disengaged Teen.

The conversation will give you enough to think about for the entire summer, but Winthrop’s comments about AI in education especially struck me:

“What should [students] learn? What’s the content? What are the skills?

People always think of education as a transactional transmission of knowledge, which is one important piece of it. But it is actually so much more than that: learning to live with other people, to know yourself and for developing the flexible competencies to be able to navigate a world of uncertainty. Those are the ‘whys’ for me.”

Note that Winthrop declares that school is partly about the transmission of knowledge. Whether or not we like it, the leading AI models can transmit knowledge quite well. If you know any teenagers, ask them whether they use AI to explain topics from Physics—or Biology, 19th century European history, or whatever class they’re struggling with—as if the AI were talking to a 5th grader. It’s a surprisingly effective use case.

We should not repeat the mistakes from 25 years ago, when the advent of google-powered search led some educators to proclaim that kids didn’t need to learn facts anymore, because they could “look it up.” But we know from the Scholarship of Teaching & Learning that students must develop durable, lasting knowledge in order to create and secure new knowledge. In an age of increasingly abundant AI, we should think about how, when, and where that transmission and consolidation of knowledge happens.

And it is also true that…

School is mostly about learning through social experiences: Students watch a teacher solve a Math problem, then they mimic the teacher’s moves and consolidate their learning through practice with feedback; students join peer study groups to learn from one another; students work in teams to create a project and absorb all sorts of learning about how to frame problems, search for answers, pressure test ideas, and more.

Dan Meyer, who is something of an AI Cassandra, has consistently said that students want to learn with and from people who show those students that their thinking matters. AI cannot show a student that their thinking matters. AI can only simulate that kind of feedback.

An adjacent insight shows up in Michael Horn’s From Reopen to Reinvent, in which he shares his Jobs To Be Done research: Students “want to feel successful and make progress, and they want to have fun with friends.” And they usually want to do these two things together.

Since early 2023, the Middle States strategic plan has declared that we are committed to being “human-driven and AI-informed.” We put the words in that order because many of us are former school leaders, so we know from experience that education is a profoundly humanistic endeavor. And we cannot afford to pretend that we can put the AI genie back into the bottle. Both of those things are true.

One of the great dangers of this moment of exploding AI is that we will underestimate and underappreciate the extent to which learning is both transactional and social.

And this moment also presents us with a once-in-a-generation opportunity: As Middle States AI Advisory Team member Tom Vander Ark has said, “Young people that understand artificial intelligence are going to help solve many of the big problems that we as a society face. They are going to produce tremendous wealth. They are going to solve many of the health issues that we face. They are going to attack climate change. So monster contributions will come from young people harnessing AI. And I want schools to be alive with that sense of possibility.”

At Middle States, we see that sense of possibility in the 75 schools that are using our comprehensive implementation framework in “AI Literacy, Safety, and Ethics.” We see that sense of possibility in the 10 schools piloting another comprehensive implementation framework in “Essential Learning Experience with AI” as well as 3 schools piloting our AI Fellows program—and in both groups, teachers are conducting action research on how they can create powerful learning experiences with AI. We see that sense of possibility in the teachers and school leaders who have completed AI 101 and AI 201 through the Evolution Academy.

Do you also see that sense of possibility? Does it include both the hard work of knowledge transmission and the inspiring dynamics of social learning?

The best time to start working on that sense of possibility was in late 2022. The second best time is now.

Previous
Previous

What Can Schools Learn From a Fashion Show?

Next
Next

How To: Professional Development for Skeptics