by Tanishia Lavette Williams, The Hechinger Report
November 4, 2025
The year I co-taught world history and English with two colleagues, we were tasked with telling the story of the world in 180 days to approximately 120 ninth graders. We invited students to think about the way in which texts and stories speak to each other: “The Interviews” as imperial governance, “Sundiata” as the political memory of Mali, “Julius Caesar” as a window on the collapse of a republic.
In winter, our students gave us nicknames. Some days we were a triumvirate. Some days we were Cerberus, the three-headed dog of Hades. It was a joke, but it had a deeper meaning. Our students were learning to make connections by integrating us into the stories they were studying. They constructed a vision of the world and they saw themselves in it.
Designed to promote critical thinking, this teaching was profoundly human. This meant scouring texts for missing voices, adapting lessons to reflect the interests of the students in front of us, and trusting that learning, like understanding, happens slowly. This work cannot be optimized for efficiency.
However, today there is a growing trend to teach faster. Thousands of New York teachers are being trained to use AI tools for lesson planning, as part of a $23 million project. initiative supported by OpenAI, Microsoft and Anthropic. The program promises to reduce teacher burnout and streamline scheduling. At the same time, a new private school in Manhattan touts an AI-based model that “rapidly teaches” core subjects in just two hours of instruction each day while deliberately avoiding politically controversial issues.
Touted as an innovation, this simplified vision of education views learning as a technical outcome rather than a human process in which students ask difficult questions and teachers cultivate the critical thinking that fuels curiosity. A recent analysis AI-generated civics lesson plans have been found to consistently lack multicultural content and critical thinking prompts. These AI tools are fast, but superficial. They fail to grasp the nuance, care, and complexity that deep learning requires.
Related: There’s a lot going on in classrooms from kindergarten through high school. Follow our free weekly K-12 education newsletter.
When I was a teacher, I often revised lesson plans to help my colleagues refine their teaching practices. Later, as a principal in Washington, D.C. and New York, I realized that lesson plans, the documents connecting curriculum and outcomes, were among the few consistent examples of classroom practice. Despite their importance, lesson plans were rarely evaluated for their effectiveness.
When I wrote my thesis, after 20 years of working in schools, the analysis of lesson plans was at the heart of my research. In analyzing plans from several schools, I found that the activities and tasks included in lesson plans were reliable indicators of the depth of knowledge required by teachers and, by extension, the limits of what students needed to learn.
Review of hundreds of plans made it clear that most lessons rarely offered more than one dominant voice – and thus limited both what counted as knowledge and what was considered achievement. Evolving plans toward deeper, more inclusive learning required deliberate efforts to incorporate primary sources, weave multiple narratives, and design tasks that push students beyond simple recall.
I also found that creating the conditions for such learning takes time. Nothing can replace that. Where this work took root, students made meaning, saw patterns, asked why, and found themselves in the story.
This is the transformation that AI cannot achieve. When educational tools are trained on the same data that has long omitted perspectives, they do not correct bias; they reproduce it. The developers of ChatGPT acknowledge that the model is “biased towards Western views and works better in English” and warns educators to carefully examine its content for stereotypes and biases. These same distortions appear at the systems level – a 2025 study in the World Journal of Advanced Research and Reviews found that biased educational algorithms can shape students’ educational pathways and create new structural barriers.
Ask an AI tool for a lesson on westward expansion and you’ll get a clear narrative about pioneers and manifest destiny. Ask for a unit on the civil rights movement and you might get a few lines about Martin Luther King Jr., but barely a word about Ella Baker, Fannie Lou Hamer or the grassroots organizers who made the movement possible. Indigenous nations, meanwhile, are reduced to footnotes or omitted altogether.
Curriculum revision – the systematic exclusion or minimization of histories, perspectives, and entire communities – has already been ingrained in educational materials for generations. So what happens when “efficiency” becomes the goal? What stories are deemed too complex, too political, or too difficult to remember?
Related: What aspects of teaching must remain humane?
None of this is theoretical. This is already happening in classrooms across the country. Educators are forced to teach more with less: less time, fewer resources, tighter guardrails. AI promises relief but overlooks deep ethical questions.
Students do not benefit from automatically generated worksheets. They benefit from lessons that challenge them, invite them to grapple with complexity, and help them connect learning to the world around them. It requires deliberate planning and professional judgment on the part of a human being who views education as a mechanism for eliciting inquiry.
Recently, I asked my students at Brandeis University to use AI to generate a list of individuals who embody concepts such as beauty, knowledge, and leadership. The results, predominantly white, male and Western, reflected what is ubiquitous in textbooks.
My students responded with pointed analysis. One student created color palettes to demonstrate the limited scope of AI-generated skin tones. Another student developed a “Missing Gender” summary to highlight omissions. This was a clear reminder that students are ready to think critically, but they need opportunities to do so.
AI can only do what it is programmed to do, which means it relies on existing, stratified information and lags behind new paradigms. This makes it both retrospective and vulnerable to replication bias.
Teaching humanely, on the other hand, requires judgment, attention, and cultural knowledge. These are qualities that no algorithm can automate. When we abandon lesson planning to AI, we don’t just lose stories; we also lose the possibility of dialoguing with them. We are losing the critical habits of inquiry and connection that teaching is supposed to foster.
Tanishia Lavette Williams is the inaugural Postdoctoral Fellow in Educational Stratification at the Institute on Race, Power, and Political Economy, a Kay Fellow at Brandeis University, and a Visiting Scholar at Harvard University.
Contact the opinion editor at [email protected].
This story about male AI and teaching was produced by The Hechinger reportan independent, nonprofit news organization focused on inequality and innovation in education. Register with Hechinger weekly newsletter.
This and is republished here under a.