Back to all posts

An Educators Final Verdict On AI In The Classroom

2025-10-19Unknown5 minutes read
Education
Artificial Intelligence
Critical Thinking

A Change of Heart on Classroom AI

by David Cutler, teacher

I’ve never considered myself anti-technology. I have even written about constructive ways AI can help students when used with proper oversight. But my first experience with the latest generation of generative AI stopped me in my tracks. This technology is different. It's dangerous, and I feel compelled to sound the alarm.

Since the debut of large language models, I've relied on a system to ensure academic integrity. Early in the semester, I would get to know each student’s unique writing voice through smaller, in-class assignments. Only after establishing this baseline would I assign longer take-home papers. This allowed me to recognize the specific rhythm of their sentences, their methods for transitioning between ideas, and even their characteristic detours. If an essay came in that didn't match their established style, I would immediately notice.

This is different. This is dangerous. And I’m finally sounding the alarm.

When AI Can Perfectly Mimic a Student's Voice

That essential safeguard is now gone. Advanced AI can now mimic a student’s writing with such precision that I can no longer reliably tell the difference. By feeding the AI a few past essays, it can flawlessly reproduce a student's unique quirks, pacing, and diction. It can then generate a brand-new paper in that same voice, complete with accurate citations, polished to appear as if it took hours of dedicated work. You can see for yourself in these papers I had ChatGPT-5 produce for my assignments, which I annotated as if it were a real student.

Even obscure texts or hard-to-find PDFs pose no obstacle. A student can simply upload a source, and the AI will read, analyze, and seamlessly weave it into the assignment. This isn't a secret; students and teachers are becoming increasingly aware of these capabilities.

Why Generative AI Is Different From a Calculator

While many educators are working to integrate AI into their classrooms, I am now convinced that it takes far more than it gives when it comes to teaching writing and critical thinking. What AI offers in efficiency, it erodes in depth. It allows students to lean on a tool before they have developed the habits, patience, and skills that authentic writing demands.

The slow, sometimes frustrating process of wrestling with a sentence, choosing the evidence that best supports an argument or discovering a new insight mid-draft — these are the moments when thinking happens.

When calculators first appeared in classrooms in the 1970s, many feared they would destroy math education. The concern was that basic numeracy would collapse. While the disruption was real and required new approaches to teaching and testing, there was a key difference: calculators only automated one part of the process—computation. The student still had to understand the problem, choose the correct formula, and interpret the result.

Generative AI is an entirely different kind of disruption. It doesn't just solve one step; it can take over the entire process from brainstorming and outlining to writing, revising, and citing sources. That isn’t replacing a tool; it's replacing the thinking itself.

The Solution: Bringing Writing Back into the Classroom

At this point, I see only one viable solution: more in-class writing. I am done assigning large take-home papers. The technology is too advanced, too easy to access, and too aggressively marketed to students. Companies openly promise to help them bypass AI detectors and deceive their teachers, and the latest AI models roll all that functionality into one seamless package.

AI companies sell their technology as a form of empowerment, glamorizing “working smarter,” which in this context often means not doing the work at all. This messaging is particularly tempting for overextended teenagers balancing school with sports, jobs, and college applications.

When a free or inexpensive tool can fully replace the thinking process, it stops being a support and becomes a substitute brain.

Not every student will cheat, but it’s naive to believe that the temptation won't grow. I used to agree with the idea of “ethical AI use,” but my position has changed. So, what is the path forward?

In my high school history classes, every significant writing assignment will now happen in the classroom. We will use either old-school blue books or a secure digital platform that allows me to monitor work in real time. Students will draft, brainstorm, and revise alongside their peers, with large assignments broken into smaller, visible stages. This may mean I cover less content, but the trade-off is necessary.

The Real Cost of AI: Intellectual Independence

The goal isn’t to ban tools but to ensure students can think without them. A computer can produce an essay in seconds, but what truly matters is whether a student can explain its ideas, adapt them in a debate, and defend them when challenged. When students outsource their thinking to a machine, they lose more than academic integrity—they lose intellectual independence. We risk raising a generation that trusts algorithms more than their own judgment.

David Cutler Courtesy: David Cutler

This piece was originally featured on Medium and has been edited for length.

About the author

David Cutler teaches American history, government and journalism at Brimmer and May, an independent school in Chestnut Hill, Mass. His writing has appeared in the National Association of Independent Schools, PBS News Hour, Edutopia, The Atlantic and Independent School Magazine. David created the Private School Journalism Association in 2019 and is its executive director. Follow David on Twitter @spinedu.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.