Back to all posts

Educations New Challenge AI and Academic Honesty

2025-09-13Aaron Payne5 minutes read
AI
Education
Academic Integrity

The traditional book report is a thing of the past. In the age of artificial intelligence, take-home tests and essays are rapidly becoming obsolete.

Educators in high schools and colleges report that student use of artificial intelligence is so widespread that assigning writing outside the classroom is practically an invitation to cheat. Casey Cuny, an English teacher with 23 years of experience, states, “The cheating is off the charts. It’s the worst I’ve seen in my entire career.” He adds, “Anything you send home, you have to assume is being AI’ed.”

The challenge for schools now is adaptation. Generations-old teaching and assessment methods are losing their effectiveness as AI technology advances and integrates into our daily lives. This shift is transforming education and creating new confusion over what academic dishonesty even means.

“We have to ask ourselves, what is cheating?” says Cuny, a 2024 California Teacher of the Year recipient. “Because I think the lines are getting blurred.”

To combat this, Cuny's students at Valencia High School now complete most of their writing in the classroom under his supervision. He uses software to monitor their laptop screens, allowing him to lock screens or block specific websites. He is also actively integrating AI into his lessons, teaching students to use it as a study aid rather than a tool for cheating.

Similarly, high school teacher Kelly Gibson in rural Oregon has switched to in-class writing and is using more verbal assessments to gauge students' understanding. “I used to give a writing prompt and say, ‘In two weeks, I want a five-paragraph essay,’” Gibson notes. “These days, I can’t do that. That’s almost begging teenagers to cheat.”

Consider a classic assignment: an essay on the relevance of social class in “The Great Gatsby.” Today, many students’ first step is to ask ChatGPT for brainstorming help. The chatbot can instantly generate essay ideas, examples, and quotes, and even offer to draft entire sections.

Students Navigate a Blurry Line

Many students turn to AI with good intentions, using it for research, editing, or to understand difficult texts. However, the technology presents an unprecedented temptation, making it hard to know where to draw the line.

Lily Brown, a college sophomore, uses ChatGPT to outline essays and to summarize complex philosophical texts. “Sometimes I feel bad using ChatGPT to summarize reading, because I wonder, is this cheating?” she asks. “If I write an essay in my own words and ask how to improve it, or when it starts to edit my essay, is that cheating?”

She notes that class syllabi are often vague, leaving a significant grey area. Students often avoid asking for clarification for fear of being labeled a cheater.

AI policies often vary from teacher to teacher within the same school. For instance, some educators encourage using the AI-powered assistant Grammarly, while others forbid it because it can rewrite sentences. Jolie Lahey, an 11th grader, learned valuable AI study skills in Cuny's class but finds her current teachers have strict “No AI” policies. “It’s such a helpful tool. And if we’re not allowed to use it that just doesn’t make sense,” she says. “It feels outdated.”

A screen displays guidelines for using artificial intelligence above a portrait of Ernest Hemingway A screen displays guidelines for using artificial intelligence above a portrait of Ernest Hemingway in Casey Cuny’s English class at Valencia High School in Santa Clarita, Calif. [Jae C. Hong | AP]

How Schools Are Adapting to the AI Era

After ChatGPT's launch in late 2022, many schools initially banned AI. Now, the perspective has shifted dramatically. “AI literacy” has become a new educational focus, emphasizing a balance between AI's benefits and its risks.

Over the summer, numerous colleges and universities created task forces to develop more detailed AI guidelines. The University of California, Berkeley, for instance, instructed faculty to include a clear AI policy in their syllabus, providing templates for courses that require, ban, or permit some AI use. The guidance noted that without a clear statement, “students may be more likely to use these technologies inappropriately.”

Carnegie Mellon University has seen a significant increase in academic responsibility violations related to AI, often from students who were unaware they were breaking rules. Rebekah Fitzsimmons, an AI faculty advisor at the university, shared an example of a non-native English speaker who used an AI tool to translate his work, not realizing it also altered his phrasing, which was then flagged by an AI detector.

Redefining the Classroom and Academic Integrity

Fitzsimmons notes that enforcing academic integrity has become more complex, as AI use is difficult to detect and prove. This has made faculty more hesitant to accuse students, while students worry about being falsely accused with no way to prove their innocence.

In response, Carnegie Mellon has drafted new guidelines for students and faculty. A blanket ban on AI is considered unviable unless instructors fundamentally change their assessment methods. Consequently, many are abandoning take-home exams, returning to in-class pen-and-paper tests, or adopting “flipped classrooms” where homework is completed during class time.

Emily DeJeu, a communications instructor at the university, has replaced homework writing assignments with in-class quizzes on a “lockdown browser” to prevent students from accessing other sites. “To expect an 18-year-old to exercise great discipline is unreasonable,” she says. “That’s why it’s up to instructors to put up guardrails.”

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.