Back to all posts

AI Cheating Epidemic Hits UK Universities

2025-06-15Michael Goodier6 minutes read
AI
Education
Academic Integrity

The Shifting Landscape of Academic Integrity

Thousands of university students across the UK have been identified misusing ChatGPT and other artificial intelligence tools for their academic work in recent years. A Guardian investigation reveals this growing trend coincides with a significant decline in traditional forms of plagiarism.

The data highlights a rapidly evolving challenge for universities as they attempt to adapt assessment methods to the rise of technologies like ChatGPT and other AI-powered writing aids. Before the widespread availability of generative AI, around 2019-20, plagiarism accounted for nearly two-thirds of all academic misconduct. This intensified during the pandemic when many assessments moved online. However, as AI tools have become more sophisticated and accessible, the nature of cheating has transformed.

AI Cheating By The Numbers

A survey of academic integrity violations uncovered almost 7,000 proven cases of cheating using AI tools in the 2023-24 academic year, equivalent to 5.1 cases for every 1,000 students. This figure is up sharply from 1.6 cases per 1,000 students in 2022-23.

Early figures up to May of the current academic year suggest this number will increase again, potentially reaching about 7.5 proven cases per 1,000 students. However, experts caution that these recorded cases represent only the tip of the iceberg.

Concurrently, confirmed cases of traditional plagiarism fell from 19 per 1,000 students to 15.2 in 2023-24. This is expected to fall further to approximately 8.5 per 1,000, according to initial data from the current academic year.

A series of charts showing proven misconduct cases per 1,000 students. Plagiarism rises from 2019-20 to 2022-23 then drops back again, while AI-related misconduct rises from 2022-23 to almost the same level as plagiarism. Other misconduct remains fairly stable.

Universities Playing Catch Up With AI Misuse

The Guardian contacted 155 universities under the Freedom of Information Act, requesting figures for proven cases of academic misconduct, plagiarism, and AI misconduct over the last five years. Of these, 131 provided some data, though not every university had records for each year or category.

Significantly, more than 27% of responding universities did not yet record AI misuse as a separate category of misconduct in 2023-24, indicating that the sector is still coming to terms with the issue.

The Challenge of Catching AI Cheats

Many more cases of AI cheating may be going undetected. A survey by the Higher Education Policy Institute in February found that 88% of students used AI for assessments. Last year, researchers at the University of Reading tested their own assessment systems and successfully submitted AI-generated work without detection 94% of the time.

Dr Peter Scarfe, an associate professor of psychology at the University of Reading and co-author of that study, stated that while there have always been methods for cheating, AI poses a fundamentally different problem for the education sector. He commented, “I would imagine those caught represent the tip of the iceberg. AI detection is very unlike plagiarism, where you can confirm the copied text. As a result, in a situation where you suspect the use of AI, it is near impossible to prove, regardless of the percentage AI that your AI detector says (if you use one). This is coupled with not wanting to falsely accuse students.”

Dr Scarfe added, “It is unfeasible to simply move every single assessment a student takes to in-person. Yet at the same time the sector has to acknowledge that students will be using AI even if asked not to and go undetected.”

Student Perspectives and Evasion Tactics

Students seeking to use generative AI undetected have ample online resources. The Guardian found dozens of videos on TikTok advertising AI paraphrasing and essay writing tools. These tools claim to help students bypass common university AI detectors by “humanising” text generated by tools like ChatGPT.

Dr Thomas Lancaster, an academic integrity researcher at Imperial College London, noted, “When used well and by a student who knows how to edit the output, AI misuse is very hard to prove. My hope is that students are still learning through this process.”

Harvey*, who recently completed his final year of a business management degree, shared that he used AI to generate ideas, structure assignments, and suggest references. He believes most of his peers use AI to some extent. “ChatGPT kind of came along when I first joined uni, and so it’s always been present for me,” he said. “I don’t think many people use AI and then would then copy it word for word, I think it’s more just generally to help brainstorm and create ideas. Anything that I would take from it, I would then rework completely in my own ways.” He also mentioned knowing someone who used AI and then other AI methods to make the text sound human-written.

Amelia*, who just finished her first year of a music business degree, said she used AI for summarising and brainstorming. She highlighted that AI tools had been particularly useful for students with learning difficulties. “One of my friends uses it, not to write any of her essays for her or research anything, but to put in her own points and structure them. She has dyslexia – she said she really benefits from it.”

AI as an Accessibility Tool

The potential benefits of AI for students with learning difficulties were also noted by the science and technology secretary, Peter Kyle. He told the Guardian recently that AI should be deployed to “level up” opportunities for dyslexic children.

Tech Giants Target the Student Market

Technology companies appear to be targeting students as a key demographic for their AI tools. For instance, Google offers university students a free upgrade of its Gemini tool for 15 months, and OpenAI provides discounts to college students in the US and Canada.

Rethinking Assessments in the Age of AI

Dr Lancaster commented on the student perspective: “University-level assessment can sometimes seem pointless to students, even if we as educators have good reason for setting this. This all comes down to helping students to understand why they are required to complete certain tasks and engaging them more actively in the assessment design process.”

He added, “There’s often a suggestion that we should use more exams in place of written assessments, but the value of rote learning and retained knowledge continues to decrease every year. I think it’s important that we focus on skills that can’t easily be replaced by AI, such as communication skills, people skills, and giving students the confidence to engage with emerging technology and to succeed in the workplace.”

Government Navigates AI in Education

A government spokesperson stated that it was investing more than £187m in national skills programmes and had published guidance on the use of AI in schools.

They said: “Generative AI has great potential to transform education and provides exciting opportunities for growth through our plan for change. However, integrating AI into teaching, learning and assessment will require careful consideration and universities must determine how to harness the benefits and mitigate the risks to prepare students for the jobs of the future.”

Names have been changed.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.