AI Boosts Student Grades But Risks Critical Thinking
Co-authored by Xiaoyan Dong, Hannah Farrell, and Michael Hogan.
Artificial intelligence (AI) is fundamentally reshaping the landscape of education and skill development. In higher education, a growing number of students are turning to AI tools like ChatGPT to help with their assignments. This trend raises important questions: How are students and AI truly collaborating, and what interaction patterns are shaping modern learning?
The Performance Paradox
A recent meta-analysis by Deng et al. (2025) reviewed 69 studies and concluded that ChatGPT improves academic performance, emotional states, and higher-order thinking tendencies, all while reducing the mental effort required. However, this rosy picture was challenged by Weidlich et al. (2025), who pointed out significant flaws in the analysis.
Critics noted that the meta-analysis improperly grouped diverse studies, from memorizing medical terms to writing essays. Furthermore, the control groups were inconsistent, and most studies measured performance during AI use rather than actual long-term learning or knowledge retention. The finding of reduced mental effort is particularly alarming. As Roman (2025) warns, over-reliance on AI could erode students' critical thinking skills by reducing cognitive engagement. A study by Luther and colleagues (2024) offers a closer look at these concerns.
How Students Really Use ChatGPT
Thinking and writing are deeply intertwined activities central to learning. The Luther et al. (2024) study analyzed how 135 students used ChatGPT during a timed essay-writing task. The goal was to write a 600-1000 word essay on alcohol prohibition in public spaces. All screen activity, including every prompt and AI response, was recorded.
The findings revealed a concerning pattern of dependency and surface-level engagement. While the number of prompts varied widely, the vast majority (95.4%) were content-related, with students primarily using ChatGPT as an information source for data and facts. Shockingly, 40.5% of prompts were requests for complete texts, a behavior strongly correlated with copy-pasting. The study also found that students with a higher affinity for technology and existing ChatGPT users were more likely to ask the AI to write entire sections for them.
While students expressed high levels of trust in ChatGPT's competence and reliability, they rated it low on warmth as a collaborator, highlighting a functional rather than interactive relationship.
Implications for Modern Educators
The study by Luther and colleagues mirrors a common scenario in higher education. It shows that familiarity and trust in AI do not prevent over-reliance or plagiarism. ChatGPT's outputs are generated from vast datasets and require critical cross-verification, not immediate acceptance.
Educators must actively teach students to maintain academic skepticism and scrutinize AI-generated content through iterative questioning. Instead of simply prompting for facts, students can learn to use ChatGPT as a thinking partner to foster critical cognition. Well-designed learning activities can encourage students to question, refine, and build upon AI responses, transforming the tool from a simple information provider into a catalyst for deeper learning.
The Path Forward is in the Design
The key to leveraging Generative AI effectively lies not in the tool itself, but in how we design the tasks it is used for. An upcoming study by Zhang and colleagues (2025) illustrates this perfectly. In their experiment, only one member of a five-person debate group could access ChatGPT. This created a 'gatekeeper' effect, where the student was held accountable by the group to filter and question the AI's suggestions before sharing them. This social pressure fostered critical engagement and positioned the students as orchestrators of the AI, not just consumers of its output.
Ultimately, adopting a design-focused mindset is crucial. By carefully considering our pedagogical goals, we can create learning experiences where AI augments, rather than replaces, human intellectual engagement.