AI On Campus The Duke University Experiment
The rapid rise of generative artificial intelligence tools like ChatGPT has sent ripples across the academic world, forcing both faculty and students to navigate a new and uncertain landscape. Since OpenAI's ChatGPT debuted in 2022, universities have been grappling with how to adapt. While some educators rushed to establish guidelines against academic dishonesty, others began to see the potential of AI as a powerful learning aid.
Duke University Launches Campus-Wide AI Initiative
Stepping directly into this debate, Duke University has launched a significant pilot project with OpenAI. As of June 2, all Duke undergraduate students, along with staff, faculty, and professional school students, have been granted free, unlimited access to ChatGPT-4o. In a move to address privacy concerns, the university also introduced DukeGPT, a university-managed AI interface designed to provide robust data protection while connecting users to learning and research resources.
To formally examine the technology's impact, Duke established a Provost’s Initiative on May 23. This group is tasked with fostering a campus-wide conversation about the opportunities and challenges AI presents to student life, with a comprehensive report expected by the end of the fall 2025 semester.
A Divided Faculty The Educator's Dilemma
The response from Duke's faculty has been anything but uniform. While some professors are actively embracing AI in their classrooms, others have instituted strict bans, citing concerns over its effect on students' problem-solving and critical thinking skills.
David Carlson, an associate professor of civil and environmental engineering, represents the more lenient approach. In his machine learning course, he permits students to use generative AI, provided they are transparent about it. "You take credit for all of (ChatGPT’s) mistakes, and you can use it to support whatever you do," Carlson stated, noting that while imperfect, AI can offer valuable secondary explanations of complex topics.
Similarly, Matthew Engelhard, an assistant professor of biostatistics and bioinformatics, encourages the interactive use of AI tools. He likens them to calculators, warning that over-reliance can be detrimental. "My approach is not to say you can’t use these different tools," Engelhard explained. "It’s actually to encourage it, but to make sure that you’re working with these tools interactively, such that you understand the content." He cautioned that using AI as a shortcut might be "short-circuiting the learning process for yourself."
From STEM to Humanities A Tale of Two Disciplines
The perspective shifts dramatically in the humanities. Thomas Pfau, a distinguished professor of English, argues that delegating learning to AI means students lose the ability to evaluate information critically. "If you want to be a good athlete, you would surely not try to have someone else do the working out for you," Pfau said. He believes that while AI has a role in STEM, it has no place in humanities, where interpretation and developing a personal voice are paramount. Using AI to finish an essay, he argues, defeats the core purpose of a university education: cultivating one's personhood.
Henry Pickford, a professor of German studies and philosophy, echoed this sentiment. He believes writing in the humanities is a tool for "self-discovery" and "self-expression." With AI, he fears students will view writing as merely "discharging a duty" rather than engaging in an intellectual challenge. This has also broadened opportunities for plagiarism, leading him to adopt a stringent AI policy.
Adapting Assessments for the AI Era
Many professors admit they struggle to detect whether a student has used AI on a standard assignment. This challenge has pushed them to explore alternative assessment methods.
Professor Carlson has introduced oral presentations to his class projects, describing them as "very hard to fake." Professor Pickford has also incorporated more oral assignments, such as spoken defense of arguments, and has added in-class exams to courses that previously relied only on papers.
"I have deemphasized the use of the kind of writing assignments that invite using ChatGPT because I don’t want to spend my time policing," Pickford noted. He did concede, however, that AI could be useful for generating feedback on a paper's structure.
The Student Perspective A Powerful But Perilous Tool
Students generally view AI chatbots as a helpful supplement to their learning but are also wary of becoming over-reliant on them.
Junior Keshav Varadarajan uses ChatGPT to outline his writing and generate code. "It’s very helpful in that it can explain concepts that are filled with jargon in a way that you can understand very well," he said. However, he admitted it can be hard to internalize concepts because the tool jumps from problem to solution without showing the process.
Conrad Qu, another junior, described ChatGPT as a "tutor that’s next to you every single second," crediting it with improving his productivity. Both students agreed that AI is useful for tasks they are less interested in or when facing a time crunch. But for subjects they genuinely care about, they prefer to do the work themselves. "If it is something I care about, I will go back and really try to understand everything (and) relearn myself," Qu said.
An Unsettled Future Is AI a Net Positive for Learning
As generative AI continues to evolve, a consensus on its role in higher education remains elusive. The debate over whether its benefits outweigh its costs is ongoing at Duke and beyond.
"To me, it’s very clear that it’s a net positive," said Professor Carlson. "Students are able to do more... It makes a lot of things like coding and writing less frustrating."
Professor Pfau is far less optimistic. He worries that future students will arrive at college already too accustomed to chatbots, putting those who don't use them at a competitive disadvantage. Ultimately, he believes the responsibility lies with the students to use these tools in a way that fosters genuine intellectual growth.
"My hope remains that students will have enough self-respect and enough curiosity about discovering who they are," Pfau concluded, "...something we can only discover if we apply ourselves and not some AI system to the tasks that are given to us."