Chapman University Debuts New PantherAI Platform
Chapman's Answer to the University AI Dilemma
Universities are navigating the complex world of artificial intelligence, balancing fears over academic integrity with the need to stay technologically current. As concerns have reached a boiling point, Chapman University has introduced its own solution: PantherAI.
Available to all students and faculty, PantherAI is a centralized hub built on the open-source platform LibreChat. This system provides users with access to multiple powerful AI models, including GPT-4o, Claude, and Gemini, all in one place.
The Strategy Behind PantherAI Cost Security and Flexibility
A key driver for this initiative was financial prudence. Unlike the California State University system, which recently spent $17 million on a ChatGPT license, Chapman's approach is more sustainable. According to Interim Vice President and Chief Information Officer Phillip Lyle, the university pays a small amount based on consumption.
"If the campus doesn’t really need it, we’re not really paying anything for it," Lyle explained. "Our budget is a tiny, tiny fraction of what it would cost for an equivalent site license with ChatGPT."
Lyle, who spearheaded the project, emphasized that the decision was based on flexibility, security, and cost-efficiency. Since there is no contract with the LibreChat system, Chapman can pivot to other solutions if better alternatives emerge. Furthermore, the platform is designed to mitigate the data privacy risks associated with creating accounts on commercial AI sites.
"If you’re not careful in reviewing those agreements… it is possible that the data you are putting in will be used to train (the AI model)," Lyle warned. He assured that nothing entered into PantherAI by students or faculty will be used to train any language models. The fact that Harvard University is a partner with LibreChat further solidified Chapman's confidence in the system.
Student Skepticism A Question of Trust
Despite the university's intentions, some students remain wary. Edsel Louis Tinoco, a freshman creative writing major and an early adopter of ChatGPT, agrees that universities should embrace AI. However, he has no plans to use PantherAI.
"I would be worried that Chapman would use my data to accuse me of cheating, because nothing is stopping them from doing so," Tinoco stated.
Reading the Fine Print What the Terms of Service Reveal
Tinoco's concerns are not unfounded. The PantherAI terms of service contain language that could raise red flags for users. It explicitly states: “Usage, including prompt information and responses, are recorded and stored within PantherAI. As is the case with all Chapman-owned systems — it is possible that authorized Chapman Information Systems and Technology (IS&T) staff members may see prompt or response details as part of a troubleshooting or debugging process.”
Furthermore, PantherAI is subject to the university's Computer and Network Acceptable Use policy. This policy reserves the university's right to retrieve data from university-owned systems to comply with investigations or respond to subpoenas. The potential for administrative review, even for troubleshooting, is a significant deterrent for students.
The University Responds to Privacy Concerns
Phillip Lyle acknowledged the potential for ambiguity in the policy language and suggested it may be revised. He stressed that IS&T will only review data to fix system errors or when legally compelled. He also voiced strong opposition to using the platform for disciplinary purposes.
"I would have significant concerns if our department was asked to use the system data to help justify an accusation of cheating," Lyle said. He clarified that cheating detection was never a factor in the decision to deploy PantherAI.
This comes as the Chapman administration, led by Dean of Students Jerry Price, is increasing its focus on academic integrity amid a rise in cases of unethical AI use. This crackdown fuels the skepticism of students like Tonico, who feel it's safer to assume their data could be used against them.
Lyle stated he could not definitively say that student data would never be requested for an academic integrity case, but he reiterated his personal and departmental opposition to such a use. "At no point was detection of cheating ever discussed as a reason for the deployment of PantherAI," he affirmed.
The Path Forward for AI at Chapman
Currently, there's a gap between the tool's availability and faculty expertise. While workshops have been offered, several professors reported not having enough knowledge to comment on the new system. This highlights a need for more robust training and guidance.
Ultimately, the success of PantherAI may hinge on whether the university can build trust. For now, the choice for students remains clear: use the university-provided tool with its privacy assurances and underlying policy questions, or turn to external services. Lyle maintains that PantherAI is the safer option. Chapman plans to continue updating its AI guidelines, which may bring more clarity and reassurance to its community.