As students at Suffolk University are given their beginning-of-semester syllabi speeches, many have noticed a different attitude towards the use of artificial intelligence, specifically large language models in a classroom setting. The most notable change: some professors are now allowing AI usage from their students so long as they cite exactly how and where in their work they used the program.
The amount of students admitting to using ChatGPT for their coursework is not small – according to Abba Connally, a senior history and literature major, she has “seen students from all departments using AI,” and noticed “a lot of students who admit to using it, [even] for creative writing.”
“The increasing lack of academic integrity, and the lack of care and effort to complete assignments [is discouraging],” said Tessa Carlucci, a sophomore law major at Suffolk.
Carlucci said that this discouragement stemmed from seeing people use AI and getting away with it when there are students working and studying hard without AI.
It isn’t only students who are noticing a change in the university’s attitude.
In the College of Arts and Sciences, Dean Edie Sparks is requiring, from each department, an AI Fellow. According to Jason Tucker, a writing professor and one of two AI Fellows in the English department, this position is meant to inform faculty of “the ways in which they might incorporate some AI use into their classes and think about the ways in which their current assignments might be vulnerable to some bad uses of AI by students in ways that are counterproductive to their learning,” he said.
Jonas Kaiser, assistant professor in the Communication, Journalism & Media Department said that he and his colleagues have noticed students turning in work made primarily with the use of AI.
Kaiser said that AI has been “suggested to students” and recalled that during the finals period of the last academic year, AI companies had given students deals to access to their programs. Kaiser also said that despite the marketing of AI as something that could create from scratch, it was never meant to be used that way.
Moby Asch, a junior psychology major, said that students he’s interacted with are often unaware of the environmental threat AI poses and said that “it’s a tricky situation.”
It was the responsibility of the companies themselves, said Kaiser, rather than students to be wary of the effects the technology has on the planet.
Asch said that he believes large companies’ “overuse of natural resources” is typically monitored more by the average person than by the companies themselves, and that it is still up to the general public to be conscious of how they interact with these programs.
In 2024, Dean of Sawyer Business School Amy Zeng shared in a University feature story that “students must learn to use AI strategically while honing vital human qualities, such as judgment, critical thinking and ethical reasoning” in reference to the business school’s Artificial Intelligence Leadership Collaborative. The program’s mission is to integrate “artificial intelligence into business education in practice,” according to the program’s website.
Relying on a chatbot to complete or generate assignments such as essays goes against the very intention behind the assignment, in Tucker’s view.
“An essay, by definition comes from an old French word, a verb, meaning ‘to try, to test, to weigh, to determine, to measure,’” said Tucker. “An essay is not an argument, it is, ‘I’m going to think this through on paper, and you, the reader, are going to participate in my thought process.’”
This thought process, which is designed to help build a student’s skill, cannot be uninterrupted when a chatbot guarantees the opposite, according to Tucker.
Jeremy Levine, a professor of production in the Media and Film department recalled an idea from media theorist Marshall McLuhan that when a new technology is developed and people start relying on it, they lose the skill that technology is helping them with.
AI is such a recent development within our society that we cannot be entirely certain what this amputation of skills is, or what it means for us. Levine said that within the film industry, there is the threat of displacing workers when writers or storyboard artists whose work is being absorbed by AI.
“The smart thing to do is to slow down, to play with things, to see their potential. And think about its impact on you and society,” Levine said.
Scott Votel, the second AI Fellow for Suffolk’s English department, said that despite previous AI usage in schools during the last couple years, there has been more movement towards a “sophisticated use” of the technology.
That being said, Votel said he is “not sure how we continue to charge the tuition that we charge” if “cheating is this easy and [undetected].” He questioned the validity of embracing programs that have been used by those paying the price to outsource the work that higher education is meant to foster and marinate over the course of several years.
Both Votel and Tucker warned of a development they referred to as “outsourced cognition,” something that occurs when a person relies on AI to think for them. Instead of using the natural, human tools one has to build up their learning skills, a person will place their potential energy into an LLM instead of themself; this is where “outsourcing” begins, they said.
Votel and Tucker said that by asking a chatbot to analyze something, a student is refusing themselves the opportunity to grow; they are robbing themselves of a process that often begins in environments like Suffolk: one designed to nurture minds, not give them shortcuts that will cost them later in life.
Kaiser used the analogy of a person going to the gym but asking a bodybuilder to work out for them, particularly a bodybuilder that might be prone to making mistakes. This person would be giving someone else their own potential; likely wasting their own time. This is the same as someone relying on or blindly trusting large language models, according to Kaiser.
For Votel and Tucker, the conflict between embracing technological advances in a safe way while still prioritizing students and their education needs to be tackled first by taking in student input.