Education

College students are cheating. Professors are struggling to stop them.

More students are using AI to complete assignments.
(Photo Illustration: Philip Robibero / AJC | Source: Getty)
(Photo Illustration: Philip Robibero / AJC | Source: Getty)
14 hours ago

Last fall, professor Catherine Nickerson went old school. For the first time since 1999, she gave her students at Emory University a blue book exam.

In the age of artificial intelligence, the blue book — sheets of lined paper stapled in a blue booklet — seems an ancient relic. Yet Nickerson and university professors across Georgia are embracing them, not for the sake of nostalgia but necessity.

For years, Nickerson gave her students weekly online quizzes. “But that started to become a problem when everybody was getting 100% on every quiz,” she said. “It led me to believe there was not fully honest test taking going on. So I decided to go to this.”

Faculty at many of Georgia’s largest universities are having similar experiences. They say more college students are cheating. They’re using AI to do it. And for now, reverting to the in-class, handwritten exam is one of the few ways professors can stop them.

“Students at all universities are using generative AI to complete assignments,” said Amy Bruckman, a computer science professor at Georgia Tech. “I don’t think the general public understands how desperate the situation is.” Bruckman and other professors interviewed for this article spoke on their own behalf and not for their universities.

Loading...

It’s hard to quantify exactly how much AI cheating is happening, in part because, to the consternation of many professors, it’s so hard to catch. But an academic honesty report from the University of Georgia shows that last school year, the number of cheating accusations nearly doubled. And for the first time, the annual report included an AI datapoint: 51% of the incidents were “alleged to be related to the use of artificial intelligence.”

While Georgia Tech says it has not seen an increase in what it calls “student integrity referrals,” referrals alleging AI use have more than tripled since the 2023-24 school year.

The issues go far beyond Georgia schools. A February report from College Board found that 92% of faculty nationally are concerned about AI being used for plagiarism and academic dishonesty.

While AI detection programs exist, a memo from Emory’s honor council says there is no solid consensus about their accuracy. It notes that Emory does not currently license a program and advises faculty to treat their results with caution.

It also lists some common indicators of AI use, including a “student’s inability to discuss their own work.”

Cheating the old-fashioned way, like writing answers on your hand or glancing at a friend’s test, at least required some effort. “Now with AI, once the test is on your laptop, you can just turn your brain off,” said Andrew Wang, a graduate student studying computer science at Georgia Tech. He said he avoids the temptation because he fears getting caught. But as he’s seen from classmates, the temptation can be strong, especially in a time crunch.

“You check your watch, and you think, there’s no way I’m going to get this done,” said John Knox, a meteorology professor at UGA. “And meanwhile, the supercomputer that is in your hand or sitting on your desk is just begging you and saying, ‘Oh, I can do that so easily.’”

In one of Knox’s classes, the best students “did not have anywhere near the highest grades because they were honest.” They’d get a 95% on a test honestly, while students who didn’t know the material as well would get 100%.

“When grades start to invert that way,” he said, “it’s time to blow the whistles, stop the presses, throw the parking brake and figure out a new way to do this.”

In fall 2023, one year after the release of ChatGPT, Tony Brozowski sat near the back of an intro course at Kennesaw State University and saw many of his classmates had AI programs on their computer screens.

“I think that class is when I noticed, like, this is definitely a problem,” he said, adding that more than two years later, some faculty are still struggling to adapt.

“I think there’s a sense of helplessness engulfing certain professors,” said Isaac Smith, a junior at Emory. Instead of trying to police AI cheating, he thinks faculty should change their class structure by including more discussions and peer-to-peer interactions. If faculty don’t somehow evolve their assignments, he suspects students will keep using AI to complete them.

Bruckman, the Georgia Tech professor, spent last summer reimagining her curriculum, making it more project-based and having students do more field work. While it’s been successful, it also took her a lot of time, and Bruckman said some classes are easier to redesign than others.

Josh Beck, a Ph.D. student, had to rethink how he runs the undergraduate classes he teaches at Georgia State University. For the first time, he added in-class, written exams to his curriculum.

It appears to be a popular strategy. Emory’s bookstore said it’s seen a “noticeable increase” in blue book purchases. At UGA, blue book sales are up approximately 125% from last year.

But Beck says they’re “wildly inconvenient” to grade. For professors teaching many classes, grading hundreds of tests by hand may not be feasible.

Forbidding AI entirely likely isn’t a good option, either. Not only would the policy be difficult to enforce, but many expect students will be asked to use AI when they enter the workforce.

“When we get out into the real world, my boss doesn’t care if I use ChatGPT to generate an email. My boss is going to care how quickly I can do my job,” said Smith. When he uses AI for assignments, he meticulously cites how he used it so he avoids violating Emory’s honor code.

AI can be a tremendous learning tool, said Georgia Tech senior lecturer Jeff Epstein. But if students use it to cheat, it’s difficult for faculty to assess if their students are actually learning. In the short to medium term, he expects faculty will need to do more in-person exams.

For long-term solutions, Bruckman believes there needs to be federally funded research to address what she believes is a crisis. “Individual faculty can make little patches, but we don’t have time or resources to really fix it,” she said. “What we need is for someone to throw a half a billion dollars at it and say, ‘We need real research to solve this problem and to tell people how to teach in the presence of AI.’”

Loading...

Considering the College Board found 84% of high school students report using GenAI tools for schoolwork, it’s a problem higher education will continue to face. And Epstein says it raises much bigger issues.

“Which is not how we do university, but why we do university? In other words, what is the point of teaching students these things if a machine can do it for them faster and better?” he said. “And that’s going to be also a very difficult conversation.”

About the Author

Jason Armesto is the higher education reporter for the AJC.

More Stories