Gone are the days where if you complain about an essay on Twitter, a bot replies and you can pay someone to write your essay. Now, the plagiarism game can be much simpler: students can instead ask OpenAI’s ChatGPT to write your essay. Or so they think.

Last year, OpenAI, an artificial intelligence company, launched ChatGPT, which takes a submitted prompt and answers it to the best of its ability. While something like this might be helpful and ethical for use in some business contexts and in personal use, students across the nation are using ChatGPT to write their papers for them.

The GPT that ChatGPT uses stands for Generative Pre-Trained Transformer, meaning it takes all the knowledge of human language and data it can collect from the internet to write a response the way a human would. Additionally, as users provide feedback to its responses, it learns and adapts to get better at answering prompts. However, it frequently gets things wrong in sophomoric style—and professors can tell.

“Now, for those students who ask, ‘Oh, well they won’t notice,’ yeah, we do,” said Dr. Zachery Koppelmann. “It doesn’t write that well. Yes, it answers the question, but oftentimes the content is wrong. And it’s really pretty obvious, especially if you’re trying to do anything long.”

Just as professors and higher education leaders across America are worried about the use of ChatGPT, Wabash faculty are worried about the effects of students and their potential use of AI on papers.

“I would hope that Wabash students wouldn’t want to short circuit that process by having AI produce papers for them.”

Dr. Brian Tucker

“Papers are important because they ask students to delve deeper into a topic than they would in class discussion or shorter questions on exams,” said Dr. Joyce Burnette, Chair of Division III. “ChatGPT could potentially make it more difficult to accomplish that goal if students can turn in papers with- out investing much thought. This would mean that students wouldn’t get the quality education that we advertise.

Dr. Brian Tucker, Chair of Division II, echoed that sentiment. “My humanities perspective is that learning to write well is about learning to read well and think well,” said Tucker. “I would hope that Wabash students wouldn’t want to short circuit that process by having AI produce papers for them.”

Some professors have talked about tools they can use and actions they could take to catch students who use ChatGPT and stump it in the first place.

Registrar and Dean Jon Jump has advocated for a checker that claims it can identify AI writing, but some professors are skeptical, saying it doesn’t work and is too inconsistent.

Academic writing at a college level has certain intrinsic expectations. And that privileges stu- dents who go to schools who have more money.”

Dr. Zachary Koppelmann

Dr. Shamira Gelbman, Chair of the Political Science department, has advocated that, rather than using a tool to catch students, professors ought to write their questions in a way that makes them impossible for AI to answer. She’s found that by writing questions that explicitly ask students to reflect on their personal experience or specific material in class, the AI is stumped. Because of ChatGPT’s function as a language model that simply aggregates data and regurgitates a reasonable response, it cannot answer from experience and it cannot generate new, genuine reflections on material.

At the same time, professors all over the nation act as if the sky is falling down, there seem to be some potential benefits of technology like this.

“The most legitimate uses are for students who for whom English is not their first language, or for students who came from very underprivileged schools,” said Dr. Koppelmann. “Academic writing at a college level has certain intrinsic expectations. And that privileges students who go to schools who have more money.”

Koppelmann thinks that ChatGPT can serve to teach students from underprivileged backgrounds how to write in certain ways, like demonstrating argument structure or some of the odds and ins of English that are only learned through experience.

Still, ChatGPT is incapable of writing well enough to pass classes (even if it weren’t cheating) because it cannot think of its own arguments. It may be able to serve a similar purpose as Wikipedia, where someone can get a basic overview of a subject and find an idea of where to start with a paper, but its inability to form arguments renders it unusable.

“Rather than using something like this, they should use the Writing Center,” said Koppelmann.

Koppelmann also made sure to condemn a new tool that Microsoft has rolled out in Word, called the Word Grade Level Checker, which claims to give you a grade for your writing simply based on its complexity.

For as long as students have had assignments, they have attempted to cheat on them. Sometimes they looked over each other’s shoulders or paid someone online to write their essays for them. If anything is clear from the emergence of ChatGPT as a tool for lazy students, it is that they will be caught and that professors will find new ways to test students.