ChatGPT has schools playing defense. This artificial-intelligence language bot was developed by OpenAI (“GPT” stands for “Generative Pre-trained Transformer”). Microsoft, Google, and other Big Tech companies are developing their own versions of this technology. ChatGPT is able to take a few bullet points and expand them into a rough essay. Students can then polish the AI-generated draft, or simply turn in the raw text. Those students who would have received a gentleman’s C can now use AI to cheat and .
To fend off cheating via chatbots, universities are expanding the surveillance panopticon. They are hiring the private cheating-detection companies they first turned to during Covid, when students left the classroom. These companies use machine-learning models to count how many times per minute a student blinks or shifts his gaze (while taking an online exam, for example); they then compare this figure to the number of times a student who was really working would blink or shift his gaze. In this way, they hope to detect whether the student is using ChatGPT to complete the exam..
Ultimately, the potential for AI-assisted cheating is a symptom of a deeper problem that predates the technology. Students cheat because they believe that the grade they receive in their class—and the degree they receive at the end of four years—is more valuable than the material they’re allegedly there to learn. This mindset has been encouraged by bloated administrations who focus on amenities and credentialing and tout a customer-oriented approach. They hire an army of adjuncts to create class-like procedures that require the fulfillment of certain requirements, with no real reflection on what is being taught or how the students are being formed.
The cheating began with university administrators, when they started to substitute a credentialing process for an actual commitment to the formation of a particular kind of student. It’s no surprise that college administrators have already been caught using ChatGPT themselves—after all, many university mission statements are bland enough to be algorithmically generated from a few bullet points and buzzwords. They don’t exist to animate the mission of the university but to fill up enough column inches on a website.
At Vanderbilt University's Peabody College, administrators were even caught turning to ChatGPT to write a condolence note to the student body in the wake of a mass shooting at another school. The administrators, but upon examining the note, it’s easy to see why they turned to AI. The bland of “we must reflect on the impact of such an event and take steps to ensure that we are doing our best to create a safe and inclusive environment for all” is the work of a human trying to erase any trace of humanity.
Sadly, that is the kind of formation that many schools are trying to offer. If schools are primarily dedicated to producing workers, rather than holistic human beings steeped in the liberal arts, then this is the right kind of formation for those who want what David Graeber termed Simple Sabotage Field Manual, a handbook prepared by the Office of Strategic Services that was intended to help ordinary Britons obstruct the Nazis in case of occupation. (Sample tip: “Attempt to make committees as large as possible—never less than five.”)These jobs are more about maintaining the illusion of productivity than producing anything of value. These jobs often appear to be run according to the principles of the
The work at BS jobs can’t bear too much scrutiny. Close up, workers will admit to the governing principle, “We pretend to work, and they pretend to pay us.” At BS schools, students, teachers, and administrators are engaged in the same farrago of false industry. They can form the appearance of a classroom, but it’s hollow inside, just as ChatGPT can write your essay or your love letters without any sentiment.
When I taught statistics at the King’s College, I always gave the requisite anti-cheating speech, and then I added my own addendum: “If you cheat in this class, the best-case scenario for you is that you aren’t caught, and that your grade goes up a little in this class—a class that for most of you isn’t part of your major. That will shift your GPA up by a couple tenths higher than it would have been, and, after your first job, no one will ever ask you for your GPA again. To buy those temporary tenths, you’ll have practiced running roughshod over your conscience, making it easier to ignore that still small voice the next time.”
I knew that my students weren’t necessarily math enthusiasts, so I also made the case for why they needed to master the material I would teach them. Statistics allow us to help a friend or a parent in a medical crisis. It can be easy for the firehose of medical babble to become the “WAH WAH WAH” of the adults in Charlie Brown. It takes a little fluency with statistics to be able to listen, ask questions, and care for the person you love. I wanted my students to desire that capacity, and to question me until they were confident they’d achieved it.
Professors can point cameras at students and pat them down for contraband in closed-book tests. But they won’t beat ChatGPT until they ask for something that only the students can give. Professors need to persuade students that the content of their classes has real value. Administrators need to commit to transforming students into whole human beings, not credentialed cogs in a machine.
First Things depends on its subscribers and supporters. Join the conversation and make a contribution today.
Click here to make a donation.
Click here to subscribe to First Things.