346: I Used to Teach Students. Now I Catch ChatGPT Cheats.
Plus, regulators in Australia ask whether security issues require online degrees to come with warning label. Plus, the video I tried to share at ICAI. Plus, jobs.
Issue 346
Subscribe below to join 4,414 (+57) other smart people who get “The Cheat Sheet.” New Issues every Tuesday and Thursday.
The Cheat Sheet is free. Although, patronage through paid subscriptions is what makes this newsletter possible.
I Used to Teach Students. Now I Catch ChatGPT Cheats.
Troy Jollimore is an award-winning author and professor — usually, he says, in philosophy and ethics. This week he has a long and vital piece in The Walrus, a Canadian magazine.
Were I a teacher, it would be required reading. I highly, highly recommend it and feel as though I could have written it. If teaching and learning and integrity are important to you, please click and read it. The portions I share here are inadequate, and without simply reprinting it, I do it little justice.
The headline up top — I Used to Teach Students. Now I Catch ChatGPT Cheats. — is the title of Jollimore’s essay. But you’ll get more context from the subheadline:
I once believed university was a shared intellectual pursuit. That faith has been obliterated
Why? Yes, cheating — cheating with AI in particular.
I know how he feels. You may as well.
Part of his description of new AI includes:
No fuss, no muss. If your instructor doesn’t know what to look for, or if the AI you are using is good enough, you can convince them you have mastered the topic without needing to learn anything about it at all.
And:
Sometimes they will provide quotations, giving page numbers that, as often as not, do not seem to correspond to anything in the actual world. These, again, are the ones that are easiest to pick out. I have little doubt that there are others that are more sophisticated, and that some get past me.
He continues:
To judge by the number of papers I read last semester that were clearly AI generated, a lot of students are enthusiastic about this latest innovation. It turns out, too, this enthusiasm is hardly dampened by, say, a clear statement in one’s syllabus prohibiting the use of AI. Or by frequent reminders of this policy, accompanied by heartfelt pleas that students author the work they submit.
Preach.
Statements and reminders and pleas are wasted breath. I’m sorry — they are.
Then there is this, emphasis original:
I once believed my students and I were in this together, engaged in a shared intellectual pursuit. That faith has been obliterated over the past few semesters. It’s not just the sheer volume of assignments that appear to be entirely generated by AI—papers that show no sign the student has listened to a lecture, done any of the assigned reading, or even briefly entertained a single concept from the course.
It’s other things too. It’s the students who say: I did write the paper, but I just used AI for a little editing and polishing. Or: I just used it to help with the research. (More and more, I have been forbidding outside research in these papers for this very reason. But this, of course, has its own costs. And I hate discouraging students who genuinely want to explore their topics further.) It’s the students who, after making such protestations, are unable to answer the most basic questions about the topic or about the paper they allegedly wrote. The students who beg you to reconsider the zero you gave them in order not to lose their scholarship. (I want to say to them: Shouldn’t that scholarship be going to ChatGPT? )
If you read The Cheat Sheet or work in academic integrity, you know exactly what he’s talking about. Exactly.
Please read the whole piece.
I’m going to decouple these next four paragraphs, because I think it’s easier to read them and for me to comment, as briefly as I can. Here is the first:
I could, of course, turn a blind eye to it and give every paper the benefit of the doubt, no matter how unlikely it might be that it was written by a human hand. I could say, as some of my colleagues do, that I trust all of my students, and that it is better to trust and sometimes be wrong than to be suspicious and right. I could go along with all of the people who have said to me: It’s not your job to force people to be virtuous. Let cheaters cheat. In the long run, they only hurt themselves, denying themselves the real education they could be gaining (and for which, after all, they or their parents are paying a fair bit of money).
Yes, teachers actually suggest ignoring cheating. They feel good, I suspect, about using ideas such as trust, to camouflage their lack of rigor. Or integrity.
Continuing:
This is a lovely set of thoughts, and it would make my life a good deal easier if I could bring myself to embrace it. But I can’t. (I’m an ethics professor, for crying out loud.) For one thing, while it is true that the cheaters are cheating themselves out of an education, it does not follow that they are harming only themselves. There is no getting around the fact that a central part of my job is deciding who passes and who fails. Some of my students still do write their own papers. They still do the readings. They want to learn. (And then there are those who don’t but have enough integrity that they follow the rules nonetheless.) My papers are hard to write, and the readings can be hard to understand; it’s philosophy, after all. These students are working. The effort they put in is real, and I appreciate and admire it.
As I — and many others — have written, cheaters harm considerably more people than themselves. Honest students are easy to pinpoint as examples. But the circle of harm around academic misconduct is wide and deep.
Paragraph three:
It would be an egregious wrong to reward both sets of students equally, to force the honest students to compete for jobs, for scholarships, for admission to graduate schools on an equal footing with those who have “earned” their grades by submitting work that is not theirs.
Agreed. An egregious wrong.
And:
Of course, should such cheating continue to be widespread, it seems inevitable that all college degrees will be worth a good deal less and will perhaps cease to offer any advantage at all. But this outcome is surely of little comfort to those willing to work for their degrees, those who want those degrees to continue to be worth something and to mean something.
Yes, all college degrees will be worth a good deal less and perhaps cease to offer any advantage. That is what we’re talking about.
As we watch it happen in slow motion — right now — we can blame the students for taking shortcuts and avoiding the work. But there are adults in the room, at least in theory. When those teachers and schools shrug cheating away, they are causing the demise of college every bit as much as those doing the cheating, blowing up the entire value proposition of organized education in the process.
In my view.
But whatever. Ignore me. But perhaps hear what Jollimore has to say. I do think it’s worth it.
Australia Regulator Asks Whether Online Degrees Should Come with “Warning Label”
TEQSA, the higher education regulator in Australia, reportedly asked if, due to concerns about assessment security and AI use, college degrees obtained online should be required to indicate the mode of attainment.
Setting this stage, TEQSA is likely the most engaged and aware education quality regulator anywhere, especially when it comes to issues of integrity. The agency has erased cheating websites from the Internet, actively engaged and educated teachers about threats, and even sued Chegg (see Issue 314). TEQSA is, in my view, where most education quality assurance entities should be.
With that, the coverage is not very detailed, as it seems reporters were not present at the meeting. What was shared includes that TEQSA:
reportedly stated that the integrity of assessment of online degrees could no longer be guaranteed and that degrees studied online would therefore need to be tagged with what is effectively a warning label.
It’s clear that the query was not a policy proposal — at least not yet. The regulator raised the question to spark debate and conversation.
Still, the coverage reports:
the fact that the national regulator could raise the inability of institutions to prevent cheating in exams for online courses as an issue has raised eyebrows.
Because:
any suggestion that assessment of online degrees is perceived (or measured) to be inferior to that of face-to-face or blended models has huge implications for both staff and students.
No question — any mark or designation that segregates an online degree from one earned in the traditional way, would have massive implications. To undersell it by 20,000 leagues, those are implications that schools do not want, having invested so much in, and become so reliant on, online programs.
Still, the topic of whether online assessments, and therefore online programs, can be secured, is timely. Recently, I have come to believe that online assessments are unsecurable.
Mitigations can, and should, be made. Lockdown browsers, assessment proctoring, secure writing environments — they all help, but are still insufficient. Even with these interventions, misconduct rates in online classes remain higher than they are in in-person settings (see Issue 36). And unfortunately, too many schools view mitigation and security efforts to be luxuries or, at best, case-by-case deployments. Some schools don’t use them at all.
Schools will have to decide, and fast, whether they have the capacity to invest in securing their online assessments, classes, and programs or whether they want to sell education credentials with no value. Going to the wall to secure online assessments does not guarantee their value or validity. But not going to the wall, does guarantee the lack thereof.
I sense we are past the point where that reality must be acknowledged and the hard questions — such as the one TEQSA posed — must be asked.
The Video I Tried to Share at ICAI
When I spoke at the ICAI conference last week, I tried to share a video in my presentation.
I didn’t tell anyone that I planned to play video; I just forgot to mention it. So, it did not work. That was my goof.
So, here it is.
It’s worth the two and half minutes. In my opinion.
I’ll share more on the conference itself in a future issue or two.
Until then, I’m sorry. This video may be depressing.
Jobs
Caveon
Caveon is advertising for exam security officers, in-person, in Florida and Louisiana.
Georgia Tech
Georgia Tech is looking for a Student Integrity Coordinator (Hybrid). The posting says the person in the position will:
Coordinate actions associated with allegations of misconduct as defined through the Student Code of Conduct and Academic Honor Code.
Colorado State University
CSU is looking for an Executive Director of the Student Resolution Center, which:
is responsible for overseeing the university’s student conduct process and a myriad of conflict resolution processes, focusing on student success and retention.
In other words, it's getting to the point where we won't be able to catch them, and probably should abandon on-line education altogether. Which is a shame, because if these people didn't work so damn hard to spoil it for the rest of us, it could be an amazing thing.