Giving Fake, False Answers to Cheating Site
Plus, despite half of Cambridge students using ChatGPT and the school saying it's misconduct, the school has zero cases of misuse. Plus, cheating on nursing exams in India.
Issue 214
To join the 3,303 smart people who subscribe to “The Cheat Sheet,” enter your e-mail address below. New Issues every Tuesday and Thursday. It’s free:
If you enjoy “The Cheat Sheet,” please consider joining the 13 amazing people who are chipping in a few bucks a month via Patreon. Or joining the 12 outstanding citizens who are now paid subscribers. Thank you!
A Professor Asks “Am I the Unethical One?” I Answer: No.
Recently an online outlet called Daily Nous, ran a story featuring a professor of Philosophy at Sacramento State University who, the report says:
recently caught 40 of the 96 students in his online Introduction to Ethics course cheating on a take-home final exam.
First, an ethics course. Ethics.
Then, in actuality, it’s certain that more than 40 of these 96 students cheated. I’ll get to that in a beat.
But before that, the biggest reason I’m sharing this article is because the incident stirred up quite the little dust storm of foot stomping and pearl clutching on social media. The furor is not about the cheating, of course - we never seem to care about that. Instead, the comical condemnation is over how this particular professor caught his cheating students.
Finding copies of a previous final exam on the online cheating site Quizlet, he had it removed and uploaded his own version of his exam, complete with obviously wrong answers. When students submitted those same fake answers on the exam, bingo. Nabbed.
A few in social media orbit called the professor’s stratagem of seeding known cheating sites with false information “entrapment.”
To be as clear as I know how to be, that’s absurd. As it applies to being caught in a trap, sure. The students were truly and accurately caught. But I suspect that those throwing the word around are trying to use it in its more common way, which applies to enticing someone to act in a way they would not normally, committing an illicit act in the process. That simply did not happen here.
No one was enticed to visit known cheating sites such as Quizlet and search for answers. No one was pressured to download or review what they found. No one was cajoled to use those answers on a test. Every single one of those steps was taken voluntarily and with intent - especially the last one.
What the professor did is not entrapment; it is smart.
Seeding cheating sites with false information sounds like a very worthy endeavor, if you ask me. Armed with information that someone was profiting from his intellectual and academic property, this professor not only had it removed, he acted to devalue future theft.
In the process, he taught a valuable lesson on doing your own work, not believing junk you find online, and integrity. And teaching is what teachers are supposed to do, I think. Maybe especially in an ethics class.
Moreover, as word of fake, teacher-planted answers on Quizlet gets around, maybe fewer students will be tempted by it. I cannot see how that’s anything but good.
To further the point, had the same students found the exact same wrong answers from the exact same website, and we didn’t know how they got there, we would not be having this conversation.
In fact, we should not be having this conversation because this pedantic kerfuffle over giving fake answers to a cheating profiteer obscures the genuinely illicit acts - the brazen cheating. Dozens of students sought out and likely paid for test answers and used them. End of paragraph.
What else is there to talk about?
Quickly, since I teased it already, the 40 who were busted using Quizlet answers on a final were not nearly all of them. For one, those are just those who were caught using one cheating site. The same misconduct services are very likely available at Chegg and Course Hero and dozens of other sites. If a student used anything other than Quizlet to submit cheated answers, they probably were not caught.
For another, the professor set a very high statistical standard before accusing students of cheating - a student needed a better than 1:100 chance of submitting the same, wrong answers by coincidence. To be swept up in this net, a student had to submit at least 19 of the 45 incorrect, seeded answers. In other words, of the false answers on Quizlet, if a student used just a few and not most of them, they would not have been caught. They would have cheated, yes. Caught, no. Counted in the 40, no.
In other words, the 40 students of the 96 were caught - that’s 41% - were absolutely not all who cheated. Which means that it’s likely that most of this class committed academic fraud.
Maybe that’s the conversation we ought to be having.
Do I think what our professor did - seeding a known cheating site with obviously fake information - was ethical?
Absolutely.
In my view, he should never have given a take-home final exam, especially in an online class. But to curtail the inevitable cheating that goes with it, what he did is not just justified, it should be obligatory. Frankly, I wish more teachers would do it.
To the ridiculous aspersion that the professor was entrapping his students, he told Daily Nous:
These charges don’t make sense to me. I did not encourage or nudge my students to cheat, I did not do anything to make such cheating more likely or easier. Quite the opposite: I tell all my students what will happen if I catch them cheating, and I gave them a comprehensive study guide for the final.
There’s simply nothing more I can say.
Half the Students at University of Cambridge Use ChatGPT for Schoolwork, University Reports Zero GPT Misconduct Cases, None
The Tab, a student-aligned paper at University of Cambridge, has a story this week with the dispiriting, though entirely predictable headline:
Zero Cambridge students investigated for ChatGPT cheating despite half admitting using it
49 per cent of students said they used it to help complete work for their degree
Yup.
The article continues:
The Office of Student Conduct, Complaint and Appeals (OSCCA) confirmed there have been no investigations using the Student Discipline Procedure.
However, a survey of 540 Cambridge students conducted by The Cambridge Tab found almost half (49 per cent) had admitted using the AI chatbot to help complete work for their degree. A similar poll by Varsity, equally found almost half of Cambridge students said they had used chatbots for similar means.
I left that link in because I’m not sure we covered the Varsity survey on its own. In any case, that’s two surveys showing GPT usage by Cambridge students at about 50%. Both use the phrasing, “to complete work for their degree.”
Meanwhile, according to the Tab, the University:
wrote to students reminding them of the “strict guidelines on student conduct and academic integrity” in which “students must be the authors of their own work.”
Good job, Cambridge. That will do it. Nicely done.
Most students, the paper reported, acknowledged using the AI bot for outlining or refining topics, which is not surprising. Few people will ever admit to the most blatant forms of cheating and fraud.
It’s “useful to condense information to aid understanding of complicated concepts,” claimed a first year computer science student.
I’m sure it is. Though, silly me, I thought that condensing information to understand complicated concepts was one of the reasons someone went to a college such as Cambridge. Any college, actually.
The article contained the usual caveats about ChatGPT’s information being “useless” or “unreliable.” Again, half the students at Cambridge admit to using it. And again, great job Cambridge for doing, it appears, nothing at all.
The Tab piece also includes this odd, non-sequitur:
It seems that ChatGPT may be one of the things that Cambridge students actually moderate their use of, employing it more as a tool for learning, at their discretion, rather than for cheating.
Based on what, I am not exactly sure - but OK. I’m also not exactly sure that’s not cheating. And that’s despite the article also including this statement from the University:
Academic misconduct, by its nature, includes students using AI in summative assessments, unless explicitly permitted by the assessment.
If that’s true, the paragraph above makes no sense. And the lack of even a single inquiry into the use of AI-writing in academic work is inexplicable.
Another Cheating Scandal Brewing in India - This One on Nursing Exams
I’ve said before, if I really wrote about academic cheating in India, I’d never be able to write about anything else.
Part of that is, I sense, that authorities in India talk about cheating more often and the scale of both education and education misconduct are simply enormous. Scale invites cheating.
So, with that, India is facing a new, big cheating scandal, according to news reports.
This time it’s alleged that the country’s national nursing recruitment exam fell to rampant cheating of all varieties. The news says answers were circulated online before the exam, impersonators were hired, the whole thing.
The exam is an entrance exam for which candidates had to be eligible. If students cheat to get into nursing programs, it’s rhetorical to ask how ethical they will be once they get in. So, cheating. In nursing. Nice.