The 2023 List of Shame - Educators Who Stood with Course Hero
Plus, EdSurge on "false positives" in AI similarity detection. Plus, an AI job in Australia, the good kind of AI. Plus, corrections.
Issue 226
To join the 3,383 smart people who subscribe to “The Cheat Sheet,” enter your e-mail address below. New Issues every Tuesday and Thursday. It’s free:
If you enjoy “The Cheat Sheet,” please consider joining the 14 amazing people who are chipping in a few bucks a month via Patreon. Or joining the 14 outstanding citizens who are now paid subscribers. Thank you!
Course Hero List of Shame - 2023
As you probably know, in June, major cheating provider Course Hero had their annual event in which they try to convince people they’re a legitimate education company.
As covered frequently, the company attempts to do that by getting educators to participate. And while it’s not clear whether those who join are paid (some are), educators are, it seems, hoodwinked into compromising their integrity by appearing with Course Hero. Or, alternatively, they just don’t care. Whether the institutions that employ them care is an open question. There is no evidence they do.
Also, as mentioned before, I don’t contact these teachers to ensure they know what they’re doing and who they’re doing it with. I think it’s best that, when those notes do come, they come from colleagues. Or administrators. Though now I am clearly dreaming.
Without more delay, here is the list of educators who loaned or sold their credibility to Course Hero this year, based on the final published online agenda. The 2022 list is here, in Issue 143.
The 2023 Course Hero List of Shame:
Michael London - Muhlenberg College
Stephanie Malmberg - SUNY Broome (Course Hero says she’s a Dean, by the way)
Stephen M. Kosslyn, Foundry College
Brian Steinberg - SUNY Geneseo
Stephanie Speicher - Weber State University
Bonnie Slavych - Missouri State University
Gabriella McBride - New York University
Cynthia van Golen - Delaware State University
Crystal White - The University of Memphis
Bridget Turner Kelly - University of Maryland
Jeremy Logsdon - Western Kentucky University
Jen Newton - Ohio University
Yolanda Sealey-Ruiz -Teachers College, Columbia University
Michael Horn - Harvard University (lecturer)
Jeremy Caplan - CUNY Newmark
Matthew Farber - University of Northern Colorado
Joab Corey - University of California, Riverside
Siovahn Williams - Carthage (that’s all it says)
Detra Price-Dennis - Ohio State University
Michael G. Strawser - University of Central Florida
Teresa Cusumano - Lehigh University
William Hardaway - California State University, Fresno
Shanina Sanders Johnson - Spelman College
Anne Arendt - Utah Valley University
Renée Cummings - University of Virginia
Also of very special note here is that it looks as though Edward Tian, the founder of the dubious AI similarity detector GPTZero, did officially join Course Hero at the event (see Issue 218).
And Course Hero does now own Shea Swauger - the guy who compared online exam supervision with “anti-immigration policies, selective breeding programs, marriage restrictions, forced sterilization, murder, and genocide.” Oh, and Nazis. Almost forgot. He also wrote an entire blog post titled, “Cheating Doesn’t Matter” (see Issue 219).
Educators from the schools listed above appeared with him. And with Course Hero. Shame on them.
EdSurge Reports on “False Positives”
A few weeks ago, EdSurge ran a story on “false positives” in the systems that detect AI similarity in written work.
I put “false positives” in quotes because, in nearly all cases, the results are not false. The detection systems look for material that feels like AI - not that is AI.
If you’re a really average writer, a really predictable wordsmith, these systems may flag your work. They’re right. You write like a bot. Which is why every single detection-maker advises that the results should not be used on their own to determine authenticity.
Anyway, the story is exceptionally important because it cites some pundit about something important:
Some chafe at the term false positive — arguing that, because flags raised by these detectors are meant to be used as a start of a conversation, not proof, the term can give the wrong impression. Academic integrity watchdogs also point out that a dismissal of cheating charges does not mean no misconduct occurred, only that it wasn’t proven. Google Docs may be an important tool in establishing authorship for students accused of plagiarism in the future, argues Derek Newton, author of the academic integrity newsletter The Cheat Sheet.
Obviously, I do chafe at the term.
And also, true - the dismissal of a charge of misconduct does not mean there was no misconduct. And Google Docs or other tamper-proof, work-tracking systems are going to be more and more important in this new era.
That Newton guy seems pretty insightful.
An AI Job in AUS
There’s a job open in the good AI - academic integrity. In Australia.
Charles Sturt University is looking for a “Manager, Academic Misconduct and Integrity,” according to the job posting.
It’s a new office, new post. And the pay looks nice. All of which - pretty good. Good for the school for investing in integrity.
My resume is in the mail.
Department of Corrections and Class Notes
The last issue of “The Cheat Sheet” had a few errors in it. Nothing factual, as far as I know. Mostly a typo or two and the occasional “account” instead of “accountant.” That kind of thing.
I do try to make this newsletter perfect. Or at least as good as it can be. So when errors sneak past me, I am embarrassed and I do apologize.
I also need some help. If you’re able and available to proofread or copy edit “The Cheat Sheet” - able to return comments and corrections in a few hours twice a week - please let me know. I would be grateful. A reply e-mail reaches me.
While I have you, if you have news or information related to academic integrity that you’d like to share with readers, please let me know that too. Research, insights, product information, policy changes, even comments - I am pretty flexible. And we now have a readership/subscribership of nearly 3,400 with per-issue readers of more than 2,000 - people active and interested in academic integrity. So, if you care to reach them, reach out.