Cheating More than Doubles at Penn
Plus, the educators signing up to help Course Hero. Plus, Bloomberg opinion dishes on AI exam proctoring.
Issue 128
To subscribe to “The Cheat Sheet,” just enter your e-mail address below:
To share “The Cheat Sheet:”
If you enjoy or support my work, please consider chipping in a few bucks via Patreon:
Misconduct Cases More than Double at Ivy’s Penn
It did not appear to be publicized much, but the University of Pennsylvania (Penn), released an annual report on academic and other misconduct. It covers the 2020-21 year and shows that cases of academic misconduct more than doubled last year.
Penn is nowhere near alone in this. I can check the math but something along the line of 95% of schools that have released data on academic misconduct have watched incidents jump - often two or three times above historic norms.
Before getting to the numbers, good for Penn.
Releasing data about misconduct is essential to an informed conversation and reasoned solutions. It’s also key because, as noted previously (see Issue 127), there are those who either don’t believe misconduct is increasing or are literally invested in dismissing it.
According to the report, cases of cited academic misconduct at Penn went from an average of 221 for the three years before 2020-21, to 465.
Penn’s report is good in that it not only cites totals, it breaks down the types of conduct. Some of those categories are a little fuzzy - “cheating” vs “misconduct during an exam” for example.
Here some of those breakdowns with their previous three year average, then the reported number in 2020-21.
Cheating, from 55 to 222
Unauthorized collaboration, from 47 to 117
Plagiarism, from 76 to 39
Unfair advantage over fellow students, from 4 to 31
Facilitating academic dishonesty, 25 cases (no change)
Misconduct during an exam, from 20 to 24
I don’t know how much weight I’d put in these individual groupings because, as mentioned, I’m not sure what the difference is between “unfair advantage” and “misconduct” or “cheating.” Further, Penn says one student - a respondent - could be implicated in more than one type of incident.
And, for the record, cases of reported plagiarism declined four years in a row, so I’m not sure that decline is indicative of much more than a continued decline. And I’m not sure exactly why it’s declining.
Nonetheless, when you see numbers move from 47 to 117 or from 55 to 222 - or when you look at the top line number of student respondents go from 221 to 465 - there is a trend there.
The Penn report is also good in that it breaks down the sanctions or outcomes of the cases too. If academic integrity is your thing, it’s definitely worth reviewing. Noteworthy to me from the sanctions section is:
Probation: 93
Reprimand: 53
Warning: 27
Suspension: 5
Expulsion: 0
Notation on transcript: 0
Make of that what you will.
For a note on last year’s misconduct conditions at Penn, see Issue 66.
To repeat though, good for Penn.
If you know of an institution that publishes academic integrity reports such as this one, I’d love to know. Please drop me a note.
Bloomberg Columnist Says AI Proctoring Needs a Babysitter
The recent column from Bloomberg writer Parmy Olson was about AI more generally, though it did touch on proctoring in some detail.
Olson wrote specifically about ProctorU, one of the larger exam proctoring providers, noting that it:
recently conducted an investigation into how colleges were using its AI software. It found that just 11% of test sessions tagged by its AI as suspicious were double-checked by the school or testing authority.
Lack of review of “flagged” exam incidents has been an issue for a long time. By not reviewing incidents, institutions usually either stamp the flags as cheating - which is not great - or leave actual exam misconduct unaddressed - which may be worse.
The column continues that the AI can be wrong. That’s true. But the column is wrong when it says:
In February, one teenager taking a remote exam was wrongly accused of cheating by a competing provider, because she looked down to think during her exam, according to a New York Times report.
As examined in Issue 122, that was not true. The student in question was not wrongly accused, she just said she didn’t cheat. Though she probably did. It was not just the AI that flagged her, her teacher and the school’s Dean reviewed the test session and agreed misconduct occurred. In other words, though she denies it, she was found to have cheated. That’s not “wrongly accused.”
But that’s not the point here.
After reviewing how other sectors and organizations use AI, Bloomberg’s Olson says:
The answer isn’t to stop using AI systems but rather to hire more humans with special expertise to babysit them. In other words, put some of the excess trust we’ve put in AI back on humans, and reorient our focus toward a hybrid of humans and automation.
The column returns to ProctorU:
To its credit, Alabama-based ProctorU made a dramatic pivot toward human babysitters. After it carried out its internal analysis, the company said it would stop selling AI-only products and only offer monitored services, which rely on roughly 1,300 contractors to double-check the software’s decisions.
That part is true, see Issue 28. And as mentioned at the time, that was a pretty big deal. A year later, someone at Bloomberg thinks so too.
More Educators Sign On to Back Cheating Provider
As we’ve shared before, cheating provider Course Hero loves to share a stage with teachers. It’s a consistent part of their public relations image, however confusing and disingenuous it may be (see Issue 44 or Issue 99).
What continues to amaze me is that educators continue to return the love - standing with, standing up for Course Hero. It amazes me because it’s no secret what Course Hero is and what they do. I mean, the jury has been back on this issue for a long time now (see Issue 97 or Issue 42).
As that is, Course Hero is assembling yet another grotesque online event with teachers, this one in late July. Scheduled to appear with cheating provider Course Hero are:
Michael Sorrell, President of Paul Quinn College
J. Luke Wood, San Diego State University
Mary McNaughton-Cassill, University of Texas, San Antonio
Kelly Pope, DePaul University
Catherine Ross, University of Texas, Tyler
Laura Summers, University of Colorado, Denver
Gaye Theresa Johnson, University of California, Los Angeles
Ben Wiggins, Shoreline Community College
Stephanie Speicher, Weber State University
Fabiola Torres, Glendale Community College
Jonathan Chin, Medgar Evers College, CUNY
A few of these are, shall we say, repeat offenders, having apparently enjoyed hanging out with Course Hero enough to come back for more.
Every time Course Hero does this I say some version of the same thing - that I feel really terrible for the students who see their institution listed or their professor’s photo under the Course Hero logo and think using Course Hero is legitimate or allowed. Those students are going to get into trouble. Led there by cynical marketing and educators who should know better.