Study: More than Half of College Students Say They Cheated in Online Exams During Covid
Plus, error - corrected. Plus, Chegg loses more money, stock surges. Plus, firefighters in Georgia busted for cheating.
Issue 230
To join the 3,441 smart people who subscribe to “The Cheat Sheet,” enter your e-mail address below. New Issues every Tuesday and Thursday. It’s free:
If you enjoy “The Cheat Sheet,” please consider joining the 14 amazing people who are chipping in a few bucks a month via Patreon. Or joining the 15 outstanding citizens who are now paid subscribers. Thank you!
New Research: 54% of Students Admitted Cheating Online Exams During Covid
Newly published research from Philip M. Newton (no relation) and Keioni Essex of Swansea University (UK) examined other published studies and surveys of self-reporting academic misconduct during online assessments among college and university students. The numbers they report ought to end the discussion as to whether cheating is common and whether it increased during online, pandemic-mandated assessments.
Here is the headline finding:
Online exam cheating was self-reported by a substantial minority (44.7%) of students in total. Pre-COVID this was 29.9%, but during COVID cheating jumped to 54.7%,
Those numbers represent:
25 samples were identified from 19 Studies, including 4672 participants, going back to 2012.
Keeping in mind that self-reporting of illicit conduct underestimates actual rates, and that many students don’t consider their actions “cheating” even when they clearly violate integrity policies, these numbers should be the loudest alarm bell anyone could ring in academic settings.
So much so that they are worth repeating - nearly three-in-ten students reported cheating on online tests and exams before Covid-19 alterations, and nearly 55% said they cheated during. Fifty-five percent.
The study did not include any data on post-covid misconduct rates, so we have no idea if the confessed rates of cheating have receded. Based on numbers I’ve seen, they have not.
But again, the 55% baseline is probably low. The authors write:
Most samples were collected using designs which makes it likely that online exam cheating is under-reported
The new information has several other items that are, in my view, well worth sharing.
When trying to ascertain why students were cheating in online assessments, the authors reported:
the most common reason students reported for cheating was simply that there was an opportunity to do so.
Yup. Students said they cheated because they could.
I know many people like to console themselves and others that academic misconduct is the product of stress or competition or panic. But the truth is simple. A sizeable set of those who engage in misconduct do so because they can. The process is high reward and very low risk.
Related, the new study also concludes:
Remote proctoring appeared to reduce the occurrence of cheating, although data were limited.
And, citing one piece of research specifically:
The most common form of cheating was the same in both groups (‘web search during an exam’) and was reported by 39.8% of students in the unproctored group but by only 8.5% in the proctored group
We already knew this, but maybe it’s beneficial to see it in writing again.
The study also found that individual cheating on online exams was more common than group misconduct. They found that honor codes don’t work, about which I agree. They found that warnings of discovery and penalty may work, which I think makes sense. They also found merit in open-book tests as a foil for misconduct, about which I am skeptical.
Nonetheless, the numbers - that more than half of college students admit to cheating on online tests during Covid - are the plot. So concerned are the authors about this level of misconduct that they say, bluntly:
Future approaches to online exams should consider how the basic validity of examinations can be maintained, considering the substantial numbers of students who appear to be willing to admit engaging in misconduct.
Chegg Revenue Continues to Fall, Stock Surges Anyway
Cheating giant Chegg released its quarterly earnings report yesterday, telling investors that it continues to lose money and lose subscribers. On the news, Chegg stock immediately jumped more than 20% because the company did not falter as much as was expected.
For those who may be new to Chegg, the company is traded on the New York Stock Exchange. Its shares are held by a number of high-finance institutions, including teacher pension funds. No, I am not kidding (see Issue 142).
Anyway, because Chegg is public, we have insight into how profitable cheating has been and how the market dynamics are changing. In this newest earnings release, we learned a few things.
According to the press release, Chegg:
Had total net revenues of $182.9 million for the quarter - a decrease of 6% year-over-year
Their subscription services revenues decreased 5% year-over-year to $165.9 million, or 91% of total net revenues, compared to 90% in Q2 2022
Chegg has 4.8 million subscription services subscribers, a decrease of 9% year-over-year
Translated - Chegg is making 6% less money than it did a year ago and their subscriber base has eroded 9%. And they’re becoming even more dependent on their answer-on-demand subscription service. It’s now 91% of their revenue.
Chegg also suggested their third quarter revenue would be “in the range of $151 million to $153 million.” Or, less, in other words. Some of that is the summer slump, where fewer students need to cheat on tests they’re not taking. But the trend line is not subtle.
For more on Chegg’s stock issues, see Issue 206.
But let’s not be quick to gloss over the fact that Chegg collected $183 million in just three months. Cheating is big business. And Chegg is big business. They say they have 4.8 million subscribers.
Two other nuggets from Chegg’s release. One, Chegg says:
students see ChatGPT and Chegg as complementary with very different use cases.
Chegg cites a survey saying “GenZ” students” are “not comfortable with the exact information ChatGPT puts out” and adds:
it’s become clear to us that a simple, high quality, accurate, personal learning assistant, is needed and we feel we are uniquely positioned to deliver a world class personal learning assistant.
Two tells here. One, Chegg essentially concedes that their service and ChatGPT are similar and they’re right. Both will just tell you the answer with little to no work or comprehension required. The only difference is that, while ChatGPT lies, it is free. And, importantly, Chegg feels that if they can fact-check or fact-correct GPT, they have a viable business proposition.
For what it’s worth, I think they’re right. I think this is the niche that essay mills will eventually accept as well.
Finally, according to their report, Chegg says:
we are building our own large language models which gives us the ability to train them specifically for education. Our LLMs will be trained with our unique data sets, and with the help of our 150,000 subject matter experts.
This means Chegg’s AI will probably not only be more accurate than GPT, but it will also be trained to write answers in forms and formats that fit with test questions. Every exam or homework question that Chegg has answered once will be on regurgitation rotation, ready to assist another cheater - and at a fraction of the cost. If they pull it off, that’s good news for Chegg, bad news for educators. After all, there are only so many questions you can ask about the War of 1812 and Chegg is training its own AI to answer them. For a fee, of course.
The bottom line is that Chegg is dying. But not quickly. And at $180 or $150 million every quarter, they have time to pivot - to find a new, more lucrative way to profit from academic fraud.
Iowa State Corrects Headline on Online Exam Study
In the last Issue of “The Cheat Sheet” we shared the findings of a study showing that, as expected, test scores were higher when exams were given online and unsupervised.
The study concluded that assessments given this way nonetheless had value because, by and large, the rank order of individual students did not change much - good students did well even if cheating was rampant. I pointed out at the time that, if cheating is widespread, I think we’re collectively missing the point of an assessment. And, for that matter, teaching at all.
I also mentioned that press coverage of the research totally blew it. Iowa State University, where the researchers are, headlined the report findings as:
Researchers find little evidence of cheating with online, unsupervised exams
Which was amazingly wrong.
I wrote to the editors at the Iowa State publication to be sure they were aware of the error, and, to their credit, they changed the headline. It now reads:
Study finds unsupervised, online exams can provide valid assessments
True. Make of the results what you will, at least that is what they were.
Good for Iowa State. The record is important. And what was probably an honest mistake was certain to be repeated.
Seven Firefighters in Georgia Demoted, Suspended for Cheating Certification Exam
Seven firefighters in the Atlanta, Georgia area have been suspended without pay and/or demoted for their roles in cheating on a state certification exam.
From the coverage:
Four lieutenants were demoted to engineers, and two engineers and one firefighter were suspended without pay after violating the department’s code of conduct.
The firefighters participated in a training course and exam to become certified instructors on specific fire apparatus. While in the exam room, the lead instructor and the test-takers discussed the answers to the questions, officials found.
Quoting the Fire Chief:
The punishment is severe but should send a message that this department will not tolerate any breach of ethical behavior.
Messages matter because risk and reward calculations matter.
This also isn’t the first time we’ve seen cheating among firefighters and cadets - see Issue 124, Issue 213, or Issue 4.