Research: More than Half of Engineering Sections Compromised on Chegg
Plus, coverage of foreign students in Australia. Plus, class notes.
Issue 309
Subscribe below to join 3,972 other smart people who get “The Cheat Sheet.” New Issues every Tuesday and Thursday.
If you enjoy “The Cheat Sheet,” please consider joining the 18 amazing people who are chipping in a few bucks via Patreon. Or joining the 39 outstanding citizens who are now paid subscribers. Paid subscriptions start at $8 a month. Thank you!
New Research Examines Chegg Cheating
New research from Edmund Pickering and Clancy Schuller of Queensland University of Technology (AUS) examines the use of Chegg to cheat and is a valuable contribution to the academic integrity conversation.
The duo examined Chegg’s growth as well as the depths to which use of Chegg had compromised the integrity of the school’s engineering units. Short answers: Chegg has grown a ton, especially during the pandemic. And, the level of academic misconduct was substantial.
Let’s start with the meat:
Of the respondent units, a majority (56%) had assessment content found on Chegg. Through this process, 1180 solutions were found on Chegg which directly matched assessment content, with the largest unit having 394 matches identifed. We reiterate that the uploading of these 1180 assessment items to Chegg constituted academic misconduct as these were efforts to subvert assessment.
Eleven hundred and eighty solutions were on Chegg. That’s across just 41 units from just one program in just one university. A unit, I believe is synonymous with section. And yes, those were misconduct.
Then there is this:
Upon first submission, questions were answered rapidly with 25% answered within 42 min, 50% within 1.5 h, and 75% within 4.5 h.
This is odd to me because my sense is that, in the United States, Chegg’s response rates are considerably faster. I think they even advertise that they give answers to questions in less than 30 minutes. Still, even these results are a problem because, as the authors note:
This makes Chegg a very effective means of cheating on assessment items. Students can post a question, with large certainty that they will have a solution the following day. More alarmingly, the short wait for a solution makes Chegg an effective option to cheat on timed online assessment (e.g. online quizzes or exams). In this study 40% of questions were answered within 1 h, and 58% within 2 h. As common exam durations are 1 or 2 h, students can upload assessment questions to homework-helper sites like Chegg, with the expectations, that many questions will be answered during the exam duration.
Ah, and there it is - Chegg’s business model.
Not sure that’s true? Consider then the next two snippets from the literature review of this recent paper:
Broemer and Recktenwald (2021) presented a detailed investigation of Chegg usage in a 2-h online mechanical engineering exam, identifying 129 unique posts to Chegg, with 71% of posts answered during the exam (50% answered within 1 h)
And:
Manoharan and Speidel (2020) uploaded assignment questions to Chegg … [and they] ensured the questions they uploaded were clearly identifable as formative assessment, which violates Chegg’s policy and which Chegg tutors are required to report. None of these questions were identifed as attempts to cheat.
So, even when you tell Chegg that the questions are for an exam or a graded assignment — even when that is “clearly identifiable” — Chegg answers and nothing happens. Obviously, Chegg’s policies are — let’s just call them optional.
Also interesting from this research is that Pickering and Schuller were able to track the growth of questions asked to Chegg over a number of years, finding:
Between the years of 2016–2019, the question rate remained relatively stagnant at between 7.5–9.5× 106 per year. This dramatically increased to 22× 106 following the COVID-19 pandemic
In graphic form:
Boy, would you look at that jump in March, 2020. Interesting.
I found it also noteworthy that the authors say:
While these sites purport to be for legitimate study, they are highly vulnerable to misuse, and there are limited mechanisms to prevent students using these Q&A services to cheat. Broemer and Recktenwald (2021) proposed that the Chegg’s Q&A service is primarily used for cheating, this is further supported by Lancaster and Cotarlan (2021a).
I’d say that using Chegg to cheat is not misuse, it’s intended use. But maybe that’s irrelevant since the paper cites two other research efforts that found that Chegg “is primarily used for cheating.”
Aside, the Lancaster cited in that section above and elsewhere in the paper, is the same Lancaster who has joined Chegg’s advisory board (see Issue 298).
Another quick note that, though the research team does not spend too much time on it, they nonetheless clearly see Chegg and other answer-for-money sites as a special threat to integrity when exams are not proctored:
the data presented in this work does demonstrate risks inherent in non-proctored online exams.
Yup.
Which gives me reason again to say that if you’re offering any kind of assessment that is online, unsupervised, and open for any amount of time at all — you’re wasting everyone’s time.
Anyway, the pair of authors have some advice:
unless action is taken, tertiary education institutions can expect increased misconduct through the use of Chegg and similar sites.
Fact check: true.
And:
These findings paint a picture where units are highly vulnerable to cheating through Chegg. A majority of the units investigated in this study had their integrity compromised through the use of Chegg. While this study only presents analysis from a single engineering school, it is reasonable to assume these results can be generalised to other institutions and subject areas. Other institutions should take these results as a warning bell, and should consider how the integrity of their degrees might be vulnerable to misconduct through Chegg and similar services
Also true.
And I like how the authors did not just talk about the integrity of exams or grades but correctly went right to the “integrity of their degrees.”
Continuing:
We show that Chegg was used for misconduct across a majority of audited mechanical engineering units. We further show that the rapid rate of Chegg solutions makes it an effective tool to cheat on both assessment and online exams.
Chegg is an effective tool to cheat. Hmm. Absolute shocker.
And finally, because the speech coaches say to always end with a laugh, the paper says:
[we] implore homework helper websites, if they wish to be considered legitimate tools, to signifcantly increase efforts to reduce misconduct and improve engagement with university integrity processes.
Since Chegg has actively dismissed calls to make its services less prone to cheating and actively stopped cooperating with academic integrity investigations, that one is a knee-slapper.
Chegg and its sister sites do want to be considered legitimate. They just want to continue to sell as much cheating as they possibly can while doing it.
Press: Foreign Students with Poor English Skills Have “Hollowed Out Academic Integrity”
This coverage from The Guardian isn’t about cheating, directly. But it is about academic integrity — awarding academic credentials without demonstrated learning.
The article is primarily about the accusation that higher education institutions in Australia are admitting foreign students who lack basic English skills and allowing them to graduate, because they generate outsized tuition revenue. If you’ve been around higher ed enough, it’s an implication you hear from time to time about every country that admits any significant share of foreign students.
The article reads:
universities’ financial reliance on foreign students over many years had hollowed out academic integrity and threatened the international credibility of the sector.
I note the link between a lack of academic integrity and credibility. Regardless of the type of questioned integrity, credibility of the educational offering is what’s at stake.
The article also says the situation has become worse since generative AI:
Many said the rise of artificial intelligence was accelerating the crisis to the point where the only way to fail a course would be to hand nothing in
An anonymous professor told The Guardian:
“It breaks my heart reading essay after essay with a strong suspicion students couldn’t have written it,” they said. “The writing is on par with mine but when I ask [students] what a citation and a reference is, they have no idea.
“I’ve interviewed students after grading with suspicions and they could not tell me a single thing about the entire semester, yet wrote beautiful posts online and a beautiful essay.”
Another professor said:
he frequently graded essays that software – and his intuition – suggested were plagiarised. He said he would fail the student, they would appeal and the outcome was that they would pass.
“You’d gauge the language proficiency [of students] and they would produce something extremely precise, it was common,” he said.
There’s more to it than this, and you can check it out. But I’ll end with this, from the piece:
Bien, an international student nearing the end of a postgraduate course at a Melbourne university, who asked for her full name to be withheld, said she had to defend herself in front of an academic misconduct committee twice for using artificial intelligence to complete assignments. Both cases were later dropped.
“My first experience with ChatGPT was due to a group assignment where no one else contributed, so I had no choice but to get inspiration from genAI,” she said.
About 60% of her international student friends admitted to AI use, she said. Some paid for copies of previously submitted high marked papers and used layers of genAI to mask the plagiarism before submitting, while others hired ghostwriters to complete their work and run it through detectors to pick up anomalies.
She said she didn’t blame them.
Class Notes:
This week I will be in beautiful Oklahoma, speaking to the Oklahoma Association of Testing Professionals about academic misconduct. I am honored to be invited and privileged to be able share what I know.
Also this week, also on Thursday, I’m pleased to join a virtual panel with several respected academic integrity experts from Australia: International Perspectives on Academic Integrity.
Enjoyed your contribution to AAIN