Tutors Are Being Solicited to Give Answers, Take Tests for Students
Plus, more on that research failure by The New Yorker. Plus, tips to deal with increased misconduct. Plus, Quick Bites.
Issue 31
British Papers: Tutors Are Solicited to Give Answers, Take Tests for Students
Two big British papers - The Times and The Telegraph (subscriptions required) have similar stories this week on essentially the same news - that private tutors report being solicited by students to take their online tests for them. The headline in The Telegraph is:
University students enlist private tutors to take their online exams during pandemic
One quoted tutor, Naomi Wilson, told The Times, she declines the requests but said, “having principles is quite expensive.” She also said she reports the requests to the schools. Both stories quote schools saying how seriously they take the accusations.
Another unnamed tutor interviewed by The Telegraph said,
Everyone involved in education knew this was going to happen, a lot of people have their heads in the sand about it.
I would be astonished if anyone genuinely believed that they have robust exams systems that are safe against cheating - they must be aware there are huge quantities of cheating happening.
The New Yorker Got That Cheating Research Completely Wrong
In Issue 29 of “The Cheat Sheet” I pointed out a few of the errors in the recent coverage of online cheating and proctoring from The New Yorker.
One error I noted was this section from the article:
Although most educators assume that cheating is more common when exams are online, research has suggested that the prevalence may not vary much from in-person exams.
As mentioned in Issue 29, the research cited did not measure prevalence of online cheating. But now that I’ve had a chance to read that full research paper, The New Yorker not only has it wrong, they have it completely, embarrassingly backwards.
Not only does the research paper they cited show that those testing online and without a proctor had a significant advantage due to cheating, it directly calls on instructors and school leaders to intervene and invest in techniques to stop online cheating - the very things The New Yorker was trying to criticize.
On the first point, the research paper says:
It is therefore concluded that this extraordinary change in the online test scores … is likely the result of cheating in the online class.
And
the online group scored significantly above the in-class group on the final exam, thus suggesting a cheating effect
The three author/researchers continue in their paper:
to the extent that these results showing that online testing does facilitate student cheating are representative, then the growing reliance of distance learning on online testing does suggest that professors and deans must take affirmative steps to suppress student cheating in those courses relying on online testing.
Further, they say:
additional research findings similar to those presented here would implicitly impose a moral burden that would fall on professors and institutions of higher education to assure students, potential employers, graduate admissions departments and other consumers of grade information that the grades that students are receiving are truly reflective of the learning that has been accomplished.
In other words, in trying to say that proctoring of online tests was unnecessary because cheating online was not necessarily more common than cheating in face-to-face classes, The New Yorker actually cited a paper calling for “affirmative steps to suppress student cheating in those courses relying on online testing” and suggests “a moral burden that would fall on professors and institutions of higher education” to do so.
It’s clear that no one at The New Yorker read the research they were citing, or even its summary. That’s incredible - as in not credible.
Association of American Colleges & Universities on Academic Integrity
David Harris, editor-in-chief of OER leader OpenStax, and Maureen O’Brien, vice president of evaluation operations of Western Governors University, co-wrote a post on academic integrity for the Association of American Colleges & Universities.
Harris and O’Brien correctly peg the increasing threats from companies such as Chegg. Students, they write:
are often lured by aggressive marketing practices and promises of quick fixes. The popularity of services like Chegg, which provides answers to questions in “less than 30 minutes,” has created a multibillion-dollar industry and even coined a new verb, “chegging.”
The writers suggest approaches for dealing with the new integrity dynamics, notably removing or reducing “curve” grading which can incentivize and reward cheating:
Many institutions hold students to standards that are determined by the learning and performance of their peers. As misconduct becomes more prevalent, this incentivizes students to approach it more defensively—if their peers are doing it, they must do the same in order to compete.
For academic integrity, it’s a good suggestion. The other suggestions in the article are as well. It’s worth a read.
“The Cheat Sheet” Quick Bites
A story from Scotland on cheating says “Teaching experts at the University of Edinburgh have expressed concerns about the issue of ‘academic misconduct’ following the move from traditional in-person exams to open book online exams.”
It also says that cheating has sparked a push for “authentic assessments” but it importantly quotes Dr. Joe Arton, an academic developer in the Institute for Academic Development at the University of Edinburgh saying that those things come, he says, with a “price tag” – an increase in appraisal opportunities for students, often means an increase in marking workload for academic staff.
Western University in London, Ontario says it’s switching test proctoring companies - from Proctortack to Proctorio.
Daily Mail TV (UK) had a recent segment on cheating. It says, “Campaigners are predicting a surge in cheating this summer because universities are running unsupervised online exams …” It quotes Chris McGovern, chairman of the Campaign for Real Education, saying “Unsupervised DIY exams are an open invitation to cheats and it is honest students who will be penalised.”
The74 has a story about some disappointing test results in Miami-Dade County showing that “43 percent of students who took January diagnostic tests in grades pre-K-3 tested below grade level in reading. And 54 percent tested below grade level in math.”
Those actual learning deficits may be worse than that though. A representative of the test provider said she doesn’t suspect “widespread cheating” from students who took tests at home, saying that even if students received help, “I think it was very innocent helping,” she said.
In the next “The Cheat Sheet” - finally getting around to the cheating at Stanford. Plus, the European Conference on Academic Integrity and Plagiarism 2021.
To share or subscribe, links below.