Meet Cheating Conglomerate Learneo
Plus, a college in Canada says it will turn off exam security. Plus, a commission in The Netherlands says proctoring software may have discriminated - and why they're probably wrong.
Issue 173
To join the 2,716 smart people who subscribe to “The Cheat Sheet,” enter your e-mail address below:
If you enjoy “The Cheat Sheet,” please consider joining the nine amazing people who are chipping in a few bucks a month via Patreon. And thank you to those who are.
Meet the Cheat: New Conglomerate Learneo
Last week, shameless cheating provider Course Hero announced it had formed a parent company - Learneo - which will house it and other elements of its academic fraud empire such as AI-assisted paraphrase engine, QuillBot. The founder of Course Hero will run the new umbrella company.
There are a few interesting bits from this announcement.
One is that Course - err, Learneo - has used this press release to brag about its success, saying it has:
over 100 million monthly active users and a combined revenue of more than $200 million across its six businesses
That’s good money. But the 100 million monthly users is more interesting. First, I am just not sure I buy it. Even so, assuming it’s accurate, that’s an awful lot of “studying” and “learning” going on every month. Is it not?
It continues:
The company has surpassed 3 million subscriptions per year and is targeting 50 million subscriptions per year by the end of 2030.
The release also says:
Last December, Learneo, formerly known as Course Hero, Inc. raised $395 million in series C funding, marking the company's value at $3.6 billion.
I just don’t think many people realize how big and well-funded these cheating companies are.
The announcement also quotes Matt Witheiler, who’s with the private investment group Wellington Management. Matt says great things about the budding cheating empire that is/was Course Hero. It does not say explicitly, but the implication is that Wellington is an investor.
What’s funny - or sickening, depending on your view - about Wellington being in bed with Course Hero is that on Wellington’s “our culture” page, the very first item is:
We value integrity
We hold ourselves to the highest ethical standards. Trust is at the core of our relationships with clients and with each other. We work hard to earn and sustain that trust.
Wow.
A company that says that, is backing a company that flatly refuses to cooperate with academic integrity inquiries from schools - going as far as to tell professors that if they want information about cheating, they need to get a court order. (See Issue 102).
My only guess as it relates to Wellington, and the others who seek profit from academic fraud, is that they must have some definition of “highest ethical standards” that I simply don’t know.
Anyway, I may have buried the real news again. I do that. Bad habit.
When you read the press release, it’s pretty clear that this new company wants to shift its focus from just cheating to a:
new platform of businesses that will develop and acquire productivity and learning technologies
And:
Learneo aims to attract entrepreneurs and builders eager to empower anyone to achieve their fullest potential.
Whatever that means.
I think it means that LearnEE-eye, EE-eye, OH is about to go on a buying spree. Cheating providers with promising technology but no real market yet, companies such as PhotoMath and EduBirdie, may find capital to grow, courtesy of Course Hero and, I assume, Wellington Management.
I hope I’m wrong.
One Canada School Faces 300+ Cases of Organized Cheating While Another Moves to Turn Off Remote Test Security
Last Issue featured coverage of a story of the more than 300 people who cheated on their real estate exams at the public college in Toronto, Humber College. And that, in response to the mass cheating, the college is now requiring two cameras for remote test proctoring.
With that as stage-setting, it’s odd news that this month another public college in Canada, University of Guelph, which is also in Ontario, is intentionally turning the cameras off.
Maybe the two schools should talk or something.
Guelph, it’s reported, will:
step away from the use of exam proctoring software Respondus Monitor.
The given reason is that the school says, essentially, that it’s racist and discriminatory:
The recommendation comes from the President's Advisory Committee on Anti-Racism (PACAR), which said the proctoring software “does not align with U of G’s values.”
A Provost at the school told the paper:
there have been reports of facial detection issues with students of colour and transgender students, and that students with disabilities get “unfairly flagged” for movement.
It’s clear that the Provost doesn’t understand how remote proctoring works, which is kind of amazing. Especially since it’s reported in the very same news story that:
Respondus Monitor … utilizes webcams to record students during online exams, flagging any suspicious movement. The footage is then reviewed for possible academic misconduct.
In other words, a flag does not mean anything aside from that a review is appropriate. The local newspaper in Guelph, Canada understands this. The university, it seems, does not.
Let’s review this together. A flag does not mean you fail. It does not mean someone cheated. It does not mean someone is even accused of cheating. In best uses, the test-taker doesn’t even know there is a flag. And so if a flag is erroneously generated because someone has different abilities - which I seriously doubt - the professor or trained proctor will see that and dismiss it. Nothing happens.
As a related aside, I’m not aware of any research showing that “false flags” for disability happen. At least I’ve not seen any. Even in this terrible research in which students were told to try to generate false flags on remote proctoring software, they didn’t (see Issue 153). But again, even if that happens, the remedy is easy and inconsequential - if a flag is caused by disability, it will certainly be dismissed during review.
I’ll get to the idea that remote proctoring software is racially biased down below. That too is misunderstood at best and complete nonsense at worst.
In any case, that a school would simply turn off its remote proctoring is malpractice. Even the Provost who spoke about turning the cameras off told the same newspaper in this same article that:
a case study showed academic misconduct increased threefold during the pandemic.
No kidding. But Guelph is now going to somewhat literally shut its eyes. I confess, I don’t get it.
But what really frosts me about this is that nearly all of it is pointless - except for making future misconduct more likely. That’s because, according to the reporting:
faculty members were “never required” to use Respondus in their courses or exams, and that its use was left up to each individual instructor.
And that, starting this year, students can already “opt out” of remote proctoring if they had “human rights concerns.” No, I am not kidding. Human rights.
And - there’s more. According to the story:
Although it has been said the software doesn’t align with U of G’s values, its use won’t be completely eliminated, however.
That’s right. Even though students will be able to have the cameras off for “human rights,” the school will still use it:
for distance education courses where students are unable to come to campus.
So, for remote exams. It’s racist and discriminatory and we can’t have it. But for cases in which someone can’t get to campus, it’s fine. Which means that in every one of the maybe four cases in which a school has said they’re ending remote proctoring, they’re not really doing that. Not even here.
Though, with the opt-out, what’s the point?
Sure, I’ll take your remote education classes. Please turn the cameras off. I’ll see you at graduation. In the meantime, you may want to give Humber College a call - just to check in.
Commission in The Netherlands (Probably Incorrectly) Says Remote Proctoring Software Racially Discriminated
News this past week is that The Netherlands Institute for Human Rights has preliminarily found that it’s possible that an anti-cheating, remote proctoring algorithm may have “discriminated against a person” based on race.
That sounds bad. But it’s actually quite instructive. And probably not right.
To catch you up:
A student at the Vrije Universiteit in Amsterdam (VU) filed a complaint alleging that the software used to proctor an exam that was given remotely did not recognize her because of her skin color when she attempted to log in and complete the exam.
She also said she was kicked out the exam several times.
Those probably aren’t related.
That’s because every remote proctoring company I know uses off-the-shelf face recognition technology built by one company and used by literally thousands of other companies in every imaginable industry. I won’t say what company. But it does rhyme with Blamazon.
Amazon. It’s Amazon.
The point is that the facial recognition part of test proctoring process is not designed or run by the testing company. That does not excuse it. There is credible research showing that Amazon’s ID software can and does struggle to ID dark faces, especially in low light or with substandard technology such as old cameras or bad Internet connectivity.
And it’s entirely possible that this grad student had difficulty accessing her test because the recognition software had issues, perhaps even unfairly. Importantly, where this may have happened is the doorway to the exam proctoring, not the exam proctoring. If Amazon’s tech failed or was biased, it does not mean the exam proctoring process was. They are different. One asks who you are. The other asks what you’re doing.
In other words, this headline is not correct:
Anti-cheating webcam software may have discriminated based on skin color
The software in question is not anti-cheating or proctoring software - probably.
And because the facial recognition, ID-verification part of the exam onboarding is different from the delivery and proctoring of the assessment, it’s almost certain that whatever troubles this student had logging into the exam are not related to her being disconnected from it once she got in - which it seems she did, by the way.
Reading the tea leaves here, the student likely had a bad Internet connection. That would cause or exacerbate both issues. The University said the same thing.
The bigger point is that whatever issues, even racial issues, the exam entry process may have based on a universal technology problem, they only impact accessing the exam. They could not impact how the exam is delivered, or who gets “flagged,” or for what.
Even in this case, the student does not allege discrimination during her test or in any way related to academic conduct. She says onboarding and connection issues caused her stress. I am sure she’s right. That sounds stressful. I’m just not sure it’s related to the exam proctoring software.
Nonetheless, under the rules in The Netherlands, the school now has to prove the software did not discriminate. And I have no idea how that’s possible - to prove the negative. We will see.
The bigger point here is that if you look closely, every - or nearly every - assertion that remote proctoring is racially biased or problematic is related to the ID and exam entry process - not the test administration or integrity measures. I’m not saying that’s OK. I am saying it’s different.
The take-away is that this institute in The Netherlands found possible discrimination based on race in ID-check technology being used by a proctoring company. Not in test proctoring. It does not mean that remote exam proctoring is racist. It means, at most, that there are imbedded issues in the third-party identity check process, which we’ve known for some time. I do not understand why people do not understand the difference.
ICAI Calls for Award Nominees
The International Center for Academic Integrity has called for nominees for its annual awards. Details in the link.
The ICAI Conference is in March, in Indianapolis.