More on Cheating at Harvard, Leaving Me Speechless
Plus, NCAA cites Northern Arizona University for academic misconduct. Plus, Google and Photomath? Plus, what Harvard could learn from a Florida High School.
To join the 3,051 smart people who subscribe to “The Cheat Sheet,” enter your e-mail address below. It’s free:
If you enjoy “The Cheat Sheet,” please consider joining the 12 amazing people who are chipping in a few bucks a month via Patreon. Or joining the three outstanding citizens who are now paid subscribers! I don’t know how that works, to be honest. But I thank you!
“The Cheat Sheet” won’t come out this Thursday, as I’ll be in Austin, hosting a panel at SxSW EDU.
Unrelated, to review why SxSW is a major disappointment on academic integrity, see Issue 187.
If you’re at SxSW and want to say hello, please drop me a note. As always, a reply e-mail reaches me.
On Cheating, GPT at Harvard
What’s happening at Harvard is not entirely unusual. Though, it’s Harvard, so people pay attention. But by any standard, it’s also kind of unbelievable.
Recently, the student paper at Harvard ran a lengthy piece on students cheating, using ChatGPT to complete assignments and what the school is not doing about it - which is much of anything.
The article begins with the story of a student who:
With some revision, she turned ChatGPT’s sentences into her essay. Feeling guilty but relieved, she submitted it
At Harvard, ChatGPT quickly found its way onto students’ browsers. In the midst of finals week, we encountered someone whose computer screen was split between two windows: on the left, an open-internet exam for a statistics class, and on the right, ChatGPT outputting answers to his questions. He admitted that he was also bouncing ideas for a philosophy paper off the AI. Another anonymous source we talked to used the chatbot to complete his open-internet Life Sciences exam.
I realize that some schools, Ivy League schools in particular, cling to the idea that their honor codes are effective. That trusting their students not to cheat is enough. If anything has been laid bare in the past two years, it’s how outdated and borderline crazy that idea is now, especially with “open-internet” exams.
The article goes on to say that Harvard:
had no official policy prohibiting the use of ChatGPT.
While noting that using ChatGPT to create and submit work seems to violate the school’s Honor Code, school leaders also made it clear that they prefer to leave professors with “substantial flexibility” related to the use of ChatGPT. This is, quite obviously, a disaster. The very next line in the article says, perfectly:
Some students are already making the most of this gray area.
Then extraordinarily, the piece goes into a student perspective on, essentially cheating - comparing an essay it took six hours to research and write to AI being able to do it in minutes. It says, I kid you not:
In less than five minutes, ChatGPT did what originally took six hours. And it did it well enough.
Condensing hours into minutes is no small feat, and an enticing prospect for many. Students have many different demands on their time, and not everyone puts academics first. Some pour their time into intense extracurriculars, pre-professional goals, or jobs. (People also want to party.)
Harvard, ladies and gentlemen - where student demands for party time entices people to cut academic corners. I’m speechless. Not that this is true, we all know it is. But that someone actually wrote it.
But it gets better. Or worse, depending on your view.
After pointing out that Harvard’s mission is the “value of intellectual transformation,” and that such a transformation is “necessarily difficult” and “an arduous process,” the author says:
That’s the theory. In practice, many students feel they don’t always have the time to do the difficult work of intellectual transformation. But they still care about their grades, so they cut corners.
Students don’t feel they have time to do the actual work of learning. But they want good grades. So, they cheat. That checks out.
This is Harvard. Though it’s not a Harvard-only problem.
In closing, I’ll share these as well:
Out of more than a dozen professors we’ve spoken with, none currently plan to use an AI detector.
That is a Harvard-only problem.
A Harvard professor told the writer that he:
trusts that his students see the value in learning without cutting corners.
Just move up a few graphs to the pull-quote that starts, “That’s the theory.”
And another professor said:
“Generally, I trust students. I personally don't go super out of my way to try to detect student cheating,” he says. “And I am not going to start.”
I am not speechless very often, but this makes me so. I just can’t.
It’s pretty simple. If professors show they don’t care about academic integrity, their students won’t either. Just like their professors, students don’t want to go super out of their way, to do the actual work of teaching or learning. That’s not trust. That’s not caring.
Again, this is Harvard, which said it expelled 27 students for cheating last year (see Issue 166). And where some teachers aren’t going super out of their way to look for cheating.
Nonetheless, the article is good and long and touches on AI and ChatGPT pretty thoroughly and rather well.
NCAA Cites Northern Arizona University for Academic Misconduct
The NCAA - the institution of college athletics management in the United States - has cited Northern Arizona University for impermissible academic misconduct, saying:
the violation occurred when the administrator provided a student-athlete with a significant number of answers to a math placement exam.
Of special note, the NCAA also says:
The school's online proctoring service flagged the exam for irregularities, and after reviewing audio and video recordings of the exam, its results were discarded and the matter was referred to the school's academic integrity hearing board to determine further university-level sanctions.
Which ought to answer whether online exam proctoring catches cheating. It does. In addition to deterring misconduct, it does catch cheating. I personally never understood the argument that it did not.
As a penalty, the school’s women’s basketball team received probation of one year, a $5,000 fine and the former administrator is essentially banned from engaging with student athletes - even at other schools.
I’ve said before - and it continues to absolutely baffle me - schools are actually sanctioned and fined when academic cheating impacts sports, yet when the cheating just impacts their core mission of teaching and learning and the value of a certification or degree, nothing happens. There is no oversight, no management, no incentive to get things right. Accreditors don’t care at all. When misconduct is caught and made public - which is exceptionally rare already - schools say how seriously they take integrity and everyone moves on.
It’s not a solution I advocate but I do believe that if schools were fined $5,000 every time a school policy led to academic misconduct, schools would miraculously find the moral fortitude to actually try to prevent it.
Anyway, good for the NCAA for being on the job. I’m glad someone is.
News Nuggets About Our Cheating Friends
These few blips came to me from another Substack newsletter - “EdTech Thoughts” - this edition in particular. It covers the business of EdTech - M&A and VC and whatnot. These items relate to cheating providers and I had not seen them reported elsewhere.
Related aside, I do wish that education outlets would not pretend that the likes of Quizlet and Chegg are education companies. They are not, though they pretend to be. Nonetheless, here are some notes:
Photomath is a cheating weapon of mass destruction. As of October 2021, it had been downloaded 250 million times and was answering 2 billion math questions a month. Two billion. And that was a year and a half ago.
The news is that Photomath raised $23 million in funding in early 2022 and was acquired shortly after by - wait for it - Google. In other words, Google now owns a major, major cheating technology. Nothing about that is good.
Major cheating profiteer Quizlet has announced that it’s used ChatGPT to build an AI-powered “tutor.” And when cheating companies say “tutor,” they mean selling answers. So, Quizlet now has an even faster, even easier way for students to buy answers to homework or assessments while doing no work whatsoever. Literally everyone saw this coming. Still, yuck.
Dan Rosenweig, the CEO of cheating giant Chegg, has joined the Board of Directors of UpGrad - an India-based company that sells online college programs. It advertises partnerships with the University of Maryland and Duke - the words “corporate education” are in small print after the Duke University logo.
Putting the CEO of what is probably the world’s largest cheating provider on the Board of an online education provider advertising affiliations with American universities - what could go wrong?
A “Cheating Scandal” at Florida High School
A news report says students at an elite high school academic program have been caught using ChatGPT to cheat. I’m shocked.
From the news:
Students in a Florida high school’s elite academic program have been accused of using ChatGPT and artificial intelligence to write their essays, according to a report.
The head of Cape Coral High School’s prestigious International Baccalaureate Program (IB) flagged the suspected misconduct to staff in a flurry of internal emails
Schools officials, the report says, were reviewing student laptops for evidence of ChatGPT use as well as regularly meeting personally with students to confirm their content mastery. An official said:
“Our teachers must authenticate all student work prior to submission to IB,” she wrote. “If they are unable to authenticate a student’s work then the student will not have successfully completed the IB program.”
Teachers having to authenticate student work - imagine that. Which automatically makes the IB program at Cape Coral High more academically disciplined than Harvard.