342: "I'd Say There Are Maybe Five People in My Class Who Don't Use ChatGPT"
Plus, widespread cheating on required legal exams in Australia. Plus, cheating nuggets from press coverage in Mayland.
Issue 342
Subscribe below to join 4,337 (+13) other smart people who get “The Cheat Sheet.” New Issues every Tuesday and Thursday.
If you enjoy “The Cheat Sheet,” please consider joining the 16 amazing people who are chipping in a few bucks via Patreon. Or joining the 47 (+1) outstanding citizens who are now paid subscribers. Paid subscriptions start at $8 a month or $80 a year, and corporate or institutional subscriptions are $240 a year. Thank you!
ChatGPT and Cheating, in France
French news outlet Le Monde has a good article (subscription required) on early and prolific student cheating with generative AI. The article subhead is, in part:
The widespread use of OpenAI's chatbot by students, some as early as junior high school, has teachers worried.
Well, at least they’re worried. That’s good, I guess. Because I feel as though, so often, I see teachers who have either just given up or embraced other forms of surrender such as, “we need to teach them to use it ethically” or “they will need it in their jobs, so we should teach it.” Maybe that’s just in the United States.
I’m distracted. Back on task, the article starts with this:
"I'd say there are maybe five people in my class who don't use ChatGPT," said Alice, a ninth grade student from Paris, without hesitation. A universal sport as old as school itself, cheating has found a way to reinvent itself with generative artificial intelligence (AI), particularly ChatGPT, OpenAI's chatbot. There's a free version of this particularly intuitive tool, which can be installed on a smartphone. Evidently, few teenagers have not taken this initiative. "Those who don't use it at all are the best students," observed the schoolgirl. "Those who use it to do their homework for them, on the other hand, have pretty poor results. They figure that they might as well get good grades without working too hard."
They want good grades without working too hard. I can’t add much.
Please read the piece. It’s pretty stark, and I always feel bad about copying too much of it.
It quotes another student:
"When I'm tired or don't have time, ChatGPT does my German homework," said Elliot, a 16-year-old high school student from Cergy (northwest of Paris), who admits that Goethe's language isn't exactly his strong point. The artificial intelligence, an excellent translator, is capable of producing a German text "at my level," said the young man, in such a way that it isn't spotted too quickly. All you have to do is ask it.
Further:
A bluffing assistant for research or translation, but also for writing structured texts, generative AI is of particular concern to humanities teachers. "Over the past year [2024], the use of AI has become massive," said Marie Perret, a philosophy teacher and board member of the philosophy teachers' association APPEP. "It's exponential, and we're very worried." For writing an essay, ChatGPT is "formidable," added the teacher. "The AI can generate a complete paper. You can ask it to find an example, develop an argument and even look up a philosophical reference," she explained.
A bluffing assistant — love it.
But also, a teacher is using the words “massive,” and “exponential,” and “very worried.” Yet we saw our on collective fiddle while it all burns to the ground.
Also:
"I'm under no illusions: as soon as you give people something to write, everyone uses AI," stressed [a different] teacher. "Hence the need to think about only offering them things that make them use their brains. I asked them, for example, to choose and compare two poems, with the idea of showing them that it was the same theme, but not the same form." But even then, the teacher suspected some of them went to ChatGPT for advice.
Ah — yes, they did. That’s easy work for ChatGPT or the other AI tools.
We also get this, very common, very destructive theme:
"The institution doesn't always support us, even when it's obvious that the student has cheated," reported Perret. "Parents also quibble: one father explained to me that he himself used AI in his work and didn't see the problem."
Indeed — too often, schools don’t want to deal with this, and parents don’t want their kids in trouble.
The fiddle sounds nice. I hope the blaze is keeping you warm.
Widespread Cheating on Required Legal Courses in Australia
An interesting story from Financial Review looks at a “mandatory post-university course required for admission to legal practice” in Australia. The course, according to the coverage, aims to teach “practical legal training.”
The story says that the course is kind of a joke, which is pretty shocking considering that it’s required and costs a reported $9,200 — and that’s after a newly announced fee reduction. But that’s not my point here. My point is the alleged cheating on said course.
To jump in, the course is described as a:
15-week practical legal training course, which requires only five days of in-person attendance and is taught mostly online
OK, so — right there, the game is over. I know already that this is not a serious thing. I see why people are upset.
Quoting people in and around the legal field in Australia, the article says:
They acknowledged a normalisation of cheating by sharing past answers to recycled exam questions and deploying ChatGPT to generate responses.
No surprise whatsoever.
One lawyer said:
“Written assessments follow the same formula for every subject … but no one puts any effort into them. People either copy someone else’s [answers] or use ChatGPT.
“You can upload the course documents to ChatGPT and ask it to write a letter of advice. I did that and passed everything.”
“I’ve given all my past answers to paralegals, they said.”
And:
Other recent students working at major corporate firms described intra-firm group chats and email chains with past answers, with minimal difference in assessments year-on-year.
The course and test provider said, predictably:
“we have developed strategies to address this, including relying strongly on oral assessments, coupled with the redesign of coursework activities to ensure that when students use AI they do so responsibly and critically.”
“Any suspected plagiarism or unauthorised use of AI by a student in their coursework is investigated and may result in a finding of academic misconduct.”
Sure. Whatever.
Quoting another source:
one student recently told him that use of ChatGPT in the course was widespread, and “had just become what’s expected … The academic rigour is almost non-existent”.
The press coverage also says that the test provider is expanding in Great Britain and:
also offers courses in Malaysia and Singapore, and has an agreement to provide legal training to South Korean police officers.
Nice.
Even so, this is a great example of what will inevitably happen if you deliver a course online and recycle your assessments while also not, it seems, doing much to actually secure the examination process. The courses and programs become a joke, at least partly due to the routine cheating. If you don’t care, examinees won’t care.
The saddest part is how easy it is to not do that — to make courses and certifications rigorous and authentic instead. But too many test providers don’t, preferring to rely on pixie dust and strong words such as “investigated and may result in a finding of academic misconduct.”
Please. Just stop. This entire thing is embarrassing.
Two Snippets From Press in Maryland
Baltimore Magazine ran an article recently about AI use in education. It’s unremarkable, save for two items.
The first is:
In a Spring 2024 University of Maryland campus-wide survey, 41 percent of students said they have used generative AI for academic purposes—and usage has almost certainly grown since the survey was conducted.
Yes, 41% is very, very likely to be a low estimate given that the survey is approximately a year old now and people don’t admit to doing things they know are disfavored or not allowed. Also, the phrase “campus-wide” makes me think this survey was at the University of Maryland proper, not may not have included their “global campus” online program, where cheating is sure to be more common.
In other words, if you looked at 41% and thought, “that’s not so bad,” the actual use rate of generative AI for “academic purposes” is probably well in excess of that figure.
Also, please keep in mind that this 41% asked about generative AI only, not other forms of misconduct.
The second item worth sharing is this quote:
“AI detectors are unreliable and the implicit bias within them will often flag work of multilingual learners and other students as being AI-generated even when it is not,” says Nattrass. “And we are now faced with this environment of distrust sometimes in our schools with teachers asking, was this generated by AI [or] was it not?”
Nattrass, the story says, is Tara Nattrass, who is:
managing director of innovation strategy at ISTE+ASCD, a nonprofit that seeks to help educators in K-12 and higher education use technology to improve education
So, of course, what she said is not true.
It would one thing for a random teacher or Maryland advisor to have those things backwards, but this is ISTE, which ought to know better. It’s embarrassing that they do not and, more importantly, it’s evidence of the big misinformation problem in academic integrity. The people who really ought to know better, just don’t.
Just noting that the news article upon which you were drawing about the requirement to do mandatory Practical Legal Training in Australia to be admissible to legal practice was about College of Law, which is one provider of PLT. It is the largest provider across the country to be sure.
However, there are other PLT providers, including universities (such as my own Curtin University in a Graduate Diploma of Legal Practice course), other business providers, and even the not-for-profit Piddington Society (https://www.piddingtonsociety.org/) which teaches PLT and does tremendously good work getting graduates into their first graduate positions.
I can't speak for every PLT provider, but Curtin University does not teach PLT online; it is in person and full-time can still take some months to complete. The flaws that you identify in your discussion of PLT do not, necessary, extend over every PLT provider in Australia.
You should, however, have a good look at the College of Law's website and their Centre for Legal Innovation, which has jumped on the AI booster bandwagon as a part of their business model.