331: Rant Warning: Professor Says Student Responsibilities for Integrity Can Be "Burdensome"
Plus, Grammarly executive says the company is "ubiquitous AI agent." Plus, nominations for best and worst of 2024.
Issue 331
Subscribe below to join 4,243 (+8) other smart people who get “The Cheat Sheet.” New Issues every Tuesday and Thursday.
If you enjoy “The Cheat Sheet,” please consider joining the 16 amazing people who are chipping in a few bucks via Patreon. Or joining the 43 outstanding citizens who are now paid subscribers. Paid subscriptions start at $8 a month or $80 a year, and corporate or institutional subscriptions are $240 for a year. Thank you!
Rant Warning: A Professor Rationalizes, Excuses Cheating
Recently, Inside Higher Ed ran an op-ed piece from Daniel Cryer:
an associate professor of English at Johnson County Community College in Kansas.
Of course it’s in Inside Higher Ed. More on this in a bit.
As for the piece itself, it’s a doozy. And I cannot do anything but rant. I am sorry.
It’s one of the finest examples of rationalizing student misconduct that I’ve read in some time. Not even the cheating companies themselves have done a better job selling the responsibility and consequence-free life of easy cheating.
It begins by asking readers to imagine they’re 19 and a college student:
Your professors have AI policies, but they all seem a little naïve.
And here we go — blame the professors for being fools. Integrity, effort, and learning are not your responsibility if you think your professor is a buffoon.
It continues:
One says you can use it for brainstorming and researching but not actual writing. But when you ask ChatGPT, Perplexity or any other tool to jump-start your research, it gives you ready-made content for your paper. Are you supposed to ignore it?
Yes.
Do you think students don’t know that these AI bots spit out full text? Like it’s a surprise? Oh no, I just went into the bank with my gun, and they handed me bags of money. Am I supposed to give it back? If you did not intend to be a bank robber, yes — give it back.
And:
Another professor says to use AI however you want, as long as you describe what you did in a short paragraph at the end of your paper—an “acknowledgments section,” he calls it. But that still leaves you with the decision of what to do and how much to disclose.
I’m sorry - what?
This has nothing to do with AI. If you’re using a primary source, a book for example, you have to answer the same questions — how much do I use and what do I disclose? This is not some new, bizarre, and cruel imposition on students. I mean, we used to just call this the writing process. But here — those poor students. Making decisions about what to include and how to cite your writing. How unreasonable.
There’s more:
Another professor forbids AI entirely, but how are you supposed to avoid technology that’s embedded in every writing platform?
I have an idea.
I bet you can guess what it is. OK, fine. I’ll say.
How about don’t use an AI writing platform? How about that?
But seriously — how are you supposed to avoid it? In any other context, this would be absurd. If you work in a bank, how are you supposed to avoid pocketing a few hundred bucks? The money is everywhere. Training for the Olympics? How are you supposed to avoid using steroids when they are all around?
But we’re not done with this excuse parade for making destructive decisions:
Besides, you’ve done enough sleuthing to know that AI detectors aren’t reliable anyway. You doubt professors can even enforce these policies, so they’re not a big factor in your decision-making.
Wow.
So, professors are naive when it comes to technology. Writing choices are hard. Avoiding bad things is impossible. And they can’t catch you anyway.
This is a manifesto of rationalization, which multiple research efforts have identified as essential to engaging in academic misconduct.
But. We. Are. Not. Done.
The litany of cheating excuses continues:
Instead, you think about how much you like the class the paper is assigned in, or whether it’s in your major. These are papers you might actually enjoy writing, or ones that could help you develop a skill you’ll need later.
Then again, you’re not sold on your major, and you’re not totally sure what profession you’ll land in, much less what knowledge and skills you’ll need once you get there. So it’s possible you could choose wrong and finish your classes without skills you’ll need. It’s also possible you could buy an extra three hours that would really help relieve all the stress you’re feeling.
The “this class is not important” or “I won’t need to do this later” rationalization. Can’t believe we missed that one until now. And not for nothing, we also get the “help relieve stress” excuse because, remember, doing work is hard.
Then:
At the end of the day, after all the lectures and policies, it seems like the weight of upholding them falls almost entirely on you. It’s empowering in a way, but you can’t help wondering if this is something students should be responsible for.
I’m going to absolutely lose it.
Did a professor just ask — in writing — whether actually doing work in school was “something students should be responsible for”?
I’m being punked, right?
And spare I not to answer - yes. Yes! “Lectures and policies” are things for which a college student should be responsible. And also, yes. I did notice that our writing professor ended a sentence with a preposition.
Indulging this absurdist construct, if we are supposed to question whether not cheating should be the responsibility of students, who should be responsible instead? The naive professors who cannot enforce their own rules?
Our professor ties his points to responsibility offloading — shifting ownership and control over a thing from one party to another. He says that when responsibility is shifted “without resources,” that can be bad. I agree.
And, he writes:
I’m going to argue that this is one effect of generative AI: Where upholding the academic integrity of college classes has always been the shared responsibility of teachers and students, students now bear the lion’s share of it. Expecting them to bear this burden is neither reasonable nor fair.
I disagree with the premise in that I fail to see how any of the burden for “upholding academic integrity” has shifted from teachers (or schools, I’d add) to students. Were it true, it may indeed be unfair. But I do not see any evidence that the fundamental balance of responsibly for integrity and academic rigor has changed. It still is, as far as I can tell, as it was — schools and teachers set expectations and policies, students are responsible for knowing and following them.
We can debate whether schools and teachers are doing their share. I may argue they are not. But that does not mean the responsibility apportionment is any different.
Professor Cryer presents his evidence for a shift in responsibility, for offloading:
There are three big reasons for this new reality: the easy availability of AI writing tools, the marketing of these tools to students as a way of reducing or avoiding coursework and the inability of teachers or detectors to reliably discern whether text is student-authored or AI-generated. To be clear, I’m not suggesting that all uses of AI amount to cheating, that it is never beneficial or that professors’ AI policies are pointless. We need such policies to clarify best practices for students, and there are many good resources for “AI-proofing” assignments. But all such strategies, short of in-person, pen-and-paper writing, arrive at the same place: The final decision on whether to complete one’s coursework or substantially outsource it rests with the student.
I’ll be quick.
Easy availability of AI tools and marketing — check. No one has written more about these developments than I have.
The inability of teachers or detectors to discern AI writing — no check. Simply not true. Teachers and AI detectors reliably spot AI all the time, literally every hour of every day.
But even if this was true, it does not mean that teachers are declining responsibility for the integrity and fairness of their assessments and grades. I’ve never heard a teacher say, “Well, I can’t do this very well, so it’s no longer my responsibility.” There are profound differences between not doing something, not being able to do that thing, and not being responsible for doing that thing.
Also, I deeply doubt you can “AI-proof” an assignment by design.
But this — “The final decision on whether to complete one’s coursework or substantially outsource it rests with the student” — is obvious. But it’s no change. It has always been this way.
In fact, the very next line of this article is:
Some might say this has always been so.
I say it has always been so. Please tell me when this decision did not rest with the student.
This paragraph in the article, I will have to take in sentence pieces because it’s a bit of pretzel. The first line was above, the “some might say” bit. The second sentence is:
But no previous generation has been faced with the ever-present option to offload their work, at no cost, with a low likelihood of immediate negative consequences.
OK. Sure. I agree. But this does not mean the balance of responsibility for integrity has shifted.
Just because there are more guns, gun stores, and gun shows does not mean anyone’s responsibility for their own actions has changed. In fact, if anything, I would argue that the ubiquity of bad items and bad actors has increased the responsibilities of teachers, schools, and even government. No single student can reasonably do anything about Chegg, except not use it themselves.
Next sentence:
Some might say, as I have, that students are responsible for their own learning.
I’m confused. I thought the professor’s point was that expecting students to be responsible for this was “neither reasonable nor fair.”
Next:
But responsibilities can be burdensome and must be understood in contexts beyond the classroom.
Yes. Responsibilities can be a burden.
Millennials coined it “adulting.” Responsibilities suck. We would all rather not do the work and still get the reward. Just as I am sure that an adult college student should be learning this reality and not excused from it because it’s hard and temptations are aplenty.
I really cannot believe a person of responsibility and authority — a teacher — wrote that.
Not too long after, we get this:
What generative AI does, then, is intensify the responsibilization of college students, who do not have the resources necessary to exercise these responsibilities in a way likely to benefit them (and society) more broadly. As they try to decide which assignments are worth doing and which can be safely outsourced, they are essentially tasked with discerning what knowledge and which skills they will need in their future roles as employees and citizens. These are not questions we should expect them to know the answers to.
I agree. Students usually do not know, or cannot well judge, what things in college today will benefit them years from now. That’s why we hire, trust, and pay expert teachers.
But here’s an idea. Maybe schools and teachers should provide “the resources necessary to exercise these responsibilities.” Or, alternatively, to at least erect conditions, policies, and consequences to make sure they do them anyway, precisely because they do not know how to make good choices in these, and many other, areas.
On top of which, I am still bewildered by the idea that if we accept that students cannot make good choices about what learning is important, that this means that professors are now somehow less responsible for learning outcomes or basic fairness.
But we march on:
And even if [our fictional student from the introduction] could sort through all this, the new combination of AI saturation and unenforceable AI policies still makes her the ultimate arbiter of academic integrity in a way no previous generation of students has been.
False. Students have cheated for a thousand years. Every single one was “the ultimate arbiter” of their own academic choices. What was it the good professor said earlier? I think it was something pretty close to “students are responsible for their own learning.”
And:
At the end of the day, though, when it’s just her, her computer and a deadline, all those things [that teachers and schools say] aren’t much more than suggestions. It’s really up to her which skills she hones and which ones she outsources. She is responsible for her own learning.
Oh — he said it again.
True. But, again, what’s new about this?
I am genuinely confused as to what point is being made here.
And also, no. Academic integrity policies are not suggestions. At least they are not supposed to be. If a school’s policies on doing authentic work while having credible and verifiable assessments “aren’t much more than suggestions,” you have a major problem. If you think integrity policies are suggestions, you may be a problem.
The professor suggests more in-person writing, by hand, of assignments. And more focus on the process of writing instead of on polished final products. I have no issue with either recommendation. Though, in an online class with 300 students, those are wishful thinking.
He also suggests empathy for students trying to navigate this new buffet of easy cheating opportunities and the billion-dollar, investor-backed companies slinging academic poison. My words, not his. Nothing wrong with empathy.
But I think we can get a solid sense of Professor Cryer’s worldview in this sentence:
And even as we understandably fret over academic integrity, we can do so with empathy for students forced into a new reality fraught with blurry lines and impossible choices.
First, fret? Come on.
Blurry lines and impossible choices? No. Just no.
This has never been as complicated as some apologists want it to be. Students do the work, or they don’t. Their work represents their honest ability, or it does not. They misrepresent it, or they do not. They cheat themselves, their classmates, teachers, their schools, and the general public, or they do not. The lines are not blurry, and the choices absolutely are not impossible.
But of course — of course! — Inside Higher Ed published a piece arguing that they are. Of course we’re reading about how the responsibility for acting with integrity is burdensome and unfair. Of course.
Grammarly Executive: Grammarly is “Ubiquitous AI Assistant”
Recently, a LinkedIn post from Shishir Mehrotra caught my attention.
His announcement is typical:
Big exciting announcement: I’m excited to share that I’ve signed the agreement for Grammarly to acquire Coda! And I’m honored to lead the combined companies as CEO.
OK, my mistake. His announcement is big and exciting.
I share it only because, in it, he writes:
I realized that both companies have arrived at remarkably similar views of the future. One where AI will redefine every business application and workflow, reinventing productivity as we know it today into a place where humans and AI work together everywhere you get work done.
Grammarly is an amazing platform — with over 40 million daily active users, I see Grammarly as one of the world’s most ubiquitous AI assistants. We have a broad roadmap of how we’ll bring the products together. First, we’ll be working on making Grammarly’s ubiquitous AI assistant even “smarter” and “more helpful” by adding the context of Coda Brain
For the education community, please note that Grammarly is described as “one of the world’s most ubiquitous AI assistants” — a phrase he used twice.
Grammarly is not a grammar helper anymore. It’s a full “AI assistant,” which means it writes for you to reinvent productivity everywhere you get work done, whatever that means. But it’s clear that one of those places is at school.
You know, just FYI.
Please Send Nominations or Suggestions for End of Year Issue
I’ve been working on the 2024 End of Year Issue for a week now. These take a long time.
If you have nominees or suggestions for best/worst of the past year in academic integrity, please send me them along. Categories include, but are not limited to:
Quote of the Year
Person on the Year
Best/Worst Research
Best/Worst Writing or Reporting
Biggest Story
Most Overlooked Story
Here is the one we did last year, for reference.
And thank you.
FWIW, while the burden is heavy, the students I've talked to (recently!) in the hallways and classrooms have been thoughtful about this and looking for support from the university and professors to help them navigate doing the right thing. They have a vested and intense interest in learning and knowledge.
Next time you see a student studying or writing or practicing their presentation: tell them what they're doing is important. Let them feel seen.