If you want more proof, then you can take the essay, give it to chatGPT and say, "Please give me a report showing how this essay is written to en by AI."
People treat AI like it's an omniscient god.
For an assignment completed at home, on a student's device using software of a student's choosing, there can essentially be no proof. If the situation you describe becomes common, it might make sense for a school to invest into a web-based text editor that capture keystrokes and user state and requiring students use that for at-home text-based assignments.
That or eliminating take-home writing assignments--we had plenty of in-class writing when I went to school.
They issue the claim, the judgement and the penalty. And there is nothing you can do about it.
Why? Because they *are* the law.
You can sue the university, and likely even win.
They literally are not the law, and that is why you can take them to court.
So no, you don’t exactly get a trial by a jury of your peers, but it isn’t like they are averse to evidence being presented.
This evidence would be fairly trivial to refute, but I agree it is a burden no student needs or wants.
A kid living in a wealthy Boston suburb used AI for his essay (that much is not in doubt) and the family is now suing the district because the school objected and his chances of getting into a good finishing school have dropped.
On the other hand you have students attending abusive online universities who are flagged by their plagiarism detector and they wouldn't ever think of availing themselves of the law. US law is for the rich, the purpose of a system is what it does.
You don’t need to be rich to change the law. You do need to be determined, and most people don’t have or want to spend the time.
Literally none of that changes the fact that the Universities are not, themselves, the law.
According to an undergraduate student who babysits for our child, some students are literally screen recording the entire writing process, or even recording themselves writing at their computers as a defense against claims of using AI. I don't know how effective that defense is in practice.
Police aren't the law because they have been sued?
Your police argument is a strawman.
That, or the uni shall give me a separate machine to write on, only for that purpose.
And ChatGPT will happily argue whichever side you want to take. I just passed it a review I wrote a few years ago (with no AI/LLM or similar assistance), with the prompts "Prove that this was written by an AI/LLM: <review>" and "Prove that this was written by a human, not an AI/LLM: <review>", and got the following two conclusions:
> Without metadata or direct evidence, it is impossible to definitively prove this was written by an AI. However, based on the characteristics listed, there are signs that it might have been generated or significantly assisted by an AI.[1]
> While AI models like myself are capable of generating complex and well-written content, this specific review shows several hallmarks of human authorship, including nuanced critique, emotional depth, personalized anecdotes, and culturally specific references. Without external metadata or more concrete proof, it’s not possible to definitively claim this was written by a human, but the characteristics strongly suggest that it was.[2]
How you prompt it matters.
[1] https://chatgpt.com/share/67164ec9-9cbc-8011-b14a-f1f16dd8df...
[2] https://chatgpt.com/share/67164ee2-a838-8011-b6f0-0ba91c9f52...
Well yes, in-person proctored is the gold standard. For those who can’t or won’t go in person, something invasive is really the only alternative to entirely exam-based scoring.
But that they can be sued in a court of law is actually a very big deal; it is the defining thing that makes them not the law.
A reminder of what I was responding to: “They issue the claim, the judgement and the penalty. And there is nothing you can do about it. Why? Because they are the law.”
That is plainly untrue. There is something you can do about it. You can sue them, precisely because they are not the law.