←back to thread

427 points JumpCrisscross | 1 comments | | HN request time: 0s | source
Show context
rowanG077 ◴[] No.41897344[source]
This has nothing to do with AI, but rather about proof. If a teacher said to a student you cheated and the student disputes it. Then in front of the dean or whatever the teacher can produce no proof of course the student would be absolved. Why is some random tool (AI or not) saying they cheated without proof suddenly taken as truth?
replies(4): >>41897406 #>>41897434 #>>41897477 #>>41897586 #
deckiedan ◴[] No.41897406[source]
The AI tool report shown to the dean with "85% match" Will be used as "proof".

If you want more proof, then you can take the essay, give it to chatGPT and say, "Please give me a report showing how this essay is written to en by AI."

People treat AI like it's an omniscient god.

replies(2): >>41897645 #>>41903651 #
1. deepsquirrelnet ◴[] No.41897645[source]
I think what you pointed out is exactly the problem. Administrators apparently don’t understand statistics and therefore can’t be trusted to utilize the outputs of statistical tools correctly.