←back to thread

461 points JumpCrisscross | 1 comments | | HN request time: 0.238s | source
Show context
mrweasel ◴[] No.41901883[source]
The part that annoys me is that students apparently have no right to be told why the AI flagged their work. For any process where an computer is allowed to judge people, where should be a rule in place that demands that the algorithm be able explains EXACTLY why it flagged this person.

Now this would effectively kill off the current AI powered solution, because they have no way of explaining, or even understanding, why a paper may be plagiarized or not, but I'm okay with that.

replies(9): >>41902108 #>>41902131 #>>41902463 #>>41902522 #>>41902919 #>>41905044 #>>41905842 #>>41907688 #>>41913643 #
1. GJim ◴[] No.41913643[source]
> For any process where an computer is allowed to judge people....

GDPR to the rescue!

https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-re...

You must identify whether any of your processing falls under Article 22 [automated decision making, including AI] and, if so, make sure that you:

* give individuals information about the processing;

* introduce simple ways for them to request human intervention or challenge a decision;

* carry out regular checks to make sure that your systems are working as intended.

Why in gods name has the USA not adopted similar common sense legislation?