←back to thread

427 points JumpCrisscross | 3 comments | | HN request time: 0s | source
Show context
greatartiste ◴[] No.41901335[source]
For a human who deals with student work or reads job applications spotting AI generated work quickly becomes trivially easy. Text seems to use the same general framework (although words are swapped around) also we see what I call 'word of the week' where whichever 'AI' engine seems to get hung up on a particular English word which is often an unusual one and uses it at every opportunity. It isn't long before you realise that the adage that this is just autocomplete on steroids is true.

However programming a computer to do this isn't easy. In a previous job I had dealing with plagiarism detectors and soon realised how garbage they were (and also how easily fooled they are - but that is another story). The staff soon realised what garbage these tools are so if a student accused of plagiarism decided to argue back then the accusation would be quietly dropped.

replies(14): >>41901440 #>>41901484 #>>41901662 #>>41901851 #>>41901926 #>>41901937 #>>41902038 #>>41902121 #>>41902132 #>>41902248 #>>41902627 #>>41902658 #>>41903988 #>>41906183 #
aleph_minus_one ◴[] No.41901851[source]
> The staff soon realised what garbage these tools are so if a student accused of plagiarism decided to argue back then the accusation would be quietly dropped.

I ask myself when the time comes that some student will accuse the stuff of libel or slander becuase of false AI plagiarism accusations.

replies(1): >>41902481 #
red_admiral ◴[] No.41902481[source]
Or of racism. There was a thing during the pandemic where automated proctoring tools couldn't cope with people of darker skin than they were trained on; I imagine the first properly verified and scientifically valid examples of AI-detection racism will be found soon.
replies(2): >>41902556 #>>41904671 #
1. jjmarr ◴[] No.41904671[source]
https://www.nature.com/articles/s41586-024-07856-5

LLMs already discriminates against African-American English. You could argue a human grader would as well, but all tested models were more consistent in assigning negative adjectives to hypothetical speakers of that dialect.

replies(1): >>41905451 #
2. kayodelycaon ◴[] No.41905451[source]
This is entirely unsurprising to me. As taught to me, written English (in the US) has a much stricter structure and vocabulary. African-American English was used as the primary example of incorrect and unprofessional writing.
replies(1): >>41908779 #
3. selimthegrim ◴[] No.41908779[source]
I think that it’s a little more complicated than that as the comment from Brad Daniels at this link would show - https://www.takeourword.com/TOW145/page4.html

NB: I am not African-American, nor did I grew up on an African-American community, and I performed very well on all sorts of verbal tests. Yet, even I made the all intensive purposes mistake until well into adulthood. Probably a Midwestern thing.