←back to thread

427 points JumpCrisscross | 8 comments | | HN request time: 0.001s | source | bottom
Show context
greatartiste ◴[] No.41901335[source]
For a human who deals with student work or reads job applications spotting AI generated work quickly becomes trivially easy. Text seems to use the same general framework (although words are swapped around) also we see what I call 'word of the week' where whichever 'AI' engine seems to get hung up on a particular English word which is often an unusual one and uses it at every opportunity. It isn't long before you realise that the adage that this is just autocomplete on steroids is true.

However programming a computer to do this isn't easy. In a previous job I had dealing with plagiarism detectors and soon realised how garbage they were (and also how easily fooled they are - but that is another story). The staff soon realised what garbage these tools are so if a student accused of plagiarism decided to argue back then the accusation would be quietly dropped.

replies(14): >>41901440 #>>41901484 #>>41901662 #>>41901851 #>>41901926 #>>41901937 #>>41902038 #>>41902121 #>>41902132 #>>41902248 #>>41902627 #>>41902658 #>>41903988 #>>41906183 #
aleph_minus_one ◴[] No.41901851[source]
> The staff soon realised what garbage these tools are so if a student accused of plagiarism decided to argue back then the accusation would be quietly dropped.

I ask myself when the time comes that some student will accuse the stuff of libel or slander becuase of false AI plagiarism accusations.

replies(1): >>41902481 #
red_admiral ◴[] No.41902481[source]
Or of racism. There was a thing during the pandemic where automated proctoring tools couldn't cope with people of darker skin than they were trained on; I imagine the first properly verified and scientifically valid examples of AI-detection racism will be found soon.
replies(2): >>41902556 #>>41904671 #
1. Iulioh ◴[] No.41902556[source]
The "dark skin problem" is mostly the camera sensors, not only the training...

Low light scenarios are just a thing, you would need more expensive hardware do deal with it.

replies(1): >>41902726 #
2. 15155 ◴[] No.41902726[source]
> mostly the camera sensors

Could it be mostly just be..reality? More expensive hardware doesn't somehow make a darker surface reflect more energy in the visible spectrum. "Low light" is not the same condition as "dark surface in well-lit environment."

Leaving the visible spectrum is one possible solution, but it's substantially more error-prone and costly. This is still not the same solution as classical CV with "more expensive hardware."

replies(1): >>41903142 #
3. red_admiral ◴[] No.41903142[source]
If you're building a system to proctor students, then part of your job is to get it to work under all reasonable real-world conditions you might encounter: low light, students with standard webcams or just the one built into their laptop, students with darker skin etc. Reality might make this harder for some cases, but solving that is what you are being paid for.

Also, this could have been handled much better in the cases that came up in the media if there had been proper human review of all cases before prosecuting the students.

replies(1): >>41904027 #
4. VancouverMan ◴[] No.41904027{3}[source]
The last time I got an ID photo taken, I got to wait and watch as the dark-skinned Indian photographer repeatedly struggled to take a suitable passport photo of the light-skinned white woman who was in line directly ahead of me.

This was at a long-established mall shop that specialized in photography products and services. The same photographer had taken suitable photos of some other people in line ahead of us rather quickly.

The studio area was professional enough, with a backdrop, with dedicated photography lighting, with ample lighting in the shop beyond that, and with an adjustable stool for the subject to sit on.

The camera appeared to be a DSLR with a lens and a lens hood, similar enough to what I've seen professional wedding photographers use. It was initially on a tripod, although the photographer eventually removed it during later attempts.

Despite being in a highly-controlled purpose-built environment, and using photography equipment much better than that of a typical laptop or phone camera, the photographer still couldn't take a suitable photo of this particular woman, despite repeated attempts and adjustments to the camera's settings and to the environment.

Was the photographer "racist"? I would guess not, given the effort he put in, and the frustration he was exhibiting at the lack of success.

Was the camera "racist"? No, obviously not.

Sometimes it can just be difficult to take a suitable photo, even when using higher-end equipment in a rather ideal environment.

It has nothing to do with "racism".

replies(3): >>41904168 #>>41905360 #>>41906955 #
5. red_admiral ◴[] No.41904168{4}[source]
I think this comes down to there being different definitions of racism, that are sometimes flat out contradictory.

I don't think anyone is saying that the universities or the software companies have some kind of secret agenda to keep black people out. As far as I can tell there's good evidence they're mostly trying to get more black people in (and in some cases to keep Asians out, but that's another story). I also don't think anyone here was acting out of fear or hatred of black people.

What I am claiming is that the universities in question ended up with a proctoring product that was more likely to produce false positives for students with darker skin colors, and did not apply sufficient human review and/or giving people the benefit of the doubt to cancel out those effects. It is quite likely that whatever model-training and testing the software companies did, was mostly on fair-skinned people in well-lit environments, otherwise they would have picked up this problem earlier on. This is not super-woke Ibram X Kendi applied antiracism, this is doing your job properly to make sure your product works for all students, especially as the students don't have any choice to opt out of using the proctoring software beyond quitting their college.

To me it's on the same level as having a SQL injection vulnerability: maybe you didn't intend to get your users' data exposed - about 100% of the time when this happens, the company involved very much did not intend to have a data breach - but it happened anyway, you were incompetent at the job and your users are now dealing with the consequences.

And to the extent that those consequences here fall disproportionately on skin colors (and so, by correlation, on ethnicities) that have historically been disadvantaged, calling this a type of racism seems appropriate. It's very much not the KKK type of racism, but it could very well still meet legal standards for discrimination.

replies(1): >>41905562 #
6. realitychx2020 ◴[] No.41905360{4}[source]
>> It has nothing to do with "racism".

Every major system in the US academic system is aimed to reducing Asian population. It often comes in the guise of DEI with a very wide definition of "Diversity" that rarely includes Asian.

These systems will use subtle features to blackbox racism. They may just be overt and leak over metadata to achieve it, or get smart and using writing styles.

7. zahlman ◴[] No.41905562{5}[source]
>What I am claiming is that the universities in question ended up with a proctoring product that was more likely to produce false positives for students with darker skin colors, and did not apply sufficient human review and/or giving people the benefit of the doubt to cancel out those effects.

The issue is that, for most people, the term "racism" connotes a moral failing comparable to the secret agendas, fear and hatred, etc. Specifically, an immoral act motivated by a deliberately applied, irrational prejudice.

Using it to refer to this sort of "disparate impact" is at best needlessly vague, and at worst a deliberate conflation known to be useful to (and used by) the "super-woke Ibram X Kendi" types - equivocating (per https://en.wikipedia.org/wiki/Motte-and-bailey_fallacy) in order to attach the spectre of moral outrage to a problem not caused by any kind of malice.

If you're interested in whether someone might have a legal case, you should be discussing that in an appropriate forum - not with lay language among laypeople.

8. dpkirchner ◴[] No.41906955{4}[source]
If the outcome of a system is biased against people with darker or lighter skin, it's obviously racist and should be adjusted or eliminated. It doesn't really matter what the cause of the problem is when making this determination -- we can't just say "lol sorry, some people can't get passport photos."

> Despite being in a highly-controlled purpose-built environment

Frankly it sounds like the environment was not purpose-built at all. It was built to meet insufficient standards, perhaps.