←back to thread

574 points nh43215rgb | 7 comments | | HN request time: 0.21s | source | bottom
Show context
hexbin010 ◴[] No.45781498[source]
> “ICE officials have told us that an apparent biometric match by Mobile Fortify is a ‘definitive’ determination of a person’s status and that an ICE officer may ignore evidence of American citizenship—including a birth certificate—if the app says the person is an alien,”

This is "computer says no (not a citizen)". Which is horrifying

They've just created an app to justify what they were already doing right? And the argument will be "well it's a super complex app run by a very clever company so it can't be wrong"?

replies(13): >>45781606 #>>45781662 #>>45781821 #>>45782252 #>>45782541 #>>45782817 #>>45782848 #>>45782971 #>>45783123 #>>45783772 #>>45784468 #>>45784720 #>>45786967 #
rgsahTR ◴[] No.45781662[source]
> They've just created an app to justify what they were already doing right?

This was also one of the more advanced theories about the people selection and targeting AI apps used in Gaza. I've only heard one journalist spell it out, because many journalists believe that AI works.

But the dissenter said that they know it does not work and just use it to blame the AI for mistakes.

replies(5): >>45782107 #>>45782130 #>>45782878 #>>45783028 #>>45783384 #
bko ◴[] No.45782878[source]
It's better that the alternative which is humans. Unless you think enforcing laws or ever having the need to establish identity should never take place
replies(11): >>45782905 #>>45782914 #>>45782959 #>>45782980 #>>45783029 #>>45783156 #>>45783385 #>>45784431 #>>45787217 #>>45788483 #>>45792841 #
sennalen ◴[] No.45782980[source]
It's humans. This is like TSA's fake bomb detectors with nothing inside the plastic shell
replies(1): >>45783387 #
bko ◴[] No.45783387[source]
You think the person at the TSA that gets paid 40k a year is better at facial recognition than a computer?
replies(7): >>45783450 #>>45783557 #>>45783603 #>>45783644 #>>45783650 #>>45783831 #>>45791172 #
atmavatar ◴[] No.45783831[source]
It's likely the TSA employee's five year old child is better at facial recognition than a computer, too.
replies(1): >>45783926 #
bko ◴[] No.45783926[source]
Please don't spread unscientific misinformation. You can say ICE bad, or you don't believe in borders, but saying computer facial recognition is inaccurate compared to humans is just factually incorrect.

https://pages.nist.gov/frvt/html/frvt11.html?utm_source=chat...

replies(2): >>45783978 #>>45783993 #
1. esseph ◴[] No.45783993[source]
https://abc7ny.com/post/man-falsely-jailed-nypds-facial-reco...

https://www.ftc.gov/news-events/news/press-releases/2023/12/...

https://www.theguardian.com/technology/2020/jan/24/met-polic...

https://link.springer.com/article/10.1007/s00146-023-01634-z

https://www.mozillafoundation.org/en/blog/facial-recognition...

https://surface.syr.edu/cgi/viewcontent.cgi?article=2479&con...

Yeah it's pretty fucking shit, actually.

Here's the science.

replies(1): >>45784134 #
2. verdverm ◴[] No.45784134[source]
Looks like GP is using ChatGPT (see the utm_source in their link) to find the first result that supports their viewpoint rather than doing a broad discovery and analysis
replies(2): >>45785206 #>>45786543 #
3. ◴[] No.45785206[source]
4. bko ◴[] No.45786543[source]
The horror! Someone using an LLM for basic information gathering like "is AI facial recognition accurate compared to humans?" rather than going off vibes or one off sensationalized articles.
replies(3): >>45787198 #>>45788024 #>>45788516 #
5. verdverm ◴[] No.45787198{3}[source]
Apparently it has not given you broad coverage of the subject, others have provide more references showing the opposite result of your claim

LLMs are sycophants, how you ask matters

6. justinclift ◴[] No.45788024{3}[source]
> Someone using an LLM for basic information gathering ...

While doing so can be ok, you should probably do some checking via non-LLM means as well.

Otherwise you'll end up misunderstanding things that you _think_ you've learned about. :(

7. queenkjuul ◴[] No.45788516{3}[source]
LLM is equivalent to vibes, sorry