←back to thread

693 points jsheard | 8 comments | | HN request time: 0.306s | source | bottom
Show context
tavavex ◴[] No.45093991[source]
The year is 2032. One of the big tech giants has introduced Employ AI, the premiere AI tool for combating fraud and helping recruiters sift through thousands of job applications. It is now used in over 70% of HR departments, for nearly all salaried positions, from senior developers to minimum wage workers.

You apply for a job, using your standardized Employ resume that you filled out. It comes bundled with your Employ ID, issued by the company to keep track of which applications have been submitted by specifically you.

When Employ AI does its internet background check on you, it discovers an article about a horrific attack. Seven dead, twenty-six injured. The article lists no name for the suspect, but it does have an expert chime in, one that happens to share their last name with you. Your first name also happens to pop up somewhere in the article.

With complete confidence that this is about you, Employ AI adds the article to its reference list. It condenses everything into a one-line summary: "Applicant is a murderer, unlikely to promote team values and social cohesion. Qualifications include..." After looking at your summary for 0.65 seconds, the recruiter rejects your application. Thanks to your Employ ID, this article has now been stapled to every application you'll ever submit through the system.

You've been nearly blacklisted from working. For some reason, all of your applications never go past the initial screening. You can't even know about the existence of the article, no one will tell you this information. And even if you find out, what are you going to do about it? The company will never hear your pleas, they are too big to ever care about someone like you, they are not in the business of making exceptions. And legally speaking, it's technically not the software making final screening decisions, and it does say its summaries are experimental and might be inaccurate in 8pt light gray text on a white background. You are an acceptable loss, as statistically <1% of applicants find themselves in this situation.

replies(7): >>45094022 #>>45094060 #>>45094116 #>>45094125 #>>45094163 #>>45094183 #>>45094232 #
1. tantalor ◴[] No.45094060[source]
What's to stop you from running the same check on yourself, so you can see what the employers are seeing?

If anything this scenario makes the hiring process more transparent.

replies(4): >>45094098 #>>45094217 #>>45094323 #>>45101053 #
2. fmbb ◴[] No.45094098[source]
Wrong question. What would enable you to run the same check?
replies(3): >>45094153 #>>45094168 #>>45094797 #
3. dexterdog ◴[] No.45094153[source]
Paying the company that sells the service of checking for you.
4. Schiendelman ◴[] No.45094168[source]
The FCRA would likely already require that you can receive a copy of the check.
5. tavavex ◴[] No.45094217[source]
You only have access to the applicant-facing side of the software, one that will dispense you an Employ ID, an application template, and will enable you to track the status of your application. To prevent people from abusing the system and finding workarounds, employers need to apply to be given an employer license that lets them use all the convenient filtering tools. Most tech companies have already bought one, as did all the large companies. Places like individual McDonald's franchises use their greater company's license. It's not a completely watertight system, but monitoring is just stringent enough to make your detailed application info inaccessible for nearly everyone. Maybe if you have the right credentials, or if you manage to fool the megacorp into believing that you're an actual employer, it's possible.
6. const_cast ◴[] No.45094323[source]
Why would you have access to the software?

Do you currently run the various automated resume parsing software that employers use? I mean - do you even know what the software is? Like even a name or something? No?

7. tgv ◴[] No.45094797[source]
Even if you could, how could you possibly correct the process? In the USA, it would probably take many years, possibly all the way to the Supreme Court, and the big bucks win anyway.

AI believers, pay attention and stop your downplaying and justifications. This can hit you too, or your healthcare. The machine doesn't give a damn.

8. pjc50 ◴[] No.45101053[source]
You're assuming the software gives the same response to every user. Or even gives the same response twice. And if it does .. how do you correct it?

Worker blacklists have been a real problem in a few places: https://www.bbc.com/news/business-36242312