←back to thread

791 points 317070 | 4 comments | | HN request time: 0.209s | source
Show context
Icedcool ◴[] No.15010185[source]
"In the name of diversity, when we fill quotas to check boxes, we fuck it up for the genuinely amazing women in tech."

Awesome. A plea towards hiring based on quality, rather than quotas.

Towards a group that is judged by the content and quality of their character rather than some of the variation of an attempt to combat discrimination through discrimination.

replies(6): >>15010295 #>>15010360 #>>15010754 #>>15010810 #>>15012567 #>>15012748 #
JimboOmega ◴[] No.15010360[source]
So quotas are terrible, yes.

But what if there are still biases in hiring? That someone sees a woman and assumes this or that about her based on gender alone?

My own experience as a transgender person is that there are people who, as my gender presentation has shifted, really seem to view me as less competent. Not in a "girl's can't code" way, but like steadily viewing me as more junior, needing more hand holding, giving me simpler tasks, that kind of thing.

It's subtle enough to make me constantly second guess myself, but it's noticeable.

It happens in interviews, too. It's very easy to rationalize biases within certain bounds. Those kind of things - and toxic environments - are what needs to be corrected most in today's tech workplace.

Of course correcting toxic environments early in the pipeline would be the best, because then the men that share those environments don't normalize them, either! But it's not fair to ignore the adult realities of the current working world and just dump all the blame on the early part of the pipeline.

replies(6): >>15010417 #>>15010462 #>>15010516 #>>15010585 #>>15010642 #>>15012543 #
Danihan ◴[] No.15010462[source]
I believe that treating everyone as individuals, rather than as stereotypical groups, is the only way forward. It's the only truly fair approach.

What ever happened to the notion of being color-blind when it comes to policy enforcement? AKA, actually treating people equally, based on merit?

If biases are really that big of an issue (are there studies that show this is true in tech?) then what is wrong with "blind-hiring," instead of the current "diversity-conscious" hiring? You don't have to get to know someone's personality at a deep level to make a hiring decision, you need to know their skill level and aptitude.

It worked to remove the gender gap in orchestras. Why wouldn't it be good to use in tech?

http://gap.hks.harvard.edu/orchestrating-impartiality-impact....

replies(4): >>15010551 #>>15010744 #>>15011261 #>>15012902 #
iainmerrick ◴[] No.15010551[source]
You don't have to get to know someone's personality at a deep level to make a hiring decision, you need to know their skill level and aptitude.

I agree, but how do you estimate their aptitude in an unbiased way? That mostly rules out face-to-face conversations, which is what most companies use.

Aptitude tests? I feel like those have a bad reputation, at least in Bay Area tech companies. Are there good tests we should be using? How do you customize the test to fit your own company? To the extent that "cultural fit" is important for effective teams (and isn't simply a way of excluding women, black people, etc) how do you test for that?

replies(2): >>15010594 #>>15010681 #
Zyst ◴[] No.15010681[source]
How about an artificially anonymized process. HR does know, or assumes, your gender from receiving your CV. But from that point onward your identity is anonymous.

For instance, HR creates a throw away email which will be used during the hiring process to coordinate the rest of the tasks. The coding interviews could use one of the many platforms we have for shared/same space coding with an added chat box for talking your way through the problem so to say.

And so hiring decisions are done by interviewers without knowing the gender or how the candidate looks.

I think this would allow for a good level of blind testing, but would provide some downsides in the side of cultural fit screening. It's a lot easier to pretend you're not an asshole in asynchronous text-only communications.

I guess everything carries a trade off.

replies(2): >>15010696 #>>15010857 #
imron ◴[] No.15010857[source]
> And so hiring decisions are done by interviewers without knowing the gender or how the candidate looks.

It might make the situation worse: http://www.abc.net.au/news/2017-06-30/bilnd-recruitment-tria...

replies(1): >>15010957 #
1. pbhjpbhj ◴[] No.15010957[source]
It's very sad that "free from sexist, racist, ageist biases" is considered "worse", surely?

Do you agree?

replies(2): >>15011330 #>>15011428 #
2. imron ◴[] No.15011330[source]
Yes, I agree (check my comment history over the last few days to see which side I fall on).

I meant worse for the problem the GP was trying to solve.

3. mLuby ◴[] No.15011428[source]
The method (free from sexist…) is better but the outcome (employee similarity to general population) is worse.
replies(1): >>15012854 #
4. Caveman_Coder ◴[] No.15012854[source]
I guess it really comes down to the ethical framework you accept as valid, deontological or utilitarian...