Most active commenters
  • JimboOmega(7)
  • sololipsist(3)

←back to thread

791 points 317070 | 31 comments | | HN request time: 0.588s | source | bottom
Show context
Icedcool ◴[] No.15010185[source]
"In the name of diversity, when we fill quotas to check boxes, we fuck it up for the genuinely amazing women in tech."

Awesome. A plea towards hiring based on quality, rather than quotas.

Towards a group that is judged by the content and quality of their character rather than some of the variation of an attempt to combat discrimination through discrimination.

replies(6): >>15010295 #>>15010360 #>>15010754 #>>15010810 #>>15012567 #>>15012748 #
1. JimboOmega ◴[] No.15010360[source]
So quotas are terrible, yes.

But what if there are still biases in hiring? That someone sees a woman and assumes this or that about her based on gender alone?

My own experience as a transgender person is that there are people who, as my gender presentation has shifted, really seem to view me as less competent. Not in a "girl's can't code" way, but like steadily viewing me as more junior, needing more hand holding, giving me simpler tasks, that kind of thing.

It's subtle enough to make me constantly second guess myself, but it's noticeable.

It happens in interviews, too. It's very easy to rationalize biases within certain bounds. Those kind of things - and toxic environments - are what needs to be corrected most in today's tech workplace.

Of course correcting toxic environments early in the pipeline would be the best, because then the men that share those environments don't normalize them, either! But it's not fair to ignore the adult realities of the current working world and just dump all the blame on the early part of the pipeline.

replies(6): >>15010417 #>>15010462 #>>15010516 #>>15010585 #>>15010642 #>>15012543 #
2. alexandercrohde ◴[] No.15010417[source]
So how would you counteract that stereotype/bias?

Author would suggest lowering the bar [for women] would only reinforce such stereotypes, do you agree or disagree?

replies(2): >>15010503 #>>15012614 #
3. Danihan ◴[] No.15010462[source]
I believe that treating everyone as individuals, rather than as stereotypical groups, is the only way forward. It's the only truly fair approach.

What ever happened to the notion of being color-blind when it comes to policy enforcement? AKA, actually treating people equally, based on merit?

If biases are really that big of an issue (are there studies that show this is true in tech?) then what is wrong with "blind-hiring," instead of the current "diversity-conscious" hiring? You don't have to get to know someone's personality at a deep level to make a hiring decision, you need to know their skill level and aptitude.

It worked to remove the gender gap in orchestras. Why wouldn't it be good to use in tech?

http://gap.hks.harvard.edu/orchestrating-impartiality-impact....

replies(4): >>15010551 #>>15010744 #>>15011261 #>>15012902 #
4. iainmerrick ◴[] No.15010503[source]
I keep thinking about orchestras, where simply auditioning performers behind a curtain completely fixes the bias problem.

Of course the trick is that you don't need to see the candidates or talk to them, just listen to their playing. In software, we would need to find some similarly effective way to measure anonymized performance.

In fact, completely aside from fixing gender and racial biases, that's something we could really use just to make good hiring decisions! I don't believe anyone really knows how to make consistently great hires in software.

For a start, the hiring decision could be based on gender-anonymized feedback from the interviewer(s), although that obviously wouldn't fix any underlying biases in the feedback itself.

replies(2): >>15010567 #>>15010685 #
5. Brakenshire ◴[] No.15010516[source]
> But what if there are still biases in hiring? That someone sees a woman and assumes this or that about her based on gender alone?

I think the issue is not so much about someone actually being outright blocked at the interview, although that may well happen also, especially if there are substantial numbers of people who think that women need to be accommodated because of fundamental biological differences.

But I would say the issue of bias is more systemic than that, it's more about the pipeline, that there are fewer female candidates at the interview stage because programming has been seen as a male activity. Women are discouraged from getting involved, through their own attitudes, and the attitudes of others projected onto them, and over time that winnows down the crowd of candidates. It's death by a thousand cuts, usually nothing dramatic, just a thousand subconscious decisions and comments.

That is not a problem of the same order, but it is still a problem, and assuming we accept that, the issue is what can be done about it. I don't think it's enough to say that we need programmes only targeted at teenagers or children, there should be something which happens at the end of the pipeline as well, so that company cultures are welcoming, there are female role models, and that it's clear that jobs are available if you buck the trend. The problem is made up of myriad small issues all along the pipeline, so that's also where you need to tackle it. Over time you will get to a point where the small changes are self-reinforcing, and no further action is required.

replies(1): >>15011525 #
6. iainmerrick ◴[] No.15010551[source]
You don't have to get to know someone's personality at a deep level to make a hiring decision, you need to know their skill level and aptitude.

I agree, but how do you estimate their aptitude in an unbiased way? That mostly rules out face-to-face conversations, which is what most companies use.

Aptitude tests? I feel like those have a bad reputation, at least in Bay Area tech companies. Are there good tests we should be using? How do you customize the test to fit your own company? To the extent that "cultural fit" is important for effective teams (and isn't simply a way of excluding women, black people, etc) how do you test for that?

replies(2): >>15010594 #>>15010681 #
7. alexandercrohde ◴[] No.15010567{3}[source]
I believe interviewing.io did something with voice-pitch-adjusting (to make girls sound like guys, or vice-versa) in phone interviews to try to study exactly this effect.

I think their results were confusing and uncertain, but the methodology seemed brilliant and I think should be the gold-standard for tech interviews. If we keep it phone it would remove other subliminal biases (attractiveness, physical disabilities) as an additional benefit.

8. sololipsist ◴[] No.15010585[source]
> But what if there are still biases in hiring? That someone sees a woman and assumes this or that about her based on gender alone?

Quotas are not the answer, period. It's not okay to deny someone a job to correct for some possible, unmeasured, inaccurate judgement call on another person based on a stereotype whether or not that stereotype is accurate across populations. All you're doing is transferring the injustice to another person. What you're doing is preferring certain victims of injustice based on race/gender/whatever, and transferring their injustice to some other race/gender/whatever.

Find another solution. Get creative. But subverting explicit meritocratic hiring to correct for unconfirmed but suspected implicit non-meritocratic hiring is unjust. And stupid.

replies(1): >>15012259 #
9. Danihan ◴[] No.15010594{3}[source]
I personally believe including "cultural fit" in hiring decisions is introducing massive amounts of bias, almost by definition.

Aptitude can be figured out by something as simple as SAT or GMAT scores. If universities use those test scores, why shouldn't employers?

Skill level is determined by doing tasks very similar to the ones that will be given at the job. You know, like reversing red / black binary trees in memory. ;p

replies(1): >>15010753 #
10. colordrops ◴[] No.15010642[source]
The author agrees that biases and toxic environments are a problem that should be addressed. Quotas are orthogonal.
11. Zyst ◴[] No.15010681{3}[source]
How about an artificially anonymized process. HR does know, or assumes, your gender from receiving your CV. But from that point onward your identity is anonymous.

For instance, HR creates a throw away email which will be used during the hiring process to coordinate the rest of the tasks. The coding interviews could use one of the many platforms we have for shared/same space coding with an added chat box for talking your way through the problem so to say.

And so hiring decisions are done by interviewers without knowing the gender or how the candidate looks.

I think this would allow for a good level of blind testing, but would provide some downsides in the side of cultural fit screening. It's a lot easier to pretend you're not an asshole in asynchronous text-only communications.

I guess everything carries a trade off.

replies(2): >>15010696 #>>15010857 #
12. fapjacks ◴[] No.15010685{3}[source]
There appear to be problems with this approach that wouldn't be acceptable to the people driving the diversity effort [0]. To summarize the linked example: ElectronConf tried a gender-blind selection process for speakers, and when they lifted the veil on their selections, discovered they had only selected men to speak, so they canceled the conference.

[0] https://news.ycombinator.com/item?id=14480868

replies(1): >>15010786 #
13. ◴[] No.15010696{4}[source]
14. dsfyu404ed ◴[] No.15010744[source]
>What ever happened to the notion of being color-blind when it comes to policy enforcement?

A few people didn't have the self control to pull it off, hired a bunch of white dudes, fired a bunch of "diverse" people and basically showed favoritism to people like themselves (i.e. mostly white men) which increased the disparity. After awhile this pattern became obvious and HR departments instituted quantitative policies because doing it poorly in a legally defensible way is better than trying to do it well at a risk of doing it poorly enough to get sued.

15. muninn_ ◴[] No.15010753{4}[source]
Mainly because those tests aren't really a good measure of aptitude, and minoriies and women tend to (at least historically, idk about now) score lower. So you would end up with a company full of white, Asian, and Indian guys. (Not making a judgement here just pointing it out)

You could also use the test as a filter mechanism, but I'd just not take the test unless you paid for it as the recruiter. Even then they require months of serious study for most people. It just doesn't work well overall. Take home "work samples" tend to be the preferred method right now and they seem ok as long as they aren't abused.

16. ryandvm ◴[] No.15010786{4}[source]
Google Code Jam has the same problem. I'm honestly surprised that Google hasn't yet tried to intervene with some sort diversity enchantment...
17. imron ◴[] No.15010857{4}[source]
> And so hiring decisions are done by interviewers without knowing the gender or how the candidate looks.

It might make the situation worse: http://www.abc.net.au/news/2017-06-30/bilnd-recruitment-tria...

replies(1): >>15010957 #
18. pbhjpbhj ◴[] No.15010957{5}[source]
It's very sad that "free from sexist, racist, ageist biases" is considered "worse", surely?

Do you agree?

replies(2): >>15011330 #>>15011428 #
19. jasonwatkinspdx ◴[] No.15011261[source]
Doing blind hiring for software is really, really, hard. It's unsurprising that most interviewing pipelines end up being conversations + some whiteboard coding, because coming up with something standardized, systematic, and as blind to biases as possible is something we really haven't figured out yet.
20. imron ◴[] No.15011330{6}[source]
Yes, I agree (check my comment history over the last few days to see which side I fall on).

I meant worse for the problem the GP was trying to solve.

21. mLuby ◴[] No.15011428{6}[source]
The method (free from sexist…) is better but the outcome (employee similarity to general population) is worse.
replies(1): >>15012854 #
22. JimboOmega ◴[] No.15011525[source]
> The problem is made up of myriad small issues all along the pipeline, so that's also where you need to tackle it.

Yup. The WHOLE pipeline, from how we treat girls who are interested in math to how we figure out who to promote. The system pushes female gendered people out at every step. In High School, College, at Interview, at Work, at Promotion time.

I think though it's a bit disingenuous to say if we just look at the beginning, then it will eventually all sort itself out.

I can promise you as a person coming to grips with a transgender identity, seeing that there are women in upper management at my workplace is really, really important to me. You need to see people further along the path than you to know it's a real option. I needed to see women succeeding in ways I wanted to before I could be comfortable accepting my identity.

I wish I had more transgender women as role models, but cis women make a huge difference, too. I know that me, personally, presenting in a feminine way at the workplace has inspired at least one other person to accept her gender identity, too.

Point is, it's not an abstract thing that affects hypothetical people. It's a concrete thing that very directly affects me, right now. I'm in a fragile place trying to rebuild my identity, and I need people to look up to.

Edit: Either you edited what you wrote or I missed half a paragraph, but either way, we're in agreement :)

23. JimboOmega ◴[] No.15012259[source]
Agree.

There is an argument I'm not sure I support that diversity has value on its own - that 5 people with different backgrounds is worth more than 5 with the same background.

But that being said - questioning biases, and correcting them at every point you spot them (not just interviewing!) is vital.

replies(1): >>15013619 #
24. softawre ◴[] No.15012543[source]
> course correcting toxic environments early in the pipeline would be the best, because then the men that share those environments don't normalize them, either

Women can be biased too, right?

I think I'm with you on the rest. Fixing early pipeline isn't enough.

replies(1): >>15012576 #
25. JimboOmega ◴[] No.15012576[source]
Oh, they can - and even against themselves! Internalized transphobia and misogyny is a really hard thing.
26. JimboOmega ◴[] No.15012614[source]
So hiring unqualified people is not useful. Giving someone the same benefit of the doubt you'd give another person with a different background would be great but is really, really hard in practice.

Other than doing thought experiments to try to correct for biases... It's hard.

One thing, and it's a small thing, is noticing if people in a group seem to have something to say, but seem unable to say it because they won't interrupt/talk over people (or that keeps happening)... Once you notice it, clearing a space for them to actually talk can help.

It requires being tuned in not just to the conversation, but the people in it, which is itself difficult. But I have seen it happen and it can be powerful.

There are a lot of little things like that which can be worked out, and I have no list of them or any magic wand solution.

Basically? Pay some attention to your biases and how you - and those around you - are treating others, especially those who might have internalized negative stereotypes and be struggling with imposter syndrome and all of that. Emotional awareness really helps.

27. Caveman_Coder ◴[] No.15012854{7}[source]
I guess it really comes down to the ethical framework you accept as valid, deontological or utilitarian...
28. JimboOmega ◴[] No.15012902[source]
> If biases are really that big of an issue (are there studies that show this is true in tech?)

I don't know if there are studies, but I absolutely know toxic and unfriendly environments exist. I don't know how you'd quantify the effect; if you made up some metric where you looked at how many women COULD be in tech, there's huge lost productivity, but that's not necessarily meaningful.

Even clearing the hiring hurdle is not nearly enough. Hiring someone who your culture treats like crap is not going to help you or them. If the person is actually very competent, but consistently treated as a newbie, their work will be sub-par and they will burn out and leave.

It turns out you need managers who can actually see people, how they interact, and manage them on a personal level. Set up mentoring for those who need it, put people who like to work alone tasks that can be handled alone, people who like to be on big teams on big teams, etc.

There's no way to exhaustively list the things you could do, and that's the point - it's a big, hard job that is a job, that I think SV too often wishes didn't exist.

29. sololipsist ◴[] No.15013619{3}[source]
Whether or not it has value on it's own (which I'm not convinced of in coding). It doesn't matter.

Meritocratic hiring has value on its own. The best person getting the job has value on its own. Not including OR discluding someone from a job based on their skin color or genitals has value on its own.

Insisting that having people with different skin color program a piece of software makes that software better is one thing, actually playing god with people's careers on the basis of their skin color or genitals because you, personally, value that more than meritocracy is an irresponsible and highly unethical course of action.

replies(1): >>15013857 #
30. JimboOmega ◴[] No.15013857{4}[source]
Well, right, but there are biases that are hard to account for; and saying "I know I am probably biased, so I will act on the margins to try to correct that" is in no way irresponsible.

Choosing the best team for the job is always the goal.

And once again, the hard work isn't just the interviewing - it's managing the ongoing workplace environment, day to day.

replies(1): >>15019033 #
31. sololipsist ◴[] No.15019033{5}[source]
> Saying "I know I am probably biased, so I will act on the margins to try to correct that" is in no way irresponsible.

Whether this is true depends on your methods of correction. That's what this whole conversation is about. If you don't KNOW you are biased (you don't), but suspect you are, and further, if you are, but don't know the extent to which you are biased or what the outcome would have been if you weren't biased (you don't; shit, the outcome may have even been the same), adding explicit discrimination that you do know is happening is stupid, unethical, and unjust.

Now, if you solution is instead to, say, "I'll do blind resume reviews, score candidates based on those, then do separate scores based on interviews, and compare them after a few dozen hires. If I see a significant and meaningful drop in acceptance rates for ANY group moving from blind resume reviews to in-person interviews, I'll do further review to see if there is good reason to believe the cause is, in fact, discrimination, rather than some benign issue like candidates of specific groups being legitimately worse at things that only come out in interviews. Then, if I don't find clear evidence of non-benign cause, I won't change anything, but if I do, I will attack those specific causes - even if we happen to find that it's actually white men that are being discriminated against, however unintuitive that is." then you'd be fine.

The response to the suspicion of injustice is key, here. Simply assuming your suspicions are correct and attacking perceived symptoms by doing exactly what you're trying to prevent, while qualifying as "trying to correct that," is not okay.