←back to thread

Reflections on Palantir

(nabeelqu.substack.com)
479 points freditup | 6 comments | | HN request time: 0.837s | source | bottom
Show context
beedeebeedee ◴[] No.41869866[source]
I'm amazed there's no discussion in the article about Palantir's role in Gaza and their development of Lavender and "Where's Daddy". That goes beyond the gray areas that the author mentions.
replies(2): >>41870222 #>>41894515 #
talldayo[dead post] ◴[] No.41870222[source]
[flagged]
ocular-rockular ◴[] No.41870296[source]
Given that it's a site run by a startup incubator with opaque moderation and a strong American bias, I would imagine that most things that make "line go up" are ok.
replies(1): >>41870664 #
1. ryandrake ◴[] No.41870664[source]
I don't even think it's always about "line go up". I've talked to real life engineers who evidently have no ethical bar whatsoever--if the technology is cool and complex and an interesting technical problem, they'll work on it, regardless of the real-world application. "Whether my code is used ethically is someone else's problem. I just love technical challenges!" You absolutely see this mentality on HN as well.
replies(3): >>41870933 #>>41871244 #>>41871384 #
2. ToucanLoucan ◴[] No.41870933[source]
The nihilism you see in a lot of nerdy spaces in this vein is so upsetting. Tech workers are so unbelievably critical to the functioning of the modern world that if we collectivized and made demands, we could utterly bring the global system to a halt until those demands were met. But so many are so well sold on this "rockstar programmer" horseshit. Like by all means, get your bag, we all need to, but personally I sleep better knowing no software I've made is helping target guided rockets.
3. femiagbabiaka ◴[] No.41871471[source]
Generally speaking I think it’s unhelpful to develop patterns of thinking that suppose you are the main character in a sort of worldwide video game instead of just another one of billions of humans living on the planet who all have relatively similar wants and needs.
4. sofixa ◴[] No.41872238[source]
This reeks of "just following orders", combined with a healthy those dehumanising others.

Do you care if your code is used to kill literal children? Kidnap suspected terrorists (suspected based on detailed information such as their first name, their watch type, or them driving a car at the wrong time in the wrong place) and torture them to death? Where is your moral line?

5. ocular-rockular ◴[] No.41872744[source]
Case in point. It's hard to value life when you are willing to contribute to the loss of it. I hope that the price of your morals is at the very least high to be sold so easily.
6. ethbr1 ◴[] No.41878840[source]
Ethic judgement becomes simple when you outsource the decision of who is and isn't a terrorist.

Is everyone in Gaza a terrorist, in your opinion?