Most active commenters
  • endtime(4)
  • throwanem(4)

←back to thread

728 points squircle | 12 comments | | HN request time: 1.056s | source | bottom
Show context
herculity275 ◴[] No.41224826[source]
The author has also written a short horror story about simulated intelligence which I highly recommend: https://qntm.org/mmacevedo
replies(9): >>41224958 #>>41225143 #>>41225885 #>>41225929 #>>41226053 #>>41226153 #>>41226412 #>>41226845 #>>41227116 #
htk ◴[] No.41226153[source]
Reading mmacevedo was the only time that I actually felt dread related to AI. Excellent short story. Scarier in my opinion than the Roko's Basilisk theory that melted Yudkowsky's brain.
replies(1): >>41226777 #
digging ◴[] No.41226777[source]
> Scarier in my opinion than the Roko's Basilisk theory that melted Yudkowsky's brain.

Is that correct? I thought the Roko's Basilisk post was just seen as really stupid. Agreed that "Lena" is a great, chilling story though.

replies(2): >>41227181 #>>41228532 #
endtime ◴[] No.41227181[source]
It's not correct. IIRC, Eliezer was mad that someone who thought they'd discovered a memetic hazard would be foolish enough to share it, and then his response to this unintentionally invoked the Streisand Effect. He didn't think it was a serious hazard. (Something something precommit to not cooperating with acausal blackmail)
replies(4): >>41227683 #>>41228118 #>>41229694 #>>41230289 #
throwanem ◴[] No.41230289[source]
> precommit to not cooperating with acausal blackmail

He knows that can't possibly work, right? Implicitly it assumes perfect invulnerability to any method of coercion, exploitation, subversion, or suffering that can be invented by an intelligence sufficiently superhuman to have escaped its natal light cone.

There may exist forms of life in this universe for which such an assumption is safe. Humanity circa 2024 seems most unlikely to be among them.

replies(2): >>41230802 #>>41233063 #
1. endtime ◴[] No.41230802[source]
Eliezer once told me that he thinks people aren't vegetarian because they don't think animals are sapient. And I tried to explain to him that actually most people aren't vegetarian because they don't think about it very much, and don't try to be rigorously ethical in any case, and that by far the most common response to ethical arguments is not "cows aren't sapient" but "you might be right but meat is delicious so I am going to keep eating it". I think EY is so surrounded by bright nerds that he has a hard time modeling average people.

Though in this case, in his defense, average people will never hear about Roko's Basilisk.

replies(5): >>41230902 #>>41231294 #>>41232652 #>>41236655 #>>41237034 #
2. defrost ◴[] No.41230902[source]
Despite, perhaps, all your experience to the contrary it's only a relatively recent change to a situation where "most people" have no association with the animals they eat for meat and thus can find themselves "not thinking about it very much".

It's only within the past decade or so that the bulk of human population lives in an urban setting. Until that point most people did not and most people gone fishing, seen a carcass hanging in a butcher's shop, killed for food at least once, had a holiday on a farm if not worked on one or grown up farm adjacent.

By most people, of course, I refer to globally.

Throughout history vegetarianism was relatively rare save in vegatarian cultures (Hindi, et al) and in those cultures where it was rare people were all too aware of the animals they killed to eat. Many knew that pigs were smart and that dogs and cats interact with humans, etc.

Eliezer was correct to think that people who killed to eat thought about their food animals differently but I suspect it had less to do with sapience and more to do with thinking animals to be of a lesser order, or there to be eaten and to be nutured so there would be more for the years to come.

This is most evident in, sat, hunter societies, aboriginals and bushmen, who have extensive stories about animals, how they think, how they move and react, when they breed, how many can be taken, etc. They absolutely attribute a differing kind of thought, and they hunt them and try not to over tax the populations.

replies(1): >>41230962 #
3. endtime ◴[] No.41230962[source]
That's all fair, but the context of the conversation was the present day, not the aggregate of all human history.
replies(1): >>41231094 #
4. defrost ◴[] No.41231094{3}[source]
People are or are not vegetarian mostly because of their parents and the culture in which they were raised.

People who are not vegetarian but have never cared for or killed a farm animal were very likely (in most parts of the world) raised by people that have.

Even in the USofA much of the present generations are not far removed from grandparents who owned farms | worked farms | hunted.

The present day is a continuum from yesterday. Change can happen, but the current conditions are shaped by the prior conditions.

5. tbrownaw ◴[] No.41231294[source]
There's a standard response to a particular PETA campaign: "Meat is murder. Delicious, delicious murder.".

It's a bit odd that someone would like to argue on the topic, but also either not have heard that or not recognize the ha-ha-only-serious nature of it.

replies(1): >>41236985 #
6. Vecr ◴[] No.41232652[source]
Yudkowsky's not a vegetarian though, is he? Not ideologically at least, unless he changed since 2015.
replies(1): >>41237476 #
7. lupire ◴[] No.41236655[source]
This shows the difference between being "bright" and being "logical". Or being "wise" vs "intelligent".

Being very good at an arbitary specific game isn't the same as being smart. Prrendit that the universe is the same as your game is not wise.

replies(1): >>41236834 #
8. throwanem ◴[] No.41236834[source]
I usually find better results describing this as the orthogonality of cleverness and wisdom, and avoiding the false assumption that one is preferable in excess.
9. digging ◴[] No.41236985[source]
I believe most people would be fine with eating the meat of murdered humans, too, if it was sold on grocery store shelves for a few years. The power of normalization is immense. It sounds like Eliezer was stuck on a pretty wrong path in making that argument. But it's also an undated anecdote and it may be that he never said such a thing.
10. throwanem ◴[] No.41237034[source]
> I think EY is so surrounded by bright nerds that he has a hard time modeling average people.

On reflection, I could've inferred that from his crowd's need for a concept of "typical mind fallacy." I suppose I hadn't thought it all the way through.

I'm in a weird spot on this, I think. I can follow most of the reasoning behind LW/EA/generally "Yudkowskyish" analysis and conclusions, but rarely find anything in them which I feel requires taking very seriously, due both to weak postulates too strongly favored, and to how those folks can't go to the corner store without building a moon rocket first.

I recognize the evident delight in complexity for its own sake, and I do share it. But I also recognize it as something I grew far enough out of to recognize when it's inapplicable and (mostly!) avoid indulging it then.

The thought can feel somewhat strange, because how I see those folks now palpably has much in common with how I myself was often seen in childhood, as the bright nerd I then was. (Both words were often used, not always with unequivocal approbation.) Given a different upbringing I might be solidly in the same cohort, if about as mediocre there as here. But from what I've seen of the results, there seems no substantive reason to regret the difference in outcome.

11. endtime ◴[] No.41237476[source]
Not AFAIK, and IIRC (at least as of this conversation, which was probably around 2010) he doesn't think cows are sapient either.
replies(1): >>41239871 #
12. throwanem ◴[] No.41239871{3}[source]
Has he met one? (I have and I still eat them, this isn't a loaded question; I would just be curious to know whether and what effect that would have on his personal ethic specifically.)