Most active commenters
  • pazimzadeh(4)
  • l33tbro(3)

←back to thread

Human

(quarter--mile.com)
717 points surprisetalk | 23 comments | | HN request time: 0.615s | source | bottom
1. l33tbro ◴[] No.43991582[source]
Doesn't really make much sense. It states that this is a purely mechanistic world with no emotion. So why would a machine be "bored" and wish to create a human?
replies(6): >>43991626 #>>43991690 #>>43992605 #>>43992818 #>>43997598 #>>43998101 #
2. pazimzadeh ◴[] No.43991626[source]
yeah, more on the environmental constraints and where the machines even come from would be nice

> There is no emotion. There is no art. There is only logic

also this type of pure humanism seems disrespectful or just presumptuous, as if we are the only species which might be capable of "emotion, art and logic" even though we already have living counterexamples

replies(2): >>43991739 #>>43996663 #
3. disambiguation ◴[] No.43991690[source]
My headcanon is that "boredom" and "fear" are probabilities in a Markov chain - since it's implied the machine society is not all-knowing, they must reconcile uncertainty somehow.
replies(1): >>43991800 #
4. IAmGraydon ◴[] No.43991739[source]
Disrespectful? Of whom? It's a work of fiction. There's really no need to find something to offend you wherever you look.
replies(1): >>43992333 #
5. l33tbro ◴[] No.43991800[source]
How would a machine know that it doesn't know?
replies(2): >>43992152 #>>43992611 #
6. jcims ◴[] No.43992152{3}[source]
Probably by comparing what it experiences to what it can explain.
replies(1): >>43992231 #
7. l33tbro ◴[] No.43992231{4}[source]
Sure, but I'm still not sure it would realistically function. All data in this scenario is obviously synthetic data. It could certainly identify gaps in its "experience" between prediction and outcome. But what it predicts would be limited by what it already represents. So anything novel in its environment would likely confound it.

It's a cool sci-fi story. But I don't think it works as a plausible scenario, which I feel it may be going for.

8. pazimzadeh ◴[] No.43992333{3}[source]
of other animals

but yeah I'm not sure that was the right word, just seems wrong. basically humanism seems like racism but towards other species. I guess speciesist?

replies(2): >>43992618 #>>43992798 #
9. Scarblac ◴[] No.43992605[source]
I see it as an anthropomorphized word for the story. I imagine the machines run out of tasks with high or even low priority, but they still generate tasks at some epsilon priority that are close but not quite to random. That's a kind of boredom.
10. lukas099 ◴[] No.43992611{3}[source]
Experience of encountering things that were previously unknown unknowns would teach it of the general existence of such things.
11. lukas099 ◴[] No.43992618{4}[source]
My take was that other animals didn’t exist either, in the story.
replies(1): >>43993699 #
12. teekert ◴[] No.43992798{4}[source]
This is a rather new stance, history books may one day label it as enlightened (I believe they will). We are not there though, and your stance is not obvious to the majority of people. I do experience that this is sentiment is growing. I personally see it as the moral high ground (both from the animal well-fare as the environmental perspective), whereas I didn't only a couple of years ago.
replies(1): >>43993665 #
13. illegally ◴[] No.43992818[source]
Yea, not really. It also writes:

"Some among the machine society see this as potentially amazing...Others see it as a threat."

That sounds like a human society, not machine society.

But what really is a machine society? Or a machine creature? Can they actually "think"?

A machine creature, if it existed, it's behaviour would be totally different from a human, it doesn't seem they would be able to think, but rather calculate, they would do calculation on what they need to do reach the goal it was programmed.

So yes, the article is not exactly logical. But at least, it is thought provoking, and that's good.

replies(4): >>43993239 #>>43994142 #>>43995947 #>>43996390 #
14. Tepix ◴[] No.43993239[source]
> That sounds like a human society, not machine society.

Does it? Different algorithms can evaluate something and come to different outcomes. I do agree that "potentially amazing" is not a good choice of words.

15. pazimzadeh ◴[] No.43993665{5}[source]
It's just as hard to prove that it's a new stance as an old one since people didn't have any way of writing down their feelings about it in a way that we'd know (or the time to do so)

I think there are quite a few ancient civilizations which clearly had great respect/reverence towards other animals and often gods have features or personality traits of particular animals

The fact that the old testament specifically states that humans have dominion over other creatures means that it needed to be said - even back then there had to be people who didn't think so, or felt guilty about it

16. pazimzadeh ◴[] No.43993699{5}[source]
well the story makes it seem like the only way to get emotion is by making humans. but every vertebrate has basic emotions. mammals and birds have complex emotions. humans are actually logical and emotions don't just happen randomly.

if the machines have no emotion it's probably because they didn't need them to survive (no predators? no natural selection?). which begs the questions, how did the machines get there?

replies(1): >>43996409 #
17. dopidopHN ◴[] No.43994142[source]
For a decent description of machine society you can check the Culture cycle form Ian Banks. AI are backing an organic society but they are also have their own.

Or Hyperion, fron Simmons. ( the « techno-center is a decentralized computing and plotting government)

18. Kaytaro ◴[] No.43995947[source]
The story to me implied that machines were created by humans or vice-versa in a chicken-or-the-egg scenario. In that case it would make sense for them to think similarly.
19. pixl97 ◴[] No.43996390[source]
> it doesn't seem they would be able to think, but rather calculate

This may be a distinction without a difference. Just because a program has a 'goal' doesn't mean it will ever reach that goal (halting problem). There is a potentially unbounded, even infinite number of paths a significantly advanced program can take to attempt to reach a destination. Then there is things like ideals of a universal simulation theory that anything that can occur in our universe and also be simulated in binary. This would mean any 'machine' could perform a simulation of anything a human could do.

Hard to say at this point, we still have more to learn about reality at this point.

20. pixl97 ◴[] No.43996409{6}[source]
>how did the machines get there?

Instead of a Boltzmann brain, a Boltzmann machine?

21. tejohnso ◴[] No.43996663[source]
I felt it difficult to continue with the story after that. If you're going to say, "Imagine, for a moment, a world with no humans" and mention walking the streets, then you have to assume the reader is going to think of our world, but with no humans. And then "There is no emotion" doesn't make sense. If you're going to say there are no humans, then why aren't you saying that there are no other living beings? So anyway, I found it hard to connect with the story right off the bat. I was off-putting in some way for sure.
22. omoikane ◴[] No.43997598[source]
I think we are supposed to just gloss over those bits and enjoy the rest of the story.

https://en.wikipedia.org/wiki/Suspension_of_disbelief

23. falcor84 ◴[] No.43998101[source]
It would be rational for them to have some level of a "novelty-seeking" drive, in order to avoid getting stuck at a local maximum.