←back to thread

Human

(quarter--mile.com)
717 points surprisetalk | 1 comments | | HN request time: 0.214s | source
Show context
dan-robertson ◴[] No.43994459[source]
Perhaps I am unimaginative about whatever AGI might be, but it so often feels to me like predictions are more based on sci-fi than observation. The theorized AI is some anthropomorphization of a 1960s mainframe: you tell it what to do and it executes that exactly with precise logic and no understanding of nuance or ambiguity. Maybe it is evil. The SOTA in AI at the moment is very good at nuance and ambiguity but sometimes does things that are nonsensical. I think there should be less planning around something super-logical.
replies(6): >>43994589 #>>43994685 #>>43994741 #>>43994893 #>>43995446 #>>43996662 #
mr_toad ◴[] No.43994741[source]
Sci Fi is quite ridiculous when it describes a cold logically machine, and then on the next page describes its malign intentions. Pick a lane.
replies(3): >>43994808 #>>43995563 #>>44007061 #
1. vinceguidry ◴[] No.44007061[source]
I enjoyed the Culture / WH40K fanfic, unfinished as it is. Their take on the Culture / Necron negotiations was hilarious, essentially the Necrons are machine intelligences that have degraded introspective ability, thus are unable to effectively negotiate. Every negotiation breaks down into demands / threats which they clearly can't deliver on. The Culture eventually works around this limitation through hints and intimations and effects a technology trade.

https://archiveofourown.org/works/649448/chapters/1329953

Feelings, or some other way of understanding the self and what it wants, are apparently required to operate effectively as an agent.