←back to thread

63 points tejonutella | 1 comments | | HN request time: 0.515s | source
1. borisk ◴[] No.43304113[source]
"Change is inevitable. You can’t stop the railroad as they used to say. It’s going to kill some jobs but not all of them."

I personally don't worry about jobs. AI is progressing very fast (there are a lof of smart folks working on it, there are a ton of money invested and there's a lot of demand from businesses, governments and individuals). Human inteligence stays the same. I think it's likely that sometime soon AI models will become more inteligent than an average human. And then more inteligent than the smartest human. And then more inteligent than the whole human race combined.

Let's say some worms 600 million years ago could think. And they consider should they kill all mutants and stay forever the pinnacle of evolution as they are or allow some of them to evolve into fish, and then mammals and eventually inteligent humans. I think we are in a position like this - we are currently the pinacle of "creation" in the known universe - do we want to stay this way - by blocking AGI progress, or do we want to allow minds far greater than us to evolve from current LLMs at the cost of probable human extinction eventually.