←back to thread

174 points Philpax | 2 comments | | HN request time: 0.416s | source
Show context
ksec ◴[] No.43720025[source]
Is AGI even important? I believe the next 10 to 15 years will be Assisted Intelligence. There are things that current LLM are so poor I dont believe a 100x increase in pref / watt is going to make much difference. But it is going to be good enough there wont be an AI Winter. Since current AI has already reached escape velocity and actually increase productivity in many areas.

The most intriguing part is if Humanoid factory worker programming will be made 1000 to 10,000x more cost effective with LLM. Effectively ending all human production. I know this is a sensitive topic but I dont think we are far off. And I often wonder if this is what the current administration has in sight. ( Likely Not )

replies(9): >>43720094 #>>43721244 #>>43721573 #>>43721593 #>>43721933 #>>43722074 #>>43722240 #>>43723605 #>>43726461 #
1. phire ◴[] No.43722240[source]
Depends on what you mean by “important”. It’s not like it will be a huge loss if we never invent AGI. I suspect we can reach a technology singularity even with limited AI derived from today’s LLMs

But AGI is important in the sense that it have a huge impact on the path humanity takes, hopefully for the better.

replies(1): >>43728418 #
2. 9rx ◴[] No.43728418[source]
> But AGI is important in the sense that it have a huge impact on the path humanity takes

The only difference between AI and AGI is that AI is limited in how many tasks it can carry out (special intelligence), while AGI can handle a much broader range of tasks (general intelligence). If instead of one AGI that can do everything, you have many AIs that, together, can do everything, what's the practical difference?

AGI is important only in that we are of the belief that it will be easier to implement than many AIs, which appeals to the lazy human.