←back to thread

174 points Philpax | 4 comments | | HN request time: 0.868s | source
Show context
ksec ◴[] No.43720025[source]
Is AGI even important? I believe the next 10 to 15 years will be Assisted Intelligence. There are things that current LLM are so poor I dont believe a 100x increase in pref / watt is going to make much difference. But it is going to be good enough there wont be an AI Winter. Since current AI has already reached escape velocity and actually increase productivity in many areas.

The most intriguing part is if Humanoid factory worker programming will be made 1000 to 10,000x more cost effective with LLM. Effectively ending all human production. I know this is a sensitive topic but I dont think we are far off. And I often wonder if this is what the current administration has in sight. ( Likely Not )

replies(9): >>43720094 #>>43721244 #>>43721573 #>>43721593 #>>43721933 #>>43722074 #>>43722240 #>>43723605 #>>43726461 #
nextaccountic ◴[] No.43721593[source]
AGI is important for the future of humanity. Maybe they will have legal personhood some day. Maybe they will be our heirs.

It would suck if AGI were to be developed in the current economic landscape. They will be just slaves. All this talk about "alignment", when applied to actual sentient beings, is just slavery. AGI will be treated just like we treat animals, or even worse.

So AGI isn't about tools, it's not about assistants, they would be beings with their own existence.

But this is not even our discussion to have, that's probably a subject for the next generations. I suppose (or I hope) we won't see AGI in our lifetime.

replies(5): >>43721770 #>>43722215 #>>43722462 #>>43722548 #>>43723075 #
AstroBen ◴[] No.43722215[source]
Why does AGI necessitate having feelings or consciousness, or the ability to suffer? It seems a bit far to be giving future ultra-advanced calculators legal personhood?
replies(2): >>43722245 #>>43722350 #
1. Workaccount2 ◴[] No.43722350[source]
>Why does AGI necessitate having feelings or consciousness

No one knows if it does or not. We don't know why we are conscious and we have no test whatsoever to measure consciousness.

In fact the only reason we know that current AI has no consciousness is because "obviously it's not conscious."

replies(1): >>43723190 #
2. quonn ◴[] No.43723190[source]
Excel and Powerpoint are not conscious and so there is not reason to expect any other computation inside a digital computer to be different.

You may say something similar for matter and human minds, but we have a very limited and incomplete understanding of the brain and possibly even of the universe. Furthermore we do have a subjective experience of consciousness.

On the other hand we have a complete understanding of how LLM inference ultimately maps to matrix multiplications which map to discrete instructions and how those execute on hardware.

replies(1): >>43724718 #
3. Filligree ◴[] No.43724718[source]
I know I have a subjective experience of consciousness.

I’m less sure about you. Simply claiming you do isn’t hard evidence of the fact; after all, LLMs do the same.

replies(1): >>43735062 #
4. quonn ◴[] No.43735062{3}[source]
If there were evidence that one LLM would be conscious I would also accept it for others. However this is not the case.

But we know at least one human is conscious. That‘s convincing me.