←back to thread

765 points MindBreaker2605 | 1 comments | | HN request time: 0s | source
Show context
lm28469 ◴[] No.45897524[source]
But wait they're just about to get AGI why would he leave???
replies(1): >>45897571 #
killerstorm ◴[] No.45897571[source]
LeCun always said that LLMs do not lead to AGI.
replies(2): >>45897613 #>>45897683 #
consumer451 ◴[] No.45897613[source]
Can anyone explain to me the non-$$ logic for one working towards AGI, aside from misanthropy?

The only other thing I can imagine is not very charitable: intellectual greed.

It can't just be that, can it? I genuinely don't understand. I would love to be educated.

replies(7): >>45897658 #>>45897906 #>>45898328 #>>45898355 #>>45898667 #>>45899342 #>>45899621 #
tedsanders ◴[] No.45897658[source]
I'm working toward AGI. I hope AGI can be used to automate work and make life easier for people.
replies(4): >>45897687 #>>45897826 #>>45897871 #>>45899287 #
consumer451 ◴[] No.45897687[source]
Who’s gonna pay for that inference?

It’s going to take money, what if your AGI has some tax policy ideas that are different from the inference owners?

Why would they let that AGI out into the wild?

Let’s say you create AGI. How long will it take for society to recover? How long will it take for people of a certain tax ideology to finally say oh OK, UBI maybe?

The last part is my main question. How long do you think it would take our civilization to recover from the introduction of AGI?

Edit: sama gets a lot of shit, but I have to admit at least he used to work on the UBI problem, orb and all. However, those days seem very long gone from the outside, at least.

replies(3): >>45898336 #>>45900951 #>>45905114 #
1. Arkhaine_kupo ◴[] No.45898336[source]
I am not someone working on AGI but I think a lot of people work backwards from the expected outcome.

Expected outcome is usually something like a Post-Scarcity society, this is a society where basic needs are all covered.

If we could all live in a future with a free house and a robot that does our chores and food is never scarce we should works towards that, they believe.

The intermiddiete steps aren't thought out, in the same way that for example the communist manifesto does little to explain the transition from capitalism to communism. It simply says there will be the need for things like forcing the bourgiese to join the common workers and there will be a transition phase but no clear steps between either system.

Similarly many AGI proponents think in terms of "wouldnt it be cool if there was an AI that did all the bits of life we dont like doing", without systemic analysis that many people do those bits because they need money to eat for example.