←back to thread

321 points jhunter1016 | 1 comments | | HN request time: 0.199s | source
Show context
twoodfin ◴[] No.41878632[source]
Stay for the end and the hilarious idea that OpenAI’s board could declare one day that they’ve created AGI simply to weasel out of their contract with Microsoft.
replies(4): >>41878980 #>>41878982 #>>41880653 #>>41880775 #
fragmede ◴[] No.41880653[source]
The question is how rigorously defined is AGI in their contract? Given how much AGI is a nebulous concept of smartness and reasoning ability and thinking, how are they going to declare when it has or hasn't been achieved. What stops Microsoft from weaseling out of the contract by saying they never reach it.
replies(2): >>41880701 #>>41880868 #
JacobThreeThree ◴[] No.41880868[source]
OpenAI's short definition of AGI is:

A highly autonomous system that outperform humans at most economically valuable work.

replies(4): >>41881028 #>>41881206 #>>41881215 #>>41882567 #
JumbledHeap ◴[] No.41881028[source]
Will AGI be able to stock a grocery store shelf?
replies(2): >>41881208 #>>41881601 #
1. theptip ◴[] No.41881208[source]
Sometimes it is more narrowly scoped as “… economically valuable knowledge work”.

But sure, if you have an un-embodied super-human AGI you should assume that it can figure out a super-human shelf-stocking robot shortly thereafter. We have Atlas already.