Stay for the end and the hilarious idea that OpenAI’s board could declare one day that they’ve created AGI simply to weasel out of their contract with Microsoft.
replies(4):
I remember all the hype open ai had done before the release of chat GPT-2 or something where they were so afraid, ooh so afraid to release this stuff and now it's a non-issue. it's all just marketing gimmicks.
Either the next tokens can include "this question can't be answered", "I don't know" and the likes, in which case there is no omniscience.
Or the next tokens must contain answers that do not go on the meta level, but only pick one of the potential direct answers to a question. Then the halting problem will prevent finite time omniscience (which is, from the perspective of finite beings all omniscience).