This is exactly the type of shit I see benevolent AGI doing for us
replies(5):
You're invoking LLMs, but "benevolent AGI" was what got invoked originally. Don't conflate a hypothetical AGI with an existing LLM. Anything of the scale required to create a hypothetical AGI is going to be expensive. Period.
Is grandma really going to use a hypothetical AGI any better than she's able to use Spotify? Come on.