Unfortunately, the physics of this work out so that every playlist generated is composed of Nickelback and “Sweet Caroline” covers.
You're invoking LLMs, but "benevolent AGI" was what got invoked originally. Don't conflate a hypothetical AGI with an existing LLM. Anything of the scale required to create a hypothetical AGI is going to be expensive. Period.
Is grandma really going to use a hypothetical AGI any better than she's able to use Spotify? Come on.
Conversely, it wouldn't make a lot of sense to predict that it will always be as expensive as it is today.
Well, I guess "pennies" is a radical prediction. Cheap, anyway.
1. Never made it as a poor man
2. Never made it as a blind man stealing
3. This is how I remind you, that I’m really MOND.
Sometimes the answer is just staring you in the face. A Canadian face, that should have been from San Antonio, TX.
I'll give you an example: fabricating an ASIC is expensive. Using FPGAs is cheaper if the potential sales are low, but they're less performant.
If a hypothetical AGI a decade from now can do the radio gimmick, but it incurs an ongoing cost, but it's going to have wide appeal, it makes more sense to make a simple utility.
Better yet, the simple utility already exists and doesn't need a hypothetical "benevolent AGI". It doesn't even need an LLM. It's here today.
This entire sub-thread went off at a tangent of trying to shoehorn AI into somewhere it has no place being, just like the fetishizing of blockchain and attempting to shoehorn it into everywhere a database would be cheaper, more flexible and more performant.
A hypothetical "benevolent AGI" is going to be incredibly larger in scale than an LLM, thus much more expensive. You won't be running one on a laptop. We may not even have enough compute globally for a hypothetical "benevolent AGI".