←back to thread

1901 points l2silver | 1 comments | | HN request time: 0.239s | source

Maybe you've created your own AR program for wearables that shows the definition of a word when you highlight it IRL, or you've built a personal calendar app for your family to display on a monitor in the kitchen. Whatever it is, I'd love to hear it.
Show context
cptaj ◴[] No.35740766[source]
This is exactly the type of shit I see benevolent AGI doing for us
replies(5): >>35740864 #>>35742060 #>>35742293 #>>35742410 #>>35744076 #
hammyhavoc ◴[] No.35740864[source]
I'm not convinced the costs would make that a viable use of resources versus just making an appropriate product, or using something that already exists like Spotify playlists. Even an LLM is expensive to keep running.
replies(3): >>35741414 #>>35741485 #>>35742546 #
bbarnett ◴[] No.35742546[source]
Even an LLM is expensive to keep running.

In a decade, it'll cost pennies a year.

replies(3): >>35742777 #>>35743277 #>>35743561 #
jacobr1 ◴[] No.35743561[source]
Isn't it already relatively cheap to run? Training is costly, but there are examples of running LLaMA on your laptop. It doesn't seem like it will take decades to commoditize this stuff ... it is just the cutting edge might keep moving ahead.
replies(1): >>35744090 #
1. hammyhavoc ◴[] No.35744090[source]
Not relative to a permanent and much more simplistic solution that already exists in the form of the source code for the original radio project mentioned in the op.

I'll give you an example: fabricating an ASIC is expensive. Using FPGAs is cheaper if the potential sales are low, but they're less performant.

If a hypothetical AGI a decade from now can do the radio gimmick, but it incurs an ongoing cost, but it's going to have wide appeal, it makes more sense to make a simple utility.

Better yet, the simple utility already exists and doesn't need a hypothetical "benevolent AGI". It doesn't even need an LLM. It's here today.

This entire sub-thread went off at a tangent of trying to shoehorn AI into somewhere it has no place being, just like the fetishizing of blockchain and attempting to shoehorn it into everywhere a database would be cheaper, more flexible and more performant.

A hypothetical "benevolent AGI" is going to be incredibly larger in scale than an LLM, thus much more expensive. You won't be running one on a laptop. We may not even have enough compute globally for a hypothetical "benevolent AGI".