I'll give you an example: fabricating an ASIC is expensive. Using FPGAs is cheaper if the potential sales are low, but they're less performant.
If a hypothetical AGI a decade from now can do the radio gimmick, but it incurs an ongoing cost, but it's going to have wide appeal, it makes more sense to make a simple utility.
Better yet, the simple utility already exists and doesn't need a hypothetical "benevolent AGI". It doesn't even need an LLM. It's here today.
This entire sub-thread went off at a tangent of trying to shoehorn AI into somewhere it has no place being, just like the fetishizing of blockchain and attempting to shoehorn it into everywhere a database would be cheaper, more flexible and more performant.
A hypothetical "benevolent AGI" is going to be incredibly larger in scale than an LLM, thus much more expensive. You won't be running one on a laptop. We may not even have enough compute globally for a hypothetical "benevolent AGI".