←back to thread

175 points chilipepperhott | 1 comments | | HN request time: 0.235s | source
Show context
mkagenius ◴[] No.44474681[source]
Three clear advantages of a local first software:

1. No network latency, you do not have to send anything across the atlantic.

2. Your get privacy.

3. Its free, you do not need to pay any SaaS business.

An additional would be, the scale being built-in. Every person has their own setup. One central agency doesn't have to take care of all.

replies(6): >>44474744 #>>44474793 #>>44474889 #>>44474937 #>>44475878 #>>44516447 #
echelon ◴[] No.44474744[source]
This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.

1. A lot of good models require an amount of VRAM that is only present in data center GPUs.

2. For models which can run locally (Flux, etc.), you get dramatically different performance between top of line cards and older GPUs. Then you have to serve different models with different sampling techniques to different hardware classes.

3. GPU hardware is expensive and most consumers don't have GPUs. You'll severely limit your TAM if you require a GPU.

4. Mac support is horrible, which alienates half of your potential customers.

It's best to follow the Cursor model where the data center is a necessary evil and the local software is an adapter and visualizer of the local file system.

replies(2): >>44474864 #>>44475280 #
1. cyberax ◴[] No.44475280[source]
> This entire paradigm gets turned on its head with AI. I tried to do this with purely local compute, and it's a bad play. We don't have good edge compute yet.

Your TV likely has a good enough CPU to run a decent model for home automation. And a game console most definitely does.

I'd love to see a protocol that would allow devices to upload a model to a computer and then let it sleep until a command it received. Current AI models are really self-contained, they don't need complicated infrastructure to run them.