←back to thread

Local-first software (2019)

(www.inkandswitch.com)
869 points gasull | 1 comments | | HN request time: 0s | source
Show context
jumploops ◴[] No.44473619[source]
One thing I’m personally excited about is the democratization of software via LLMs.

Unfortunately, if you go to ChatGPT and ask it to build a website/app, it immediately points the unknowing user towards a bunch of cloud-based tools like Fly.io, Firebase, Supabase, etc.

Getting a user to install a local DB and a service to run their app (god forbid, updating said service), is a challenge that’s complex, even for developers (hence the prevalence of containers).

It will take some time (i.e. pre-training runs), but this is a future I believe is worth fighting for.

replies(2): >>44473639 #>>44475883 #
1. moffkalast ◴[] No.44473639[source]
Local LLMs are even more amazing in concept, all of the world's knowledge and someone to guide you through learning it without needing anything but electricity (and a hilariously expensive inference rig) to run it.

I would be surprised if in a decade we won't have local models that are an order of magnitude better than current cloud offerings while being smaller and faster, and affordable ASICs to run them. That'll be the first real challenger to the internet's current position as "the" place for everything. The more the web gets enshittified and commercialized and ad-ridden, the more people will flock to this sort of option.