←back to thread

623 points magicalhippo | 1 comments | | HN request time: 0.479s | source
Show context
blackoil ◴[] No.42620569[source]
Is there any effort in local cloud computing? I can't justify $3000 for a fun device. But if all devices (6 phone, 2 iPads, a desktop and 2 laptops) in my home can leverage that for fast LLM, gaming, and photo/video editing, now it makes so much more sense.
replies(5): >>42620586 #>>42620810 #>>42620877 #>>42621768 #>>42621901 #
1. KeplerBoy ◴[] No.42620586[source]
You can just setup your local openAI like API endpoints for LLMs. Most devices and apps won't be able to use them, because consumers don't run self-hosted apps, but for a simple chatGPT style app this is totally viable. Today.