←back to thread

3 points itstomo | 1 comments | | HN request time: 0.216s | source

Hey HN,

If there is a pocket-sized compact hardware that hosts large-size open-source LLMs that you can connect offline, wouldn't it be helpful?

The benefits:

- You can use large-size open-source LLMs without using up your PC or smartphone's compute

- You can protect your privacy

- You can use high performance LLM offline

1. yamatokaneko ◴[] No.44507312[source]
I think it’s a very interesting approach. Could be for a niche group, likely power users of LLMs who are often mobile and value privacy.

Any thoughts on how small this hardware could eventually become?