I think it’s a very interesting approach. Could be for a niche group, likely power users of LLMs who are often mobile and value privacy.
Any thoughts on how small this hardware could eventually become?
If there is a pocket-sized compact hardware that hosts large-size open-source LLMs that you can connect offline, wouldn't it be helpful?
The benefits:
- You can use large-size open-source LLMs without using up your PC or smartphone's compute
- You can protect your privacy
- You can use high performance LLM offline
Any thoughts on how small this hardware could eventually become?