With more and more personal AI, i think having a truly private device that can run large LLMs (remember: larger is better) is fantastic!
Ideally we can configure things like Apple Intelligence to use this instead of OpenAI and Apple's cloud.
Ideally we can configure things like Apple Intelligence to use this instead of OpenAI and Apple's cloud.