←back to thread

317 points diwank | 2 comments | | HN request time: 0.463s | source
Show context
nox101 ◴[] No.41856751[source]
I've heard through the grapevine that Apple is having trouble making Apple Intelligence not give lots of bad or wrong advice/suggestions/etc... (same as most LLMs).

I would be amazing to me (as in "get out the popcorn") if Apple decided not to ship Apple Intelligence and came out with a public statement saying LLM tech is not ready or is a dead end and effectively implying that other LLM companies are selling snake oil.

https://arstechnica.com/ai/2024/10/llms-cant-perform-genuine...

replies(2): >>41856768 #>>41856803 #
1. BillyTheKing ◴[] No.41856768[source]
People just have less tolerance for errors using apple products.. that doesn't mean that other LLM companies sell snake-oil, people just have a higher error tolerance for them and are learning to work 'with' those LLMs
replies(1): >>41909054 #
2. fennecbutt ◴[] No.41909054[source]
No they don't. They have more tolerance for errors imo, because they've been told "it just works" so many times that when it doesn't work, their brain has to skip it. So many iPhone friends point at some problem my Sammy might have like it's broken all the time yet completely ignore when their own phones have issues, or glaring design flaws like the lack of a consistent back button, a major os feature in 2024 being "you can have a custom lock screen and home screen wallpaper" lmao