←back to thread

197 points baylearn | 1 comments | | HN request time: 0.287s | source
Show context
Animats ◴[] No.44474788[source]
"A disturbing amount of effort goes into making AI tools engaging rather than useful or productive."

Right. It worked for social media monetization.

"... hallucinations ..."

The elephant in the room. Until that problem is solved. AI systems can't be trusted to do anything on their own. The solution the AI industry has settled on is to make hallucinations an externality, like pollution. They're fine as long as someone else pays for the mistakes.

LLMs have a similar problem to Level 2-3 self-driving cars. They sort of do the right thing, but a human has to be poised to quickly take over at all times. It took Waymo a decade to get over that hump and reach level 4, but they did it.

replies(3): >>44474981 #>>44475475 #>>44475555 #
nunez ◴[] No.44475475[source]
Waymo "did it" in very controlled environments, not in general. They're still a ways away from solving self-driving in the general case.
replies(2): >>44475572 #>>44475630 #
1. Animats ◴[] No.44475572[source]
Los Angeles and San Francisco are not "very controlled environments".