←back to thread

317 points laserduck | 1 comments | | HN request time: 0.421s | source
Show context
klabb3 ◴[] No.42157457[source]
I don’t mind LLMs in the ideation and learning phases, which aren’t reproducible anyway. But I still find it hard to believe engineers of all people are eager to put a slow, expensive, non-deterministic black box right at the core of extremely complex systems that need to be reliable, inspectable, understandable…
replies(6): >>42157615 #>>42157652 #>>42158074 #>>42162081 #>>42166294 #>>42167109 #
childintime ◴[] No.42157615[source]
You mean, like humans have been for many decades now.

Edit: I believe that LLM's are eminently useful to replace experts (of all people) 90% of the time.

replies(5): >>42157661 #>>42157674 #>>42157685 #>>42157904 #>>42158502 #
1. lxgr ◴[] No.42158502[source]
Experts of the kind that will be able to talk for hours about the academic consensus on the status quo without once considering how the question at hand might challenge it? Quite likely.

Experts capable of critical thinking and reflecting on evidence that contradicts their world model (and thereby retraining it on the fly)? Most likely not, at least not in their current architecture with all its limitations.