←back to thread

378 points todsacerdoti | 1 comments | | HN request time: 0.239s | source
Show context
OutOfHere ◴[] No.44984298[source]
Although some of the author's concerns are valid, the author seems completely biased against LLMs, which makes their arguments trashworthy. The author is not seeking any sensible middle ground, only a luddite ground.
replies(2): >>44984382 #>>44984579 #
bccdee ◴[] No.44984579[source]
The author is giving an account of his experience with LLMs. If those experiences were enough to thoroughly bias him against them, then that's hardly his fault. "Sensible middle ground" is what people appeal to when they are uncomfortable engaging with stark realities.

If someone told me that their Tesla's autopilot swerved them into a brick wall and they nearly died, I'm not going to say, "your newfound luddite bias is preventing you from seeking sensible middle ground. Surely there is no serious issue here." I'm going to say, "wow, that's fucked up. Maybe there's something deeply wrong with Tesla autopilot."

replies(4): >>44984833 #>>44984900 #>>44984920 #>>44985135 #
mdale ◴[] No.44984920[source]
The poorly named "Autopilot" is a good analogy. The LLMs can definitely help with the drudgery of stop and go traffic with little risk; but take your eye off the road for one second when your moving too fast and your dead.
replies(1): >>44986572 #
1. OutOfHere ◴[] No.44986572[source]
It isn't, because no one is dying from not looking at the LLM's output in the next second. One is free to look at one's preferred speed.