←back to thread

169 points mattmarcus | 2 comments | | HN request time: 0s | source
Show context
EncomLab ◴[] No.43612568[source]
This is like claiming a photorestor controlled night light "understands when it is dark" or that a bimetallic strip thermostat "understands temperature". You can say those words, and it's syntactically correct but entirely incorrect semantically.
replies(6): >>43612607 #>>43612629 #>>43612689 #>>43612691 #>>43612764 #>>43612767 #
1. aSanchezStern ◴[] No.43612607[source]
The post includes this caveat. Depending on your philosophical position about sentience you might say that LLMs can't possibly "understand" anything, and the post isn't trying to have that argument. But to the extent that an LLM can "understand" anything, you can study its understanding of nullability.
replies(1): >>43614080 #
2. keybored ◴[] No.43614080[source]
People don’t use “understand” for machines in science because people may or may not believe in the sentience of machines. That would be a weird catering to panpsychism.