←back to thread

337 points mooreds | 1 comments | | HN request time: 0.496s | source
Show context
schnitzelstoat ◴[] No.44487821[source]
Honestly, I think LLM's are a distraction from AGI. It seems to me that the path to AGI will likely be some sort of Reinforcement Learning approach.

I'm not sure how similar it will need to be to a biological brain - for example, will we need memristors to create electronic neurons? Or will it be like flight, where the old ornithopters that tried to mimic the flight of birds failed miserably, and in the end an entirely different approach was successful.

replies(1): >>44491050 #
1. tim333 ◴[] No.44491050[source]
LLMs seem quite similar to the part of the human brain where you speak quickly without thinking. They don't do the thinking and learning bit that brains do well though. Something needs to be changed or added on I guess.