←back to thread

LLMs can get "brain rot"

(llm-brain-rot.github.io)
466 points tamnd | 1 comments | | HN request time: 0.253s | source
1. killshotroxs ◴[] No.45657901[source]
If only I got money every time my LLM kept looping answers and telling stuff I didn't even need. Just recently, I was stuck with LLM answers, all while it wouldn't even detect simple syntax errors...