←back to thread

LLMs can get "brain rot"

(llm-brain-rot.github.io)
466 points tamnd | 1 comments | | HN request time: 0s | source
Show context
pixelmelt ◴[] No.45657074[source]
Isn't this just garbage in garbage out with an attention grabbing title?
replies(6): >>45657153 #>>45657205 #>>45657394 #>>45657412 #>>45657896 #>>45658420 #
philipallstar ◴[] No.45657153[source]
Attention is all you need.
replies(2): >>45657800 #>>45658232 #
echelon ◴[] No.45657800[source]
In today's hyper saturated world, attention is everything:

- consumer marketing

- politics

- venture fundraising

When any system has a few power law winners, it makes sense to grab attention.

Look at Trump and Musk and now Altman. They figured it out.

MrBeast...

Attention, even if negative, wedges you into the system and everyone's awareness. Your mousey quiet competitors aren't even seen or acknowledged. The attention grabbers suck all the oxygen out of the room and win.

If you go back and look at any victory, was it really better solutions, or was it the fact that better solutions led to more attention?

"Look here" -> build consensus and ignore naysayers -> keep building -> feedback loop -> win

It might not just be a societal algorithm. It might be one of the universe's fundamental greedy optimization algorithms. It might underpin lots of systems, including how we ourselves as individuals think and learn.

Our pain receptors. Our own intellectual interests and hobbies. Children learning on the playground. Ant colonies. Bee swarms. The world is full of signals, and there are mechanisms which focus us on the right stimuli.

replies(4): >>45658156 #>>45658531 #>>45658557 #>>45660567 #
ghurtado ◴[] No.45658531[source]
Something flew approximately 10 miles above your head that would be a good idea for you to learn.
replies(2): >>45659181 #>>45660165 #
1. scubbo ◴[] No.45659181[source]
There were plenty of kinder ways to let someone know that they had missed a reference - https://xkcd.com/1053/