←back to thread

LLMs can get "brain rot"

(llm-brain-rot.github.io)
466 points tamnd | 1 comments | | HN request time: 0s | source
Show context
pixelmelt ◴[] No.45657074[source]
Isn't this just garbage in garbage out with an attention grabbing title?
replies(6): >>45657153 #>>45657205 #>>45657394 #>>45657412 #>>45657896 #>>45658420 #
philipallstar ◴[] No.45657153[source]
Attention is all you need.
replies(2): >>45657800 #>>45658232 #
1. dormento ◴[] No.45658232[source]
In case anyone missed the reference: https://arxiv.org/abs/1706.03762

> (...) We propose a new simple network architecture, the Transformer, based solely on attention mechanisms, dispensing with recurrence and convolutions entirely.