Just yesterday I used an LLM to write some docs for me, and for a little bit where I mistakenly thought the docs were fine as they were (they weren't, but I had to read them closely to see this) it felt like, "wow if the LLM just writes all my docs now, I'm pretty much going to forget how to write docs. Is that something I should worry about?" The LLM almost fooled me. The docs sounded good. It's because they were documenting something I myself was too lazy to re-familiarize with, hoping the LLM would just do it for me. Fortunately the little bit of my brain that still wanted to be able to do things decided to really read the docs deeply, and they were wrong. I think this "the LLM made it convincing, we're done let's go watch TV" mentality is a big danger spot at scale.
There's an actual problem forming here and it's that human society is becoming idiocracy all the way down. It might be completely unavoidable. It might be the reason for the Fermi paradox.