Isn't it irrelevant that students do not have the answer to a basic quiz though? In a real life situation, they can just _ask an LLM_ if they need to know something.
I don't believe having this option will make people a lot less functional. Sure, some may slip through the cracks by faking it, but we'll soon develop different metrics to judge somebodies true capabilities. Actually, we'll probably create AI for that as well.
As a professional, you hire people who get things done. If that means hiring skilled LLM users, that do not fully understand what they produce, but what they make consistently works about as often as classic dev output does, and they do this in a fraction of the time... You would be crazy _not_ to hire them.
It's true that inexperienced developers will probably generate a massive tech debt during the time where AI is good enough to provide code, but not good enough to fish out hidden bugs. It will soon surpass humans at that skill though, and can then quickly clean up all the spaghetti.
Over the last two years my knowledge on how to perform and automate repetitive and predictable tasks has gradually worn away, replaced by a higher level understanding of software architecture. I use it to guide language models to a desired outcome. For those that want to learn, LLM's excel at explaining code. For this, and plenty of other subjects, it's the greatest learning tool we have ever had! All it takes is a curious mind.
We are in a transitionary time and we simply need to figure out how to deal with this new technology, warts and all. It's not like there is an alternative scenario; it's not going to go away...