> If "AI" worked (which fortunately isn't the case), humans would be degraded to passive consumers in the last domain in which they were active creators: thinking.
"AI" (depending on what you understand that to be) is already "working" for many, including myself. I've basically stopped using Google because of it.
> humans would be degraded to passive consumers in the last domain in which they were active creators: thinking
Why? I still think (I think at least), why would I stop thinking just because I have yet another tool in my toolbox?
> you would have to pay centralized corporations that stole all of humanity's intellectual output for engaging in your profession
Assuming we'll forever be stuck in the "mainframe" phase, then yeah. I agree that local models aren't really close to SOTA yet, but the ones you can run locally can already be useful in a couple of focused use cases, and judging by the speed of improvements, we won't always be stuck in this mainframe-phase.
> Mediocre developers are enabled to have a 10x volume (not quality).
In my experience, which admittedly been mostly in startups and smaller companies, this has always been the case. Most developers seem to like to produce MORE code over BETTER code, I'm not sure why that is, but I don't think LLMs will change people's mind about this, in either direction. Shitty developers will be shit, with or without LLMs.