We certainly improve productivity, but that is not necessarily good for humanity. Could be even worse.
i.e.: my company already expect less time for some tasks given that they _know_ I'll probably use some AI to do tasks. Which means I can humanly handle more context in a given week if the metric is "labour", but you end up with your brain completely melted.
I think this is really still up for debate
We produce more output certainly but if it's overall lower quality than previous output is that really "improved productivity"?
There has to be a tipping point somewhere, where faster output of low quality work is actually decreasing productivity due to the efforts now required to keep the tower of garbage from toppling
Does my opinion count?
Programmer here. The answer is 100% no. The programmers who think they're saving time are racking up debts they'll pay later.
The debts will come due when they find they've learned nothing about a problem space and failed to become experts in it despite having "written" and despite owning the feature dealing with it.
Or they'll come due as their failure to hone their skills in technical problem solving catches up to them.
Or they'll come due when they have to fix a bug that the LLM produced and either they'll have no idea how or they'll manage to fix it but then they'll have to explain, to a manager or customer, that they committed code to the codebase that they didn't understand.