We certainly improve productivity, but that is not necessarily good for humanity. Could be even worse.
i.e.: my company already expect less time for some tasks given that they _know_ I'll probably use some AI to do tasks. Which means I can humanly handle more context in a given week if the metric is "labour", but you end up with your brain completely melted.
I think this is really still up for debate
We produce more output certainly but if it's overall lower quality than previous output is that really "improved productivity"?
There has to be a tipping point somewhere, where faster output of low quality work is actually decreasing productivity due to the efforts now required to keep the tower of garbage from toppling
"AI bubble seems close to collapsing" in response to an article about AI being used as a study aid. Does not seem relevant to the actual content of the post at all, and you do not provide any proof or explanation for this statement.
"God knows how many billions have been invested", I am pretty sure it's actually not that difficult to figure out how much investor money has been poured into AI, and this still seems totally irrelevant to a blog post about AI being used as a study aid. Humans 'pour' billions of dollars into all sorts of things, some of which don't work out. What's the suggestion here, that all the money was wasted? Do you have evidence of that?
"We still don't have an actual use case for AI which is good for humanity"... What? We have a lot of use cases for AI, some of which are good for humanity. Like, perhaps, as a study aid.
Are you just typing random sentences into the HN comment box every time you are triggered by the mention of AI? Your post is nonsense.
Does my opinion count?
This frees you up to work on the crunchy unsolved problems.
Programmer here. The answer is 100% no. The programmers who think they're saving time are racking up debts they'll pay later.
The debts will come due when they find they've learned nothing about a problem space and failed to become experts in it despite having "written" and despite owning the feature dealing with it.
Or they'll come due as their failure to hone their skills in technical problem solving catches up to them.
Or they'll come due when they have to fix a bug that the LLM produced and either they'll have no idea how or they'll manage to fix it but then they'll have to explain, to a manager or customer, that they committed code to the codebase that they didn't understand.