Notice the loophole: there’s no qualification of how much problem context the AI started from. Most of the problem -> code “work” would still be done by a human in that situation — even if technically 50% of the code is “AI generated” [because the human did all the hard work of generating the context necessary for those tokens, including the preceding tokens of code].
As the saying goes… lies, damned lies, and statistics.
A lot of my problems show up on lists like “25 falsehoods programmers believe about addresses”. These were things that were maybe acceptable in 1999, because they didn’t know better, but only having a single street line is a problem.
I could rant, but half of it would be sales directors who have never used Salesforce shooting us in the foot.
Its customers need big enterprise and there isn't a lot of other competition in that space. Not stuff with an ecosystem of systems integrators buzzing around.
If you don't need all that get Hubspot or Zoho or SugarCRM or something.
I showed him this post and he had the following to say about it:
"Unless your company is a non-profit, then anything the company does is for the purpose of profit, and everything else is subordinate to that. A 'Puritan Work Ethic' culture makes people believe work has inherent value, so expressions of shared value, cohesion, culture, etc. are done to take advantage of that and convince people to work for less. So shared values and cohesion help manage salaries and wages, but if people end up not being needed, then those aren't needed."
I don't know. If AI replaces jobs, or makes most of them "copy-paste what the AI said," what is the meaning of that?
I asked him that, and he said this:
"I guess everyone's gonna have to be blue collar now or join the military."
Infer it from the article:
“as much as 30% to 50% of the company’s work is now completed by AI”
There. That’s not nothing.
You can and should call bs on all corporate claims, but this idea that coding agents at scale don’t work or is just total fluff is just wrong.
What I’m seeing is that people over 25 who like to write code and have spent their lives “perfecting” their environment and code generation process, can’t stand that businesses prefer lower quality code that’s created faster and cheaper than their “perfect” code.
Software engineers (and engineers generally) are closer economically to day laborers than theoretical physicists - but we/they refuse to believe that.
This is why unionization matters but you can’t unionize divas until they actually start losing jobs.
With AI, companies can built some rigid analytics/tests/benchmarks, which could be used at scale.
In reality, we have a system that suppresses wages to the fullest extent possible, and it is getting more and more possible. If you would like "companies" to be like this, you'll need to join with others to build the power to make that happen. This might happen via traditional union organizing, creating alternative structures like worker co-ops to compete directly in the market with the AI slop factories, or via state-level interventions. Presumably all three tactics will be necessary (and possibly some other ones I can't think of or which haven't been invented yet), since the other side has pushed us into this spot using every tool they have access to, legal or not.
It's not the 90s anymore -- we will need to get off our asses and organize if we're going to avoid the worst futures.
Both are useful, but the former situation is one that just makes good engineers more useful/in demand imho.
This also demonstrates how meaningful this claim is