←back to thread

378 points todsacerdoti | 2 comments | | HN request time: 0.457s | source
Show context
aeon_ai ◴[] No.44984252[source]
AI is a change management problem.

Using it well requires a competent team, working together with trust and transparency, to build processes that are designed to effectively balance human guidance/expertise with what LLM's are good at. Small teams are doing very big things with it.

Most organizations, especially large organizations, are so far away from a healthy culture that AI is amplifying the impact of that toxicity.

Executives who interpret "Story Points" as "how much time is that going to take" are asking why everything isn't half a point now. They're so far removed from the process of building maintainable and effective software that they're simply looking for AI to serve as a simple pass through to the bottom line.

The recent study showing that 95% of AI pilots failed to deliver ROI is a case study in the ineffectiveness of modern management to actually do their jobs.

replies(8): >>44984371 #>>44984602 #>>44984660 #>>44984777 #>>44984897 #>>44986307 #>>44989493 #>>44995318 #
1. deepburner ◴[] No.44984897[source]
I'm rather tired of this ai apologism bit where every downside is explained away as "it would've happened anyways". AI destroying people's brains and causing paychosis? They would've gone psychotic anyways! AI causing company culture problems? The company was toxic anyways!

Instruments are not inculpable as you think they are.

replies(1): >>44985810 #
2. anuramat ◴[] No.44985810[source]
What's your point? We should be more careful with it? This is the "trial and error" part