AI should not be a goal in itself, unless you make and sell AI. But for anyone else, you need to stick to your original quality and productivity metrics. If AI can help you improve those, that's great. But don't make AI use itself a goal.
I've got a coworker who complains she's getting pressured by management to use AI to write the documents she writes. She already uses AI to review them, and that works great, according to her. But they want her to use AI to write the whole thing, and she refuses, because the writing process is also how she organizes her own thinking around the content she's writing. If she does that, she's not building her own mental model of the processes she's describing, and soon she'd have no idea of what's going on anymore.
People ignore the importance of such mental models a lot. I recall a story of air traffic control that was automated, leading air traffic controllers to lose track in their heads of which plane was where. So they changed the system so they still had to manually move planes from one zone to another in an otherwise automated system, just to keep their mental models intact.
The consumers of the incident report aren't the ones who had any say in using LLMs so they're stuck with less certainty.
This is true of all technology, and it's weird to me to see all this happening with AI because it just makes me wonder what other nonsense bosses were insisting people use for no reason other than cargo culting. It just seems so wild to imagine someone saying "other people are using this so we should use it too" without that recommendation actually being based in any substantive way on the tool's functionality.
It's also how I utilties AIs - summaries or rewrite text to make it sound better but never to create code or understand code. Nothing that requires deep understanding of the problem space.
Its the mental models in my head that don't jell with AI that prevent AI adoption for me.
Which is fine by management, because the intent is to fire her and have AI generate the reports. The top-down diktats for AI maximization is to quickly figure out how much can be automated so companies can massively scale back on payroll before their competition does.