←back to thread

118 points soraminazuki | 1 comments | | HN request time: 0s | source
Show context
topkai22 ◴[] No.45080616[source]
The answer, well documented in the article, is yes.

While the article presents cases that appear the be problematic in the particulars, I think coming to the conclusion that bosses/managers shouldn't be pushing or mandating the use of AI tools in general is incorrect.

It's quite possible that any one new AI tool is wrong, but it is unlikely all of them are. A great historical analogies are the adoption of PCs in the 80s and the adoption of the internet/web in the 90s. Not everything we tried back then was an improvement on existing technologies or processes but in general if you weren't experimenting across a broad swath of your business you were going to get left behind.

It's easy to defend the utility of these tools so long as you caveat them. For example, I've had a lot of success in AI driven code generation for utility scripts, but it is less useful for full fledged feature development in our main code base. AI driven code summarization and its ability to do coding standards enforcement on PRs is a huge help.

Finally, I find the worries in the article about using these tools on sensitive data or scenarios such as ideation to be rather overdrawn. They are just SaaS services. You shouldn't use the free version of most tools for business purposes due to often problematic licensing, but purchasing and legal should be able help find an appropriate service. After all, if you are using google docs or Microsoft 365 to create and store your documents why would (at least with some due diligence that they don't retain or train on your input) you treat Gemini or Copilot (or their other LLM options) as presenting higher legal peril?

replies(6): >>45080681 #>>45080761 #>>45081020 #>>45081023 #>>45081374 #>>45081429 #
bigstrat2003 ◴[] No.45081374[source]
> I think coming to the conclusion that bosses/managers shouldn't be pushing or mandating the use of AI tools in general is incorrect. It's quite possible that any one new AI tool is wrong, but it is unlikely all of them are.

If the tool is good, then management won't need to mandate it. People will be tripping over themselves to get access to the tool that helps them to do their job better. So perhaps you're right that some of the tools will be good (though I personally haven't yet had that experience), but I think that it is incorrect for managers to push for (let alone mandate) tool usage. Measure the result, not the path an employee takes to get there. If Bob uses AI tools to great effect, but Alice is doing just as well as him without using said tools, it's a mistake to force her to change her workflow thinking that the tools will be just as good for her as for Bob.

replies(1): >>45081455 #
1. pmg101 ◴[] No.45081455[source]
Somewhat true but let's also recognise all of us have a certain level of friction. Yes, Alice may be effective with using tool A, due to her knowledge and experience, but not have the higher context to realise that she's at a local maximum and could, after a period of confusion and relearning, become even MORE effective using tool B.

However this is a subtle and nuanced situation requiring careful people management and helping to nudge or lead people, letting them take risks, letting them fail, giving them psychological safety, and praising their attempts. Blanket mandates are just a very tone deaf and stupid way to try to achieve this.