That's not even considering tool use!
And the safety testing actually makes this worse, because it leads people to trust that LLMs are less likely to give dangerous advice, when they could still do so.
Manipulation is a genuine concern!
...later someone higher-up decided that it's actually great at programming as well, and so now we all believe it's incredibly useful and necessary for us to be able to do our daily work
For this reason o3 is way better than most of the doctors I've had access to, to the point where my PCP just writes whatever I brought in because she can't follow 3/4 of it.
Yes, the answers are often wrong and incomplete, and it's up to you to guide the model to sort it out, but it's just like vibe coding: if you put in the steering effort, you can get a decent output.
Would it be better if you could hire an actual professional to do it? Of course. But most of us are priced out of that level of care.