A lot of times it is because those things aren’t properly wired up into their systems well enough to get the right context needed to help. Lots of them are nothing more than a prompt with no ability to dig any deeper than their original training data.