It's not like they're above lying, why do they even care to update this?
replies(2):
The alignment-faking research seems to indicate that LLMs exercise of this kind of reasoning.
And even if it was, they wouldn’t tell the system it was part of old non-evil Google.