Actual news articles misrepresent reality more often than 45%.
Some very recent discussions on HN:
replies(2):
Some very recent discussions on HN:
Who cares if AI does a good job representing the source, when the source is crap?
But even if we concede that to be true, it doesn’t change the fact that LLMs are misrepresenting the text they’ve been given half the time. Which means the information is degraded further. Which is worse.
I guess I don’t exactly understand the point you’re trying to make.