←back to thread

423 points sohkamyung | 4 comments | | HN request time: 0s | source
Show context
scarmig ◴[] No.45669929[source]
If you dig into the actual report (I know, I know, how passe), you see how they get the numbers. Most of the errors are "sourcing issues": the AI assistant doesn't cite a claim, or it (shocking) cites Wikipedia instead of the BBC.

Other issues: the report doesn't even say which particular models it's querying [ETA: discovered they do list this in an appendix], aside from saying it's the consumer tier. And it leaves off Anthropic (in my experience, by far the best at this type of task), favoring Perplexity and (perplexingly) Copilot. The article also intermingles claims from the recent report and the one on research conducted a year ago, leaving out critical context that... things have changed.

This article contains significant issues.

replies(7): >>45669943 #>>45670942 #>>45671401 #>>45672311 #>>45672577 #>>45675250 #>>45679322 #
amarant ◴[] No.45672311[source]
Human journalists misrepresent the white paper 85% of the time.

With this in mind, 45% doesn't seem so bad anymore

replies(4): >>45672656 #>>45673221 #>>45673610 #>>45674322 #
1. SkyBelow ◴[] No.45674322[source]
Years ago in college, we had a class where we analyzed science in the news for a few weeks compared to the publish research itself. I think it was a 100% misrepresentation rate comparing what a news article summarized about a paper verses what the paper itself said. We weren't going off of CNN or similar main news sites, but news websites aimed at specific types of news which were consistently better than the articles in mainstream news (whenever the underlying research was noteworthy enough to earn a mention on larger sites). Leaving out complete details or only reporting some of the findings weren't enough to count, as it was expected any news summary would reduce the total amount of information being provided about a published paper compared to reading the paper directly. The focus was on looking for summaries that were incorrect or which made claims which the original paper did not support.

Probably the most impactful "easy A" class I had in college.

replies(3): >>45674730 #>>45674960 #>>45679527 #
2. specialist ◴[] No.45674730[source]
That's terrific. Media literacy should be required civics curriculum.

I was on my highschool's radio station, part of the broadcast media curriculum. It was awesome.

That early experience erased any esteem I had for mass media. (Much as I loved the actual work.)

We got to visit local stations, job shadow, produce content for public access cable, make commercials, etc. Meet and interview adults.

We also talked with former students, who managed to break into the industry.

Since it was a voc tech program, there was no mention of McLuhan, Chomsky, Postman, or any media criticism of any kind.

I learned that stuff much later. Yet somehow I was able to intuit the rotten core of our media hellscape.

3. Terr_ ◴[] No.45674960[source]
The Science News Cycle: https://phdcomics.com/comics.php?f=1174
4. BuddyPickett ◴[] No.45679527[source]
I never had any college classes that weren't easy A classes. I think that's all they have.