←back to thread

423 points sohkamyung | 2 comments | | HN request time: 0.405s | source
Show context
alcide ◴[] No.45669582[source]
Kagi News has been pretty accurate. Source information is provided along with the summary and key details too.

AI summarizes are good for getting a feel of if you want to read an article or not. Even with Kagi News I verify key facts myself.

replies(6): >>45669784 #>>45670009 #>>45670554 #>>45670632 #>>45672369 #>>45672702 #
delusional ◴[] No.45669784[source]
What if the AI makes an interesting or important article sound like one you don't want to read? You'd never cross check the fact, and you'd never discover how wrong the AI was.
replies(4): >>45669966 #>>45670174 #>>45670299 #>>45670572 #
1. alcide ◴[] No.45670174[source]
Integrity of words and author intent is important. I understand the intent of your hypothetical but I haven’t run into this issue in practice with Kagi News.

Never share information about an article you have not read. Likewise, never draw definitive conclusions from an article that is not of interest.

If you do not find a headline interesting, the take away is that you did not find the headline interesting. Nothing more, nothing less. You should read the key insights before dismissing an article entirely.

I can imagine AI summarizes being problematic for a class of people that do not cross check if an article is of value to them.

replies(1): >>45671791 #
2. latexr ◴[] No.45671791[source]
> I can imagine AI summarizes being problematic for a class of people that do not cross check if an article is of value to them.

I feel like that’s “the majority of people” or at least “a large enough group for it to be a societal problem”.