←back to thread

423 points sohkamyung | 1 comments | | HN request time: 0.251s | source
Show context
visarga ◴[] No.45669657[source]
I recently tried to get Gemini to collect fresh news and show them to me, and instead of using search it hallucinated everything wholesale, titles, abstracts and links. Not just once, multiple times. I am kind of afraid of using Gemini now for anything related to web search.

Here is a sample:

> [1] Google DeepMind and Harvard researchers propose a new method for testing the ‘theory of mind’ of LLMs - Researchers have introduced a novel framework for evaluating the "theory of mind" capabilities in large language models. Rather than relying on traditional false-belief tasks, this new method assesses an LLM’s ability to infer the mental states of other agents (including other LLMs) within complex social scenarios. It provides a more nuanced benchmark for understanding if these systems are merely mimicking theory of mind through pattern recognition or developing a more robust, generalizable model of other minds. This directly provides material for the construct_metaphysics position by offering a new empirical tool to stress-test the computational foundations of consciousness-related phenomena.

> https://venturebeat.com/ai/google-deepmind-and-harvard-resea...

The link does not work, the title is not found in Google Search either.

replies(8): >>45669725 #>>45670064 #>>45670405 #>>45670834 #>>45671889 #>>45673663 #>>45676497 #>>45678588 #
wat10000 ◴[] No.45670064[source]
They can be good for search, but you must click through the provided links and verify that they actually say what it says they do.
replies(2): >>45670239 #>>45670408 #
reaperducer ◴[] No.45670408[source]
They can be good for search, but you must click through the provided links and verify that they actually say what it says they do.

Then they're not very good at search.

It's like saying the proverbial million monkeys at typewriters are good at search because eventually they type something right.

replies(1): >>45671376 #
1. wat10000 ◴[] No.45671376[source]
Huh? All the classic search engines required you to click through the results and read them. There's nothing wrong with that. What's different is that LLMs will give you a summary that might make you think you can get away with not clicking through anymore. This is a mistake. But that doesn't mean that the search itself is bad. I've had plenty of cases where an LLM gave me incorrect summaries of search results, and plenty of cases where it found stuff I had a hard time finding on my own because it was better at figuring out what to search for.