←back to thread

Nobody knows how to build with AI yet

(worksonmymachine.substack.com)
526 points Stwerner | 4 comments | | HN request time: 0s | source
Show context
Flatcircle ◴[] No.44616899[source]
My theory on AI is it's the next iteration of google search, a better more conversational, base layer over all the information that exists on the internet.

Of course some people will lose jobs just like what happened to several industries when search became ubiquitous. (newspapers, phone books, encyclopedias, travel agents)

But IMHO this isn't the existential crisis people think it is.

It's just a tool. Smart, clever people can do lots of cool stuff with tools.

But you still have to use it,

Search has just become Chat.

You used to have to search, now you chat and it does the searching, and more!

replies(9): >>44616955 #>>44616960 #>>44616976 #>>44617019 #>>44617060 #>>44617065 #>>44617099 #>>44620763 #>>44623695 #
Quitschquat ◴[] No.44617019[source]
Google doesn’t have to change search. It already returns AI generated crap before anything useful.
replies(5): >>44617042 #>>44617089 #>>44617146 #>>44617504 #>>44617593 #
1. patcon ◴[] No.44617089[source]
I have systemic concerns with how Google is changing roles from "knowledge bridging" to "knowledge translating", but in terms of information: I find it very useful.

You find it gives you poor information?

replies(1): >>44617210 #
2. aDyslecticCrow ◴[] No.44617210[source]
Always check the sources. I've personally found it;

- Using a source to claim the opposite of what the source says.

- Point to irrelevant sources.

- Use a very untrustworthy source.

- Give our sources that do not have anything to do with what it says.

- Make up additional things like any other LLM without source or internet search capability, despite reading sources.

I've specifically found Gemeni (the one Google puts at the top of searches) is hallucination-prone, and I've had far better results with other agents with search capability.

So... presenting a false or made-up answer to a person searching the web on a topic they don't understand... I'd really like to see a massive lawsuit cooked up about this when someone inevitably burns their house down or loses their life.

replies(2): >>44619181 #>>44622911 #
3. siliconwrath ◴[] No.44619181[source]
I’ve had to report AI summaries to Google several times for telling me restaurant items don’t contain ingredients I'm allergic to, when the cited “source” allergen menu says otherwise. They’re gonna kill someone.
4. LtWorf ◴[] No.44622911[source]
> - Using a source to claim the opposite of what the source says.

That's because a lot of people do that all the time when arguing online. Cite something without bothering to read it.