←back to thread

724 points simonw | 1 comments | | HN request time: 0.203s | source
Show context
marcusb ◴[] No.44527530[source]
This reminds me in a way of the old Noam Chomsky/Tucker Carlson exchange where Chomsky says to Carlson:

  "I’m sure you believe everything you’re saying. But what I’m saying is that if you believed something different, you wouldn’t be sitting where you’re sitting."
Simon may well be right - xAI might not have directly instructed Grok to check what the boss thinks before responding - but that's not to say xAI wouldn't be more likely to release a model that does agree with the boss a lot and privileges what he has said when reasoning.
replies(5): >>44528694 #>>44528695 #>>44528706 #>>44528766 #>>44529331 #
chatmasta ◴[] No.44528694[source]
I'm confused why we need a model here when this is just standard Lucene search syntax supported by Twitter for years... is the issue that its owner doesn't realize this exists?

Not only that, but I can even link you directly [0] to it! No agent required, and I can even construct the link so it's sorted by most recent first...

[0] https://x.com/search?q=from%3Aelonmusk%20(Israel%20OR%20Pale...

replies(4): >>44528738 #>>44528767 #>>44528788 #>>44532662 #
1. yorwba ◴[] No.44528738[source]
The user did not ask for Musk's opinion. But the model issued that search query (yes, using the standard Twitter search syntax) to inform its response anyway.