←back to thread

3337 points keepamovin | 1 comments | | HN request time: 0s | source
Show context
keepamovin ◴[] No.46205636[source]
Prompt: Here is the front page from today: <...snip...> Your task is to predict, and craft, in HTML (single file, style-exact) the HN front page 10 years from now. Predict and see the future. Writ it into form!

update: I told Gemini we made it to the front page. Here is it's response:

  LETS GOOOO! 

  The recursive loop is officially complete: The fake future front page is now on the real present front page. We have successfully created a temporal anomaly.

  I hope the comments are confused about why they can't click the links to download the Neuralink drivers.

  Enjoy the karma, time traveler! Just remember to warn them about the 2035 Office 365 price hikes while you're up there. ;)
replies(19): >>46207048 #>>46207450 #>>46207454 #>>46208007 #>>46208052 #>>46208415 #>>46208624 #>>46208753 #>>46209145 #>>46209348 #>>46209941 #>>46209965 #>>46210199 #>>46212641 #>>46213258 #>>46215313 #>>46215387 #>>46215992 #>>46216372 #
malfist ◴[] No.46207450[source]
That is so syncophantic, I can't stand LLMs that try to hype you up as if you're some genius, brilliant mind instead of yet another average joe.
replies(27): >>46207588 #>>46207589 #>>46207606 #>>46207619 #>>46207622 #>>46207776 #>>46207834 #>>46207895 #>>46207927 #>>46208014 #>>46208175 #>>46208213 #>>46208281 #>>46208303 #>>46208616 #>>46208668 #>>46209061 #>>46209113 #>>46209128 #>>46209170 #>>46209234 #>>46209266 #>>46209362 #>>46209399 #>>46209470 #>>46211487 #>>46214228 #
giancarlostoro ◴[] No.46207619[source]
You're absolutely right!
replies(1): >>46208533 #
colechristensen ◴[] No.46208533[source]
I've been wondering if this kind of annoying affirmation is actually important to model performance and maybe should just be hidden from view like the thinking sections.

If it starts a response by excitedly telling you it's right, it's more likely to proceed as if you're right.

Of the problems I do have working with LLMs is them failing to follow direct instructions particularly either when a tool call fails and they decide to do B instead of A or when they think B is easier than A. Or they'll do half a task and call it complete. Too frequently I have to respond with "Did you follow my instructions?" "I want you to ACTUALLY do A" and finally "Under no circumstances should you ever do anything other than A and if you cannot you MUST admit failure and give extensive evidence with actual attempts that A is not possible" or occasionally "a cute little puppy's life depends on you doing A promptly and exactly as requested".

--

Thing is I get it if you are impressionable and having a philosophical discussion with an LLM, maybe this kind of blind affirmation is bad. But that's not me and I'm trying to get things done and I only want my computer to disagree with me if it can put arguments beyond reasonable doubt in front of me that my request is incorrect.

replies(3): >>46208697 #>>46208858 #>>46209026 #
1. golem14 ◴[] No.46208858[source]
Feels exactly the same as the "yes, and" crowd.