←back to thread

GPT-5.2

(openai.com)
1053 points atgctg | 1 comments | | HN request time: 0.243s | source
Show context
onraglanroad ◴[] No.46237160[source]
I suppose this is as good a place as any to mention this. I've now met two different devs who complained about the weird responses from their LLM of choice, and it turned out they were using a single session for everything. From recipes for the night, presents for the wife and then into programming issues the next day.

Don't do that. The whole context is sent on queries to the LLM, so start a new chat for each topic. Or you'll start being told what your wife thinks about global variables and how to cook your Go.

I realise this sounds obvious to many people but it clearly wasn't to those guys so maybe it's not!

replies(14): >>46237301 #>>46237674 #>>46237722 #>>46237855 #>>46237911 #>>46238296 #>>46238727 #>>46239388 #>>46239806 #>>46239829 #>>46240070 #>>46240318 #>>46240785 #>>46241428 #
noname120 ◴[] No.46237674[source]
Problem is that by default ChatGPT has the “Reference chat history” option enabled in the Memory options. This causes any previous conversation to leak into the current one. Just creating a new conversation is not enough, you also need to disable that option.
replies(3): >>46238018 #>>46238056 #>>46238504 #
redhed ◴[] No.46238056[source]
This is also the default in Gemini pretty sure, at least I remember turning it off. Make's no sense to me why this is the default.
replies(2): >>46238580 #>>46239233 #
1. astrange ◴[] No.46239233[source]
Mostly because they built the feature and so that implicitly means they think it's cool.

I recommend turning it off because it makes the models way more sycophantic and can drive them (or you) insane.