←back to thread

GPT-5.2

(openai.com)
1019 points atgctg | 1 comments | | HN request time: 0s | source
Show context
onraglanroad ◴[] No.46237160[source]
I suppose this is as good a place as any to mention this. I've now met two different devs who complained about the weird responses from their LLM of choice, and it turned out they were using a single session for everything. From recipes for the night, presents for the wife and then into programming issues the next day.

Don't do that. The whole context is sent on queries to the LLM, so start a new chat for each topic. Or you'll start being told what your wife thinks about global variables and how to cook your Go.

I realise this sounds obvious to many people but it clearly wasn't to those guys so maybe it's not!

replies(14): >>46237301 #>>46237674 #>>46237722 #>>46237855 #>>46237911 #>>46238296 #>>46238727 #>>46239388 #>>46239806 #>>46239829 #>>46240070 #>>46240318 #>>46240785 #>>46241428 #
eru ◴[] No.46240070[source]
> I realise this sounds obvious to many people but it clearly wasn't to those guys so maybe it's not!

It's worse: Gemini (and ChatGPT, but to a lesser extent) have started suggesting random follow-up topics when they conclude that a chat in a session has exhausted a topic. Well, when I say random, I mean that they seem to be pulling it from the 'memory' of our other chats.

For a naive user without preconceived notions of how to use these tools, this guidance from the tools themselves would serve as a pretty big hint that they should intermingle their sessions.

replies(1): >>46240643 #
ghostpepper ◴[] No.46240643[source]
For ChatGPT you can turn this memory off in settings and delete the ones it's already created.
replies(1): >>46242026 #
1. eru ◴[] No.46242026{3}[source]
I'm not complaining about the memory at all. I was complaining about the suggestion to continue with unrelated topics.