←back to thread

196 points zmccormick7 | 5 comments | | HN request time: 0.994s | source
1. maerF0x0 ◴[] No.45387539[source]
I've noticed that chatgpt doesnt seem to be very good at understanding elapsed time. I have some long running threads and unless i prompt it with elapsed time ("it's now 7 days later") the responses act like it was 1 second after the last message.

I think this might be a good leap for agents, the ability to not just review a doc in it's current state, but to keep in context/understanding the full evolution of a document.

replies(2): >>45387612 #>>45387677 #
2. wat10000 ◴[] No.45387612[source]
They have no ability to even perceive time, unless the system gives them timestamps for the current interaction and past interactions.
replies(1): >>45387710 #
3. HankStallone ◴[] No.45387677[source]
I've noticed the same thing with Grok. One time it predicted a X% chance that something would happen by July 31. On August 1, it was still predicting the thing would happen by July 31, just with lower (but non-zero) odds. Their grasp on time is tenuous at best.
4. multiplegeorges ◴[] No.45387710[source]
Which seems like a trivial addition if it's not there?
replies(1): >>45387783 #
5. wat10000 ◴[] No.45387783{3}[source]
It is, but now you're burning a bit of context on something that might not be necessary, and potentially having the agent focus on time when it's not relevant. Not necessarily a bad idea, but as always, tradeoffs.