←back to thread

109 points roseway4 | 3 comments | | HN request time: 0.726s | source

Hi, I'm Daniel from Zep. I've integrated the Cursor IDE with Graphiti, our open-source temporal knowledge graph framework, to provide Cursor with persistent memory across sessions. The goal was simple: help Cursor remember your coding preferences, standards, and project specs, so you don't have to constantly remind it.

Before this integration, Cursor (an AI-assisted IDE many of us already use daily) lacked a robust way to persist user context. To solve this, I used Graphiti’s Model Context Protocol (MCP) server, which allows structured data exchange between the IDE and Graphiti's temporal knowledge graph.

Key points of how this works:

- Custom entities like 'Requirement', 'Preference', and 'Procedure' precisely capture coding standards and project specs.

- Real-time updates let Cursor adapt instantly—if you change frameworks or update standards, the memory updates immediately.

- Persistent retrieval ensures Cursor always recalls your latest preferences and project decisions, across new agent sessions, projects, and even after restarting the IDE.

I’d love your feedback—particularly on the approach and how it fits your workflow.

Here's a detailed write-up: https://www.getzep.com/blog/cursor-adding-memory-with-graphi...

GitHub Repo: https://github.com/getzep/graphiti

-Daniel

1. mellosouls ◴[] No.43507337[source]
This looks interesting but somewhat complicated or not obvious how to get going in a classic "Show HN" style.

The requirement for an OpenAI key may also be a little off-putting, or at least, could do with some indication of realistic costs; most Cursor users will likely need a significant motivation to add to the subscription they already have.

Don't get me wrong, this could be a really worthwhile addition to the LLM coding toolset but I think it needs some work on the presentation as to how to get quickly up and running.

replies(1): >>43507538 #
2. roseway4 ◴[] No.43507538[source]
Graphiti uses OpenAI (or other LLMs) to build the knowledge graph. Setting up the MCP server is fairly straight forward: https://github.com/getzep/graphiti/tree/main/mcp_server

There's also a Docker Compose setup: https://github.com/getzep/graphiti/tree/main/mcp_server#runn...

The Cursor MCP setup is also simple:

```{ "mcpServers": { "Graphiti": { "url": "http://localhost:8000/sse" } } }```

replies(1): >>43507733 #
3. jasonjmcghee ◴[] No.43507733[source]
How complex is the system? Can a local model or the agent itself be used instead?