The deeper problem
1. Not portable – context is vendor-locked; nothing travels across tools.
2. Not relational – most memory systems store only the latest fact (“sticky notes”) with no history or provenance.
3. Not yours – your AI memory is sensitive first-party data, yet you have no control over where it lives or how it’s queried.
Demo video: https://youtu.be/iANZ32dnK60
Repo: https://github.com/RedPlanetHQ/core
What we built
- CORE (Context Oriented Relational Engine): An open source, shareable knowledge graph (your memory vault) that lets any LLM (ChatGPT, Cursor, Claude, SOL, etc.) share and query the same persistent context.
- Temporal + relational: Every fact gets a full version history (who, when, why), and nothing is wiped out when you change it—just timestamped and retired.
- Local-first or hosted: Run it offline in Docker, or use our hosted instance. You choose which memories sync and which stay private.
Try it
- Hosted free tier (HN launch): https://core.heysol.ai
https://github.com/RedPlanetHQ/core/blob/main/README.md You can check in our readme on how to use mcp server
Other memory MCP servers do not require access to OpenAI - and not even sure why I would want any of my data going there when I use a paid Claude subscription.
I like the idea here, but doesn't seem very usable.
1. If you have self hosted it yes then we need OpenAI api key, we are also working to get things working with llama so should be available soon (but if from cloud you shouldn't have got this error)
2. We also have moved away from openAI embedding (BGE-M3) to our own hosted in the cloud.
In self host we already have claude code support happy to help there