←back to thread

111 points Manik_agg | 1 comments | | HN request time: 0.272s | source

I keep running in the same problem of each AI app “remembers” me in its own silo. ChatGPT knows my project details, Cursor forgets them, Claude starts from zero… so I end up re-explaining myself dozens of times a day across these apps.

The deeper problem

1. Not portable – context is vendor-locked; nothing travels across tools.

2. Not relational – most memory systems store only the latest fact (“sticky notes”) with no history or provenance.

3. Not yours – your AI memory is sensitive first-party data, yet you have no control over where it lives or how it’s queried.

Demo video: https://youtu.be/iANZ32dnK60

Repo: https://github.com/RedPlanetHQ/core

What we built

- CORE (Context Oriented Relational Engine): An open source, shareable knowledge graph (your memory vault) that lets any LLM (ChatGPT, Cursor, Claude, SOL, etc.) share and query the same persistent context.

- Temporal + relational: Every fact gets a full version history (who, when, why), and nothing is wiped out when you change it—just timestamped and retired.

- Local-first or hosted: Run it offline in Docker, or use our hosted instance. You choose which memories sync and which stay private.

Try it

- Hosted free tier (HN launch): https://core.heysol.ai

- Docs: https://docs.heysol.ai/core/overview

1. IXCoach ◴[] No.44471360[source]
Hey, we should talk. If your project is stable might have a collaboration that would align us both.

I have a functional UI for storing the knowledge base between my AI agents and can have an MCP server functional within a couple of days.

Right now it accesses personal instances of AI engines or the cloud, but Builds my private local knowledge base in the process. I can import in 1s from other systems as well.

Instant, navigable, lighting fast, local ui for managing my ais memory. I use semantic search for the lookups as of the moment.

Sounds like perhaps the 2 tools together would add to each other.