←back to thread

106 points codingmoh | 1 comments | | HN request time: 0.202s | source

Hey HN,

I’ve built Open Codex, a fully local, open-source alternative to OpenAI’s Codex CLI.

My initial plan was to fork their project and extend it. I even started doing that. But it turned out their code has several leaky abstractions, which made it hard to override core behavior cleanly. Shortly after, OpenAI introduced breaking changes. Maintaining my customizations on top became increasingly difficult.

So I rewrote the whole thing from scratch using Python. My version is designed to support local LLMs.

Right now, it only works with phi-4-mini (GGUF) via lmstudio-community/Phi-4-mini-instruct-GGUF, but I plan to support more models. Everything is structured to be extendable.

At the moment I only support single-shot mode, but I intend to add interactive (chat mode), function calling, and more.

You can install it using Homebrew:

   brew tap codingmoh/open-codex
   brew install open-codex

It's also published on PyPI:

   pip install open-codex

Source: https://github.com/codingmoh/open-codex
Show context
xyproto ◴[] No.43756300[source]
This is very convenient and nice! But I could not get it to work with the best small models available for Ollama for programming, like https://ollama.com/MFDoom/deepseek-coder-v2-tool-calling for example.
replies(2): >>43756392 #>>43756426 #
smcleod ◴[] No.43756392[source]
That's a really old model now. Even the old Qwen 2.5 coder 32b model is better than DSv2
replies(1): >>43756430 #
codingmoh ◴[] No.43756430[source]
I want to add support for qwen 2.5 next
replies(2): >>43756625 #>>43759843 #
manmal ◴[] No.43756625[source]
QwQ-32 might be worth looking into also, as a high level planning tool.
replies(1): >>43756709 #
1. codingmoh ◴[] No.43756709[source]
Thank you so much!