The picture before the prose.
Codencer's architecture is three roles, two contracts, one record. The roles are planner, bridge, and executor. The contracts are TaskSpec (planner → bridge) and ResultSpec (bridge → planner). The record is the run.
If you want the deep prose, the docs are at /docs. This page is the picture.
Your AI coding workflow today, then with Codencer.
If you use more than one AI tool to ship code — a chat to plan, a coding agent to execute — you are the bridge between them. Codencer is what removes that copy-paste loop without taking either tool away.
Today
You are the bridge.
- You ask ChatGPT (or Claude, Gemini, DeepSeek) to scope a new feature.
- It replies with a 6-step plan.
- You copy step 1 into Claude Code (or Codex, Qwen) in your terminal.
- The coding agent works on it — edits files, runs its own checks, returns a summary.
- You read the summary, decide if it looks right, then go back to the chat.
- You paste the result, ask the chat what to do for step 2.
- Repeat for every step. You hold the plan in your head and shuttle context between two windows.
# In your terminal:
$ git checkout -b feature/new-billing
$ # paste step 1 from chat
$ claude "implement the new billing endpoint ..."
$ # wait, read summary, copy to chat
$ # chat replies with step 2
$ # paste step 2 ... repeat ...
$ # you hold the plan across windowsWith Codencer
Codencer is the bridge.
- You ask the same chat to scope the same feature — Codencer is connected.
- The chat sends step 1 to Codencer. Codencer dispatches to your local coding agent.
- The coding agent runs in an isolated git worktree, on your machine, near the code.
- Codencer captures every artifact and returns a structured result to the chat.
- The chat reads the result and decides what's next. Codencer dispatches step 2.
- The loop continues without you holding state between two windows.
- You watch the run tree, review when it matters, work on something else when it doesn't.
# In your terminal, just once:
$ orchestratord &
$ # connect your chat to Codencer's MCP endpoint
$ # then talk to the chat normally:
$ # "scope and ship the new billing feature"
$ # Codencer dispatches each step, captures artifacts, returns results.
$ # The chat reads each result and decides next.
$ # You watch the run:
$ orchestratorctl runs list --jsonPlanner → Codencer → executor
Planners decide what to do. Executors do it. Codencer sits between them as state, contract, and audit trail.
Runs → steps → attempts → artifacts → validations → gates
Run lifecycle — runs, steps, attempts, artifacts, validations, gatesEvery TaskSpec opens a run. Every run is a tree. Steps under runs, attempts under steps, artifacts under attempts. Validations and gates close the tree. Every node is recorded, append-only, SQLite-backed.
The state model is the thing other vendors' architectures flatten away. Codencer keeps it, because it's the asset.
Local-only. Self-host relay. Self-host cloud.
Three Codencer deployment modesCodencer is local-first by design. Execution always runs on the operator's own machine, near the code. The relay layer can run on the operator's infrastructure too.
Local-only is the path of least resistance — single machine, single user, full proof in under a minute. Self-host relay adds a remote planner without exposing a raw remote shell. Self-host cloud adds the multi-tenant control plane.
Comparison matrix
| Product | Planner | Executor | Cross-vendor | AI coding semantics | Local exec | Self-host | Durable run record | Adapters shipped | Worktree isolation |
|---|---|---|---|---|---|---|---|---|---|
| ChatGPT → Codex | OpenAI | OpenAI | ─ | chat only | ─ | ─ | chat history | 1 | ─ |
| Claude → CC | Anthropic | Anthropic | ─ | tool calls only | partial | ─ | tool-call log | 1 | ─ |
| Cursor self-hosted | Cursor | Cursor | ─ | partial | ● | ● | partial | 1 | ─ |
| Copilot cloud | GitHub | GitHub | ─ | partial | ─ | ─ | partial | 1 | ─ |
| DIY MCP glue | any | any | partial | you build it | varies | varies | you build it | 0 | you build it |
| Codencer | any | any | ● | ● | ● | ● | ● | 5 | ● |
Vendors deepen their own stacks. The neutral cross-vendor bridge with executor adapters and durable run records is still missing.