Project Memory MCP Server

Long-term project memory for AI models and coding agents

Project Memory MCP stores architecture, decisions, tasks, warnings, preferences, and session state so your AI tools can resume work exactly where they left off.

Interactive index

Jump directly to the area you need and keep implementation steps easy to follow.

Why teams use it

Built for teams that want MCP project memory, Supabase persistent context, AI agent memory, and reliable handoff between OpenCode, Claude Code CLI, Qwen Code, Codex, and native sessions.

Unified memory

Store decisions, warnings, tasks, and preferences in one Supabase-backed memory layer.

Interface-aware routing

Adapt responses for native chat, Qwen Code integration, and Codex plugin memory without duplicating logic.

Context optimization

Trim large payloads safely with model-aware and interface-aware token strategies.

Production ready

Use SQL RLS policies, analytics, and test coverage to ship a public MCP server with confidence.

Setup in five steps

Minimal setup, keyboard-friendly navigation, and responsive guidance for desktop, tablet, and mobile.

  1. 1

    Clone the repository and install Python dependencies.

  2. 2

    Create a Supabase project and run the provided schema.sql file.

  3. 3

    Copy .env.example to .env and add your public values.

  4. 4

    Register the server in mcp.json or your MCP-compatible client settings.

  5. 5

    Start the server and verify shared memory across interfaces.

Use it with natural language

In most MCP-compatible clients, you talk to the model normally and it decides which tool to call when tool use is enabled.

Talk normally

You do not need to manually name a tool for common tasks. Ask for the context, ask what changed, or ask to continue the project.

The model selects tools

If the client exposes Project Memory MCP tools, the model can call `load_unified_context`, `sync_session_state`, or other tools automatically based on your request.

Force a tool only when needed

Manual tool calls are useful for debugging, integrations, or when you want exact control over inputs and outputs.

Prompt examples

Resume this project and tell me where we left off.

Load the stored project memory before continuing the refactor.

Save this architecture decision and mark the current task as in progress.

If a client disables tool use, the model cannot call MCP tools automatically. In that case, enable MCP tools in the client or call the tool explicitly.

Configuration

You can keep this repository in the folder you prefer. Most users clone it once, keep a private `.env`, and connect multiple IDEs or AI clients to the same MCP server.

1. Clone and prepare the server

Clone the repository, install dependencies, run `pip install -e .`, create `.env`, run `schema.sql` in Supabase, and keep the folder in a stable location.

git clone https://github.com/dannymaaz/project-memory-mcp.git

cd project-memory-mcp

python -m venv .venv

pip install -r requirements.txt

pip install -e .

macOS and Linux

The MCP is not Windows-only. After `pip install -e .`, macOS and Linux expose the same `project-memory-mcp` command, so the same MCP server entry works across platforms.

python3 -m venv .venv

source .venv/bin/activate

cp .env.example .env

2. Keep one central installation

Do not copy the server into every project. Keep one stable folder for the MCP server and connect all clients to that same installation.

cp .env.example .env

project-memory-mcp

3. Use one standard MCP command

After installation, configure clients to launch the same command: project-memory-mcp. This makes the server behave like any other stdio MCP server.

"command": "project-memory-mcp"

"env": { "SUPABASE_URL": "..." }

4. Do I need to start it after every reboot?

Usually no. When a client is configured with the `project-memory-mcp` command, it normally starts the server automatically when needed. You only need to run it manually when testing it directly or debugging outside the client.

Works with other MCP clients

If an IDE or agent accepts a standard MCP JSON config with an mcpServers entry, you can usually paste the same server block and only adjust environment values.

mcpServers.project-memory-mcp

OpenCode

Point OpenCode to the shared MCP config or paste the same server block into its settings, then set PROJECT_MEMORY_INTERFACE=opencode.

opencode --mcp-config mcp.json

Codex

Register the same server block in Codex MCP settings, then launch Codex with that configuration.

codex --config mcp.json

Claude Desktop

Add a local MCP server entry to Claude Desktop using the installed project-memory-mcp command and the same environment variables from your local setup.

Claude Desktop path by platform

%APPDATA%\Claude\claude_desktop_config.json

macOS: ~/Library/Application Support/Claude/claude_desktop_config.json

Linux: check your local Claude Desktop app data folder

"command": "project-memory-mcp"

Claude Code CLI

Run Claude Code with the shared MCP config or the equivalent CLI registration flow.

claude-code --mcp-config mcp.json

Antigravity

Paste the same MCP JSON into Antigravity MCP settings. On Windows, a common config location is shown below.

%USERPROFILE%\.gemini\antigravity\mcp_config.json

"command": "project-memory-mcp"

Qwen Code

Point Qwen Code to the same MCP config or paste the shared server block into its settings, then set PROJECT_MEMORY_INTERFACE=qwen-code.

qwen --mcp-config mcp.json

Generic MCP config snippet

{
  "mcpServers": {
    "project-memory-mcp": {
      "command": "project-memory-mcp",
      "env": {
        "SUPABASE_URL": "https://your-project.supabase.co",
        "SUPABASE_KEY": "your-anon-key",
        "OWNER_ID": "your-owner-id",
        "DEFAULT_MODEL": "gemini-pro",
        "PROJECT_MEMORY_INTERFACE": "claude-code"
      }
    }
  }
}

Common Antigravity path on Windows

%USERPROFILE%\.gemini\antigravity\mcp_config.json

Core MCP tools

The API focuses on loading shared context, saving decisions, syncing session state, and measuring interface usage.

load_unified_context

Load durable project memory optimized for the current interface and model.

save_cross_interface_decision

Persist a decision so future sessions inherit the same reasoning.

sync_session_state

Store the current working state so another AI client can continue the same task.

get_interface_analytics

Measure usage patterns across OpenCode, Claude Code CLI, Qwen Code, Codex, and native workflows.

Examples by interface

Switch views to compare OpenCode, Claude Code CLI, and Codex workflows using the same project memory backend.

OpenCode workflow

Load project memory before editing so OpenCode inherits architecture, previous decisions, and active warnings.

Launch OpenCode with the MCP config

opencode --mcp-config mcp.json