An MCP server to wrap the OpenAI Codex CLI tool for use with Claude Code.
This project provides a simple JSON-RPC server that allows Claude Code to interact with the OpenAI Codex CLI tool. This enables Claude Code to use OpenAI's models for code generation, explanation, and problem-solving when needed.
The MCP server has been enhanced to provide a more intuitive and robust interface:
-
Specialized Methods: Added dedicated methods for common coding tasks:
write_code
: Generate code for a specific task in a specified languageexplain_code
: Get detailed explanations of how code worksdebug_code
: Find and fix bugs in problematic code
-
Model Selection: Clearly defined model options with defaults:
- o4-mini
- o4-preview
- o4-pro
- o4-latest
-
Simplified Syntax: More intuitive parameter naming and structure for easier integration
These improvements make it easier for Claude Code to use OpenAI's models for specific programming tasks without requiring complex prompt engineering.
- Python 3.12+
- OpenAI Codex CLI tool (
npm install -g @openai/codex
) - Valid OpenAI API key (for the Codex CLI, not for this server)
This project uses a PEP‑621 pyproject.toml
. Follow these steps:
# 1. Create & activate a venv
uv venv # creates a .venv/ directory
source .venv/bin/activate # on Windows: .\.venv\Scripts\activate
# 2. Install package and dependencies into the venv
uv pip install .
After installation, the codex_server
entrypoint is available in your PATH.
Use the provided setup script to automatically set up the environment and start the server:
./setup_and_run.sh
This script will:
- Check for the
codex
CLI installation - Set up a Python virtual environment if needed
- Install the MCP tool via the Claude CLI if available
- Start the MCP server
If you prefer to start the server manually:
- Make sure the Codex CLI is installed and properly configured with your OpenAI API key
- Start the server:
Alternatively, use uvicorn directly:
codex_server
uvicorn codex_server:app
Once your MCP server is running, you can register it with Claude Code using either of these methods:
The repository includes a configuration file that can be installed directly using the Claude CLI:
# Install the MCP tool directly from the JSON config
claude mcp add /path/to/openai_codex_mcp.json
# Verify the tool was installed correctly
claude mcp list
Alternatively, you can register the tool manually:
- In Claude Code, navigate to Settings → Tools → Manage MCP Tools.
- Create a new tool with:
- Name:
openai_codex
- Description:
OpenAI Codex CLI integration via MCP server
- Base URL:
http://localhost:8000/
- Protocol:
JSON-RPC 2.0
- Authentication: leave blank
- Name:
- Save the tool.
Now, Claude Code can use the OpenAI Codex CLI tool for tasks where a different perspective or approach is desired. You can invoke it by asking Claude to use the OpenAI models for a particular task.
The MCP server provides multiple methods to interact with OpenAI's models for different coding tasks.
For flexible, custom prompts to OpenAI:
curl -X POST http://localhost:8000/ \
-H 'Content-Type: application/json' \
-d '{
"jsonrpc": "2.0",
"method": "codex_completion",
"params": {
"prompt": "Write a JavaScript function to sort an array of objects by a property value",
"model": "o4-mini"
},
"id": 1
}'
Specialized method for code generation with language specification:
curl -X POST http://localhost:8000/ \
-H 'Content-Type: application/json' \
-d '{
"jsonrpc": "2.0",
"method": "write_code",
"params": {
"task": "Calculate the first 100 Fibonacci numbers and return them as an array",
"language": "python",
"model": "o4-mini"
},
"id": 1
}'
Specialized method for code explanation:
curl -X POST http://localhost:8000/ \
-H 'Content-Type: application/json' \
-d '{
"jsonrpc": "2.0",
"method": "explain_code",
"params": {
"code": "def quicksort(arr):\n if len(arr) <= 1:\n return arr\n pivot = arr[len(arr) // 2]\n left = [x for x in arr if x < pivot]\n middle = [x for x in arr if x == pivot]\n right = [x for x in arr if x > pivot]\n return quicksort(left) + middle + quicksort(right)",
"model": "o4-mini"
},
"id": 1
}'
Specialized method for finding and fixing bugs:
curl -X POST http://localhost:8000/ \
-H 'Content-Type: application/json' \
-d '{
"jsonrpc": "2.0",
"method": "debug_code",
"params": {
"code": "function fibonacci(n) {\n if (n <= 0) return [];\n if (n === 1) return [1];\n let sequence = [1, 1];\n for (let i = 2; i <= n; i++) {\n sequence.push(sequence[i-2] + sequence[i-1]);\n }\n return sequence;\n}",
"issue_description": "It generates one too many Fibonacci numbers",
"model": "o4-mini"
},
"id": 1
}'
o4-mini
: Faster, more affordable reasoning modelo3
: Most powerful reasoning modelo3-mini
: A small model alternative to o3o1
: Previous full o-series reasoning modelo1-mini
: A small model alternative to o1o1-pro
: Version of o1 with more compute for better responses
gpt-4.1
: Flagship GPT model for complex tasksgpt-4o
: Fast, intelligent, flexible GPT modelgpt-4.1-mini
: Balanced for intelligence, speed, and costgpt-4.1-nano
: Fastest, most cost-effective GPT-4.1 modelgpt-4o-mini
: Fast, affordable small model for focused tasks
prompt
(required): The prompt to send to Codexmodel
(optional): The model to use (e.g., "o4-mini", "o3", "gpt-4.1", "gpt-4o-mini")images
(optional): List of image paths or data URIs to includeadditional_args
(optional): Additional CLI arguments to pass to Codex
task
(required): Description of the coding tasklanguage
(required): Programming language for the solution (e.g., "python", "javascript", "java")model
(optional): The model to use
code
(required): The code to explainmodel
(optional): The model to use
code
(required): The code to debugissue_description
(optional): Description of the issue or errormodel
(optional): The model to use
Once configured, you can ask Claude to use OpenAI Codex for specific tasks using any of the available methods:
User: Can you use OpenAI Codex to write a Python function that generates prime numbers?
Claude: I'll use the OpenAI Codex write_code method to generate that function.
[Claude would use the write_code method with task="Write a Python function that generates prime numbers" and language="python"]
User: Can you ask OpenAI Codex to explain how this quicksort algorithm works?
Claude: I'll use OpenAI Codex to explain this algorithm.
[Claude would use the explain_code method with the code provided]
User: My JavaScript function has a bug. Can you use OpenAI Codex to debug it?
Claude: I'll ask OpenAI Codex to find and fix the bug in your code.
[Claude would use the debug_code method with the code provided]
User: Can you use OpenAI Codex with the o4-preview model to write an efficient implementation of a binary search tree?
Claude: I'll use the o4-preview model to generate that implementation.
[Claude would use the specified model with the appropriate method]
This allows you to leverage both Claude and OpenAI's capabilities seamlessly within the same interface, with specialized methods for common coding tasks.