Skip to main content
Codex CLI is OpenAI’s open-source local coding agent that runs in your terminal. You can connect it to NagaAI by defining a custom provider in ~/.codex/config.toml and using your NagaAI API key.
Codex CLI relies on function calling (tools) for agentic actions such as reading files, running commands, and editing code. Choose any NagaAI model that supports tools: models with tools support.

Setup

1

Install Codex CLI

Follow the official installation instructions: https://github.com/openai/codex
2

Create a NagaAI API key

Create an API key in the dashboard: https://naga.ac/dashboard.
If you’re new to NagaAI, follow the Quickstart first.
3

Configure Codex CLI to use NagaAI

Open ~/.codex/config.toml and add a custom provider that points to NagaAI’s OpenAI-compatible API:
model_provider = "naga-ai"
model_reasoning_effort = "high"
model = "gpt-5.3-codex"

[model_providers.naga-ai]
name = "naga-ai"
base_url = "https://api.naga.ac/v1"
env_key = "NAGAAI_API_KEY"
We recommend gpt-5.3-codex, but you can use any NagaAI model that supports tools. See models with tools support.
4

Set your API key

Codex reads the API key from the environment variable named in env_key.
export NAGAAI_API_KEY="YOUR_NAGAAI_API_KEY"
Keep real API keys out of version control and shell history where possible.
5

Start Codex

In your project directory:
codex
Your requests will be sent through NagaAI using the provider defined in config.toml.

Notes

  • Config file: ~/.codex/config.toml
  • Base URL: https://api.naga.ac/v1
  • API key env var: NAGAAI_API_KEY
  • Recommended model: gpt-5.3-codex (you can also use any NagaAI model with tools support: https://naga.ac/models?parameters=tools)
  • If you use Codex project trust levels, configure them in the same config.toml file

Try it

  • “Summarize this repository: structure, build system, and entry points.”
  • “Find where feature X is implemented and list the relevant files.”
  • “Run tests and fix the first failing test.”
  • “Refactor module Y and update all call sites.”