Back to all posts

Developer Offer

Try ImaginePro API with 50 Free Credits

Build and ship AI-powered visuals with Midjourney, Flux, and more — free credits refresh every month.

Start Free Trial

Choosing The Right OpenAI Codex Model

2025-11-14Unknown5 minutes read
Codex
OpenAI
AI Programming

Meet the Latest Codex Models

OpenAI offers a suite of Codex models tailored for different programming needs, from complex, long-running agentic tasks to faster, more cost-effective solutions. Here’s a look at the primary models available, helping you choose the best fit for your project.

gpt-5.1-codex

This model is optimized for intensive, long-running, and agentic coding tasks. It's the powerhouse of the Codex family and serves as the default model for macOS and Linux users. It is also the exclusive model used for Codex Cloud tasks.

To use it in the command line: bash codex -m gpt-5.1-codex

Features & Pricing

FeatureAvailability
Capability✨✨✨✨
Speed⚡️⚡️⚡️
Codex CLI & SDK✅ Yes
Codex IDE Extension✅ Yes
Codex Cloud✅ Yes
API Access✅ Yes
Plan/APIRate (per 5h / 1M tokens)
ChatGPT Plus45-225 local messages / 10-60 cloud tasks
ChatGPT Pro300-1,500 local messages / 50-400 cloud tasks
API Input$1.25
API Cached Input$0.13
API Output$10.00

gpt-5.1-codex-mini

For developers looking for a more cost-effective and faster alternative, gpt-5.1-codex-mini offers a great balance. It's a smaller, less-capable version of the main Codex model but excels in speed, making it ideal for quicker tasks.

To use it in the command line: bash codex -m gpt-5.1-codex-mini

Features & Pricing

FeatureAvailability
Capability✨✨✨
Speed⚡️⚡️⚡️⚡️
Codex CLI & SDK✅ Yes
Codex IDE Extension✅ Yes
Codex Cloud❌ No
API Access✅ Yes
Plan/APIRate (per 5h / 1M tokens)
ChatGPT Plus180-900 local messages / 40-240 cloud tasks
ChatGPT Pro1,200-6,000 local messages / 200-1,600 cloud tasks
API Input$0.25
API Cached Input$0.03
API Output$2.00

gpt-5.1

This model is a strong generalist, great for a wide range of coding and agentic tasks across different domains. It is the default model for Windows users.

To use it in the command line: bash codex -m gpt-5.1

Features & Pricing

FeatureAvailability
Capability✨✨✨✨
Speed⚡️⚡️⚡️
Codex CLI & SDK✅ Yes
Codex IDE Extension✅ Yes
Codex Cloud❌ No
API Access✅ Yes
Plan/APIRate (per 5h / 1M tokens)
ChatGPT Plus45-225 local messages / 10-60 cloud tasks
ChatGPT Pro300-1,500 local messages / 50-400 cloud tasks
API Input$1.25
API Cached Input$0.13
API Output$10.00

How to Select Your Codex Model

You have several ways to configure which model Codex uses, whether you want to set a permanent default or make a temporary choice for a specific task.

Configure Your Default Local Model

Both the Codex CLI and the IDE Extension use a central configuration file named config.toml. To set your preferred default model, simply add a model entry to this file. If you don't specify one, the tool will select a default for you.

toml model="gpt-5.1-codex"

For users who frequently switch between models and settings, you can also set up different Codex profiles.

Temporarily Choosing a Different Local Model

If you need to switch models on the fly, you can use the /model command during an active session in the Codex CLI. In the IDE Extension, a model selector is available next to the input box.

To start a new CLI session or run a one-off command with a specific model, use the --model or -m flag.

bash codex -m gpt-5.1-codex-mini

Choosing Your Model for Cloud Tasks

Currently, there is no option to select a different model for Codex Cloud tasks. All cloud-based tasks are run using gpt-5.1-codex.

Legacy Codex Models

The following models have been succeeded by the newer versions listed above but may still be available for use.

  • gpt-5-codex: The predecessor to gpt-5.1-codex, this version was also tuned for long-running, agentic coding tasks.
  • gpt-5-codex-mini: A smaller, cost-effective version of gpt-5-codex that has been succeeded by gpt-5.1-codex-mini.
  • gpt-5: The original reasoning model for coding tasks, which has been succeeded by gpt-5.1.

Using Custom Models with Codex

Codex offers flexibility beyond its native models. If you are authenticating with an API key, you can point Codex to any model and provider that supports either the OpenAI Chat Completions or Responses APIs. This allows you to adapt the tool to your specific use case or infrastructure.

Read Original Post

Compare Plans & Pricing

Find the plan that matches your workload and unlock full access to ImaginePro.

ImaginePro pricing comparison
PlanPriceHighlights
Standard$8 / month
  • 300 monthly credits included
  • Access to Midjourney, Flux, and SDXL models
  • Commercial usage rights
Premium$20 / month
  • 900 monthly credits for scaling teams
  • Higher concurrency and faster delivery
  • Priority support via Slack or Telegram

Need custom terms? Talk to us to tailor credits, rate limits, or deployment options.

View All Pricing Details
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.