core-agent-ide/codex-rs
dulikaifazr de1768d3ba
Fix: Claude models return incomplete responses due to empty finish_reason handling (#6728)
## Summary
Fixes streaming issue where Claude models return only 1-4 characters
instead of full responses when used through certain API
providers/proxies.

## Environment
- **OS**: Windows
- **Models affected**: Claude models (e.g., claude-haiku-4-5-20251001)
- **API Provider**: AAAI API proxy (https://api.aaai.vip/v1)
- **Working models**: GLM, Google models work correctly

## Problem
When using Claude models in both TUI and exec modes, only 1-4 characters
are displayed despite the backend receiving the full response. Debug
logs revealed that some API providers send SSE chunks with an empty
string finish_reason during active streaming, rather than null or
omitting the field entirely.

The current code treats any non-null finish_reason as a termination
signal, causing the stream to exit prematurely after the first chunk.
The problematic chunks contain finish_reason with an empty string
instead of null.

## Solution
Fix empty finish_reason handling in chat_completions.rs by adding a
check to only process non-empty finish_reason values. This ensures empty
strings are ignored and streaming continues normally.

## Testing
- Tested on Windows with Claude Haiku model via AAAI API proxy
- Full responses now received and displayed correctly in both TUI and
exec modes
- Other models (GLM, Google) continue to work as expected
- No regression in existing functionality

## Impact
- Improves compatibility with API providers that send empty
finish_reason during streaming
- Enables Claude models to work correctly in Windows environment
- No breaking changes to existing functionality

## Related Issues
This fix resolves the issue where Claude models appeared to return
incomplete responses. The root cause was identified as a compatibility
issue in parsing SSE responses from certain API providers/proxies,
rather than a model-specific problem. This change improves overall
robustness when working with various API endpoints.

---------

Co-authored-by: Eric Traut <etraut@openai.com>
2025-11-16 19:50:36 -08:00
..
.cargo build: 8mb stacks on win (#5997) 2025-10-30 16:12:50 -07:00
.config Add test timeout (#6612) 2025-11-14 09:30:37 -08:00
ansi-escape Fix transcript mode rendering issue when showing tab chars (#4911) 2025-10-08 11:42:09 -07:00
app-server [App server] add mcp tool call item started/completed events (#6642) 2025-11-14 08:08:43 -08:00
app-server-protocol [App server] add mcp tool call item started/completed events (#6642) 2025-11-14 08:08:43 -08:00
app-server-test-client feat: add app-server-test-client crate for internal use (#5391) 2025-11-14 12:39:58 -08:00
apply-patch Fix apply_patch rename move path resolution (#5486) 2025-11-06 17:02:09 -08:00
arg0 Use codex-linux-sandbox in unified exec (#6480) 2025-11-10 17:17:09 -08:00
async-utils Support graceful agent interruption (#5287) 2025-10-17 18:52:57 +00:00
backend-client [app-server] read rate limits API (#5302) 2025-10-20 14:11:54 -07:00
chatgpt chore: merge git crates (#5909) 2025-10-29 12:11:44 +00:00
cli Revert "templates and build step for validating/submitting winget package" (#6696) 2025-11-15 03:47:58 +00:00
cloud-tasks Add test timeout (#6612) 2025-11-14 09:30:37 -08:00
cloud-tasks-client chore(deps): bump thiserror from 2.0.16 to 2.0.17 in /codex-rs (#4426) 2025-10-30 19:00:00 -07:00
codex-backend-openapi-models fix: icu_decimal version (#5919) 2025-10-29 20:46:45 +00:00
common Reasoning level update (#6586) 2025-11-13 06:24:36 +00:00
core Fix: Claude models return incomplete responses due to empty finish_reason handling (#6728) 2025-11-16 19:50:36 -08:00
docs [App-server] Add auth v2 doc & update codex mcp interface auth section (#6353) 2025-11-07 08:17:19 -08:00
exec feat: better UI for unified_exec (#6515) 2025-11-14 16:31:12 +01:00
execpolicy Use anyhow::Result in tests for error propagation (#4105) 2025-09-23 13:31:36 -07:00
feedback Add feedback upload request handling (#5682) 2025-10-27 05:53:39 +00:00
file-search Follow symlinks during file search (#4453) 2025-11-03 20:28:33 -08:00
keyring-store [Auth] Choose which auth storage to use based on config (#5792) 2025-10-27 19:41:49 -07:00
linux-sandbox chore: unify config crates (#5958) 2025-10-30 10:28:32 +00:00
login [Auth] Choose which auth storage to use based on config (#5792) 2025-10-27 19:41:49 -07:00
mcp-server Add warning on compact (#6052) 2025-10-31 13:27:33 -07:00
mcp-types [app-server] remove serde(skip_serializing_if = "Option::is_none") annotations (#5939) 2025-10-30 18:18:53 +00:00
ollama Use assert_matches (#4756) 2025-10-05 21:12:31 +00:00
otel Changes to sandbox command assessment feature based on initial experiment feedback (#6091) 2025-11-01 14:52:23 -07:00
process-hardening feat: introduce npm module for codex-responses-api-proxy (#4417) 2025-09-28 19:34:06 -07:00
protocol Handle "Don't Trust" directory selection in onboarding (#4941) 2025-11-14 15:23:35 -08:00
responses-api-proxy feat: add options to responses-api-proxy to support Azure (#6129) 2025-11-03 10:06:00 -08:00
rmcp-client fix: resolve Windows MCP server execution for script-based tools (#3828) 2025-11-16 13:41:10 -08:00
scripts feat: add --promote-alpha option to create_github_release script (#6370) 2025-11-07 20:05:22 +00:00
stdio-to-uds feat: experimental codex stdio-to-uds subcommand (#5350) 2025-10-19 21:12:45 -07:00
tui Fix AltGr/backslash input on Windows Codex terminal (#6720) 2025-11-16 19:15:06 -08:00
utils feat: cache tokenizer (#6609) 2025-11-14 17:05:00 +01:00
windows-sandbox-rs fix codex detection, add new security-focused smoketests. (#6682) 2025-11-14 12:08:59 -08:00
.gitignore [MCP] Prefix MCP tools names with mcp__ (#5309) 2025-10-19 20:41:55 -04:00
Cargo.lock fix: resolve Windows MCP server execution for script-based tools (#3828) 2025-11-16 13:41:10 -08:00
Cargo.toml feat: add app-server-test-client crate for internal use (#5391) 2025-11-14 12:39:58 -08:00
clippy.toml fix: switch rate limit reset handling to timestamps (#5304) 2025-10-17 17:39:37 -07:00
code Send text parameter for non-gpt-5 models (#4195) 2025-09-24 22:00:06 +00:00
config.md Fix link to MCP Servers config section (#5301) 2025-10-17 14:58:27 -07:00
default.nix Fix nix build (#6230) 2025-11-04 17:07:37 -08:00
justfile feat: add app-server-test-client crate for internal use (#5391) 2025-11-14 12:39:58 -08:00
README.md add codex debug seatbelt --log-denials (#4098) 2025-11-10 22:48:14 +00:00
rust-toolchain.toml chore: upgrade to Rust 1.90 (#4124) 2025-09-24 08:32:00 -07:00
rustfmt.toml Update cargo to 2024 edition (#842) 2025-05-07 08:37:48 -07:00

Codex CLI (Rust Implementation)

We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.

Installing Codex

Today, the easiest way to install Codex is via npm:

npm i -g @openai/codex
codex

You can also install via Homebrew (brew install --cask codex) or download a platform-specific release directly from our GitHub Releases.

Documentation quickstart

What's new in the Rust CLI

The Rust implementation is now the maintained Codex CLI and serves as the default experience. It includes a number of features that the legacy TypeScript CLI never supported.

Config

Codex supports a rich set of configuration options. Note that the Rust CLI uses config.toml instead of config.json. See docs/config.md for details.

Model Context Protocol Support

MCP client

Codex CLI functions as an MCP client that allows the Codex CLI and IDE extension to connect to MCP servers on startup. See the configuration documentation for details.

MCP server (experimental)

Codex can be launched as an MCP server by running codex mcp-server. This allows other MCP clients to use Codex as a tool for another agent.

Use the @modelcontextprotocol/inspector to try it out:

npx @modelcontextprotocol/inspector codex mcp-server

Use codex mcp to add/list/get/remove MCP server launchers defined in config.toml, and codex mcp-server to run the MCP server directly.

Notifications

You can enable notifications by configuring a script that is run whenever the agent finishes a turn. The notify documentation includes a detailed example that explains how to get desktop notifications via terminal-notifier on macOS.

codex exec to run Codex programmatically/non-interactively

To run Codex non-interactively, run codex exec PROMPT (you can also pass the prompt via stdin) and Codex will work on your task until it decides that it is done and exits. Output is printed to the terminal directly. You can set the RUST_LOG environment variable to see more about what's going on.

Experimenting with the Codex Sandbox

To test to see what happens when a command is run under the sandbox provided by Codex, we provide the following subcommands in Codex CLI:

# macOS
codex sandbox macos [--full-auto] [--log-denials] [COMMAND]...

# Linux
codex sandbox linux [--full-auto] [COMMAND]...

# Windows
codex sandbox windows [--full-auto] [COMMAND]...

# Legacy aliases
codex debug seatbelt [--full-auto] [--log-denials] [COMMAND]...
codex debug landlock [--full-auto] [COMMAND]...

Selecting a sandbox policy via --sandbox

The Rust CLI exposes a dedicated --sandbox (-s) flag that lets you pick the sandbox policy without having to reach for the generic -c/--config option:

# Run Codex with the default, read-only sandbox
codex --sandbox read-only

# Allow the agent to write within the current workspace while still blocking network access
codex --sandbox workspace-write

# Danger! Disable sandboxing entirely (only do this if you are already running in a container or other isolated env)
codex --sandbox danger-full-access

The same setting can be persisted in ~/.codex/config.toml via the top-level sandbox_mode = "MODE" key, e.g. sandbox_mode = "workspace-write".

Code Organization

This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:

  • core/ contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.
  • exec/ "headless" CLI for use in automation.
  • tui/ CLI that launches a fullscreen TUI built with Ratatui.
  • cli/ CLI multitool that provides the aforementioned CLIs via subcommands.