core-agent-ide/codex-rs
Michael Bolin 515b6331bd
feat: add support for login with ChatGPT (#1212)
This does not implement the full Login with ChatGPT experience, but it
should unblock people.

**What works**

* The `codex` multitool now has a `login` subcommand, so you can run
`codex login`, which should write `CODEX_HOME/auth.json` if you complete
the flow successfully. The TUI will now read the `OPENAI_API_KEY` from
`auth.json`.
* The TUI should refresh the token if it has expired and the necessary
information is in `auth.json`.
* There is a `LoginScreen` in the TUI that tells you to run `codex
login` if both (1) your model provider expects to use `OPENAI_API_KEY`
as its env var, and (2) `OPENAI_API_KEY` is not set.

**What does not work**

* The `LoginScreen` does not support the login flow from within the TUI.
Instead, it tells you to quit, run `codex login`, and then run `codex`
again.
* `codex exec` does read from `auth.json` yet, nor does it direct the
user to go through the login flow if `OPENAI_API_KEY` is not be found.
* The `maybeRedeemCredits()` function from `get-api-key.tsx` has not
been ported from TypeScript to `login_with_chatgpt.py` yet:


a67a67f325/codex-cli/src/utils/get-api-key.tsx (L84-L89)

**Implementation**

Currently, the OAuth flow requires running a local webserver on
`127.0.0.1:1455`. It seemed wasteful to incur the additional binary cost
of a webserver dependency in the Rust CLI just to support login, so
instead we implement this logic in Python, as Python has a `http.server`
module as part of its standard library. Specifically, we bundle the
contents of a single Python file as a string in the Rust CLI and then
use it to spawn a subprocess as `python3 -c
{{SOURCE_FOR_PYTHON_SERVER}}`.

As such, the most significant files in this PR are:

```
codex-rs/login/src/login_with_chatgpt.py
codex-rs/login/src/lib.rs
```

Now that the CLI may load `OPENAI_API_KEY` from the environment _or_
`CODEX_HOME/auth.json`, we need a new abstraction for reading/writing
this variable, so we introduce:

```
codex-rs/core/src/openai_api_key.rs
```

Note that `std::env::set_var()` is [rightfully] `unsafe` in Rust 2024,
so we use a LazyLock<RwLock<Option<String>>> to store `OPENAI_API_KEY`
so it is read in a thread-safe manner.

Ultimately, it should be possible to go through the entire login flow
from the TUI. This PR introduces a placeholder `LoginScreen` UI for that
right now, though the new `codex login` subcommand introduced in this PR
should be a viable workaround until the UI is ready.

**Testing**

Because the login flow is currently implemented in a standalone Python
file, you can test it without building any Rust code as follows:

```
rm -rf /tmp/codex_home && mkdir /tmp/codex_home
CODEX_HOME=/tmp/codex_home python3 codex-rs/login/src/login_with_chatgpt.py
```

For reference:

* the original TypeScript implementation was introduced in
https://github.com/openai/codex/pull/963
* support for redeeming credits was later added in
https://github.com/openai/codex/pull/974
2025-06-04 08:44:17 -07:00
..
ansi-escape Update submodules version to come from the workspace (#850) 2025-05-07 10:08:06 -07:00
apply-patch fix: provide tolerance for apply_patch tool (#993) 2025-06-03 09:06:38 -07:00
cli feat: add support for login with ChatGPT (#1212) 2025-06-04 08:44:17 -07:00
common fix(codex-rs): use codex-mini-latest as default (#1164) 2025-05-29 16:55:19 -07:00
core feat: add support for login with ChatGPT (#1212) 2025-06-04 08:44:17 -07:00
docs feat: initial import of Rust implementation of Codex CLI in codex-rs/ (#629) 2025-04-24 13:31:40 -07:00
exec feat: make reasoning effort/summaries configurable (#1199) 2025-06-02 16:01:34 -07:00
execpolicy chore: replace regex with regex-lite, where appropriate (#1200) 2025-06-02 17:11:45 -07:00
linux-sandbox fix: overhaul how we spawn commands under seccomp/landlock on Linux (#1086) 2025-05-23 11:37:07 -07:00
login feat: add support for login with ChatGPT (#1212) 2025-06-04 08:44:17 -07:00
mcp-client fix: honor RUST_LOG in mcp-client CLI and default to DEBUG (#1149) 2025-05-28 17:10:06 -07:00
mcp-server feat: add support for -c/--config to override individual config items (#1137) 2025-05-27 23:11:44 -07:00
mcp-types codex-rs: make tool calls prettier (#1211) 2025-06-03 14:29:26 -07:00
scripts chore: script to create a Rust release (#759) 2025-04-30 12:39:03 -07:00
tui feat: add support for login with ChatGPT (#1212) 2025-06-04 08:44:17 -07:00
.gitignore feat: initial import of Rust implementation of Codex CLI in codex-rs/ (#629) 2025-04-24 13:31:40 -07:00
Cargo.lock feat: add support for login with ChatGPT (#1212) 2025-06-04 08:44:17 -07:00
Cargo.toml feat: add support for login with ChatGPT (#1212) 2025-06-04 08:44:17 -07:00
config.md feat: make reasoning effort/summaries configurable (#1199) 2025-06-02 16:01:34 -07:00
default.nix restructure flake for codex-rs (#888) 2025-05-13 13:08:42 -07:00
justfile fix: enable set positional-arguments in justfile (#1169) 2025-05-30 09:11:53 -07:00
README.md docs: split the config-related portion of codex-rs/README.md into its own config.md file (#1165) 2025-05-29 16:59:35 -07:00
rustfmt.toml Update cargo to 2024 edition (#842) 2025-05-07 08:37:48 -07:00

Codex CLI (Rust Implementation)

We provide Codex CLI as a standalone, native executable to ensure a zero-dependency install.

Installing Codex

Today, the easiest way to install Codex is via npm, though we plan to publish Codex to other package managers soon.

npm i -g @openai/codex@native
codex

You can also download a platform-specific release directly from our GitHub Releases.

Config

Codex supports a rich set of configuration options. See config.md for details.

Model Context Protocol Support

Codex CLI functions as an MCP client that can connect to MCP servers on startup. See the mcp_servers section in the configuration documentation for details.

It is still experimental, but you can also launch Codex as an MCP server by running codex mcp. Using the @modelcontextprotocol/inspector is

npx @modelcontextprotocol/inspector codex mcp

Code Organization

This folder is the root of a Cargo workspace. It contains quite a bit of experimental code, but here are the key crates:

  • core/ contains the business logic for Codex. Ultimately, we hope this to be a library crate that is generally useful for building other Rust/native applications that use Codex.
  • exec/ "headless" CLI for use in automation.
  • tui/ CLI that launches a fullscreen TUI built with Ratatui.
  • cli/ CLI multitool that provides the aforementioned CLIs via subcommands.