Migrate tui to use UserTurn (#9497)

- `tui/` and `tui2/` submit `Op::UserTurn` and own full turn context
(cwd/approval/sandbox/model/etc.).
- `Op::UserInput` is documented as legacy in `codex-protocol` (doc-only;
no `#[deprecated]` to avoid `-D warnings` fallout).
- Remove obsolete `#[allow(deprecated)]` and the unused `ConversationId`
alias/re-export.
This commit is contained in:
Ahmed Ibrahim 2026-01-19 13:40:39 -08:00 committed by GitHub
parent 0c0c5aeddc
commit 65d3b9e145
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
8 changed files with 61 additions and 69 deletions

View file

@ -1,4 +1,4 @@
Overview of Protocol Defined in [protocol.rs](../core/src/protocol.rs) and [agent.rs](../core/src/agent.rs).
Overview of Protocol defined in [protocol.rs](../protocol/src/protocol.rs) and [agent.rs](../core/src/agent.rs).
The goal of this document is to define terminology used in the system and explain the expected behavior of the system.
@ -23,11 +23,11 @@ These are entities exit on the codex backend. The intent of this section is to e
3. `Task`
- A `Task` is `Codex` executing work in response to user input.
- `Session` has at most one `Task` running at a time.
- Receiving `Op::UserInput` starts a `Task`
- Receiving `Op::UserTurn` starts a `Task` (`Op::UserInput` is legacy)
- Consists of a series of `Turn`s
- The `Task` executes to until:
- The `Model` completes the task and there is no output to feed into an additional `Turn`
- Additional `Op::UserInput` aborts the current task and starts a new one
- Additional user-turn input aborts the current task and starts a new one
- UI interrupts with `Op::Interrupt`
- Fatal errors are encountered, eg. `Model` connection exceeding retry limits
- Blocked by user approval (executing a command or patch)
@ -42,7 +42,7 @@ These are entities exit on the codex backend. The intent of this section is to e
The term "UI" is used to refer to the application driving `Codex`. This may be the CLI / TUI chat-like interface that users operate, or it may be a GUI interface like a VSCode extension. The UI is external to `Codex`, as `Codex` is intended to be operated by arbitrary UI implementations.
When a `Turn` completes, the `response_id` from the `Model`'s final `response.completed` message is stored in the `Session` state to resume the thread given the next `Op::UserInput`. The `response_id` is also returned in the `EventMsg::TurnComplete` to the UI, which can be used to fork the thread from an earlier point by providing it in the `Op::UserInput`.
When a `Turn` completes, the `response_id` from the `Model`'s final `response.completed` message is stored in the `Session` state to resume the thread given the next user turn. The `response_id` is also returned in the `EventMsg::TurnComplete` to the UI, which can be used to fork the thread from an earlier point by providing it in a future user turn.
Since only 1 `Task` can be run at a time, for parallel tasks it is recommended that a single `Codex` be run for each thread of work.
@ -57,15 +57,16 @@ Since only 1 `Task` can be run at a time, for parallel tasks it is recommended t
- This enum is `non_exhaustive`; variants can be added at future dates
- `Event`
- These are messages sent on the `EQ` (`Codex` -> UI)
- Each `Event` has a non-unique ID, matching the `sub_id` from the `Op::UserInput` that started the current task.
- Each `Event` has a non-unique ID, matching the `sub_id` from the user-turn op that started the current task.
- `EventMsg` refers to the enum of all possible `Event` payloads
- This enum is `non_exhaustive`; variants can be added at future dates
- It should be expected that new `EventMsg` variants will be added over time to expose more detailed information about the model's actions.
For complete documentation of the `Op` and `EventMsg` variants, refer to [protocol.rs](../core/src/protocol.rs). Some example payload types:
For complete documentation of the `Op` and `EventMsg` variants, refer to [protocol.rs](../protocol/src/protocol.rs). Some example payload types:
- `Op`
- `Op::UserInput` Any input from the user to kick off a `Turn`
- `Op::UserTurn` Any input from the user to kick off a `Turn`
- `Op::UserInput` Legacy form of user input
- `Op::Interrupt` Interrupts a running turn
- `Op::ExecApproval` Approve or deny code execution
- `Op::UserInputAnswer` Provide answers for a `request_user_input` tool call
@ -114,7 +115,7 @@ sequenceDiagram
user->>codex: Op::ConfigureSession
codex-->>session: create session
codex->>user: Event::SessionConfigured
user->>session: Op::UserInput
user->>session: Op::UserTurn
session-->>+task: start task
task->>user: Event::TurnStarted
task->>agent: prompt
@ -152,7 +153,7 @@ sequenceDiagram
box Rest API
participant agent as Model
end
user->>session: Op::UserInput
user->>session: Op::UserTurn
session-->>+task1: start task
task1->>user: Event::TurnStarted
task1->>agent: prompt
@ -164,7 +165,7 @@ sequenceDiagram
task1->>task1: exec (auto-approved)
user->>task1: Op::Interrupt
task1->>-user: Event::Error("interrupted")
user->>session: Op::UserInput w/ last_response_id
user->>session: Op::UserTurn w/ response bookmark
session-->>+task2: start task
task2->>user: Event::TurnStarted
task2->>agent: prompt + Task1 last_response_id

View file

@ -1,7 +1,5 @@
pub mod account;
mod thread_id;
#[allow(deprecated)]
pub use thread_id::ConversationId;
pub use thread_id::ThreadId;
pub mod approvals;
pub mod config_types;

View file

@ -83,7 +83,10 @@ pub enum Op {
/// This server sends [`EventMsg::TurnAborted`] in response.
Interrupt,
/// Input from the user
/// Legacy user input.
///
/// Prefer [`Op::UserTurn`] so the caller provides full turn context
/// (cwd/approval/sandbox/model/etc.) for each turn.
UserInput {
/// User input items, see `InputItem`
items: Vec<UserInput>,
@ -131,7 +134,8 @@ pub enum Op {
///
/// All fields are optional; when omitted, the existing value is preserved.
/// This does not enqueue any input it only updates defaults used for
/// future `UserInput` turns.
/// turns that rely on persistent session-level context (for example,
/// [`Op::UserInput`]).
OverrideTurnContext {
/// Updated `cwd` for sandbox/tool calls.
#[serde(skip_serializing_if = "Option::is_none")]

View file

@ -70,10 +70,6 @@ impl JsonSchema for ThreadId {
}
}
/// Backward-compatible alias for the previous name.
#[deprecated(note = "use ThreadId instead")]
pub type ConversationId = ThreadId;
#[cfg(test)]
mod tests {
use super::*;

View file

@ -2296,6 +2296,14 @@ impl ChatWidget {
}
fn submit_user_message(&mut self, user_message: UserMessage) {
let Some(model) = self.current_model().or(self.config.model.as_deref()) else {
tracing::warn!("cannot submit user message before model is known; queueing");
self.queued_user_messages.push_front(user_message);
self.refresh_queued_user_messages();
return;
};
let model = model.to_string();
let UserMessage { text, image_paths } = user_message;
if text.is_empty() && image_paths.is_empty() {
return;
@ -2343,36 +2351,24 @@ impl ChatWidget {
}
}
// TODO(aibrahim): migrate the TUI to submit `Op::UserTurn` by default (and rely less on
// `Op::UserInput`) so session-level settings like collaboration mode are consistently
// applied.
let op = if self.collaboration_modes_enabled() {
let model = self
.current_model()
.unwrap_or(DEFAULT_MODEL_DISPLAY_NAME)
.to_string();
let collaboration_mode = collaboration_modes::resolve_mode_or_fallback(
let collaboration_mode = self.collaboration_modes_enabled().then(|| {
collaboration_modes::resolve_mode_or_fallback(
self.models_manager.as_ref(),
self.collaboration_mode,
model.as_str(),
self.config.model_reasoning_effort,
);
Op::UserTurn {
items,
cwd: self.config.cwd.clone(),
approval_policy: self.config.approval_policy.value(),
sandbox_policy: self.config.sandbox_policy.get().clone(),
model,
effort: self.config.model_reasoning_effort,
summary: self.config.model_reasoning_summary,
final_output_json_schema: None,
collaboration_mode: Some(collaboration_mode),
}
} else {
Op::UserInput {
items,
final_output_json_schema: None,
}
)
});
let op = Op::UserTurn {
items,
cwd: self.config.cwd.clone(),
approval_policy: self.config.approval_policy.value(),
sandbox_policy: self.config.sandbox_policy.get().clone(),
model,
effort: self.config.model_reasoning_effort,
summary: self.config.model_reasoning_summary,
final_output_json_schema: None,
collaboration_mode,
};
self.codex_op_tx.send(op).unwrap_or_else(|e| {

View file

@ -457,7 +457,6 @@ fn next_submit_op(op_rx: &mut tokio::sync::mpsc::UnboundedReceiver<Op>) -> Op {
loop {
match op_rx.try_recv() {
Ok(op @ Op::UserTurn { .. }) => return op,
Ok(op @ Op::UserInput { .. }) => return op,
Ok(_) => continue,
Err(TryRecvError::Empty) => panic!("expected a submit op but queue was empty"),
Err(TryRecvError::Disconnected) => panic!("expected submit op but channel closed"),

View file

@ -2066,6 +2066,14 @@ impl ChatWidget {
}
fn submit_user_message(&mut self, user_message: UserMessage) {
let Some(model) = self.current_model().or(self.config.model.as_deref()) else {
tracing::warn!("cannot submit user message before model is known; queueing");
self.queued_user_messages.push_front(user_message);
self.refresh_queued_user_messages();
return;
};
let model = model.to_string();
let UserMessage { text, image_paths } = user_message;
if text.is_empty() && image_paths.is_empty() {
return;
@ -2113,33 +2121,24 @@ impl ChatWidget {
}
}
let op = if self.collaboration_modes_enabled() {
let model = self
.current_model()
.unwrap_or(DEFAULT_MODEL_DISPLAY_NAME)
.to_string();
let collaboration_mode = collaboration_modes::resolve_mode_or_fallback(
let collaboration_mode = self.collaboration_modes_enabled().then(|| {
collaboration_modes::resolve_mode_or_fallback(
self.models_manager.as_ref(),
self.collaboration_mode,
model.as_str(),
self.config.model_reasoning_effort,
);
Op::UserTurn {
items,
cwd: self.config.cwd.clone(),
approval_policy: self.config.approval_policy.value(),
sandbox_policy: self.config.sandbox_policy.get().clone(),
model,
effort: self.config.model_reasoning_effort,
summary: self.config.model_reasoning_summary,
final_output_json_schema: None,
collaboration_mode: Some(collaboration_mode),
}
} else {
Op::UserInput {
items,
final_output_json_schema: None,
}
)
});
let op = Op::UserTurn {
items,
cwd: self.config.cwd.clone(),
approval_policy: self.config.approval_policy.value(),
sandbox_policy: self.config.sandbox_policy.get().clone(),
model,
effort: self.config.model_reasoning_effort,
summary: self.config.model_reasoning_summary,
final_output_json_schema: None,
collaboration_mode,
};
if !self.agent_turn_running {

View file

@ -442,7 +442,6 @@ fn next_submit_op(op_rx: &mut tokio::sync::mpsc::UnboundedReceiver<Op>) -> Op {
loop {
match op_rx.try_recv() {
Ok(op @ Op::UserTurn { .. }) => return op,
Ok(op @ Op::UserInput { .. }) => return op,
Ok(_) => continue,
Err(TryRecvError::Empty) => panic!("expected a submit op but queue was empty"),
Err(TryRecvError::Disconnected) => panic!("expected submit op but channel closed"),