Fix flakiness of CI test: https://github.com/openai/codex/actions/runs/20350530276/job/58473691434?pr=8282 This PR does two things: 1. move the flakiness test to use responses API instead of chat completion API 2. make mcp_process agnostic to the order of responses/notifications/requests that come in, by buffering messages not read |
||
|---|---|---|
| .. | ||
| auth_fixtures.rs | ||
| Cargo.toml | ||
| lib.rs | ||
| mcp_process.rs | ||
| mock_model_server.rs | ||
| models_cache.rs | ||
| responses.rs | ||
| rollout.rs | ||