Because conversations that use the Responses API can have encrypted
reasoning messages, trying to resume a conversation with a different
provider could lead to confusing "failed to decrypt" errors. (This is
reproducible by starting a conversation using ChatGPT login and resuming
it as a conversation that uses OpenAI models via Azure.)
This changes `ListConversationsParams` to take a `model_providers:
Option<Vec<String>>` and adds `model_provider` on each
`ConversationSummary` it returns so these cases can be disambiguated.
Note this ended up making changes to
`codex-rs/core/src/rollout/tests.rs` because it had a number of cases
where it expected `Some` for the value of `next_cursor`, but the list of
rollouts was complete, so according to this docstring:
|
||
|---|---|---|
| .. | ||
| archive_conversation.rs | ||
| auth.rs | ||
| codex_message_processor_flow.rs | ||
| config.rs | ||
| create_conversation.rs | ||
| fuzzy_file_search.rs | ||
| interrupt.rs | ||
| list_resume.rs | ||
| login.rs | ||
| mod.rs | ||
| model_list.rs | ||
| rate_limits.rs | ||
| send_message.rs | ||
| set_default_model.rs | ||
| user_agent.rs | ||
| user_info.rs | ||