config: enforce enterprise feature requirements (#13388)
## Why Enterprises can already constrain approvals, sandboxing, and web search through `requirements.toml` and MDM, but feature flags were still only configurable as managed defaults. That meant an enterprise could suggest feature values, but it could not actually pin them. This change closes that gap and makes enterprise feature requirements behave like the other constrained settings. The effective feature set now stays consistent with enterprise requirements during config load, when config writes are validated, and when runtime code mutates feature flags later in the session. It also tightens the runtime API for managed features. `ManagedFeatures` now follows the same constraint-oriented shape as `Constrained<T>` instead of exposing panic-prone mutation helpers, and production code can no longer construct it through an unconstrained `From<Features>` path. The PR also hardens the `compact_resume_fork` integration coverage on Windows. After the feature-management changes, `compact_resume_after_second_compaction_preserves_history` was overflowing the libtest/Tokio thread stacks on Windows, so the test now uses an explicit larger-stack harness as a pragmatic mitigation. That may not be the ideal root-cause fix, and it merits a parallel investigation into whether part of the async future chain should be boxed to reduce stack pressure instead. ## What Changed Enterprises can now pin feature values in `requirements.toml` with the requirements-side `features` table: ```toml [features] personality = true unified_exec = false ``` Only canonical feature keys are allowed in the requirements `features` table; omitted keys remain unconstrained. - Added a requirements-side pinned feature map to `ConfigRequirementsToml`, threaded it through source-preserving requirements merge and normalization in `codex-config`, and made the TOML surface use `[features]` (while still accepting legacy `[feature_requirements]` for compatibility). - Exposed `featureRequirements` from `configRequirements/read`, regenerated the JSON/TypeScript schema artifacts, and updated the app-server README. - Wrapped the effective feature set in `ManagedFeatures`, backed by `ConstrainedWithSource<Features>`, and changed its API to mirror `Constrained<T>`: `can_set(...)`, `set(...) -> ConstraintResult<()>`, and result-returning `enable` / `disable` / `set_enabled` helpers. - Removed the legacy-usage and bulk-map passthroughs from `ManagedFeatures`; callers that need those behaviors now mutate a plain `Features` value and reapply it through `set(...)`, so the constrained wrapper remains the enforcement boundary. - Removed the production loophole for constructing unconstrained `ManagedFeatures`. Non-test code now creates it through the configured feature-loading path, and `impl From<Features> for ManagedFeatures` is restricted to `#[cfg(test)]`. - Rejected legacy feature aliases in enterprise feature requirements, and return a load error when a pinned combination cannot survive dependency normalization. - Validated config writes against enterprise feature requirements before persisting changes, including explicit conflicting writes and profile-specific feature states that normalize into invalid combinations. - Updated runtime and TUI feature-toggle paths to use the constrained setter API and to persist or apply the effective post-constraint value rather than the requested value. - Updated the `core_test_support` Bazel target to include the bundled core model-catalog fixtures in its runtime data, so helper code that resolves `core/models.json` through runfiles works in remote Bazel test environments. - Renamed the core config test coverage to emphasize that effective feature values are normalized at runtime, while conflicting persisted config writes are rejected. - Ran `compact_resume_after_second_compaction_preserves_history` inside an explicit 8 MiB test thread and Tokio runtime worker stack, following the existing larger-stack integration-test pattern, to keep the Windows `compact_resume_fork` test slice from aborting while a parallel investigation continues into whether some of the underlying async futures should be boxed. ## Verification - `cargo test -p codex-config` - `cargo test -p codex-core feature_requirements_ -- --nocapture` - `cargo test -p codex-core load_requirements_toml_produces_expected_constraints -- --nocapture` - `cargo test -p codex-core compact_resume_after_second_compaction_preserves_history -- --nocapture` - `cargo test -p codex-core compact_resume_fork -- --nocapture` - Re-ran the built `codex-core` `tests/all` binary with `RUST_MIN_STACK=262144` for `compact_resume_after_second_compaction_preserves_history` to confirm the explicit-stack harness fixes the deterministic low-stack repro. - `cargo test -p codex-core` - This still fails locally in unrelated integration areas that expect the `codex` / `test_stdio_server` binaries or hit existing `search_tool` wiremock mismatches. ## Docs `developers.openai.com/codex` should document the requirements-side `[features]` table for enterprise and MDM-managed configuration, including that it only accepts canonical feature keys and that conflicting config writes are rejected.
This commit is contained in:
parent
e6773f856c
commit
bfff0c729f
70 changed files with 1718 additions and 268 deletions
|
|
@ -9007,6 +9007,15 @@
|
|||
"type": "null"
|
||||
}
|
||||
]
|
||||
},
|
||||
"featureRequirements": {
|
||||
"additionalProperties": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"type": [
|
||||
"object",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
|
|
|
|||
|
|
@ -2881,6 +2881,15 @@
|
|||
"type": "null"
|
||||
}
|
||||
]
|
||||
},
|
||||
"featureRequirements": {
|
||||
"additionalProperties": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"type": [
|
||||
"object",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
|
|
|
|||
|
|
@ -81,6 +81,15 @@
|
|||
"type": "null"
|
||||
}
|
||||
]
|
||||
},
|
||||
"featureRequirements": {
|
||||
"additionalProperties": {
|
||||
"type": "boolean"
|
||||
},
|
||||
"type": [
|
||||
"object",
|
||||
"null"
|
||||
]
|
||||
}
|
||||
},
|
||||
"type": "object"
|
||||
|
|
|
|||
|
|
@ -6,4 +6,4 @@ import type { AskForApproval } from "./AskForApproval";
|
|||
import type { ResidencyRequirement } from "./ResidencyRequirement";
|
||||
import type { SandboxMode } from "./SandboxMode";
|
||||
|
||||
export type ConfigRequirements = {allowedApprovalPolicies: Array<AskForApproval> | null, allowedSandboxModes: Array<SandboxMode> | null, allowedWebSearchModes: Array<WebSearchMode> | null, enforceResidency: ResidencyRequirement | null};
|
||||
export type ConfigRequirements = {allowedApprovalPolicies: Array<AskForApproval> | null, allowedSandboxModes: Array<SandboxMode> | null, allowedWebSearchModes: Array<WebSearchMode> | null, featureRequirements: { [key in string]?: boolean } | null, enforceResidency: ResidencyRequirement | null};
|
||||
|
|
|
|||
|
|
@ -1,3 +1,4 @@
|
|||
use std::collections::BTreeMap;
|
||||
use std::collections::HashMap;
|
||||
use std::path::PathBuf;
|
||||
|
||||
|
|
@ -611,6 +612,7 @@ pub struct ConfigRequirements {
|
|||
pub allowed_approval_policies: Option<Vec<AskForApproval>>,
|
||||
pub allowed_sandbox_modes: Option<Vec<SandboxMode>>,
|
||||
pub allowed_web_search_modes: Option<Vec<WebSearchMode>>,
|
||||
pub feature_requirements: Option<BTreeMap<String, bool>>,
|
||||
pub enforce_residency: Option<ResidencyRequirement>,
|
||||
#[experimental("configRequirements/read.network")]
|
||||
pub network: Option<NetworkRequirements>,
|
||||
|
|
|
|||
|
|
@ -165,7 +165,7 @@ Example with notification opt-out:
|
|||
- `externalAgentConfig/import` — apply selected external-agent migration items by passing explicit `migrationItems` with `cwd` (`null` for home).
|
||||
- `config/value/write` — write a single config key/value to the user's config.toml on disk.
|
||||
- `config/batchWrite` — apply multiple config edits atomically to the user's config.toml on disk.
|
||||
- `configRequirements/read` — fetch loaded requirements constraints from `requirements.toml` and/or MDM (or `null` if none are configured), including allow-lists (`allowedApprovalPolicies`, `allowedSandboxModes`, `allowedWebSearchModes`), `enforceResidency`, and `network` constraints.
|
||||
- `configRequirements/read` — fetch loaded requirements constraints from `requirements.toml` and/or MDM (or `null` if none are configured), including allow-lists (`allowedApprovalPolicies`, `allowedSandboxModes`, `allowedWebSearchModes`), pinned feature values (`featureRequirements`), `enforceResidency`, and `network` constraints.
|
||||
|
||||
### Example: Start or resume a thread
|
||||
|
||||
|
|
|
|||
|
|
@ -4550,11 +4550,9 @@ impl CodexMessageProcessor {
|
|||
}
|
||||
};
|
||||
|
||||
if thread.enabled(Feature::Apps) {
|
||||
config.features.enable(Feature::Apps);
|
||||
} else {
|
||||
config.features.disable(Feature::Apps);
|
||||
}
|
||||
let _ = config
|
||||
.features
|
||||
.set_enabled(Feature::Apps, thread.enabled(Feature::Apps));
|
||||
}
|
||||
|
||||
if !config.features.enabled(Feature::Apps) {
|
||||
|
|
|
|||
|
|
@ -127,6 +127,9 @@ fn map_requirements_toml_to_api(requirements: ConfigRequirementsToml) -> ConfigR
|
|||
}
|
||||
normalized
|
||||
}),
|
||||
feature_requirements: requirements
|
||||
.feature_requirements
|
||||
.map(|requirements| requirements.entries),
|
||||
enforce_residency: requirements
|
||||
.enforce_residency
|
||||
.map(map_residency_requirement_to_api),
|
||||
|
|
@ -212,6 +215,12 @@ mod tests {
|
|||
allowed_web_search_modes: Some(vec![
|
||||
codex_core::config_loader::WebSearchModeRequirement::Cached,
|
||||
]),
|
||||
feature_requirements: Some(codex_core::config_loader::FeatureRequirementsToml {
|
||||
entries: std::collections::BTreeMap::from([
|
||||
("apps".to_string(), false),
|
||||
("personality".to_string(), true),
|
||||
]),
|
||||
}),
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: Some(CoreResidencyRequirement::Us),
|
||||
|
|
@ -247,6 +256,13 @@ mod tests {
|
|||
mapped.allowed_web_search_modes,
|
||||
Some(vec![WebSearchMode::Cached, WebSearchMode::Disabled]),
|
||||
);
|
||||
assert_eq!(
|
||||
mapped.feature_requirements,
|
||||
Some(std::collections::BTreeMap::from([
|
||||
("apps".to_string(), false),
|
||||
("personality".to_string(), true),
|
||||
])),
|
||||
);
|
||||
assert_eq!(
|
||||
mapped.enforce_residency,
|
||||
Some(codex_app_server_protocol::ResidencyRequirement::Us),
|
||||
|
|
@ -275,6 +291,7 @@ mod tests {
|
|||
allowed_approval_policies: None,
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: Some(Vec::new()),
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
|
|||
|
|
@ -761,6 +761,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -803,6 +804,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -856,6 +858,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -912,6 +915,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -939,6 +943,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -986,6 +991,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::OnRequest]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -1032,6 +1038,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::OnRequest]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -1082,6 +1089,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -1133,6 +1141,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -1184,6 +1193,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -1272,6 +1282,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -1295,6 +1306,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::OnRequest]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
|
|||
|
|
@ -79,6 +79,7 @@ pub struct ConfigRequirements {
|
|||
pub approval_policy: ConstrainedWithSource<AskForApproval>,
|
||||
pub sandbox_policy: ConstrainedWithSource<SandboxPolicy>,
|
||||
pub web_search_mode: ConstrainedWithSource<WebSearchMode>,
|
||||
pub feature_requirements: Option<Sourced<FeatureRequirementsToml>>,
|
||||
pub mcp_servers: Option<Sourced<BTreeMap<String, McpServerRequirement>>>,
|
||||
pub exec_policy: Option<Sourced<RequirementsExecPolicy>>,
|
||||
pub enforce_residency: ConstrainedWithSource<Option<ResidencyRequirement>>,
|
||||
|
|
@ -101,6 +102,7 @@ impl Default for ConfigRequirements {
|
|||
Constrained::allow_any(WebSearchMode::Cached),
|
||||
None,
|
||||
),
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
exec_policy: None,
|
||||
enforce_residency: ConstrainedWithSource::new(Constrained::allow_any(None), None),
|
||||
|
|
@ -227,12 +229,26 @@ impl fmt::Display for WebSearchModeRequirement {
|
|||
}
|
||||
}
|
||||
|
||||
#[derive(Deserialize, Debug, Clone, Default, PartialEq, Eq)]
|
||||
pub struct FeatureRequirementsToml {
|
||||
#[serde(flatten)]
|
||||
pub entries: BTreeMap<String, bool>,
|
||||
}
|
||||
|
||||
impl FeatureRequirementsToml {
|
||||
pub fn is_empty(&self) -> bool {
|
||||
self.entries.is_empty()
|
||||
}
|
||||
}
|
||||
|
||||
/// Base config deserialized from system `requirements.toml` or MDM.
|
||||
#[derive(Deserialize, Debug, Clone, Default, PartialEq)]
|
||||
pub struct ConfigRequirementsToml {
|
||||
pub allowed_approval_policies: Option<Vec<AskForApproval>>,
|
||||
pub allowed_sandbox_modes: Option<Vec<SandboxModeRequirement>>,
|
||||
pub allowed_web_search_modes: Option<Vec<WebSearchModeRequirement>>,
|
||||
#[serde(rename = "features", alias = "feature_requirements")]
|
||||
pub feature_requirements: Option<FeatureRequirementsToml>,
|
||||
pub mcp_servers: Option<BTreeMap<String, McpServerRequirement>>,
|
||||
pub rules: Option<RequirementsExecPolicyToml>,
|
||||
pub enforce_residency: Option<ResidencyRequirement>,
|
||||
|
|
@ -267,6 +283,7 @@ pub struct ConfigRequirementsWithSources {
|
|||
pub allowed_approval_policies: Option<Sourced<Vec<AskForApproval>>>,
|
||||
pub allowed_sandbox_modes: Option<Sourced<Vec<SandboxModeRequirement>>>,
|
||||
pub allowed_web_search_modes: Option<Sourced<Vec<WebSearchModeRequirement>>>,
|
||||
pub feature_requirements: Option<Sourced<FeatureRequirementsToml>>,
|
||||
pub mcp_servers: Option<Sourced<BTreeMap<String, McpServerRequirement>>>,
|
||||
pub rules: Option<Sourced<RequirementsExecPolicyToml>>,
|
||||
pub enforce_residency: Option<Sourced<ResidencyRequirement>>,
|
||||
|
|
@ -302,6 +319,7 @@ impl ConfigRequirementsWithSources {
|
|||
allowed_approval_policies,
|
||||
allowed_sandbox_modes,
|
||||
allowed_web_search_modes,
|
||||
feature_requirements,
|
||||
mcp_servers,
|
||||
rules,
|
||||
enforce_residency,
|
||||
|
|
@ -315,6 +333,7 @@ impl ConfigRequirementsWithSources {
|
|||
allowed_approval_policies,
|
||||
allowed_sandbox_modes,
|
||||
allowed_web_search_modes,
|
||||
feature_requirements,
|
||||
mcp_servers,
|
||||
rules,
|
||||
enforce_residency,
|
||||
|
|
@ -324,6 +343,7 @@ impl ConfigRequirementsWithSources {
|
|||
allowed_approval_policies: allowed_approval_policies.map(|sourced| sourced.value),
|
||||
allowed_sandbox_modes: allowed_sandbox_modes.map(|sourced| sourced.value),
|
||||
allowed_web_search_modes: allowed_web_search_modes.map(|sourced| sourced.value),
|
||||
feature_requirements: feature_requirements.map(|sourced| sourced.value),
|
||||
mcp_servers: mcp_servers.map(|sourced| sourced.value),
|
||||
rules: rules.map(|sourced| sourced.value),
|
||||
enforce_residency: enforce_residency.map(|sourced| sourced.value),
|
||||
|
|
@ -370,6 +390,10 @@ impl ConfigRequirementsToml {
|
|||
self.allowed_approval_policies.is_none()
|
||||
&& self.allowed_sandbox_modes.is_none()
|
||||
&& self.allowed_web_search_modes.is_none()
|
||||
&& self
|
||||
.feature_requirements
|
||||
.as_ref()
|
||||
.is_none_or(FeatureRequirementsToml::is_empty)
|
||||
&& self.mcp_servers.is_none()
|
||||
&& self.rules.is_none()
|
||||
&& self.enforce_residency.is_none()
|
||||
|
|
@ -385,6 +409,7 @@ impl TryFrom<ConfigRequirementsWithSources> for ConfigRequirements {
|
|||
allowed_approval_policies,
|
||||
allowed_sandbox_modes,
|
||||
allowed_web_search_modes,
|
||||
feature_requirements,
|
||||
mcp_servers,
|
||||
rules,
|
||||
enforce_residency,
|
||||
|
|
@ -521,6 +546,8 @@ impl TryFrom<ConfigRequirementsWithSources> for ConfigRequirements {
|
|||
}
|
||||
None => ConstrainedWithSource::new(Constrained::allow_any(WebSearchMode::Cached), None),
|
||||
};
|
||||
let feature_requirements =
|
||||
feature_requirements.filter(|requirements| !requirements.value.is_empty());
|
||||
|
||||
let enforce_residency = match enforce_residency {
|
||||
Some(Sourced {
|
||||
|
|
@ -553,6 +580,7 @@ impl TryFrom<ConfigRequirementsWithSources> for ConfigRequirements {
|
|||
approval_policy,
|
||||
sandbox_policy,
|
||||
web_search_mode,
|
||||
feature_requirements,
|
||||
mcp_servers,
|
||||
exec_policy,
|
||||
enforce_residency,
|
||||
|
|
@ -588,6 +616,7 @@ mod tests {
|
|||
allowed_approval_policies,
|
||||
allowed_sandbox_modes,
|
||||
allowed_web_search_modes,
|
||||
feature_requirements,
|
||||
mcp_servers,
|
||||
rules,
|
||||
enforce_residency,
|
||||
|
|
@ -600,6 +629,8 @@ mod tests {
|
|||
.map(|value| Sourced::new(value, RequirementSource::Unknown)),
|
||||
allowed_web_search_modes: allowed_web_search_modes
|
||||
.map(|value| Sourced::new(value, RequirementSource::Unknown)),
|
||||
feature_requirements: feature_requirements
|
||||
.map(|value| Sourced::new(value, RequirementSource::Unknown)),
|
||||
mcp_servers: mcp_servers.map(|value| Sourced::new(value, RequirementSource::Unknown)),
|
||||
rules: rules.map(|value| Sourced::new(value, RequirementSource::Unknown)),
|
||||
enforce_residency: enforce_residency
|
||||
|
|
@ -622,6 +653,9 @@ mod tests {
|
|||
WebSearchModeRequirement::Cached,
|
||||
WebSearchModeRequirement::Live,
|
||||
];
|
||||
let feature_requirements = FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([("personality".to_string(), true)]),
|
||||
};
|
||||
let enforce_residency = ResidencyRequirement::Us;
|
||||
let enforce_source = source.clone();
|
||||
|
||||
|
|
@ -631,6 +665,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(allowed_approval_policies.clone()),
|
||||
allowed_sandbox_modes: Some(allowed_sandbox_modes.clone()),
|
||||
allowed_web_search_modes: Some(allowed_web_search_modes.clone()),
|
||||
feature_requirements: Some(feature_requirements.clone()),
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: Some(enforce_residency),
|
||||
|
|
@ -651,6 +686,10 @@ mod tests {
|
|||
allowed_web_search_modes,
|
||||
enforce_source.clone(),
|
||||
)),
|
||||
feature_requirements: Some(Sourced::new(
|
||||
feature_requirements,
|
||||
enforce_source.clone(),
|
||||
)),
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: Some(Sourced::new(enforce_residency, enforce_source)),
|
||||
|
|
@ -683,6 +722,7 @@ mod tests {
|
|||
)),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -723,6 +763,7 @@ mod tests {
|
|||
)),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -809,6 +850,8 @@ mod tests {
|
|||
allowed_sandbox_modes = ["read-only"]
|
||||
allowed_web_search_modes = ["cached"]
|
||||
enforce_residency = "us"
|
||||
[features]
|
||||
personality = true
|
||||
"#,
|
||||
)?;
|
||||
|
||||
|
|
@ -829,6 +872,13 @@ mod tests {
|
|||
requirements.web_search_mode.source,
|
||||
Some(source_location.clone())
|
||||
);
|
||||
assert_eq!(
|
||||
requirements
|
||||
.feature_requirements
|
||||
.as_ref()
|
||||
.map(|requirements| requirements.source.clone()),
|
||||
Some(source_location.clone())
|
||||
);
|
||||
assert_eq!(requirements.enforce_residency.source, Some(source_location));
|
||||
|
||||
Ok(())
|
||||
|
|
@ -1038,6 +1088,32 @@ mod tests {
|
|||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn deserialize_feature_requirements() -> Result<()> {
|
||||
let toml_str = r#"
|
||||
[features]
|
||||
apps = false
|
||||
personality = true
|
||||
"#;
|
||||
let config: ConfigRequirementsToml = from_str(toml_str)?;
|
||||
let requirements: ConfigRequirements = with_unknown_source(config).try_into()?;
|
||||
|
||||
assert_eq!(
|
||||
requirements.feature_requirements,
|
||||
Some(Sourced::new(
|
||||
FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([
|
||||
("apps".to_string(), false),
|
||||
("personality".to_string(), true),
|
||||
]),
|
||||
},
|
||||
RequirementSource::Unknown,
|
||||
))
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn network_requirements_are_preserved_as_constraints_with_source() -> Result<()> {
|
||||
let toml_str = r#"
|
||||
|
|
|
|||
|
|
@ -16,6 +16,7 @@ pub use config_requirements::ConfigRequirements;
|
|||
pub use config_requirements::ConfigRequirementsToml;
|
||||
pub use config_requirements::ConfigRequirementsWithSources;
|
||||
pub use config_requirements::ConstrainedWithSource;
|
||||
pub use config_requirements::FeatureRequirementsToml;
|
||||
pub use config_requirements::McpServerIdentity;
|
||||
pub use config_requirements::McpServerRequirement;
|
||||
pub use config_requirements::NetworkConstraints;
|
||||
|
|
|
|||
|
|
@ -1381,7 +1381,10 @@ mod tests {
|
|||
#[tokio::test]
|
||||
async fn resume_thread_subagent_restores_stored_nickname_and_role() {
|
||||
let (home, mut config) = test_config().await;
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow sqlite");
|
||||
let manager = ThreadManager::with_models_provider_and_home_for_tests(
|
||||
CodexAuth::from_api_key("dummy"),
|
||||
config.model_provider.clone(),
|
||||
|
|
|
|||
|
|
@ -23,11 +23,11 @@ use crate::compact::InitialContextInjection;
|
|||
use crate::compact::run_inline_auto_compact_task;
|
||||
use crate::compact::should_use_remote_compact_task;
|
||||
use crate::compact_remote::run_inline_remote_auto_compact_task;
|
||||
use crate::config::ManagedFeatures;
|
||||
use crate::connectors;
|
||||
use crate::exec_policy::ExecPolicyManager;
|
||||
use crate::features::FEATURES;
|
||||
use crate::features::Feature;
|
||||
use crate::features::Features;
|
||||
use crate::features::maybe_push_unstable_features_warning;
|
||||
#[cfg(test)]
|
||||
use crate::models_manager::collaboration_mode_presets::CollaborationModesConfig;
|
||||
|
|
@ -373,18 +373,24 @@ impl Codex {
|
|||
if let SessionSource::SubAgent(SubAgentSource::ThreadSpawn { depth, .. }) = session_source
|
||||
&& depth >= config.agent_max_depth
|
||||
{
|
||||
config.features.disable(Feature::Collab);
|
||||
let _ = config.features.disable(Feature::Collab);
|
||||
}
|
||||
|
||||
if config.features.enabled(Feature::JsRepl)
|
||||
&& let Err(err) = resolve_compatible_node(config.js_repl_node_path.as_deref()).await
|
||||
{
|
||||
let message = format!(
|
||||
"Disabled `js_repl` for this session because the configured Node runtime is unavailable or incompatible. {err}"
|
||||
);
|
||||
let _ = config.features.disable(Feature::JsRepl);
|
||||
let _ = config.features.disable(Feature::JsReplToolsOnly);
|
||||
let message = if config.features.enabled(Feature::JsRepl) {
|
||||
format!(
|
||||
"`js_repl` remains enabled because enterprise requirements pin it on, but the configured Node runtime is unavailable or incompatible. {err}"
|
||||
)
|
||||
} else {
|
||||
format!(
|
||||
"Disabled `js_repl` for this session because the configured Node runtime is unavailable or incompatible. {err}"
|
||||
)
|
||||
};
|
||||
warn!("{message}");
|
||||
config.features.disable(Feature::JsRepl);
|
||||
config.features.disable(Feature::JsReplToolsOnly);
|
||||
config.startup_warnings.push(message);
|
||||
}
|
||||
|
||||
|
|
@ -620,7 +626,7 @@ pub(crate) struct Session {
|
|||
state: Mutex<SessionState>,
|
||||
/// The set of enabled features should be invariant for the lifetime of the
|
||||
/// session.
|
||||
features: Features,
|
||||
features: ManagedFeatures,
|
||||
pending_mcp_server_refresh_config: Mutex<Option<McpServerRefreshConfig>>,
|
||||
pub(crate) conversation: Arc<RealtimeConversationManager>,
|
||||
pub(crate) active_turn: Mutex<Option<ActiveTurn>>,
|
||||
|
|
@ -674,7 +680,7 @@ pub(crate) struct TurnContext {
|
|||
pub(crate) windows_sandbox_level: WindowsSandboxLevel,
|
||||
pub(crate) shell_environment_policy: ShellEnvironmentPolicy,
|
||||
pub(crate) tools_config: ToolsConfig,
|
||||
pub(crate) features: Features,
|
||||
pub(crate) features: ManagedFeatures,
|
||||
pub(crate) ghost_snapshot: GhostSnapshotConfig,
|
||||
pub(crate) final_output_json_schema: Option<Value>,
|
||||
pub(crate) codex_linux_sandbox_exe: Option<PathBuf>,
|
||||
|
|
@ -3020,7 +3026,7 @@ impl Session {
|
|||
self.features.enabled(feature)
|
||||
}
|
||||
|
||||
pub(crate) fn features(&self) -> Features {
|
||||
pub(crate) fn features(&self) -> ManagedFeatures {
|
||||
self.features.clone()
|
||||
}
|
||||
|
||||
|
|
@ -4719,9 +4725,8 @@ async fn spawn_review_thread(
|
|||
.await;
|
||||
// For reviews, disable web_search and view_image regardless of global settings.
|
||||
let mut review_features = sess.features.clone();
|
||||
review_features
|
||||
.disable(crate::features::Feature::WebSearchRequest)
|
||||
.disable(crate::features::Feature::WebSearchCached);
|
||||
let _ = review_features.disable(crate::features::Feature::WebSearchRequest);
|
||||
let _ = review_features.disable(crate::features::Feature::WebSearchCached);
|
||||
let review_web_search_mode = WebSearchMode::Disabled;
|
||||
let tools_config = ToolsConfig::new(&ToolsConfigParams {
|
||||
model_info: &review_model_info,
|
||||
|
|
@ -8260,7 +8265,10 @@ mod tests {
|
|||
async fn session_new_fails_when_zsh_fork_enabled_without_zsh_path() {
|
||||
let codex_home = tempfile::tempdir().expect("create temp dir");
|
||||
let mut config = build_test_config(codex_home.path()).await;
|
||||
config.features.enable(Feature::ShellZshFork);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellZshFork)
|
||||
.expect("test config should allow shell_zsh_fork");
|
||||
config.zsh_path = None;
|
||||
let config = Arc::new(config);
|
||||
|
||||
|
|
@ -8813,7 +8821,7 @@ mod tests {
|
|||
#[tokio::test]
|
||||
async fn record_model_warning_appends_user_message() {
|
||||
let (mut session, turn_context) = make_session_and_context().await;
|
||||
let features = Features::with_defaults();
|
||||
let features = crate::features::Features::with_defaults().into();
|
||||
session.features = features;
|
||||
|
||||
session
|
||||
|
|
|
|||
344
codex-rs/core/src/config/managed_features.rs
Normal file
344
codex-rs/core/src/config/managed_features.rs
Normal file
|
|
@ -0,0 +1,344 @@
|
|||
use std::collections::BTreeMap;
|
||||
|
||||
use codex_config::Constrained;
|
||||
use codex_config::ConstrainedWithSource;
|
||||
use codex_config::ConstraintError;
|
||||
use codex_config::ConstraintResult;
|
||||
use codex_config::FeatureRequirementsToml;
|
||||
use codex_config::RequirementSource;
|
||||
use codex_config::Sourced;
|
||||
|
||||
use crate::config::ConfigToml;
|
||||
use crate::config::profile::ConfigProfile;
|
||||
use crate::features::Feature;
|
||||
use crate::features::FeatureOverrides;
|
||||
use crate::features::Features;
|
||||
use crate::features::canonical_feature_for_key;
|
||||
use crate::features::feature_for_key;
|
||||
|
||||
/// Wrapper around [`Features`] which enforces constraints defined in
|
||||
/// `FeatureRequirementsToml` and provides normalization to ensure constraints
|
||||
/// are satisfied. Constraints are enforced on construction and mutation of
|
||||
/// `ManagedFeatures`.
|
||||
#[derive(Debug, Clone, PartialEq)]
|
||||
pub struct ManagedFeatures {
|
||||
value: ConstrainedWithSource<Features>,
|
||||
pinned_features: BTreeMap<Feature, bool>,
|
||||
}
|
||||
|
||||
impl ManagedFeatures {
|
||||
pub(crate) fn from_configured(
|
||||
configured_features: Features,
|
||||
feature_requirements: Option<Sourced<FeatureRequirementsToml>>,
|
||||
) -> std::io::Result<Self> {
|
||||
let (pinned_features, source) = match feature_requirements {
|
||||
Some(Sourced {
|
||||
value: feature_requirements,
|
||||
source,
|
||||
}) => (
|
||||
parse_feature_requirements(feature_requirements, &source)?,
|
||||
Some(source),
|
||||
),
|
||||
None => (BTreeMap::new(), None),
|
||||
};
|
||||
|
||||
let normalized_features = normalize_candidate(configured_features, &pinned_features);
|
||||
validate_pinned_features(&normalized_features, &pinned_features, source.as_ref())?;
|
||||
Ok(Self {
|
||||
value: ConstrainedWithSource::new(Constrained::allow_any(normalized_features), source),
|
||||
pinned_features,
|
||||
})
|
||||
}
|
||||
|
||||
pub fn get(&self) -> &Features {
|
||||
self.value.get()
|
||||
}
|
||||
|
||||
fn normalize_and_validate(&self, candidate: Features) -> ConstraintResult<Features> {
|
||||
let normalized = normalize_candidate(candidate, &self.pinned_features);
|
||||
self.value.can_set(&normalized)?;
|
||||
validate_pinned_features_constraint(
|
||||
&normalized,
|
||||
&self.pinned_features,
|
||||
self.value.source.as_ref(),
|
||||
)?;
|
||||
Ok(normalized)
|
||||
}
|
||||
|
||||
pub fn can_set(&self, candidate: &Features) -> ConstraintResult<()> {
|
||||
self.normalize_and_validate(candidate.clone()).map(|_| ())
|
||||
}
|
||||
|
||||
pub fn set(&mut self, candidate: Features) -> ConstraintResult<()> {
|
||||
let normalized = self.normalize_and_validate(candidate)?;
|
||||
self.value.value.set(normalized)
|
||||
}
|
||||
|
||||
pub fn set_enabled(&mut self, feature: Feature, enabled: bool) -> ConstraintResult<()> {
|
||||
let mut next = self.get().clone();
|
||||
next.set_enabled(feature, enabled);
|
||||
self.set(next)
|
||||
}
|
||||
|
||||
pub fn enable(&mut self, feature: Feature) -> ConstraintResult<()> {
|
||||
self.set_enabled(feature, true)
|
||||
}
|
||||
|
||||
pub fn disable(&mut self, feature: Feature) -> ConstraintResult<()> {
|
||||
self.set_enabled(feature, false)
|
||||
}
|
||||
}
|
||||
|
||||
/// Only available for tests to ensure `ManagedFeatures` is constructed with
|
||||
/// any required constraints taken into account.
|
||||
#[cfg(test)]
|
||||
impl From<Features> for ManagedFeatures {
|
||||
fn from(features: Features) -> Self {
|
||||
Self {
|
||||
value: ConstrainedWithSource::new(Constrained::allow_any(features), None),
|
||||
pinned_features: BTreeMap::new(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
impl std::ops::Deref for ManagedFeatures {
|
||||
type Target = Features;
|
||||
|
||||
fn deref(&self) -> &Self::Target {
|
||||
self.get()
|
||||
}
|
||||
}
|
||||
|
||||
fn normalize_candidate(
|
||||
mut candidate: Features,
|
||||
pinned_features: &BTreeMap<Feature, bool>,
|
||||
) -> Features {
|
||||
for (feature, enabled) in pinned_features {
|
||||
candidate.set_enabled(*feature, *enabled);
|
||||
}
|
||||
candidate.normalize_dependencies();
|
||||
candidate
|
||||
}
|
||||
|
||||
fn validate_pinned_features_constraint(
|
||||
normalized_features: &Features,
|
||||
pinned_features: &BTreeMap<Feature, bool>,
|
||||
source: Option<&RequirementSource>,
|
||||
) -> ConstraintResult<()> {
|
||||
let Some(source) = source else {
|
||||
return Ok(());
|
||||
};
|
||||
let allowed = feature_requirements_display(pinned_features);
|
||||
for (feature, enabled) in pinned_features {
|
||||
if normalized_features.enabled(*feature) != *enabled {
|
||||
return Err(ConstraintError::InvalidValue {
|
||||
field_name: "features",
|
||||
candidate: format!(
|
||||
"{}={}",
|
||||
feature.key(),
|
||||
normalized_features.enabled(*feature)
|
||||
),
|
||||
allowed,
|
||||
requirement_source: source.clone(),
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn validate_pinned_features(
|
||||
normalized_features: &Features,
|
||||
pinned_features: &BTreeMap<Feature, bool>,
|
||||
source: Option<&RequirementSource>,
|
||||
) -> std::io::Result<()> {
|
||||
validate_pinned_features_constraint(normalized_features, pinned_features, source)
|
||||
.map_err(|err| std::io::Error::new(std::io::ErrorKind::InvalidData, err))
|
||||
}
|
||||
|
||||
fn feature_requirements_display(feature_requirements: &BTreeMap<Feature, bool>) -> String {
|
||||
let values = feature_requirements
|
||||
.iter()
|
||||
.map(|(feature, enabled)| format!("{}={enabled}", feature.key()))
|
||||
.collect::<Vec<_>>();
|
||||
format!("[{}]", values.join(", "))
|
||||
}
|
||||
|
||||
fn parse_feature_requirements(
|
||||
feature_requirements: FeatureRequirementsToml,
|
||||
source: &RequirementSource,
|
||||
) -> std::io::Result<BTreeMap<Feature, bool>> {
|
||||
let mut pinned_features = BTreeMap::new();
|
||||
for (key, enabled) in feature_requirements.entries {
|
||||
if let Some(feature) = canonical_feature_for_key(&key) {
|
||||
pinned_features.insert(feature, enabled);
|
||||
continue;
|
||||
}
|
||||
|
||||
if let Some(feature) = feature_for_key(&key) {
|
||||
return Err(std::io::Error::new(
|
||||
std::io::ErrorKind::InvalidData,
|
||||
format!(
|
||||
"invalid `features` requirement `{key}` from {source}: use canonical feature key `{}`",
|
||||
feature.key()
|
||||
),
|
||||
));
|
||||
}
|
||||
|
||||
return Err(std::io::Error::new(
|
||||
std::io::ErrorKind::InvalidData,
|
||||
format!("invalid `features` requirement `{key}` from {source}"),
|
||||
));
|
||||
}
|
||||
|
||||
Ok(pinned_features)
|
||||
}
|
||||
|
||||
fn explicit_feature_settings_in_config(cfg: &ConfigToml) -> Vec<(String, Feature, bool)> {
|
||||
let mut explicit_settings = Vec::new();
|
||||
|
||||
if let Some(features) = cfg.features.as_ref() {
|
||||
for (key, enabled) in &features.entries {
|
||||
if let Some(feature) = feature_for_key(key) {
|
||||
explicit_settings.push((format!("features.{key}"), feature, *enabled));
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Some(enabled) = cfg.experimental_use_unified_exec_tool {
|
||||
explicit_settings.push((
|
||||
"experimental_use_unified_exec_tool".to_string(),
|
||||
Feature::UnifiedExec,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
if let Some(enabled) = cfg.experimental_use_freeform_apply_patch {
|
||||
explicit_settings.push((
|
||||
"experimental_use_freeform_apply_patch".to_string(),
|
||||
Feature::ApplyPatchFreeform,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
if let Some(enabled) = cfg.tools.as_ref().and_then(|tools| tools.web_search) {
|
||||
explicit_settings.push((
|
||||
"tools.web_search".to_string(),
|
||||
Feature::WebSearchRequest,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
|
||||
for (profile_name, profile) in &cfg.profiles {
|
||||
if let Some(features) = profile.features.as_ref() {
|
||||
for (key, enabled) in &features.entries {
|
||||
if let Some(feature) = feature_for_key(key) {
|
||||
explicit_settings.push((
|
||||
format!("profiles.{profile_name}.features.{key}"),
|
||||
feature,
|
||||
*enabled,
|
||||
));
|
||||
}
|
||||
}
|
||||
}
|
||||
if let Some(enabled) = profile.include_apply_patch_tool {
|
||||
explicit_settings.push((
|
||||
format!("profiles.{profile_name}.include_apply_patch_tool"),
|
||||
Feature::ApplyPatchFreeform,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
if let Some(enabled) = profile.experimental_use_unified_exec_tool {
|
||||
explicit_settings.push((
|
||||
format!("profiles.{profile_name}.experimental_use_unified_exec_tool"),
|
||||
Feature::UnifiedExec,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
if let Some(enabled) = profile.experimental_use_freeform_apply_patch {
|
||||
explicit_settings.push((
|
||||
format!("profiles.{profile_name}.experimental_use_freeform_apply_patch"),
|
||||
Feature::ApplyPatchFreeform,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
if let Some(enabled) = profile.tools_web_search {
|
||||
explicit_settings.push((
|
||||
format!("profiles.{profile_name}.tools_web_search"),
|
||||
Feature::WebSearchRequest,
|
||||
enabled,
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
explicit_settings
|
||||
}
|
||||
|
||||
pub(crate) fn validate_explicit_feature_settings_in_config_toml(
|
||||
cfg: &ConfigToml,
|
||||
feature_requirements: Option<&Sourced<FeatureRequirementsToml>>,
|
||||
) -> std::io::Result<()> {
|
||||
let Some(Sourced {
|
||||
value: feature_requirements,
|
||||
source,
|
||||
}) = feature_requirements
|
||||
else {
|
||||
return Ok(());
|
||||
};
|
||||
|
||||
let pinned_features = parse_feature_requirements(feature_requirements.clone(), source)?;
|
||||
if pinned_features.is_empty() {
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let allowed = feature_requirements_display(&pinned_features);
|
||||
for (path, feature, enabled) in explicit_feature_settings_in_config(cfg) {
|
||||
if pinned_features
|
||||
.get(&feature)
|
||||
.is_some_and(|required| *required != enabled)
|
||||
{
|
||||
return Err(std::io::Error::new(
|
||||
std::io::ErrorKind::InvalidData,
|
||||
ConstraintError::InvalidValue {
|
||||
field_name: "features",
|
||||
candidate: format!("{path}={enabled}"),
|
||||
allowed,
|
||||
requirement_source: source.clone(),
|
||||
},
|
||||
));
|
||||
}
|
||||
}
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub(crate) fn validate_feature_requirements_in_config_toml(
|
||||
cfg: &ConfigToml,
|
||||
feature_requirements: Option<&Sourced<FeatureRequirementsToml>>,
|
||||
) -> std::io::Result<()> {
|
||||
fn validate_profile(
|
||||
cfg: &ConfigToml,
|
||||
profile_name: Option<&str>,
|
||||
profile: &ConfigProfile,
|
||||
feature_requirements: Option<&Sourced<FeatureRequirementsToml>>,
|
||||
) -> std::io::Result<()> {
|
||||
let configured_features = Features::from_config(cfg, profile, FeatureOverrides::default());
|
||||
ManagedFeatures::from_configured(configured_features, feature_requirements.cloned())
|
||||
.map(|_| ())
|
||||
.map_err(|err| {
|
||||
if let Some(profile_name) = profile_name {
|
||||
std::io::Error::new(
|
||||
err.kind(),
|
||||
format!(
|
||||
"invalid feature configuration for profile `{profile_name}`: {err}"
|
||||
),
|
||||
)
|
||||
} else {
|
||||
err
|
||||
}
|
||||
})
|
||||
}
|
||||
|
||||
validate_profile(cfg, None, &ConfigProfile::default(), feature_requirements)?;
|
||||
for (profile_name, profile) in &cfg.profiles {
|
||||
validate_profile(cfg, Some(profile_name), profile, feature_requirements)?;
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
|
@ -91,6 +91,7 @@ use toml::Value as TomlValue;
|
|||
use toml_edit::DocumentMut;
|
||||
|
||||
pub mod edit;
|
||||
mod managed_features;
|
||||
mod network_proxy_spec;
|
||||
mod permissions;
|
||||
pub mod profile;
|
||||
|
|
@ -102,6 +103,7 @@ pub use codex_config::ConstraintError;
|
|||
pub use codex_config::ConstraintResult;
|
||||
pub use codex_network_proxy::NetworkProxyAuditMetadata;
|
||||
|
||||
pub use managed_features::ManagedFeatures;
|
||||
pub use network_proxy_spec::NetworkProxySpec;
|
||||
pub use network_proxy_spec::StartedNetworkProxy;
|
||||
pub use permissions::NetworkToml;
|
||||
|
|
@ -475,7 +477,7 @@ pub struct Config {
|
|||
pub ghost_snapshot: GhostSnapshotConfig,
|
||||
|
||||
/// Centralized feature flags; source of truth for feature gating.
|
||||
pub features: Features,
|
||||
pub features: ManagedFeatures,
|
||||
|
||||
/// When `true`, suppress warnings about unstable (under development) features.
|
||||
pub suppress_unstable_features_warning: bool,
|
||||
|
|
@ -1684,7 +1686,19 @@ impl Config {
|
|||
codex_home: PathBuf,
|
||||
config_layer_stack: ConfigLayerStack,
|
||||
) -> std::io::Result<Self> {
|
||||
let requirements = config_layer_stack.requirements().clone();
|
||||
// Ensure that every field of ConfigRequirements is applied to the final
|
||||
// Config.
|
||||
let ConfigRequirements {
|
||||
approval_policy: mut constrained_approval_policy,
|
||||
sandbox_policy: mut constrained_sandbox_policy,
|
||||
web_search_mode: mut constrained_web_search_mode,
|
||||
feature_requirements,
|
||||
mcp_servers,
|
||||
exec_policy: _,
|
||||
enforce_residency,
|
||||
network: network_requirements,
|
||||
} = config_layer_stack.requirements().clone();
|
||||
|
||||
let user_instructions = Self::load_instructions(Some(&codex_home));
|
||||
let mut startup_warnings = Vec::new();
|
||||
|
||||
|
|
@ -1739,7 +1753,8 @@ impl Config {
|
|||
web_search_request: override_tools_web_search_request,
|
||||
};
|
||||
|
||||
let features = Features::from_config(&cfg, &config_profile, feature_overrides);
|
||||
let configured_features = Features::from_config(&cfg, &config_profile, feature_overrides);
|
||||
let features = ManagedFeatures::from_configured(configured_features, feature_requirements)?;
|
||||
let windows_sandbox_mode = resolve_windows_sandbox_mode(&cfg, &config_profile);
|
||||
let resolved_cwd = {
|
||||
use std::env;
|
||||
|
|
@ -1780,7 +1795,7 @@ impl Config {
|
|||
config_profile.sandbox_mode,
|
||||
windows_sandbox_level,
|
||||
&resolved_cwd,
|
||||
Some(&requirements.sandbox_policy),
|
||||
Some(&constrained_sandbox_policy),
|
||||
);
|
||||
if let SandboxPolicy::WorkspaceWrite { writable_roots, .. } = &mut sandbox_policy {
|
||||
for path in additional_writable_roots {
|
||||
|
|
@ -1805,13 +1820,13 @@ impl Config {
|
|||
}
|
||||
});
|
||||
if !approval_policy_was_explicit
|
||||
&& let Err(err) = requirements.approval_policy.can_set(&approval_policy)
|
||||
&& let Err(err) = constrained_approval_policy.can_set(&approval_policy)
|
||||
{
|
||||
tracing::warn!(
|
||||
error = %err,
|
||||
"default approval policy is disallowed by requirements; falling back to required default"
|
||||
);
|
||||
approval_policy = requirements.approval_policy.value();
|
||||
approval_policy = constrained_approval_policy.value();
|
||||
}
|
||||
let web_search_mode = resolve_web_search_mode(&cfg, &config_profile, &features)
|
||||
.unwrap_or(WebSearchMode::Cached);
|
||||
|
|
@ -2052,18 +2067,6 @@ impl Config {
|
|||
.or_else(|| resolve_sqlite_home_env(&resolved_cwd))
|
||||
.unwrap_or_else(|| codex_home.to_path_buf());
|
||||
|
||||
// Ensure that every field of ConfigRequirements is applied to the final
|
||||
// Config.
|
||||
let ConfigRequirements {
|
||||
approval_policy: mut constrained_approval_policy,
|
||||
sandbox_policy: mut constrained_sandbox_policy,
|
||||
web_search_mode: mut constrained_web_search_mode,
|
||||
mcp_servers,
|
||||
exec_policy: _,
|
||||
enforce_residency,
|
||||
network: network_requirements,
|
||||
} = requirements;
|
||||
|
||||
apply_requirement_constrained_value(
|
||||
"approval_policy",
|
||||
approval_policy,
|
||||
|
|
@ -4967,7 +4970,7 @@ model_verbosity = "high"
|
|||
use_experimental_unified_exec_tool: !cfg!(windows),
|
||||
background_terminal_max_timeout: DEFAULT_MAX_BACKGROUND_TERMINAL_TIMEOUT_MS,
|
||||
ghost_snapshot: GhostSnapshotConfig::default(),
|
||||
features: Features::with_defaults(),
|
||||
features: Features::with_defaults().into(),
|
||||
suppress_unstable_features_warning: false,
|
||||
active_profile: Some("o3".to_string()),
|
||||
active_project: ProjectConfig { trust_level: None },
|
||||
|
|
@ -5097,7 +5100,7 @@ model_verbosity = "high"
|
|||
use_experimental_unified_exec_tool: !cfg!(windows),
|
||||
background_terminal_max_timeout: DEFAULT_MAX_BACKGROUND_TERMINAL_TIMEOUT_MS,
|
||||
ghost_snapshot: GhostSnapshotConfig::default(),
|
||||
features: Features::with_defaults(),
|
||||
features: Features::with_defaults().into(),
|
||||
suppress_unstable_features_warning: false,
|
||||
active_profile: Some("gpt3".to_string()),
|
||||
active_project: ProjectConfig { trust_level: None },
|
||||
|
|
@ -5225,7 +5228,7 @@ model_verbosity = "high"
|
|||
use_experimental_unified_exec_tool: !cfg!(windows),
|
||||
background_terminal_max_timeout: DEFAULT_MAX_BACKGROUND_TERMINAL_TIMEOUT_MS,
|
||||
ghost_snapshot: GhostSnapshotConfig::default(),
|
||||
features: Features::with_defaults(),
|
||||
features: Features::with_defaults().into(),
|
||||
suppress_unstable_features_warning: false,
|
||||
active_profile: Some("zdr".to_string()),
|
||||
active_project: ProjectConfig { trust_level: None },
|
||||
|
|
@ -5339,7 +5342,7 @@ model_verbosity = "high"
|
|||
use_experimental_unified_exec_tool: !cfg!(windows),
|
||||
background_terminal_max_timeout: DEFAULT_MAX_BACKGROUND_TERMINAL_TIMEOUT_MS,
|
||||
ghost_snapshot: GhostSnapshotConfig::default(),
|
||||
features: Features::with_defaults(),
|
||||
features: Features::with_defaults().into(),
|
||||
suppress_unstable_features_warning: false,
|
||||
active_profile: Some("gpt5".to_string()),
|
||||
active_project: ProjectConfig { trust_level: None },
|
||||
|
|
@ -5394,6 +5397,7 @@ model_verbosity = "high"
|
|||
allowed_web_search_modes: Some(vec![
|
||||
crate::config_loader::WebSearchModeRequirement::Cached,
|
||||
]),
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -5998,6 +6002,7 @@ mcp_oauth_callback_url = "https://example.com/callback"
|
|||
crate::config_loader::SandboxModeRequirement::ReadOnly,
|
||||
]),
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -6116,6 +6121,146 @@ trust_level = "untrusted"
|
|||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn feature_requirements_normalize_effective_feature_values() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
|
||||
let config = ConfigBuilder::default()
|
||||
.codex_home(codex_home.path().to_path_buf())
|
||||
.cloud_requirements(CloudRequirementsLoader::new(async {
|
||||
Ok(Some(crate::config_loader::ConfigRequirementsToml {
|
||||
feature_requirements: Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([
|
||||
("personality".to_string(), true),
|
||||
("shell_tool".to_string(), false),
|
||||
]),
|
||||
}),
|
||||
..Default::default()
|
||||
}))
|
||||
}))
|
||||
.build()
|
||||
.await?;
|
||||
|
||||
assert!(config.features.enabled(Feature::Personality));
|
||||
assert!(!config.features.enabled(Feature::ShellTool));
|
||||
assert!(
|
||||
!config
|
||||
.startup_warnings
|
||||
.iter()
|
||||
.any(|warning| warning.contains("Configured value for `features`")),
|
||||
"{:?}",
|
||||
config.startup_warnings
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn explicit_feature_config_is_normalized_by_requirements() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
std::fs::write(
|
||||
codex_home.path().join(CONFIG_TOML_FILE),
|
||||
r#"
|
||||
[features]
|
||||
personality = false
|
||||
shell_tool = true
|
||||
"#,
|
||||
)?;
|
||||
|
||||
let config = ConfigBuilder::default()
|
||||
.codex_home(codex_home.path().to_path_buf())
|
||||
.fallback_cwd(Some(codex_home.path().to_path_buf()))
|
||||
.cloud_requirements(CloudRequirementsLoader::new(async {
|
||||
Ok(Some(crate::config_loader::ConfigRequirementsToml {
|
||||
feature_requirements: Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([
|
||||
("personality".to_string(), true),
|
||||
("shell_tool".to_string(), false),
|
||||
]),
|
||||
}),
|
||||
..Default::default()
|
||||
}))
|
||||
}))
|
||||
.build()
|
||||
.await?;
|
||||
|
||||
assert!(config.features.enabled(Feature::Personality));
|
||||
assert!(!config.features.enabled(Feature::ShellTool));
|
||||
assert!(
|
||||
!config
|
||||
.startup_warnings
|
||||
.iter()
|
||||
.any(|warning| warning.contains("Configured value for `features`")),
|
||||
"{:?}",
|
||||
config.startup_warnings
|
||||
);
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn feature_requirements_normalize_runtime_feature_mutations() -> std::io::Result<()> {
|
||||
let codex_home = TempDir::new()?;
|
||||
|
||||
let mut config = ConfigBuilder::default()
|
||||
.codex_home(codex_home.path().to_path_buf())
|
||||
.cloud_requirements(CloudRequirementsLoader::new(async {
|
||||
Ok(Some(crate::config_loader::ConfigRequirementsToml {
|
||||
feature_requirements: Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([
|
||||
("personality".to_string(), true),
|
||||
("shell_tool".to_string(), false),
|
||||
]),
|
||||
}),
|
||||
..Default::default()
|
||||
}))
|
||||
}))
|
||||
.build()
|
||||
.await?;
|
||||
|
||||
let mut requested = config.features.get().clone();
|
||||
requested
|
||||
.disable(Feature::Personality)
|
||||
.enable(Feature::ShellTool);
|
||||
assert!(config.features.can_set(&requested).is_ok());
|
||||
config
|
||||
.features
|
||||
.set(requested)
|
||||
.expect("managed feature mutations should normalize successfully");
|
||||
|
||||
assert!(config.features.enabled(Feature::Personality));
|
||||
assert!(!config.features.enabled(Feature::ShellTool));
|
||||
|
||||
Ok(())
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn feature_requirements_reject_legacy_aliases() {
|
||||
let codex_home = TempDir::new().expect("tempdir");
|
||||
|
||||
let err = ConfigBuilder::default()
|
||||
.codex_home(codex_home.path().to_path_buf())
|
||||
.cloud_requirements(CloudRequirementsLoader::new(async {
|
||||
Ok(Some(crate::config_loader::ConfigRequirementsToml {
|
||||
feature_requirements: Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([("collab".to_string(), true)]),
|
||||
}),
|
||||
..Default::default()
|
||||
}))
|
||||
}))
|
||||
.build()
|
||||
.await
|
||||
.expect_err("legacy aliases should be rejected");
|
||||
|
||||
assert_eq!(err.kind(), std::io::ErrorKind::InvalidData);
|
||||
assert!(
|
||||
err.to_string()
|
||||
.contains("use canonical feature key `multi_agent`"),
|
||||
"{err}"
|
||||
);
|
||||
}
|
||||
|
||||
#[test]
|
||||
fn experimental_realtime_ws_base_url_loads_from_config_toml() -> std::io::Result<()> {
|
||||
let cfg: ConfigToml = toml::from_str(
|
||||
|
|
|
|||
|
|
@ -1,6 +1,9 @@
|
|||
use super::ConfigToml;
|
||||
use super::deserialize_config_toml_with_base;
|
||||
use crate::config::edit::ConfigEdit;
|
||||
use crate::config::edit::ConfigEditsBuilder;
|
||||
use crate::config::managed_features::validate_explicit_feature_settings_in_config_toml;
|
||||
use crate::config::managed_features::validate_feature_requirements_in_config_toml;
|
||||
use crate::config_loader::CloudRequirementsLoader;
|
||||
use crate::config_loader::ConfigLayerEntry;
|
||||
use crate::config_loader::ConfigLayerStack;
|
||||
|
|
@ -331,6 +334,35 @@ impl ConfigService {
|
|||
format!("Invalid configuration: {err}"),
|
||||
)
|
||||
})?;
|
||||
let user_config_toml =
|
||||
deserialize_config_toml_with_base(user_config.clone(), &self.codex_home).map_err(
|
||||
|err| {
|
||||
ConfigServiceError::write(
|
||||
ConfigWriteErrorCode::ConfigValidationError,
|
||||
format!("Invalid configuration: {err}"),
|
||||
)
|
||||
},
|
||||
)?;
|
||||
validate_explicit_feature_settings_in_config_toml(
|
||||
&user_config_toml,
|
||||
layers.requirements().feature_requirements.as_ref(),
|
||||
)
|
||||
.map_err(|err| {
|
||||
ConfigServiceError::write(
|
||||
ConfigWriteErrorCode::ConfigValidationError,
|
||||
format!("Invalid configuration: {err}"),
|
||||
)
|
||||
})?;
|
||||
validate_feature_requirements_in_config_toml(
|
||||
&user_config_toml,
|
||||
layers.requirements().feature_requirements.as_ref(),
|
||||
)
|
||||
.map_err(|err| {
|
||||
ConfigServiceError::write(
|
||||
ConfigWriteErrorCode::ConfigValidationError,
|
||||
format!("Invalid configuration: {err}"),
|
||||
)
|
||||
})?;
|
||||
|
||||
let updated_layers = layers.with_user_config(&provided_path, user_config.clone());
|
||||
let effective = updated_layers.effective_config();
|
||||
|
|
@ -706,6 +738,7 @@ mod tests {
|
|||
use codex_app_server_protocol::AskForApproval;
|
||||
use codex_utils_absolute_path::AbsolutePathBuf;
|
||||
use pretty_assertions::assert_eq;
|
||||
use std::collections::BTreeMap;
|
||||
use tempfile::tempdir;
|
||||
|
||||
#[test]
|
||||
|
|
@ -1088,6 +1121,108 @@ personality = true
|
|||
assert_eq!(contents.trim(), "model = \"user\"");
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn write_value_rejects_feature_requirement_conflict() {
|
||||
let tmp = tempdir().expect("tempdir");
|
||||
std::fs::write(tmp.path().join(CONFIG_TOML_FILE), "").unwrap();
|
||||
|
||||
let service = ConfigService::new(
|
||||
tmp.path().to_path_buf(),
|
||||
vec![],
|
||||
LoaderOverrides {
|
||||
managed_config_path: None,
|
||||
#[cfg(target_os = "macos")]
|
||||
managed_preferences_base64: None,
|
||||
macos_managed_config_requirements_base64: None,
|
||||
},
|
||||
CloudRequirementsLoader::new(async {
|
||||
Ok(Some(ConfigRequirementsToml {
|
||||
feature_requirements: Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([("personality".to_string(), true)]),
|
||||
}),
|
||||
..Default::default()
|
||||
}))
|
||||
}),
|
||||
);
|
||||
|
||||
let error = service
|
||||
.write_value(ConfigValueWriteParams {
|
||||
file_path: Some(tmp.path().join(CONFIG_TOML_FILE).display().to_string()),
|
||||
key_path: "features.personality".to_string(),
|
||||
value: serde_json::json!(false),
|
||||
merge_strategy: MergeStrategy::Replace,
|
||||
expected_version: None,
|
||||
})
|
||||
.await
|
||||
.expect_err("conflicting feature write should fail");
|
||||
|
||||
assert_eq!(
|
||||
error.write_error_code(),
|
||||
Some(ConfigWriteErrorCode::ConfigValidationError)
|
||||
);
|
||||
assert!(
|
||||
error
|
||||
.to_string()
|
||||
.contains("invalid value for `features`: `features.personality=false`"),
|
||||
"{error}"
|
||||
);
|
||||
assert_eq!(
|
||||
std::fs::read_to_string(tmp.path().join(CONFIG_TOML_FILE)).unwrap(),
|
||||
""
|
||||
);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn write_value_rejects_profile_feature_requirement_conflict() {
|
||||
let tmp = tempdir().expect("tempdir");
|
||||
std::fs::write(tmp.path().join(CONFIG_TOML_FILE), "").unwrap();
|
||||
|
||||
let service = ConfigService::new(
|
||||
tmp.path().to_path_buf(),
|
||||
vec![],
|
||||
LoaderOverrides {
|
||||
managed_config_path: None,
|
||||
#[cfg(target_os = "macos")]
|
||||
managed_preferences_base64: None,
|
||||
macos_managed_config_requirements_base64: None,
|
||||
},
|
||||
CloudRequirementsLoader::new(async {
|
||||
Ok(Some(ConfigRequirementsToml {
|
||||
feature_requirements: Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([("personality".to_string(), true)]),
|
||||
}),
|
||||
..Default::default()
|
||||
}))
|
||||
}),
|
||||
);
|
||||
|
||||
let error = service
|
||||
.write_value(ConfigValueWriteParams {
|
||||
file_path: Some(tmp.path().join(CONFIG_TOML_FILE).display().to_string()),
|
||||
key_path: "profiles.enterprise.features.personality".to_string(),
|
||||
value: serde_json::json!(false),
|
||||
merge_strategy: MergeStrategy::Replace,
|
||||
expected_version: None,
|
||||
})
|
||||
.await
|
||||
.expect_err("conflicting profile feature write should fail");
|
||||
|
||||
assert_eq!(
|
||||
error.write_error_code(),
|
||||
Some(ConfigWriteErrorCode::ConfigValidationError)
|
||||
);
|
||||
assert!(
|
||||
error.to_string().contains(
|
||||
"invalid value for `features`: `profiles.enterprise.features.personality=false`"
|
||||
),
|
||||
"{error}"
|
||||
);
|
||||
assert_eq!(
|
||||
std::fs::read_to_string(tmp.path().join(CONFIG_TOML_FILE)).unwrap(),
|
||||
""
|
||||
);
|
||||
}
|
||||
|
||||
#[tokio::test]
|
||||
async fn read_reports_managed_overrides_user_and_session_flags() {
|
||||
let tmp = tempdir().expect("tempdir");
|
||||
|
|
|
|||
|
|
@ -34,6 +34,7 @@ pub use codex_config::ConfigLoadError;
|
|||
pub use codex_config::ConfigRequirements;
|
||||
pub use codex_config::ConfigRequirementsToml;
|
||||
pub use codex_config::ConstrainedWithSource;
|
||||
pub use codex_config::FeatureRequirementsToml;
|
||||
pub use codex_config::LoaderOverrides;
|
||||
pub use codex_config::McpServerIdentity;
|
||||
pub use codex_config::McpServerRequirement;
|
||||
|
|
|
|||
|
|
@ -23,6 +23,7 @@ use codex_protocol::protocol::AskForApproval;
|
|||
use codex_protocol::protocol::SandboxPolicy;
|
||||
use codex_utils_absolute_path::AbsolutePathBuf;
|
||||
use pretty_assertions::assert_eq;
|
||||
use std::collections::BTreeMap;
|
||||
use std::collections::HashMap;
|
||||
use std::path::Path;
|
||||
use tempfile::tempdir;
|
||||
|
|
@ -494,6 +495,9 @@ async fn load_requirements_toml_produces_expected_constraints() -> anyhow::Resul
|
|||
allowed_approval_policies = ["never", "on-request"]
|
||||
allowed_web_search_modes = ["cached"]
|
||||
enforce_residency = "us"
|
||||
|
||||
[features]
|
||||
personality = true
|
||||
"#,
|
||||
)
|
||||
.await?;
|
||||
|
|
@ -515,6 +519,15 @@ enforce_residency = "us"
|
|||
.cloned(),
|
||||
Some(vec![crate::config_loader::WebSearchModeRequirement::Cached])
|
||||
);
|
||||
assert_eq!(
|
||||
config_requirements_toml
|
||||
.feature_requirements
|
||||
.as_ref()
|
||||
.map(|requirements| requirements.value.clone()),
|
||||
Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([("personality".to_string(), true)]),
|
||||
})
|
||||
);
|
||||
let config_requirements: ConfigRequirements = config_requirements_toml.try_into()?;
|
||||
assert_eq!(
|
||||
config_requirements.approval_policy.value(),
|
||||
|
|
@ -552,6 +565,15 @@ enforce_residency = "us"
|
|||
config_requirements.enforce_residency.value(),
|
||||
Some(crate::config_loader::ResidencyRequirement::Us)
|
||||
);
|
||||
assert_eq!(
|
||||
config_requirements
|
||||
.feature_requirements
|
||||
.as_ref()
|
||||
.map(|requirements| requirements.value.clone()),
|
||||
Some(crate::config_loader::FeatureRequirementsToml {
|
||||
entries: BTreeMap::from([("personality".to_string(), true)]),
|
||||
})
|
||||
);
|
||||
Ok(())
|
||||
}
|
||||
|
||||
|
|
@ -581,6 +603,7 @@ allowed_approval_policies = ["on-request"]
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -629,6 +652,7 @@ allowed_approval_policies = ["on-request"]
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
@ -666,6 +690,7 @@ async fn load_config_layers_includes_cloud_requirements() -> anyhow::Result<()>
|
|||
allowed_approval_policies: Some(vec![AskForApproval::Never]),
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: None,
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
|
|||
|
|
@ -245,6 +245,14 @@ impl Features {
|
|||
self
|
||||
}
|
||||
|
||||
pub fn set_enabled(&mut self, f: Feature, enabled: bool) -> &mut Self {
|
||||
if enabled {
|
||||
self.enable(f)
|
||||
} else {
|
||||
self.disable(f)
|
||||
}
|
||||
}
|
||||
|
||||
pub fn record_legacy_usage_force(&mut self, alias: &str, feature: Feature) {
|
||||
let (summary, details) = legacy_usage_notice(alias, feature);
|
||||
self.legacy_usages.insert(LegacyFeatureUsage {
|
||||
|
|
@ -353,10 +361,7 @@ impl Features {
|
|||
}
|
||||
|
||||
overrides.apply(&mut features);
|
||||
if features.enabled(Feature::JsReplToolsOnly) && !features.enabled(Feature::JsRepl) {
|
||||
tracing::warn!("js_repl_tools_only requires js_repl; disabling js_repl_tools_only");
|
||||
features.disable(Feature::JsReplToolsOnly);
|
||||
}
|
||||
features.normalize_dependencies();
|
||||
|
||||
features
|
||||
}
|
||||
|
|
@ -364,6 +369,13 @@ impl Features {
|
|||
pub fn enabled_features(&self) -> Vec<Feature> {
|
||||
self.enabled.iter().copied().collect()
|
||||
}
|
||||
|
||||
pub(crate) fn normalize_dependencies(&mut self) {
|
||||
if self.enabled(Feature::JsReplToolsOnly) && !self.enabled(Feature::JsRepl) {
|
||||
tracing::warn!("js_repl_tools_only requires js_repl; disabling js_repl_tools_only");
|
||||
self.disable(Feature::JsReplToolsOnly);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
fn legacy_usage_notice(alias: &str, feature: Feature) -> (String, Option<String>) {
|
||||
|
|
@ -404,7 +416,7 @@ fn web_search_details() -> &'static str {
|
|||
}
|
||||
|
||||
/// Keys accepted in `[features]` tables.
|
||||
fn feature_for_key(key: &str) -> Option<Feature> {
|
||||
pub(crate) fn feature_for_key(key: &str) -> Option<Feature> {
|
||||
for spec in FEATURES {
|
||||
if spec.key == key {
|
||||
return Some(spec.id);
|
||||
|
|
@ -413,6 +425,13 @@ fn feature_for_key(key: &str) -> Option<Feature> {
|
|||
legacy::feature_for_key(key)
|
||||
}
|
||||
|
||||
pub(crate) fn canonical_feature_for_key(key: &str) -> Option<Feature> {
|
||||
FEATURES
|
||||
.iter()
|
||||
.find(|spec| spec.key == key)
|
||||
.map(|spec| spec.id)
|
||||
}
|
||||
|
||||
/// Returns `true` if the provided string matches a known feature toggle key.
|
||||
pub fn is_known_feature_key(key: &str) -> bool {
|
||||
feature_for_key(key).is_some()
|
||||
|
|
|
|||
|
|
@ -566,7 +566,10 @@ mod tests {
|
|||
fn codex_apps_mcp_url_uses_openai_connectors_gateway_when_feature_is_enabled() {
|
||||
let mut config = crate::config::test_config();
|
||||
config.chatgpt_base_url = "https://chatgpt.com".to_string();
|
||||
config.features.enable(Feature::AppsMcpGateway);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::AppsMcpGateway)
|
||||
.expect("test config should allow apps gateway");
|
||||
|
||||
assert_eq!(
|
||||
codex_apps_mcp_url(&config),
|
||||
|
|
@ -582,7 +585,10 @@ mod tests {
|
|||
let mut servers = with_codex_apps_mcp(HashMap::new(), false, None, &config);
|
||||
assert!(!servers.contains_key(CODEX_APPS_MCP_SERVER_NAME));
|
||||
|
||||
config.features.enable(Feature::Apps);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow apps");
|
||||
|
||||
servers = with_codex_apps_mcp(servers, true, None, &config);
|
||||
let server = servers
|
||||
|
|
@ -595,7 +601,10 @@ mod tests {
|
|||
|
||||
assert_eq!(url, "https://chatgpt.com/backend-api/wham/apps");
|
||||
|
||||
config.features.enable(Feature::AppsMcpGateway);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::AppsMcpGateway)
|
||||
.expect("test config should allow apps gateway");
|
||||
servers = with_codex_apps_mcp(servers, true, None, &config);
|
||||
let server = servers
|
||||
.get(CODEX_APPS_MCP_SERVER_NAME)
|
||||
|
|
|
|||
|
|
@ -266,7 +266,7 @@ mod agent {
|
|||
// Approval policy
|
||||
agent_config.permissions.approval_policy = Constrained::allow_only(AskForApproval::Never);
|
||||
// Consolidation runs as an internal sub-agent and must not recursively delegate.
|
||||
agent_config.features.disable(Feature::Collab);
|
||||
let _ = agent_config.features.disable(Feature::Collab);
|
||||
|
||||
// Sandbox policy
|
||||
let mut writable_roots = Vec::new();
|
||||
|
|
|
|||
|
|
@ -475,7 +475,9 @@ mod tests {
|
|||
async fn js_repl_instructions_are_appended_when_enabled() {
|
||||
let tmp = tempfile::tempdir().expect("tempdir");
|
||||
let mut cfg = make_config(&tmp, 4096, None).await;
|
||||
cfg.features.enable(Feature::JsRepl);
|
||||
cfg.features
|
||||
.enable(Feature::JsRepl)
|
||||
.expect("test config should allow js_repl");
|
||||
|
||||
let res = get_user_instructions(&cfg, None)
|
||||
.await
|
||||
|
|
@ -488,9 +490,13 @@ mod tests {
|
|||
async fn js_repl_tools_only_instructions_are_feature_gated() {
|
||||
let tmp = tempfile::tempdir().expect("tempdir");
|
||||
let mut cfg = make_config(&tmp, 4096, None).await;
|
||||
cfg.features
|
||||
let mut features = cfg.features.get().clone();
|
||||
features
|
||||
.enable(Feature::JsRepl)
|
||||
.enable(Feature::JsReplToolsOnly);
|
||||
cfg.features
|
||||
.set(features)
|
||||
.expect("test config should allow js_repl tool restrictions");
|
||||
|
||||
let res = get_user_instructions(&cfg, None)
|
||||
.await
|
||||
|
|
@ -503,9 +509,13 @@ mod tests {
|
|||
async fn js_repl_original_resolution_guidance_is_feature_gated() {
|
||||
let tmp = tempfile::tempdir().expect("tempdir");
|
||||
let mut cfg = make_config(&tmp, 4096, None).await;
|
||||
cfg.features
|
||||
let mut features = cfg.features.get().clone();
|
||||
features
|
||||
.enable(Feature::JsRepl)
|
||||
.enable(Feature::ImageDetailOriginal);
|
||||
cfg.features
|
||||
.set(features)
|
||||
.expect("test config should allow js_repl image detail settings");
|
||||
|
||||
let res = get_user_instructions(&cfg, None)
|
||||
.await
|
||||
|
|
@ -730,7 +740,9 @@ mod tests {
|
|||
async fn apps_feature_does_not_emit_user_instructions_by_itself() {
|
||||
let tmp = tempfile::tempdir().expect("tempdir");
|
||||
let mut cfg = make_config(&tmp, 4096, None).await;
|
||||
cfg.features.enable(Feature::Apps);
|
||||
cfg.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow apps");
|
||||
|
||||
let res = get_user_instructions(&cfg, None).await;
|
||||
assert_eq!(res, None);
|
||||
|
|
@ -742,7 +754,9 @@ mod tests {
|
|||
fs::write(tmp.path().join("AGENTS.md"), "base doc").unwrap();
|
||||
|
||||
let mut cfg = make_config(&tmp, 4096, None).await;
|
||||
cfg.features.enable(Feature::Apps);
|
||||
cfg.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow apps");
|
||||
|
||||
let res = get_user_instructions(&cfg, None)
|
||||
.await
|
||||
|
|
|
|||
|
|
@ -1207,7 +1207,10 @@ mod tests {
|
|||
.codex_home(home.path().to_path_buf())
|
||||
.build()
|
||||
.await?;
|
||||
config.features.disable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::Sqlite)
|
||||
.expect("test config should allow sqlite to be disabled");
|
||||
|
||||
let newest = write_session_file(home.path(), "2025-01-03T12-00-00", Uuid::from_u128(9001))?;
|
||||
let middle = write_session_file(home.path(), "2025-01-02T12-00-00", Uuid::from_u128(9002))?;
|
||||
|
|
@ -1253,7 +1256,10 @@ mod tests {
|
|||
.codex_home(home.path().to_path_buf())
|
||||
.build()
|
||||
.await?;
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow sqlite");
|
||||
|
||||
let uuid = Uuid::from_u128(9010);
|
||||
let thread_id = ThreadId::from_string(&uuid.to_string()).expect("valid thread id");
|
||||
|
|
@ -1319,7 +1325,10 @@ mod tests {
|
|||
.codex_home(home.path().to_path_buf())
|
||||
.build()
|
||||
.await?;
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow sqlite");
|
||||
|
||||
let uuid = Uuid::from_u128(9011);
|
||||
let thread_id = ThreadId::from_string(&uuid.to_string()).expect("valid thread id");
|
||||
|
|
|
|||
|
|
@ -95,7 +95,7 @@ async fn start_review_conversation(
|
|||
{
|
||||
panic!("by construction Constrained<WebSearchMode> must always support Disabled: {err}");
|
||||
}
|
||||
sub_agent_config.features.disable(Feature::Collab);
|
||||
let _ = sub_agent_config.features.disable(Feature::Collab);
|
||||
|
||||
// Set explicit review rubric for the sub-agent
|
||||
sub_agent_config.base_instructions = Some(crate::REVIEW_PROMPT.to_string());
|
||||
|
|
|
|||
|
|
@ -973,7 +973,7 @@ fn apply_spawn_agent_runtime_overrides(
|
|||
|
||||
fn apply_spawn_agent_overrides(config: &mut Config, child_depth: i32) {
|
||||
if child_depth >= config.agent_max_depth {
|
||||
config.features.disable(Feature::Collab);
|
||||
let _ = config.features.disable(Feature::Collab);
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -1989,7 +1989,8 @@ mod tests {
|
|||
let (_session, mut turn) = make_session_and_context().await;
|
||||
Arc::make_mut(&mut turn.config)
|
||||
.features
|
||||
.enable(Feature::ImageDetailOriginal);
|
||||
.enable(Feature::ImageDetailOriginal)
|
||||
.expect("test config should allow feature update");
|
||||
turn.model_info.supports_image_detail_original = true;
|
||||
|
||||
let content_item =
|
||||
|
|
|
|||
|
|
@ -4,4 +4,7 @@ codex_rust_crate(
|
|||
name = "common",
|
||||
crate_name = "core_test_support",
|
||||
crate_srcs = glob(["*.rs"]),
|
||||
lib_data_extra = [
|
||||
"//codex-rs/core:model_availability_nux_fixtures",
|
||||
],
|
||||
)
|
||||
|
|
|
|||
|
|
@ -3,6 +3,7 @@ use std::path::Path;
|
|||
use std::path::PathBuf;
|
||||
use std::sync::Arc;
|
||||
|
||||
use anyhow::Context;
|
||||
use anyhow::Result;
|
||||
use codex_core::CodexAuth;
|
||||
use codex_core::CodexThread;
|
||||
|
|
@ -11,12 +12,15 @@ use codex_core::ThreadManager;
|
|||
use codex_core::built_in_model_providers;
|
||||
use codex_core::config::Config;
|
||||
use codex_core::features::Feature;
|
||||
use codex_core::models_manager::collaboration_mode_presets::CollaborationModesConfig;
|
||||
use codex_protocol::config_types::ServiceTier;
|
||||
use codex_protocol::openai_models::ModelsResponse;
|
||||
use codex_protocol::protocol::AskForApproval;
|
||||
use codex_protocol::protocol::EventMsg;
|
||||
use codex_protocol::protocol::Op;
|
||||
use codex_protocol::protocol::SandboxPolicy;
|
||||
use codex_protocol::protocol::SessionConfiguredEvent;
|
||||
use codex_protocol::protocol::SessionSource;
|
||||
use codex_protocol::user_input::UserInput;
|
||||
use serde_json::Value;
|
||||
use tempfile::TempDir;
|
||||
|
|
@ -28,11 +32,13 @@ use crate::responses::output_value_to_text;
|
|||
use crate::responses::start_mock_server;
|
||||
use crate::streaming_sse::StreamingSseServer;
|
||||
use crate::wait_for_event;
|
||||
use crate::wait_for_event_match;
|
||||
use wiremock::Match;
|
||||
use wiremock::matchers::path_regex;
|
||||
|
||||
type ConfigMutator = dyn FnOnce(&mut Config) + Send;
|
||||
type PreBuildHook = dyn FnOnce(&Path) + Send + 'static;
|
||||
const TEST_MODEL_WITH_EXPERIMENTAL_TOOLS: &str = "test-gpt-5.1-codex";
|
||||
|
||||
/// A collection of different ways the model can output an apply_patch call
|
||||
#[derive(Clone, Copy, Debug, PartialEq, Eq, Hash)]
|
||||
|
|
@ -128,7 +134,10 @@ impl TestCodexBuilder {
|
|||
self.config_mutators.push(Box::new(move |config| {
|
||||
config.model_provider.base_url = Some(base_url_clone);
|
||||
config.experimental_realtime_ws_model = Some("realtime-test-model".to_string());
|
||||
config.features.enable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
}));
|
||||
self.build_with_home_and_base_url(base_url, home, None)
|
||||
.await
|
||||
|
|
@ -172,11 +181,21 @@ impl TestCodexBuilder {
|
|||
resume_from: Option<PathBuf>,
|
||||
) -> anyhow::Result<TestCodex> {
|
||||
let auth = self.auth.clone();
|
||||
let thread_manager = codex_core::test_support::thread_manager_with_models_provider_and_home(
|
||||
auth.clone(),
|
||||
config.model_provider.clone(),
|
||||
config.codex_home.clone(),
|
||||
);
|
||||
let thread_manager = if let Some(model_catalog) = config.model_catalog.clone() {
|
||||
ThreadManager::new(
|
||||
config.codex_home.clone(),
|
||||
codex_core::test_support::auth_manager_from_auth(auth.clone()),
|
||||
SessionSource::Exec,
|
||||
Some(model_catalog),
|
||||
CollaborationModesConfig::default(),
|
||||
)
|
||||
} else {
|
||||
codex_core::test_support::thread_manager_with_models_provider_and_home(
|
||||
auth.clone(),
|
||||
config.model_provider.clone(),
|
||||
config.codex_home.clone(),
|
||||
)
|
||||
};
|
||||
let thread_manager = Arc::new(thread_manager);
|
||||
|
||||
let new_conversation = match resume_from {
|
||||
|
|
@ -232,17 +251,56 @@ impl TestCodexBuilder {
|
|||
for mutator in mutators {
|
||||
mutator(&mut config);
|
||||
}
|
||||
ensure_test_model_catalog(&mut config)?;
|
||||
|
||||
if config.include_apply_patch_tool {
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
config.features.enable(Feature::ApplyPatchFreeform)?;
|
||||
} else {
|
||||
config.features.disable(Feature::ApplyPatchFreeform);
|
||||
config.features.disable(Feature::ApplyPatchFreeform)?;
|
||||
}
|
||||
|
||||
Ok((config, cwd))
|
||||
}
|
||||
}
|
||||
|
||||
fn ensure_test_model_catalog(config: &mut Config) -> Result<()> {
|
||||
if config.model.as_deref() != Some(TEST_MODEL_WITH_EXPERIMENTAL_TOOLS)
|
||||
|| config.model_catalog.is_some()
|
||||
{
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
let bundled_models_path = codex_utils_cargo_bin::find_resource!("../../models.json")
|
||||
.context("bundled models.json")?;
|
||||
let bundled_models_contents =
|
||||
std::fs::read_to_string(&bundled_models_path).with_context(|| {
|
||||
format!(
|
||||
"read bundled models.json from {}",
|
||||
bundled_models_path.display()
|
||||
)
|
||||
})?;
|
||||
let bundled_models: ModelsResponse =
|
||||
serde_json::from_str(&bundled_models_contents).context("parse bundled models.json")?;
|
||||
let mut model = bundled_models
|
||||
.models
|
||||
.iter()
|
||||
.find(|candidate| candidate.slug == "gpt-5.1-codex")
|
||||
.cloned()
|
||||
.unwrap_or_else(|| panic!("missing bundled model gpt-5.1-codex"));
|
||||
model.slug = TEST_MODEL_WITH_EXPERIMENTAL_TOOLS.to_string();
|
||||
model.display_name = TEST_MODEL_WITH_EXPERIMENTAL_TOOLS.to_string();
|
||||
model.experimental_supported_tools = vec![
|
||||
"test_sync_tool".to_string(),
|
||||
"read_file".to_string(),
|
||||
"grep_files".to_string(),
|
||||
"list_dir".to_string(),
|
||||
];
|
||||
config.model_catalog = Some(ModelsResponse {
|
||||
models: vec![model],
|
||||
});
|
||||
Ok(())
|
||||
}
|
||||
|
||||
pub struct TestCodex {
|
||||
pub home: Arc<TempDir>,
|
||||
pub cwd: Arc<TempDir>,
|
||||
|
|
@ -334,8 +392,14 @@ impl TestCodex {
|
|||
})
|
||||
.await?;
|
||||
|
||||
wait_for_event(&self.codex, |event| {
|
||||
matches!(event, EventMsg::TurnComplete(_))
|
||||
let turn_id = wait_for_event_match(&self.codex, |event| match event {
|
||||
EventMsg::TurnStarted(event) => Some(event.turn_id.clone()),
|
||||
_ => None,
|
||||
})
|
||||
.await;
|
||||
wait_for_event(&self.codex, |event| match event {
|
||||
EventMsg::TurnComplete(event) => event.turn_id == turn_id,
|
||||
_ => false,
|
||||
})
|
||||
.await;
|
||||
Ok(())
|
||||
|
|
|
|||
|
|
@ -24,8 +24,14 @@ impl ZshForkRuntime {
|
|||
approval_policy: AskForApproval,
|
||||
sandbox_policy: SandboxPolicy,
|
||||
) {
|
||||
config.features.enable(Feature::ShellTool);
|
||||
config.features.enable(Feature::ShellZshFork);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellTool)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellZshFork)
|
||||
.expect("test config should allow feature update");
|
||||
config.zsh_path = Some(self.zsh_path.clone());
|
||||
config.main_execve_wrapper_exe = Some(self.main_execve_wrapper_exe.clone());
|
||||
config.permissions.allow_login_shell = false;
|
||||
|
|
|
|||
|
|
@ -222,8 +222,14 @@ fn parse_simple_csv_line(line: &str) -> Vec<String> {
|
|||
async fn report_agent_job_result_rejects_wrong_thread() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Collab);
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Collab)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -282,8 +288,14 @@ async fn report_agent_job_result_rejects_wrong_thread() -> Result<()> {
|
|||
async fn spawn_agents_on_csv_runs_and_exports() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Collab);
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Collab)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -319,8 +331,14 @@ async fn spawn_agents_on_csv_dedupes_item_ids() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Collab);
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Collab)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -371,8 +389,14 @@ async fn spawn_agents_on_csv_dedupes_item_ids() -> Result<()> {
|
|||
async fn spawn_agents_on_csv_stop_halts_future_items() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Collab);
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Collab)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
|
|||
|
|
@ -183,7 +183,10 @@ async fn websocket_v2_test_codex_shell_chain() -> Result<()> {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ResponsesWebsocketsV2);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsocketsV2)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
|
||||
let test = builder.build_with_websocket_server(&server).await?;
|
||||
|
|
@ -262,7 +265,10 @@ async fn websocket_v2_first_turn_uses_updated_fast_tier_after_startup_prewarm()
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ResponsesWebsocketsV2);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsocketsV2)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build_with_websocket_server(&server).await?;
|
||||
|
||||
|
|
@ -311,7 +317,10 @@ async fn websocket_v2_first_turn_drops_fast_tier_after_startup_prewarm() -> Resu
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ResponsesWebsocketsV2);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsocketsV2)
|
||||
.expect("test config should allow feature update");
|
||||
config.service_tier = Some(ServiceTier::Fast);
|
||||
});
|
||||
let test = builder.build_with_websocket_server(&server).await?;
|
||||
|
|
@ -365,7 +374,10 @@ async fn websocket_v2_next_turn_uses_updated_service_tier() -> Result<()> {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ResponsesWebsocketsV2);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsocketsV2)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build_with_websocket_server(&server).await?;
|
||||
|
||||
|
|
|
|||
|
|
@ -674,7 +674,10 @@ async fn apply_patch_cli_verification_failure_has_no_side_effects(
|
|||
|
||||
let harness = apply_patch_harness_with(|builder| {
|
||||
builder.with_config(|config| {
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ApplyPatchFreeform)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
})
|
||||
.await?;
|
||||
|
|
|
|||
|
|
@ -1598,7 +1598,10 @@ async fn run_scenario(scenario: &ScenarioSpec) -> Result<()> {
|
|||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy.clone());
|
||||
for feature in features {
|
||||
config.features.enable(feature);
|
||||
config
|
||||
.features
|
||||
.enable(feature)
|
||||
.expect("test config should allow feature update");
|
||||
}
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
|
|||
|
|
@ -939,8 +939,14 @@ async fn includes_apps_guidance_as_developer_message_when_enabled() {
|
|||
let mut builder = test_codex()
|
||||
.with_auth(CodexAuth::from_api_key("Test API Key"))
|
||||
.with_config(move |config| {
|
||||
config.features.enable(Feature::Apps);
|
||||
config.features.disable(Feature::AppsMcpGateway);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.disable(Feature::AppsMcpGateway)
|
||||
.expect("test config should allow feature update");
|
||||
config.chatgpt_base_url = apps_base_url;
|
||||
});
|
||||
let codex = builder
|
||||
|
|
|
|||
|
|
@ -1481,15 +1481,27 @@ async fn websocket_harness_with_options(
|
|||
let mut config = load_default_config_for_test(&codex_home).await;
|
||||
config.model = Some(MODEL.to_string());
|
||||
if websocket_enabled {
|
||||
config.features.enable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
} else {
|
||||
config.features.disable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
}
|
||||
if runtime_metrics_enabled {
|
||||
config.features.enable(Feature::RuntimeMetrics);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RuntimeMetrics)
|
||||
.expect("test config should allow feature update");
|
||||
}
|
||||
if websocket_v2_enabled {
|
||||
config.features.enable(Feature::ResponsesWebsocketsV2);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsocketsV2)
|
||||
.expect("test config should allow feature update");
|
||||
}
|
||||
let config = Arc::new(config);
|
||||
let mut model_info = codex_core::test_support::construct_model_info_offline(MODEL, &config);
|
||||
|
|
|
|||
|
|
@ -3112,7 +3112,7 @@ async fn snapshot_request_shape_pre_turn_compaction_strips_incoming_model_switch
|
|||
.with_config(move |config| {
|
||||
config.model_provider = model_provider;
|
||||
set_test_compact_prompt(config);
|
||||
config
|
||||
let _ = config
|
||||
.features
|
||||
.enable(codex_core::features::Feature::RemoteModels);
|
||||
config.model_auto_compact_token_limit = Some(200);
|
||||
|
|
|
|||
|
|
@ -10,6 +10,7 @@
|
|||
use super::compact::COMPACT_WARNING_MESSAGE;
|
||||
use super::compact::FIRST_REPLY;
|
||||
use super::compact::SUMMARY_TEXT;
|
||||
use anyhow::Result;
|
||||
use codex_core::CodexThread;
|
||||
use codex_core::ThreadManager;
|
||||
use codex_core::compact::SUMMARIZATION_PROMPT;
|
||||
|
|
@ -291,13 +292,36 @@ async fn compact_resume_and_fork_preserve_model_history_view() {
|
|||
assert_eq!(requests.len(), 5);
|
||||
}
|
||||
|
||||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
#[test]
|
||||
/// Scenario: after the forked branch is compacted, resuming again should reuse
|
||||
/// the compacted history and only append the new user message.
|
||||
async fn compact_resume_after_second_compaction_preserves_history() {
|
||||
fn compact_resume_after_second_compaction_preserves_history() -> Result<()> {
|
||||
const TEST_STACK_SIZE_BYTES: usize = 8 * 1024 * 1024;
|
||||
|
||||
let handle = std::thread::Builder::new()
|
||||
.name("compact_resume_after_second_compaction_preserves_history".to_string())
|
||||
.stack_size(TEST_STACK_SIZE_BYTES)
|
||||
.spawn(|| -> Result<()> {
|
||||
let runtime = tokio::runtime::Builder::new_multi_thread()
|
||||
.worker_threads(2)
|
||||
.thread_stack_size(TEST_STACK_SIZE_BYTES)
|
||||
.enable_all()
|
||||
.build()?;
|
||||
runtime.block_on(compact_resume_after_second_compaction_preserves_history_impl())
|
||||
})?;
|
||||
|
||||
match handle.join() {
|
||||
Ok(result) => result,
|
||||
Err(_) => Err(anyhow::anyhow!(
|
||||
"compact_resume_after_second_compaction_preserves_history thread panicked"
|
||||
)),
|
||||
}
|
||||
}
|
||||
|
||||
async fn compact_resume_after_second_compaction_preserves_history_impl() -> Result<()> {
|
||||
if network_disabled() {
|
||||
println!("Skipping test because network is disabled in this sandbox");
|
||||
return;
|
||||
return Ok(());
|
||||
}
|
||||
|
||||
// 1. Arrange mocked SSE responses for the initial flow plus the second compact.
|
||||
|
|
@ -402,6 +426,7 @@ async fn compact_resume_after_second_compaction_preserves_history() {
|
|||
assert_eq!(chunk, seeded_user_prefix);
|
||||
}
|
||||
}
|
||||
Ok(())
|
||||
}
|
||||
|
||||
fn normalize_line_endings(value: &mut Value) {
|
||||
|
|
|
|||
|
|
@ -26,10 +26,14 @@ async fn emits_deprecation_notice_for_legacy_feature_flag() -> anyhow::Result<()
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
let mut features = config.features.get().clone();
|
||||
features.enable(Feature::UnifiedExec);
|
||||
features
|
||||
.record_legacy_usage_force("use_experimental_unified_exec_tool", Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.record_legacy_usage_force("use_experimental_unified_exec_tool", Feature::UnifiedExec);
|
||||
.set(features)
|
||||
.expect("test config should allow managed feature metadata updates");
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
});
|
||||
|
||||
|
|
@ -122,7 +126,12 @@ async fn emits_deprecation_notice_for_web_search_feature_flag_values() -> anyhow
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
let mut entries = BTreeMap::new();
|
||||
entries.insert("web_search_request".to_string(), enabled);
|
||||
config.features.apply_map(&entries);
|
||||
let mut features = config.features.get().clone();
|
||||
features.apply_map(&entries);
|
||||
config
|
||||
.features
|
||||
.set(features)
|
||||
.expect("test config should allow managed feature map updates");
|
||||
});
|
||||
|
||||
let TestCodex { codex, .. } = builder.build(&server).await?;
|
||||
|
|
|
|||
|
|
@ -167,7 +167,10 @@ async fn execpolicy_blocks_shell_invocation() -> Result<()> {
|
|||
async fn shell_command_empty_script_with_collaboration_mode_does_not_panic() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
let call_id = "shell-empty-script-collab";
|
||||
|
|
@ -219,8 +222,14 @@ async fn shell_command_empty_script_with_collaboration_mode_does_not_panic() ->
|
|||
async fn unified_exec_empty_script_with_collaboration_mode_does_not_panic() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
let call_id = "unified-exec-empty-script-collab";
|
||||
|
|
@ -272,7 +281,10 @@ async fn unified_exec_empty_script_with_collaboration_mode_does_not_panic() -> R
|
|||
async fn shell_command_whitespace_script_with_collaboration_mode_does_not_panic() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
let call_id = "shell-whitespace-script-collab";
|
||||
|
|
@ -324,8 +336,14 @@ async fn shell_command_whitespace_script_with_collaboration_mode_does_not_panic(
|
|||
async fn unified_exec_whitespace_script_with_collaboration_mode_does_not_panic() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
let call_id = "unified-exec-whitespace-script-collab";
|
||||
|
|
|
|||
|
|
@ -19,7 +19,10 @@ async fn hierarchical_agents_appends_to_project_doc_in_user_instructions() {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ChildAgentsMd);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ChildAgentsMd)
|
||||
.expect("test config should allow feature update");
|
||||
std::fs::write(config.cwd.join("AGENTS.md"), "be nice").expect("write AGENTS.md");
|
||||
});
|
||||
let test = builder.build(&server).await.expect("build test codex");
|
||||
|
|
@ -58,7 +61,10 @@ async fn hierarchical_agents_emits_when_no_project_doc() {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ChildAgentsMd);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ChildAgentsMd)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await.expect("build test codex");
|
||||
|
||||
|
|
|
|||
|
|
@ -76,7 +76,10 @@ async fn run_js_repl_turn(
|
|||
calls: &[(&str, &str)],
|
||||
) -> Result<ResponseMock> {
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::JsRepl);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::JsRepl)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(server).await?;
|
||||
|
||||
|
|
@ -112,7 +115,10 @@ async fn js_repl_is_not_advertised_when_startup_node_is_incompatible() -> Result
|
|||
let old_node = write_too_old_node_script(temp.path())?;
|
||||
|
||||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.features.enable(Feature::JsRepl);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::JsRepl)
|
||||
.expect("test config should allow feature update");
|
||||
config.js_repl_node_path = Some(old_node);
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -164,7 +170,10 @@ async fn js_repl_persists_top_level_bindings_and_supports_tla() -> Result<()> {
|
|||
|
||||
let server = responses::start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::JsRepl);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::JsRepl)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
|
|||
|
|
@ -166,8 +166,14 @@ async fn web_search_pollution_moves_selected_thread_into_removed_phase2_inputs()
|
|||
let db = init_state_db(&home).await?;
|
||||
|
||||
let mut initial_builder = test_codex().with_home(home.clone()).with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config.features.enable(Feature::MemoryTool);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::MemoryTool)
|
||||
.expect("test config should allow feature update");
|
||||
config.memories.max_raw_memories_for_consolidation = 1;
|
||||
config.memories.no_memories_if_mcp_or_web_search = true;
|
||||
});
|
||||
|
|
@ -232,8 +238,14 @@ async fn web_search_pollution_moves_selected_thread_into_removed_phase2_inputs()
|
|||
.await;
|
||||
|
||||
let mut resumed_builder = test_codex().with_home(home.clone()).with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config.features.enable(Feature::MemoryTool);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::MemoryTool)
|
||||
.expect("test config should allow feature update");
|
||||
config.memories.max_raw_memories_for_consolidation = 1;
|
||||
config.memories.no_memories_if_mcp_or_web_search = true;
|
||||
});
|
||||
|
|
@ -310,9 +322,16 @@ async fn web_search_pollution_moves_selected_thread_into_removed_phase2_inputs()
|
|||
}
|
||||
|
||||
async fn build_test_codex(server: &wiremock::MockServer, home: Arc<TempDir>) -> Result<TestCodex> {
|
||||
#[allow(clippy::expect_used)]
|
||||
let mut builder = test_codex().with_home(home).with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config.features.enable(Feature::MemoryTool);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::MemoryTool)
|
||||
.expect("test config should allow feature update");
|
||||
config.memories.max_raw_memories_for_consolidation = 1;
|
||||
});
|
||||
builder.build(server).await
|
||||
|
|
|
|||
|
|
@ -133,7 +133,10 @@ async fn model_and_personality_change_only_appends_model_instructions() -> Resul
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.2-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
let next_model = "exp-codex-personality";
|
||||
|
|
|
|||
|
|
@ -102,7 +102,10 @@ async fn snapshot_model_visible_layout_turn_overrides() -> Result<()> {
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.2-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Pragmatic);
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -332,7 +335,10 @@ async fn snapshot_model_visible_layout_resume_with_personality_change() -> Resul
|
|||
|
||||
let mut resume_builder = test_codex().with_config(|config| {
|
||||
config.model = Some("gpt-5.2-codex".to_string());
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Pragmatic);
|
||||
});
|
||||
let resumed = resume_builder.resume(&server, home, rollout_path).await?;
|
||||
|
|
|
|||
|
|
@ -176,7 +176,10 @@ async fn process_sse_emits_failed_event_on_parse_error() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -217,7 +220,10 @@ async fn process_sse_records_failed_event_when_stream_closes_without_completed()
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -278,7 +284,10 @@ async fn process_sse_failed_event_records_response_error_message() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -337,7 +346,10 @@ async fn process_sse_failed_event_logs_parse_error() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -383,7 +395,10 @@ async fn process_sse_failed_event_logs_missing_error() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -438,7 +453,10 @@ async fn process_sse_failed_event_logs_response_completed_parse_error() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -559,7 +577,10 @@ async fn handle_responses_span_records_response_kind_and_tool_name() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(|config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -625,7 +646,10 @@ async fn record_responses_sets_span_fields_for_response_events() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(|config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -706,7 +730,10 @@ async fn handle_response_item_records_tool_result_for_custom_tool_call() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -776,7 +803,10 @@ async fn handle_response_item_records_tool_result_for_function_call() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -856,7 +886,10 @@ async fn handle_response_item_records_tool_result_for_local_shell_missing_ids()
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
@ -920,7 +953,10 @@ async fn handle_response_item_records_tool_result_for_local_shell_call() {
|
|||
|
||||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(move |config| {
|
||||
config.features.disable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await
|
||||
|
|
|
|||
|
|
@ -44,7 +44,10 @@ const LOCAL_PRAGMATIC_TEMPLATE: &str = "You are a deeply pragmatic, effective so
|
|||
async fn personality_does_not_mutate_base_instructions_without_template() {
|
||||
let codex_home = TempDir::new().expect("create temp dir");
|
||||
let mut config = load_default_config_for_test(&codex_home).await;
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Friendly);
|
||||
|
||||
let model_info = codex_core::test_support::construct_model_info_offline("gpt-5.1", &config);
|
||||
|
|
@ -58,7 +61,10 @@ async fn personality_does_not_mutate_base_instructions_without_template() {
|
|||
async fn base_instructions_override_disables_personality_template() {
|
||||
let codex_home = TempDir::new().expect("create temp dir");
|
||||
let mut config = load_default_config_for_test(&codex_home).await;
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Friendly);
|
||||
config.base_instructions = Some("override instructions".to_string());
|
||||
|
||||
|
|
@ -81,7 +87,10 @@ async fn user_turn_personality_none_does_not_add_update_message() -> anyhow::Res
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.2-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -127,7 +136,10 @@ async fn config_personality_some_sets_instructions_template() -> anyhow::Result<
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.2-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Friendly);
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -181,7 +193,10 @@ async fn config_personality_none_sends_no_personality() -> anyhow::Result<()> {
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.2-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::None);
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -242,7 +257,10 @@ async fn default_personality_is_pragmatic_without_config_toml() -> anyhow::Resul
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.2-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -290,7 +308,10 @@ async fn user_turn_personality_some_adds_update_message() -> anyhow::Result<()>
|
|||
let mut builder = test_codex()
|
||||
.with_model("exp-codex-personality")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -388,7 +409,10 @@ async fn user_turn_personality_same_value_does_not_add_update_message() -> anyho
|
|||
let mut builder = test_codex()
|
||||
.with_model("exp-codex-personality")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Pragmatic);
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -472,7 +496,10 @@ async fn user_turn_personality_same_value_does_not_add_update_message() -> anyho
|
|||
async fn instructions_uses_base_if_feature_disabled() -> anyhow::Result<()> {
|
||||
let codex_home = TempDir::new().expect("create temp dir");
|
||||
let mut config = load_default_config_for_test(&codex_home).await;
|
||||
config.features.disable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.personality = Some(Personality::Friendly);
|
||||
|
||||
let model_info =
|
||||
|
|
@ -498,7 +525,10 @@ async fn user_turn_personality_skips_if_feature_disabled() -> anyhow::Result<()>
|
|||
let mut builder = test_codex()
|
||||
.with_model("exp-codex-personality")
|
||||
.with_config(|config| {
|
||||
config.features.disable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -642,7 +672,10 @@ async fn remote_model_friendly_personality_instructions_with_feature() -> anyhow
|
|||
let mut builder = test_codex()
|
||||
.with_auth(codex_core::CodexAuth::create_dummy_chatgpt_auth_for_testing())
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.model = Some(remote_slug.to_string());
|
||||
config.personality = Some(Personality::Friendly);
|
||||
});
|
||||
|
|
@ -757,7 +790,10 @@ async fn user_turn_personality_remote_model_template_includes_update_message() -
|
|||
let mut builder = test_codex()
|
||||
.with_auth(codex_core::CodexAuth::create_dummy_chatgpt_auth_for_testing())
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Personality);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Personality)
|
||||
.expect("test config should allow feature update");
|
||||
config.model = Some("gpt-5.2-codex".to_string());
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
|
|||
|
|
@ -203,12 +203,19 @@ async fn plugin_apps_expose_tools_after_canonical_name_mention() -> Result<()> {
|
|||
|
||||
let codex_home = Arc::new(TempDir::new()?);
|
||||
write_plugin_app_plugin(codex_home.as_ref());
|
||||
#[allow(clippy::expect_used)]
|
||||
let mut builder = test_codex()
|
||||
.with_home(codex_home)
|
||||
.with_auth(CodexAuth::from_api_key("Test API Key"))
|
||||
.with_config(move |config| {
|
||||
config.features.enable(Feature::Apps);
|
||||
config.features.disable(Feature::AppsMcpGateway);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.disable(Feature::AppsMcpGateway)
|
||||
.expect("test config should allow feature update");
|
||||
config.chatgpt_base_url = apps_server.chatgpt_base_url;
|
||||
});
|
||||
let codex = builder
|
||||
|
|
|
|||
|
|
@ -125,7 +125,10 @@ async fn prompt_tools_are_consistent_across_requests() -> anyhow::Result<()> {
|
|||
.web_search_mode
|
||||
.set(WebSearchMode::Cached)
|
||||
.expect("test web_search_mode should satisfy constraints");
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
|
@ -219,8 +222,14 @@ async fn gpt_5_tools_without_apply_patch_append_apply_patch_instructions() -> an
|
|||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(|config| {
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.disable(Feature::ApplyPatchFreeform);
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::ApplyPatchFreeform)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
config.model = Some("gpt-5".to_string());
|
||||
})
|
||||
.build(&server)
|
||||
|
|
@ -291,7 +300,10 @@ async fn prefixes_context_and_instructions_once_and_consistently_across_requests
|
|||
let TestCodex { codex, config, .. } = test_codex()
|
||||
.with_config(|config| {
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
|
@ -379,7 +391,10 @@ async fn overrides_turn_context_but_keeps_cached_prefix_and_key_constant() -> an
|
|||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(|config| {
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
|
@ -643,7 +658,10 @@ async fn per_turn_overrides_keep_cached_prefix_and_key_constant() -> anyhow::Res
|
|||
let TestCodex { codex, .. } = test_codex()
|
||||
.with_config(|config| {
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
|
@ -767,7 +785,10 @@ async fn send_user_turn_with_no_changes_does_not_send_environment_context() -> a
|
|||
} = test_codex()
|
||||
.with_config(|config| {
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
|
@ -888,7 +909,10 @@ async fn send_user_turn_with_changes_sends_environment_context() -> anyhow::Resu
|
|||
} = test_codex()
|
||||
.with_config(|config| {
|
||||
config.user_instructions = Some("be consistent and helpful".to_string());
|
||||
config.features.enable(Feature::CollaborationModes);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::CollaborationModes)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
.build(&server)
|
||||
.await?;
|
||||
|
|
|
|||
|
|
@ -30,7 +30,10 @@ async fn request_body_is_zstd_compressed_for_codex_backend_when_enabled() -> any
|
|||
let mut builder = test_codex()
|
||||
.with_auth(CodexAuth::create_dummy_chatgpt_auth_for_testing())
|
||||
.with_config(move |config| {
|
||||
config.features.enable(Feature::EnableRequestCompression);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::EnableRequestCompression)
|
||||
.expect("test config should allow feature update");
|
||||
config.model_provider.base_url = Some(base_url);
|
||||
});
|
||||
let codex = builder.build(&server).await?.codex;
|
||||
|
|
@ -74,7 +77,10 @@ async fn request_body_is_not_compressed_for_api_key_auth_even_when_enabled() ->
|
|||
|
||||
let base_url = format!("{}/backend-api/codex/v1", server.uri());
|
||||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.features.enable(Feature::EnableRequestCompression);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::EnableRequestCompression)
|
||||
.expect("test config should allow feature update");
|
||||
config.model_provider.base_url = Some(base_url);
|
||||
});
|
||||
let codex = builder.build(&server).await?.codex;
|
||||
|
|
|
|||
|
|
@ -201,7 +201,10 @@ async fn with_additional_permissions_requires_approval_under_on_request() -> Res
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy_for_config);
|
||||
config.features.enable(Feature::RequestPermissions);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RequestPermissions)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -280,7 +283,10 @@ async fn relative_additional_permissions_resolve_against_tool_workdir() -> Resul
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy_for_config);
|
||||
config.features.enable(Feature::RequestPermissions);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RequestPermissions)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -372,7 +378,10 @@ async fn read_only_with_additional_permissions_widens_to_unrequested_cwd_write()
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy_for_config);
|
||||
config.features.enable(Feature::RequestPermissions);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RequestPermissions)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -462,7 +471,10 @@ async fn read_only_with_additional_permissions_widens_to_unrequested_tmp_write()
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy_for_config);
|
||||
config.features.enable(Feature::RequestPermissions);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RequestPermissions)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -553,7 +565,10 @@ async fn workspace_write_with_additional_permissions_can_write_outside_cwd() ->
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy_for_config);
|
||||
config.features.enable(Feature::RequestPermissions);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RequestPermissions)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -651,7 +666,10 @@ async fn with_additional_permissions_denied_approval_blocks_execution() -> Resul
|
|||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.permissions.approval_policy = Constrained::allow_any(approval_policy);
|
||||
config.permissions.sandbox_policy = Constrained::allow_any(sandbox_policy_for_config);
|
||||
config.features.enable(Feature::RequestPermissions);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::RequestPermissions)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
|
|||
|
|
@ -79,6 +79,7 @@ async fn request_user_input_round_trip_for_mode(mode: ModeKind) -> anyhow::Resul
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let builder = test_codex();
|
||||
#[allow(clippy::expect_used)]
|
||||
let TestCodex {
|
||||
codex,
|
||||
cwd,
|
||||
|
|
@ -87,7 +88,10 @@ async fn request_user_input_round_trip_for_mode(mode: ModeKind) -> anyhow::Resul
|
|||
} = builder
|
||||
.with_config(move |config| {
|
||||
if mode == ModeKind::Default {
|
||||
config.features.enable(Feature::DefaultModeRequestUserInput);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::DefaultModeRequestUserInput)
|
||||
.expect("test config should allow feature update");
|
||||
}
|
||||
})
|
||||
.build(&server)
|
||||
|
|
|
|||
|
|
@ -22,6 +22,7 @@ use core_test_support::responses::ev_assistant_message;
|
|||
use core_test_support::responses::ev_completed;
|
||||
use core_test_support::responses::ev_function_call;
|
||||
use core_test_support::responses::ev_response_created;
|
||||
use core_test_support::responses::mount_sse_once;
|
||||
use core_test_support::responses::mount_sse_sequence;
|
||||
use core_test_support::responses::sse;
|
||||
use core_test_support::responses::start_mock_server;
|
||||
|
|
@ -142,8 +143,14 @@ fn configure_apps_with_optional_rmcp(
|
|||
apps_base_url: &str,
|
||||
rmcp_server_bin: Option<String>,
|
||||
) {
|
||||
config.features.enable(Feature::Apps);
|
||||
config.features.disable(Feature::AppsMcpGateway);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.disable(Feature::AppsMcpGateway)
|
||||
.expect("test config should allow feature update");
|
||||
config.chatgpt_base_url = apps_base_url.to_string();
|
||||
if let Some(command) = rmcp_server_bin {
|
||||
let mut servers = config.mcp_servers.get().clone();
|
||||
|
|
@ -181,13 +188,13 @@ async fn search_tool_flag_adds_tool() -> Result<()> {
|
|||
|
||||
let server = start_mock_server().await;
|
||||
let apps_server = AppsTestServer::mount(&server).await?;
|
||||
let mock = mount_sse_sequence(
|
||||
let mock = mount_sse_once(
|
||||
&server,
|
||||
vec![sse(vec![
|
||||
sse(vec![
|
||||
ev_response_created("resp-1"),
|
||||
ev_assistant_message("msg-1", "done"),
|
||||
ev_completed("resp-1"),
|
||||
])],
|
||||
]),
|
||||
)
|
||||
.await;
|
||||
|
||||
|
|
|
|||
|
|
@ -251,9 +251,13 @@ async fn shell_command_times_out_with_timeout_ms() -> anyhow::Result<()> {
|
|||
async fn unicode_output(login: bool) -> anyhow::Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
#[allow(clippy::expect_used)]
|
||||
let harness = shell_command_harness_with(|builder| {
|
||||
builder.with_model("gpt-5.2").with_config(|config| {
|
||||
config.features.enable(Feature::PowershellUtf8);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::PowershellUtf8)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
})
|
||||
.await?;
|
||||
|
|
@ -281,9 +285,13 @@ async fn unicode_output(login: bool) -> anyhow::Result<()> {
|
|||
async fn unicode_output_with_newlines(login: bool) -> anyhow::Result<()> {
|
||||
skip_if_no_network!(Ok(()));
|
||||
|
||||
#[allow(clippy::expect_used)]
|
||||
let harness = shell_command_harness_with(|builder| {
|
||||
builder.with_model("gpt-5.2").with_config(|config| {
|
||||
config.features.enable(Feature::PowershellUtf8);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::PowershellUtf8)
|
||||
.expect("test config should allow feature update");
|
||||
})
|
||||
})
|
||||
.await?;
|
||||
|
|
|
|||
|
|
@ -119,8 +119,14 @@ async fn run_snapshot_command_with_options(
|
|||
} = options;
|
||||
let builder = test_codex().with_config(move |config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config.features.enable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
config.permissions.shell_environment_policy.r#set = shell_environment_set;
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
|
|
@ -207,7 +213,10 @@ async fn run_shell_command_snapshot_with_options(
|
|||
shell_environment_set,
|
||||
} = options;
|
||||
let builder = test_codex().with_config(move |config| {
|
||||
config.features.enable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
config.permissions.shell_environment_policy.r#set = shell_environment_set;
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
|
|
@ -399,7 +408,10 @@ async fn linux_shell_command_uses_shell_snapshot() -> Result<()> {
|
|||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_command_snapshot_preserves_shell_environment_policy_set() -> Result<()> {
|
||||
let builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
config.permissions.shell_environment_policy.r#set = policy_set_path_for_test();
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
|
|
@ -446,8 +458,14 @@ async fn shell_command_snapshot_preserves_shell_environment_policy_set() -> Resu
|
|||
async fn linux_unified_exec_snapshot_preserves_shell_environment_policy_set() -> Result<()> {
|
||||
let builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config.features.enable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
config.permissions.shell_environment_policy.r#set = policy_set_path_for_test();
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
|
|
@ -493,7 +511,10 @@ async fn linux_unified_exec_snapshot_preserves_shell_environment_policy_set() ->
|
|||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_command_snapshot_still_intercepts_apply_patch() -> Result<()> {
|
||||
let builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
config.include_apply_patch_tool = true;
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
|
|
@ -562,7 +583,10 @@ async fn shell_command_snapshot_still_intercepts_apply_patch() -> Result<()> {
|
|||
#[tokio::test(flavor = "multi_thread", worker_threads = 2)]
|
||||
async fn shell_snapshot_deleted_after_shutdown_with_skills() -> Result<()> {
|
||||
let builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
let home = harness.test().home.clone();
|
||||
|
|
|
|||
|
|
@ -40,7 +40,10 @@ use uuid::Uuid;
|
|||
async fn new_thread_is_recorded_in_state_db() -> Result<()> {
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -170,7 +173,10 @@ async fn backfill_scans_existing_rollouts() -> Result<()> {
|
|||
fs::write(&rollout_path, format!("{jsonl}\n")).expect("should write rollout file");
|
||||
})
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -230,7 +236,10 @@ async fn user_messages_persist_in_state_db() -> Result<()> {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
@ -281,7 +290,10 @@ async fn web_search_marks_thread_memory_mode_polluted_when_configured() -> Resul
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
config.memories.no_memories_if_mcp_or_web_search = true;
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
@ -331,7 +343,10 @@ async fn mcp_call_marks_thread_memory_mode_polluted_when_configured() -> Result<
|
|||
|
||||
let rmcp_test_server_bin = stdio_server_bin()?;
|
||||
let mut builder = test_codex().with_config(move |config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
config.memories.no_memories_if_mcp_or_web_search = true;
|
||||
|
||||
let mut servers = config.mcp_servers.get().clone();
|
||||
|
|
@ -434,7 +449,10 @@ async fn tool_call_logs_include_thread_id() -> Result<()> {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
let db = test.codex.state_db().expect("state db enabled");
|
||||
|
|
|
|||
|
|
@ -140,8 +140,12 @@ async fn setup_turn_one_with_spawned_child(
|
|||
)
|
||||
.await;
|
||||
|
||||
#[allow(clippy::expect_used)]
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Collab);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Collab)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(server).await?;
|
||||
test.submit_turn(TURN_1_PROMPT).await?;
|
||||
|
|
@ -252,7 +256,10 @@ async fn spawned_child_receives_forked_parent_context() -> Result<()> {
|
|||
.await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::Collab);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Collab)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
|
|||
|
|
@ -284,7 +284,10 @@ async fn apply_patch_tool_executes_and_emits_patch_events() -> anyhow::Result<()
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ApplyPatchFreeform)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -389,7 +392,10 @@ async fn apply_patch_reports_parse_diagnostics() -> anyhow::Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::ApplyPatchFreeform);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ApplyPatchFreeform)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
|
|||
|
|
@ -293,9 +293,15 @@ async fn collect_tools(use_unified_exec: bool) -> Result<Vec<String>> {
|
|||
|
||||
let mut builder = test_codex().with_config(move |config| {
|
||||
if use_unified_exec {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
} else {
|
||||
config.features.disable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
}
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
|
|
|||
|
|
@ -29,7 +29,10 @@ use pretty_assertions::assert_eq;
|
|||
async fn undo_harness() -> Result<TestCodexHarness> {
|
||||
let builder = test_codex().with_model("gpt-5.1").with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
config.features.enable(Feature::GhostCommit);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::GhostCommit)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
TestCodexHarness::with_builder(builder).await
|
||||
}
|
||||
|
|
|
|||
|
|
@ -164,7 +164,10 @@ async fn unified_exec_intercepts_apply_patch_exec_command() -> Result<()> {
|
|||
let builder = test_codex().with_config(|config| {
|
||||
config.include_apply_patch_tool = true;
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let harness = TestCodexHarness::with_builder(builder).await?;
|
||||
|
||||
|
|
@ -294,7 +297,10 @@ async fn unified_exec_emits_exec_command_begin_event() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -370,7 +376,10 @@ async fn unified_exec_resolves_relative_workdir() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -452,7 +461,10 @@ async fn unified_exec_respects_workdir_override() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_model("gpt-5").with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -535,7 +547,10 @@ async fn unified_exec_emits_exec_command_end_event() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -626,7 +641,10 @@ async fn unified_exec_emits_output_delta_for_exec_command() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -702,7 +720,10 @@ async fn unified_exec_full_lifecycle_with_background_end_event() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -813,7 +834,10 @@ async fn unified_exec_emits_terminal_interaction_for_write_stdin() -> Result<()>
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -917,7 +941,10 @@ async fn unified_exec_terminal_interaction_captures_delayed_output() -> Result<(
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1112,7 +1139,10 @@ async fn unified_exec_emits_one_begin_and_one_end_event() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1232,7 +1262,10 @@ async fn exec_command_reports_chunk_and_exit_metadata() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1351,7 +1384,10 @@ async fn unified_exec_defaults_to_pipe() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1441,7 +1477,10 @@ async fn unified_exec_can_enable_tty() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1525,7 +1564,10 @@ async fn unified_exec_respects_early_exit_notifications() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1621,7 +1663,10 @@ async fn write_stdin_returns_exit_metadata_and_clears_session() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1788,7 +1833,10 @@ async fn unified_exec_emits_end_event_when_session_dies_via_stdin() -> Result<()
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1898,7 +1946,10 @@ async fn unified_exec_keeps_long_running_session_after_turn_end() -> Result<()>
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -1994,7 +2045,10 @@ async fn unified_exec_interrupt_terminates_long_running_session() -> Result<()>
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2070,7 +2124,10 @@ async fn unified_exec_reuses_session_via_stdin() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2187,7 +2244,10 @@ async fn unified_exec_streams_after_lagged_output() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2322,7 +2382,10 @@ async fn unified_exec_timeout_and_followup_poll() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2432,7 +2495,10 @@ async fn unified_exec_formats_large_output_summary() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2521,7 +2587,10 @@ async fn unified_exec_runs_under_sandbox() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2605,7 +2674,10 @@ async fn unified_exec_python_prompt_under_seatbelt() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2724,7 +2796,10 @@ async fn unified_exec_runs_on_all_platforms() -> Result<()> {
|
|||
let server = start_mock_server().await;
|
||||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -2801,7 +2876,10 @@ async fn unified_exec_prunes_exited_sessions_first() -> Result<()> {
|
|||
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.use_experimental_unified_exec_tool = true;
|
||||
config.features.enable(Feature::UnifiedExec);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::UnifiedExec)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
|
|||
|
|
@ -19,7 +19,10 @@ use toml::toml;
|
|||
async fn emits_warning_when_unstable_features_enabled_via_config() {
|
||||
let home = TempDir::new().expect("tempdir");
|
||||
let mut config = load_default_config_for_test(&home).await;
|
||||
config.features.enable(Feature::ChildAgentsMd);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ChildAgentsMd)
|
||||
.expect("test config should allow feature update");
|
||||
let user_config_path =
|
||||
AbsolutePathBuf::from_absolute_path(config.codex_home.join(CONFIG_TOML_FILE))
|
||||
.expect("absolute user config path");
|
||||
|
|
@ -56,7 +59,10 @@ async fn emits_warning_when_unstable_features_enabled_via_config() {
|
|||
async fn suppresses_warning_when_configured() {
|
||||
let home = TempDir::new().expect("tempdir");
|
||||
let mut config = load_default_config_for_test(&home).await;
|
||||
config.features.enable(Feature::ChildAgentsMd);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ChildAgentsMd)
|
||||
.expect("test config should allow feature update");
|
||||
config.suppress_unstable_features_warning = true;
|
||||
let user_config_path =
|
||||
AbsolutePathBuf::from_absolute_path(config.codex_home.join(CONFIG_TOML_FILE))
|
||||
|
|
|
|||
|
|
@ -250,7 +250,10 @@ async fn user_shell_command_history_is_persisted_and_shared_with_model() -> anyh
|
|||
let server = responses::start_mock_server().await;
|
||||
// Disable it to ease command matching.
|
||||
let mut builder = core_test_support::test_codex::test_codex().with_config(move |config| {
|
||||
config.features.disable(Feature::ShellSnapshot);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::ShellSnapshot)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder.build(&server).await?;
|
||||
|
||||
|
|
|
|||
|
|
@ -299,7 +299,10 @@ async fn view_image_tool_can_preserve_original_resolution_on_gpt5_3_codex() -> a
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5.3-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::ImageDetailOriginal);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ImageDetailOriginal)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -398,7 +401,10 @@ async fn view_image_tool_keeps_legacy_behavior_below_gpt5_3_codex() -> anyhow::R
|
|||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_model("gpt-5.2").with_config(|config| {
|
||||
config.features.enable(Feature::ImageDetailOriginal);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ImageDetailOriginal)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -499,7 +505,10 @@ async fn js_repl_emit_image_attaches_local_image() -> anyhow::Result<()> {
|
|||
|
||||
let server = start_mock_server().await;
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::JsRepl);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::JsRepl)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
@ -613,8 +622,12 @@ async fn js_repl_view_image_requires_explicit_emit() -> anyhow::Result<()> {
|
|||
skip_if_no_network!(Ok(()));
|
||||
|
||||
let server = start_mock_server().await;
|
||||
#[allow(clippy::expect_used)]
|
||||
let mut builder = test_codex().with_config(|config| {
|
||||
config.features.enable(Feature::JsRepl);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::JsRepl)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let TestCodex {
|
||||
codex,
|
||||
|
|
|
|||
|
|
@ -74,7 +74,10 @@ async fn web_search_mode_takes_precedence_over_legacy_flags() {
|
|||
let mut builder = test_codex()
|
||||
.with_model("gpt-5-codex")
|
||||
.with_config(|config| {
|
||||
config.features.enable(Feature::WebSearchRequest);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::WebSearchRequest)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.web_search_mode
|
||||
.set(WebSearchMode::Cached)
|
||||
|
|
@ -119,8 +122,14 @@ async fn web_search_mode_defaults_to_cached_when_features_disabled() {
|
|||
.web_search_mode
|
||||
.set(WebSearchMode::Cached)
|
||||
.expect("test web_search_mode should satisfy constraints");
|
||||
config.features.disable(Feature::WebSearchCached);
|
||||
config.features.disable(Feature::WebSearchRequest);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::WebSearchCached)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.disable(Feature::WebSearchRequest)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder
|
||||
.build(&server)
|
||||
|
|
@ -170,8 +179,14 @@ async fn web_search_mode_updates_between_turns_with_sandbox_policy() {
|
|||
.web_search_mode
|
||||
.set(WebSearchMode::Cached)
|
||||
.expect("test web_search_mode should satisfy constraints");
|
||||
config.features.disable(Feature::WebSearchCached);
|
||||
config.features.disable(Feature::WebSearchRequest);
|
||||
config
|
||||
.features
|
||||
.disable(Feature::WebSearchCached)
|
||||
.expect("test config should allow feature update");
|
||||
config
|
||||
.features
|
||||
.disable(Feature::WebSearchRequest)
|
||||
.expect("test config should allow feature update");
|
||||
});
|
||||
let test = builder
|
||||
.build(&server)
|
||||
|
|
|
|||
|
|
@ -45,7 +45,10 @@ async fn websocket_fallback_switches_to_http_on_upgrade_required_connect() -> Re
|
|||
move |config| {
|
||||
config.model_provider.base_url = Some(base_url);
|
||||
config.model_provider.wire_api = codex_core::WireApi::Responses;
|
||||
config.features.enable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
// If we don't treat 426 specially, the sampling loop would retry the WebSocket
|
||||
// handshake before switching to the HTTP transport.
|
||||
config.model_provider.stream_max_retries = Some(2);
|
||||
|
|
@ -91,7 +94,10 @@ async fn websocket_fallback_switches_to_http_after_retries_exhausted() -> Result
|
|||
move |config| {
|
||||
config.model_provider.base_url = Some(base_url);
|
||||
config.model_provider.wire_api = codex_core::WireApi::Responses;
|
||||
config.features.enable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
config.model_provider.stream_max_retries = Some(2);
|
||||
config.model_provider.request_max_retries = Some(0);
|
||||
}
|
||||
|
|
@ -136,7 +142,10 @@ async fn websocket_fallback_hides_first_websocket_retry_stream_error() -> Result
|
|||
move |config| {
|
||||
config.model_provider.base_url = Some(base_url);
|
||||
config.model_provider.wire_api = codex_core::WireApi::Responses;
|
||||
config.features.enable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
config.model_provider.stream_max_retries = Some(2);
|
||||
config.model_provider.request_max_retries = Some(0);
|
||||
}
|
||||
|
|
@ -211,7 +220,10 @@ async fn websocket_fallback_is_sticky_across_turns() -> Result<()> {
|
|||
move |config| {
|
||||
config.model_provider.base_url = Some(base_url);
|
||||
config.model_provider.wire_api = codex_core::WireApi::Responses;
|
||||
config.features.enable(Feature::ResponsesWebsockets);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::ResponsesWebsockets)
|
||||
.expect("test config should allow feature update");
|
||||
config.model_provider.stream_max_retries = Some(2);
|
||||
config.model_provider.request_max_retries = Some(0);
|
||||
}
|
||||
|
|
|
|||
|
|
@ -2835,25 +2835,31 @@ impl App {
|
|||
.with_profile(self.active_profile.as_deref());
|
||||
for (feature, enabled) in &updates {
|
||||
let feature_key = feature.key();
|
||||
if *enabled {
|
||||
// Update the in-memory configs.
|
||||
self.config.features.enable(*feature);
|
||||
self.chat_widget.set_feature_enabled(*feature, true);
|
||||
if let Err(err) = self.config.features.set_enabled(*feature, *enabled) {
|
||||
tracing::error!(
|
||||
error = %err,
|
||||
feature = feature_key,
|
||||
"failed to update constrained feature flags"
|
||||
);
|
||||
self.chat_widget.add_error_message(format!(
|
||||
"Failed to update experimental feature `{feature_key}`: {err}"
|
||||
));
|
||||
continue;
|
||||
}
|
||||
let effective_enabled = self.config.features.enabled(*feature);
|
||||
self.chat_widget
|
||||
.set_feature_enabled(*feature, effective_enabled);
|
||||
if effective_enabled {
|
||||
builder = builder.set_feature_enabled(feature_key, true);
|
||||
} else if feature.default_enabled() {
|
||||
builder = builder.set_feature_enabled(feature_key, false);
|
||||
} else {
|
||||
// Update the in-memory configs.
|
||||
self.config.features.disable(*feature);
|
||||
self.chat_widget.set_feature_enabled(*feature, false);
|
||||
if feature.default_enabled() {
|
||||
builder = builder.set_feature_enabled(feature_key, false);
|
||||
} else {
|
||||
// If the feature already default to `false`, we drop the key
|
||||
// in the config file so that the user does not miss the feature
|
||||
// once it gets globally released.
|
||||
builder = builder.with_edits(vec![ConfigEdit::ClearPath {
|
||||
segments: vec!["features".to_string(), feature_key.to_string()],
|
||||
}]);
|
||||
}
|
||||
// If the feature already default to `false`, we drop the key
|
||||
// in the config file so that the user does not miss the feature
|
||||
// once it gets globally released.
|
||||
builder = builder.with_edits(vec![ConfigEdit::ClearPath {
|
||||
segments: vec!["features".to_string(), feature_key.to_string()],
|
||||
}]);
|
||||
}
|
||||
}
|
||||
if windows_sandbox_changed {
|
||||
|
|
|
|||
|
|
@ -6939,12 +6939,15 @@ impl ChatWidget {
|
|||
}
|
||||
|
||||
#[cfg_attr(not(target_os = "windows"), allow(dead_code))]
|
||||
pub(crate) fn set_feature_enabled(&mut self, feature: Feature, enabled: bool) {
|
||||
if enabled {
|
||||
self.config.features.enable(feature);
|
||||
} else {
|
||||
self.config.features.disable(feature);
|
||||
pub(crate) fn set_feature_enabled(&mut self, feature: Feature, enabled: bool) -> bool {
|
||||
if let Err(err) = self.config.features.set_enabled(feature, enabled) {
|
||||
tracing::warn!(
|
||||
error = %err,
|
||||
feature = feature.key(),
|
||||
"failed to update constrained chat widget feature state"
|
||||
);
|
||||
}
|
||||
let enabled = self.config.features.enabled(feature);
|
||||
if feature == Feature::VoiceTranscription {
|
||||
self.bottom_pane.set_voice_transcription_enabled(enabled);
|
||||
}
|
||||
|
|
@ -6985,6 +6988,7 @@ impl ChatWidget {
|
|||
),
|
||||
);
|
||||
}
|
||||
enabled
|
||||
}
|
||||
|
||||
pub(crate) fn set_full_access_warning_acknowledged(&mut self, acknowledged: bool) {
|
||||
|
|
|
|||
|
|
@ -6059,7 +6059,10 @@ fn render_bottom_popup(chat: &ChatWidget, width: u16) -> String {
|
|||
#[tokio::test]
|
||||
async fn apps_popup_refreshes_when_connectors_snapshot_updates() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
let notion_id = "unit_test_apps_popup_refresh_connector_1";
|
||||
let linear_id = "unit_test_apps_popup_refresh_connector_2";
|
||||
|
|
@ -6149,7 +6152,10 @@ async fn apps_popup_refreshes_when_connectors_snapshot_updates() {
|
|||
#[tokio::test]
|
||||
async fn apps_refresh_failure_keeps_existing_full_snapshot() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
let notion_id = "unit_test_apps_refresh_failure_connector_1";
|
||||
let linear_id = "unit_test_apps_refresh_failure_connector_2";
|
||||
|
|
@ -6228,7 +6234,10 @@ async fn apps_refresh_failure_keeps_existing_full_snapshot() {
|
|||
#[tokio::test]
|
||||
async fn apps_refresh_failure_with_cached_snapshot_triggers_pending_force_refetch() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
chat.connectors_prefetch_in_flight = true;
|
||||
chat.connectors_force_refetch_pending = true;
|
||||
|
|
@ -6264,7 +6273,10 @@ async fn apps_refresh_failure_with_cached_snapshot_triggers_pending_force_refetc
|
|||
#[tokio::test]
|
||||
async fn apps_partial_refresh_uses_same_filtering_as_full_refresh() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
|
||||
let full_connectors = vec![
|
||||
|
|
@ -6360,7 +6372,10 @@ async fn apps_partial_refresh_uses_same_filtering_as_full_refresh() {
|
|||
#[tokio::test]
|
||||
async fn apps_popup_shows_disabled_status_for_installed_but_disabled_apps() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
|
||||
chat.on_connectors_loaded(
|
||||
|
|
@ -6398,7 +6413,10 @@ async fn apps_popup_shows_disabled_status_for_installed_but_disabled_apps() {
|
|||
#[tokio::test]
|
||||
async fn apps_initial_load_applies_enabled_state_from_config() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
|
||||
let temp = tempdir().expect("tempdir");
|
||||
|
|
@ -6447,7 +6465,10 @@ async fn apps_initial_load_applies_enabled_state_from_config() {
|
|||
#[tokio::test]
|
||||
async fn apps_refresh_preserves_toggled_enabled_state() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
|
||||
chat.on_connectors_loaded(
|
||||
|
|
@ -6512,7 +6533,10 @@ async fn apps_refresh_preserves_toggled_enabled_state() {
|
|||
#[tokio::test]
|
||||
async fn apps_popup_for_not_installed_app_uses_install_only_selected_description() {
|
||||
let (mut chat, _rx, _op_rx) = make_chatwidget_manual(None).await;
|
||||
chat.config.features.enable(Feature::Apps);
|
||||
chat.config
|
||||
.features
|
||||
.enable(Feature::Apps)
|
||||
.expect("test config should allow feature update");
|
||||
chat.bottom_pane.set_connectors_enabled(true);
|
||||
|
||||
chat.on_connectors_loaded(
|
||||
|
|
|
|||
|
|
@ -527,6 +527,7 @@ mod tests {
|
|||
allowed_approval_policies: Some(vec![AskForApproval::OnRequest]),
|
||||
allowed_sandbox_modes: Some(vec![SandboxModeRequirement::ReadOnly]),
|
||||
allowed_web_search_modes: Some(vec![WebSearchModeRequirement::Cached]),
|
||||
feature_requirements: None,
|
||||
mcp_servers: Some(BTreeMap::from([(
|
||||
"docs".to_string(),
|
||||
McpServerRequirement {
|
||||
|
|
@ -652,6 +653,7 @@ approval_policy = "never"
|
|||
allowed_approval_policies: None,
|
||||
allowed_sandbox_modes: None,
|
||||
allowed_web_search_modes: Some(Vec::new()),
|
||||
feature_requirements: None,
|
||||
mcp_servers: None,
|
||||
rules: None,
|
||||
enforce_residency: None,
|
||||
|
|
|
|||
|
|
@ -1501,7 +1501,10 @@ trust_level = "untrusted"
|
|||
async fn read_session_cwd_prefers_sqlite_when_thread_id_present() -> std::io::Result<()> {
|
||||
let temp_dir = TempDir::new()?;
|
||||
let mut config = build_config(&temp_dir).await?;
|
||||
config.features.enable(Feature::Sqlite);
|
||||
config
|
||||
.features
|
||||
.enable(Feature::Sqlite)
|
||||
.expect("test config should allow sqlite");
|
||||
|
||||
let thread_id = ThreadId::new();
|
||||
let rollout_cwd = temp_dir.path().join("rollout-cwd");
|
||||
|
|
|
|||
Loading…
Add table
Reference in a new issue