## Overview Adds LM Studio OSS support. Closes #1883 ### Changes This PR enhances the behavior of `--oss` flag to support LM Studio as a provider. Additionally, it introduces a new flag`--local-provider` which can take in `lmstudio` or `ollama` as values if the user wants to explicitly choose which one to use. If no provider is specified `codex --oss` will auto-select the provider based on whichever is running. #### Additional enhancements The default can be set using `oss-provider` in config like: ``` oss_provider = "lmstudio" ``` For non-interactive users, they will need to either provide the provider as an arg or have it in their `config.toml` ### Notes For best performance, [set the default context length](https://lmstudio.ai/docs/app/advanced/per-model) for gpt-oss to the maximum your machine can support --------- Co-authored-by: Matt Clayton <matt@lmstudio.ai> Co-authored-by: Eric Traut <etraut@openai.com> |
||
|---|---|---|
| .. | ||
| src | ||
| Cargo.toml | ||
| README.md | ||
codex-common
This crate is designed for utilities that need to be shared across other crates in the workspace, but should not go in core.
For narrow utility features, the pattern is to add introduce a new feature under [features] in Cargo.toml and then gate it with #[cfg] in lib.rs, as appropriate.