core-agent-ide/codex-rs/exec/src
rugvedS07 837bc98a1d
LM Studio OSS Support (#2312)
## Overview

Adds LM Studio OSS support. Closes #1883


### Changes
This PR enhances the behavior of `--oss` flag to support LM Studio as a
provider. Additionally, it introduces a new flag`--local-provider` which
can take in `lmstudio` or `ollama` as values if the user wants to
explicitly choose which one to use.

If no provider is specified `codex --oss` will auto-select the provider
based on whichever is running.

#### Additional enhancements 
The default can be set using `oss-provider` in config like:

```
oss_provider = "lmstudio"
```

For non-interactive users, they will need to either provide the provider
as an arg or have it in their `config.toml`

### Notes
For best performance, [set the default context
length](https://lmstudio.ai/docs/app/advanced/per-model) for gpt-oss to
the maximum your machine can support

---------

Co-authored-by: Matt Clayton <matt@lmstudio.ai>
Co-authored-by: Eric Traut <etraut@openai.com>
2025-11-17 11:49:09 -08:00
..
cli.rs LM Studio OSS Support (#2312) 2025-11-17 11:49:09 -08:00
event_processor.rs Minor cleanup of codex exec output (#4585) 2025-10-02 14:17:42 -07:00
event_processor_with_human_output.rs core/tui: non-blocking MCP startup (#6334) 2025-11-17 11:26:11 -08:00
event_processor_with_jsonl_output.rs Add warning on compact (#6052) 2025-10-31 13:27:33 -07:00
exec_events.rs [exec] Add MCP tool arguments and results (#5899) 2025-10-29 14:23:57 -07:00
lib.rs LM Studio OSS Support (#2312) 2025-11-17 11:49:09 -08:00
main.rs fix: move arg0 handling out of codex-linux-sandbox and into its own crate (#1697) 2025-07-28 08:31:24 -07:00