core-agent-ide/codex-rs/tui
ae a191945ed6
fix: token usage display and context calculation (#2117)
- I had a recent conversation where the one-liner showed using 11M
tokens! But looking into it 10M were cached. So I looked into it and I
think we had a regression here. ->
- Use blended total tokens for chat composer usage display
- Compute remaining context using tokens_in_context_window helper

------
https://chatgpt.com/codex/tasks/task_i_68981a16c0a4832cbf416017390930e5
2025-08-11 07:19:15 -07:00
..
src fix: token usage display and context calculation (#2117) 2025-08-11 07:19:15 -07:00
tests Revert "Streaming markdown (#1920)" (#1981) 2025-08-08 01:38:39 +00:00
Cargo.toml Change the UI of apply patch (#1907) 2025-08-07 05:25:41 +00:00
prompt_for_init_command.md chore: rename INIT.md to prompt_for_init_command.md and move closer to usage (#1886) 2025-08-06 11:58:57 -07:00