- Add --max-context flag to serve (sliding window, default 4 messages)
to prevent KV-cache explosion on multi-turn conversations
- Pass server max-tokens to chat UI HTML attribute instead of
hardcoded 2048 in JavaScript
- Add chat.js and chat_embed.go for embedded LEM chat UI
Co-Authored-By: Virgil <virgil@lethean.io>
Runs the same prompts through baseline and fine-tuned models, scores
both with the heuristic scorer, and outputs a comparison report with
LEK score deltas and improvement/regression counts.
Uses built-in content probes by default, or custom prompts file.
Co-Authored-By: Virgil <virgil@lethean.io>
Lesson command runs prompts from YAML definitions with state tracking,
sandwich signing, and interactive review mode. Sequence command runs
multiple lessons in order (vertical/strict or horizontal/flexible).
State files enable resume after interruption. Both output chat JSONL
compatible with 'core ml train'.
Co-Authored-By: Virgil <virgil@lethean.io>
Sandwich format wraps seed prompts with KB preamble (axioms framework)
and LEK-1 kernel postfix, then generates responses via local MLX
inference. Outputs chat JSONL compatible with 'core ml train'.
Supports --dry-run mode to output prompts without inference.
Co-Authored-By: Virgil <virgil@lethean.io>
Native MLX LoRA training on Apple Silicon — no Python required.
Reads chat-format JSONL, applies LoRA to target projections,
trains with AdamW + masked cross-entropy loss on assistant tokens.
Usage: core ml train --model-path /path/to/model --data training.jsonl
Co-Authored-By: Virgil <virgil@lethean.io>
## Summary
- Extract PHP/Laravel commands to `core/php` repo (42 files, standalone module)
- Extract CI/release + SDK commands to `core/ci` repo (10 files)
- Remove `internal/variants/` build tag system entirely
- Move all 30 remaining command packages from `internal/cmd/` to top-level `cmd/`
- Rewrite `main.go` with direct imports — no more variant selection
- PHP and CI are now optional via commented import lines in main.go
Co-authored-by: Claude <developers@lethean.io>
Reviewed-on: #2
Co-authored-by: Charon <charon@lthn.ai>
Co-committed-by: Charon <charon@lthn.ai>