No description
Find a file
Snider 0eaf3d5a17 feat(mlx): add LoRA adapter layers and AdamW optimizer
LoRA: low-rank adaptation with trainable A/B matrices, Kaiming normal
init, safetensors save/load. AdamW: decoupled weight decay optimizer
with positional moment tracking for gradient-replaced params.

14 tests passing including end-to-end LoRA+AdamW training loop.

Co-Authored-By: Virgil <virgil@lethean.io>
2026-02-17 17:25:42 +00:00
agentic test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
ai feat: extract AI/ML packages from core/go 2026-02-16 15:25:55 +00:00
mcp test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
ml refactor(mlx): drop mlx build tag, auto-enable on darwin/arm64 2026-02-17 16:57:41 +00:00
mlx feat(mlx): add LoRA adapter layers and AdamW optimizer 2026-02-17 17:25:42 +00:00
rag test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
go.mod test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
go.sum test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
test-mlx.go test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
TEST-RESULTS.md test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00