No description
Find a file
Snider e9973aef3c feat(mlx): add autograd — VJP, JVP, ValueAndGrad, loss functions
Native Go bindings for MLX-C gradient computation on Apple Silicon.
Foundation for LoRA training without Python.

- VJP (reverse-mode autodiff) for backward pass
- JVP (forward-mode autodiff) for directional derivatives
- ValueAndGrad for combined loss + gradient computation
- Checkpoint for memory-efficient gradient recomputation
- CrossEntropyLoss (numerically stable via LogSumExp)
- MSELoss, Log, SumAll, MeanAll, OnesLike helpers
- TakeAlongAxis and LogSumExp ops
- Fix closure callback null vector bug (affects compile.go too)
- Fix Float() returning 0 for float32 arrays

14 tests passing on Metal GPU.

Co-Authored-By: Virgil <virgil@lethean.io>
2026-02-17 17:18:47 +00:00
agentic test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
ai feat: extract AI/ML packages from core/go 2026-02-16 15:25:55 +00:00
mcp test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
ml refactor(mlx): drop mlx build tag, auto-enable on darwin/arm64 2026-02-17 16:57:41 +00:00
mlx feat(mlx): add autograd — VJP, JVP, ValueAndGrad, loss functions 2026-02-17 17:18:47 +00:00
rag test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
go.mod test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
go.sum test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
test-mlx.go test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00
TEST-RESULTS.md test: validate MLX inference and scoring pipeline on M3 Ultra 2026-02-16 17:24:36 +00:00