Native MLX LoRA training on Apple Silicon — no Python required. Reads chat-format JSONL, applies LoRA to target projections, trains with AdamW + masked cross-entropy loss on assistant tokens. Usage: core ml train --model-path /path/to/model --data training.jsonl Co-Authored-By: Virgil <virgil@lethean.io>
7 lines
84 B
Go
7 lines
84 B
Go
//go:build darwin && arm64
|
|
|
|
package ml
|
|
|
|
func init() {
|
|
mlCmd.AddCommand(trainCmd)
|
|
}
|