Interactive conversation with local MLX models. Supports: - Streaming token output - Conversation capture to JSONL for 'core ml train' - Optional sandwich signing (--kb + --kernel) - Commands: /quit, /save, /clear, /system, /undo, /help Co-Authored-By: Virgil <virgil@lethean.io>
7 lines
83 B
Go
7 lines
83 B
Go
//go:build darwin && arm64
|
|
|
|
package ml
|
|
|
|
func init() {
|
|
mlCmd.AddCommand(chatCmd)
|
|
}
|