HTTP backend returns Result{Text: text} with nil Metrics since remote APIs don't provide Metal-level inference metrics. Co-Authored-By: Virgil <virgil@lethean.io>
Inference backends (MLX, llama.cpp, HTTP), scoring engine, agent orchestrator, GGUF management, DuckDB storage, Parquet I/O. Adds CLAUDE.md/TODO.md/FINDINGS.md. Co-Authored-By: Virgil <virgil@lethean.io>