Go client for the LEM distributed inference API (BugSETI/Agentic).
Workers register via Forgejo PAT auth, pull prompt batches, run local
inference (MLX/vLLM/llama.cpp), submit results. Credits tracked as
Phase 1 stub for Phase 2 blockchain LEM token.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Go scoring daemon that polls M3 for unscored LoRA checkpoints,
converts MLX→PEFT, runs 23 binary capability probes via OpenAI-
compatible API, and pushes results to InfluxDB. Zero Python deps.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Ingests benchmark data (content scores, capability scores, training
curves) from JSONL files and mlx_lm logs into InfluxDB. Batched
writes, iteration extraction from checkpoint labels.
Also adds github.com/hupe1980/go-huggingface for future HF sync.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Ports conversational_training.py to Go with InfluxDB reporting.
24 built-in seed conversations (Vi identity, philosophy, mindfulness).
Supports extra JSONL files and golden set conversion to chat format.
Also fixes InfluxDB client to accept 204 No Content on writes.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
All scoring/influx/export/expand logic moves to pkg/lem as an
importable package. main.go is now a thin CLI dispatcher.
This lets new commands import the shared library directly —
ready for converting Python scripts to Go subcommands.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>