feat(templates): add 5 audit templates for self-healing codebase

New scan templates:
- dependency-audit: find code rolling its own vs using framework
- dead-code: unreachable functions, unused exports, orphaned files
- test-gaps: untested functions, missing error path coverage
- api-consistency: endpoint naming, response shapes, error formats
- doc-sync: documentation vs code drift

Co-Authored-By: Virgil <virgil@lethean.io>
This commit is contained in:
Snider 2026-03-15 18:33:52 +00:00
parent e6dbce3a78
commit 670e4c9a10
5 changed files with 197 additions and 0 deletions

View file

@ -0,0 +1,39 @@
name: API Consistency Audit
description: Check REST endpoint naming, response shapes, and error formats
category: audit
guidelines:
- All endpoints should follow the core/api conventions
- Response shapes should be consistent across providers
- Error responses must include structured error objects
- UK English in all user-facing strings
phases:
- name: Endpoint Naming
description: Check route naming conventions
tasks:
- "Check all registered routes follow /api/v1/{resource} pattern"
- "Check HTTP methods match CRUD semantics (GET=read, POST=create, PATCH=update, DELETE=remove)"
- "Check for inconsistent pluralisation (e.g. /provider vs /providers)"
- "Check for path parameter naming consistency"
- name: Response Shapes
description: Check response format consistency
tasks:
- "Check all success responses return consistent wrapper structure"
- "Check pagination uses consistent format (page, per_page, total)"
- "Check list endpoints return arrays, not objects"
- "Check single-item endpoints return the item directly"
- name: Error Handling
description: Check error response consistency
tasks:
- "Check all error responses include a structured error object"
- "Check HTTP status codes are correct (400 for validation, 404 for missing, 500 for internal)"
- "Check error messages use UK English"
- "Check no stack traces leak in production error responses"
- name: Report
description: Document findings
tasks:
- "List each inconsistency with endpoint, expected format, and actual format"

View file

@ -0,0 +1,37 @@
name: Dead Code Scan
description: Find unreachable functions, unused exports, and orphaned files
category: audit
guidelines:
- Only flag code that is genuinely unused, not just potentially unused
- Check go.work and cross-repo imports before flagging exports as unused
- Orphaned files include leftover migrations, unused configs, stale test fixtures
phases:
- name: Unused Exports
description: Find exported functions/types with no callers
tasks:
- "List all exported functions and types"
- "Search for callers within the package and known consumers"
- "Flag exports with zero callers (check CONSUMERS.md for cross-repo usage)"
- name: Dead Functions
description: Find unexported functions with no internal callers
tasks:
- "List unexported (lowercase) functions"
- "Check for callers within the same package"
- "Flag functions with zero callers"
- name: Orphaned Files
description: Find files that serve no purpose
tasks:
- "Check for .go files not imported by any other file in the package"
- "Check for test fixtures not referenced by any test"
- "Check for config files not loaded by any code"
- "Check for documentation files referencing deleted code"
- name: Report
description: Document findings
tasks:
- "List each dead code item with file:line and last git commit that touched it"
- "Classify as safe-to-remove or needs-investigation"

View file

@ -0,0 +1,41 @@
name: Dependency Audit
description: Find code that rolls its own instead of using framework packages
category: audit
variables:
focus:
description: Specific area to focus on (e.g. filesystem, logging, process management)
required: false
guidelines:
- Check imports for stdlib usage where a core package exists
- The framework packages are the canonical implementations
- Flag but don't fix — report only
phases:
- name: Framework Package Check
description: Identify stdlib usage that should use core packages
tasks:
- "Check for raw os.ReadFile/os.WriteFile/os.MkdirAll — should use go-io Medium"
- "Check for raw log.Printf/log.Println — should use go-log"
- "Check for raw exec.Command — should use go-process"
- "Check for raw http.Client without timeouts — should use shared client patterns"
- "Check for raw json.Marshal/Unmarshal of config — should use core/config"
- "Check for raw filepath.Walk — should use go-io Medium"
- name: Duplicate Implementation Check
description: Find re-implementations of existing framework functionality
tasks:
- "Search for custom error types — should extend go-log error patterns"
- "Search for custom retry/backoff logic — should use shared patterns"
- "Search for custom rate limiting — should use go-ratelimit"
- "Search for custom caching — should use go-cache"
- "Search for custom store/persistence — should use go-store"
- "Search for custom WebSocket handling — should use go-ws Hub"
- name: Report
description: Document findings with file:line references
tasks:
- "List each violation with file:line, what it does, and which core package should replace it"
- "Rank by impact — packages with many consumers are higher priority"
- "Note any cases where the framework package genuinely doesn't cover the use case"

View file

@ -0,0 +1,40 @@
name: Documentation Sync
description: Check that documentation matches the current code
category: audit
guidelines:
- CLAUDE.md is the primary developer reference — it must be accurate
- README.md should match the current API surface
- Code comments should match actual behaviour
- Examples in docs should compile and run
phases:
- name: CLAUDE.md Audit
description: Verify CLAUDE.md matches the codebase
tasks:
- "Check listed commands still work"
- "Check architecture description matches current package structure"
- "Check coding standards section matches actual conventions used"
- "Check dependency list is current"
- "Flag outdated sections"
- name: API Documentation
description: Check inline docs match behaviour
tasks:
- "Check all exported functions have doc comments"
- "Check doc comments describe current behaviour (not historical)"
- "Check parameter descriptions are accurate"
- "Check return value descriptions match actual returns"
- name: Example Validation
description: Verify examples are correct
tasks:
- "Check code examples in docs use current API signatures"
- "Check import paths are correct"
- "Flag examples that reference removed or renamed functions"
- name: Report
description: Document findings
tasks:
- "List each outdated doc with file, section, and what needs updating"
- "Classify as stale (wrong info) vs missing (no docs for new code)"

View file

@ -0,0 +1,40 @@
name: Test Coverage Gaps
description: Find functions without test coverage and missing edge case tests
category: audit
guidelines:
- Focus on exported functions first (public API)
- Error paths are more important than happy paths (they're usually untested)
- Tests should follow _Good/_Bad/_Ugly naming convention
- Use testify assert/require, table-driven preferred
phases:
- name: Coverage Analysis
description: Identify untested exported functions
tasks:
- "List all exported functions and methods"
- "Check each has at least one test calling it"
- "Flag functions with zero test coverage"
- "Note which are critical paths (called by many consumers)"
- name: Error Path Analysis
description: Find untested error conditions
tasks:
- "For each function that returns error, check if error paths are tested"
- "Check for nil/empty input handling tests"
- "Check for boundary condition tests (zero, max, negative)"
- "Flag error paths with no test coverage"
- name: Missing Edge Cases
description: Find obvious gaps in existing tests
tasks:
- "Check concurrent access tests for types with mutexes"
- "Check for tests with hardcoded paths or environment assumptions"
- "Check for tests that only test the happy path"
- name: Report
description: Document findings with priority
tasks:
- "List untested functions with file:line, consumer count, and priority"
- "List untested error paths with the error condition"
- "Suggest specific test cases to add"