forked from Snider/Poindexter
Compare commits
1 commit
main
...
audit/code
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
92ed819bb7 |
12 changed files with 221 additions and 790 deletions
221
AUDIT-COMPLEXITY.md
Normal file
221
AUDIT-COMPLEXITY.md
Normal file
|
|
@ -0,0 +1,221 @@
|
|||
# Code Complexity & Maintainability Audit
|
||||
|
||||
This document analyzes the code quality of the Poindexter library, identifies maintainability issues, and provides recommendations for improvement. The audit focuses on cyclomatic and cognitive complexity, code duplication, and other maintainability metrics.
|
||||
|
||||
## High-Impact Findings
|
||||
|
||||
This section summarizes the most critical issues that should be prioritized for refactoring.
|
||||
|
||||
## Detailed Findings
|
||||
|
||||
This section provides a detailed breakdown of all findings, categorized by file.
|
||||
|
||||
### `kdtree.go`
|
||||
|
||||
Overall, `kdtree.go` is well-structured and maintainable. The complexity is low, and the code is easy to understand. The following are minor points for consideration.
|
||||
|
||||
| Finding | Severity | Recommendation |
|
||||
| --- | --- | --- |
|
||||
| Mild Feature Envy | Low | The `KDTree` methods directly call analytics recording functions (`t.analytics.RecordQuery`, `t.peerAnalytics.RecordSelection`). This creates a tight coupling between the core tree logic and the analytics subsystem. |
|
||||
| Minor Code Duplication | Low | The query methods (`Nearest`, `KNearest`, `Radius`) share some boilerplate code for dimension checks, analytics timing, and handling the `gonum` backend. |
|
||||
|
||||
#### Recommendations
|
||||
|
||||
**1. Decouple Analytics from Core Logic**
|
||||
|
||||
To address the feature envy, the analytics recording could be decoupled from the core tree operations. This would improve separation of concerns and make the `KDTree` struct more focused on its primary responsibility.
|
||||
|
||||
* **Refactoring Approach**: Introduce an interface for a "query observer" or use an event-based system. The `KDTree` would accept an observer during construction and notify it of events like "query started," "query completed," and "point selected."
|
||||
* **Design Pattern**: Observer pattern or a simple event emitter.
|
||||
|
||||
**Example (Conceptual)**:
|
||||
|
||||
```go
|
||||
// In kdtree.go
|
||||
type QueryObserver[T any] interface {
|
||||
OnQueryStart(t *KDTree[T])
|
||||
OnQueryEnd(t *KDTree[T], duration time.Duration)
|
||||
OnPointSelected(p KDPoint[T], distance float64)
|
||||
}
|
||||
|
||||
// KDTree would have:
|
||||
// observer QueryObserver[T]
|
||||
|
||||
// In Nearest method:
|
||||
// if t.observer != nil { t.observer.OnQueryStart(t) }
|
||||
// ...
|
||||
// if t.observer != nil { t.observer.OnPointSelected(p, bestDist) }
|
||||
// ...
|
||||
// if t.observer != nil { t.observer.OnQueryEnd(t, time.Since(start)) }
|
||||
|
||||
```
|
||||
|
||||
**2. Reduce Boilerplate in Query Methods**
|
||||
|
||||
The minor code duplication in the query methods could be reduced by extracting the common setup and teardown logic into a helper function. However, given that there are only three methods and the duplication is minimal, this is a very low-priority change.
|
||||
|
||||
* **Refactoring Approach**: Create a private helper function that takes a query function as an argument and handles the boilerplate.
|
||||
|
||||
**Example (Conceptual)**:
|
||||
|
||||
```go
|
||||
// In kdtree.go
|
||||
func (t *KDTree[T]) executeQuery(query []float64, fn func() (any, any)) (any, any) {
|
||||
if len(query) != t.dim || t.Len() == 0 {
|
||||
return nil, nil
|
||||
}
|
||||
start := time.Now()
|
||||
defer func() {
|
||||
if t.analytics != nil {
|
||||
t.analytics.RecordQuery(time.Since(start).Nanoseconds())
|
||||
}
|
||||
}()
|
||||
return fn()
|
||||
}
|
||||
|
||||
// KNearest would then be simplified:
|
||||
// func (t *KDTree[T]) KNearest(query []float64, k int) ([]KDPoint[T], []float64) {
|
||||
// res, dists := t.executeQuery(query, func() (any, any) {
|
||||
// // ... core logic of KNearest ...
|
||||
// return neighbors, dists
|
||||
// })
|
||||
// return res.([]KDPoint[T]), dists.([]float64)
|
||||
// }
|
||||
```
|
||||
This approach has tradeoffs with readability and type casting, so it should be carefully considered.
|
||||
|
||||
### `kdtree_helpers.go`
|
||||
|
||||
This file contains significant code duplication and could be simplified by consolidating redundant functions.
|
||||
|
||||
| Finding | Severity | Recommendation |
|
||||
| --- | --- | --- |
|
||||
| High Code Duplication | High | The functions `Build2D`, `Build3D`, `Build4D` and their `ComputeNormStats` counterparts are nearly identical. They should be removed in favor of the existing generic `BuildND` and `ComputeNormStatsND` functions. |
|
||||
| Long Parameter Lists | Medium | Functions like `BuildND` accept a large number of parameters (`items`, `id`, `features`, `weights`, `invert`). This can be improved by introducing a configuration struct. |
|
||||
|
||||
#### Recommendations
|
||||
|
||||
**1. Consolidate Builder Functions (High Priority)**
|
||||
|
||||
The most impactful change would be to remove the duplicated builder functions. The generic `BuildND` function already provides the same functionality in a more flexible way.
|
||||
|
||||
* **Refactoring Approach**:
|
||||
1. Mark `Build2D`, `Build3D`, `Build4D`, `ComputeNormStats2D`, `ComputeNormStats3D`, and `ComputeNormStats4D` as deprecated.
|
||||
2. In a future major version, remove the deprecated functions.
|
||||
3. Update all internal call sites and examples to use `BuildND` and `ComputeNormStatsND`.
|
||||
|
||||
* **Code Example of Improvement**:
|
||||
|
||||
Instead of calling the specialized function:
|
||||
```go
|
||||
// Before
|
||||
pts, err := Build3D(
|
||||
items,
|
||||
func(it MyType) string { return it.ID },
|
||||
func(it MyType) float64 { return it.Feature1 },
|
||||
func(it MyType) float64 { return it.Feature2 },
|
||||
func(it MyType) float64 { return it.Feature3 },
|
||||
[3]float64{1.0, 2.0, 0.5},
|
||||
[3]bool{false, true, false},
|
||||
)
|
||||
```
|
||||
|
||||
The code would be refactored to use the generic version:
|
||||
```go
|
||||
// After
|
||||
features := []func(MyType) float64{
|
||||
func(it MyType) float64 { return it.Feature1 },
|
||||
func(it MyType) float64 { return it.Feature2 },
|
||||
func(it MyType) float64 { return it.Feature3 },
|
||||
}
|
||||
weights := []float64{1.0, 2.0, 0.5}
|
||||
invert := []bool{false, true, false}
|
||||
|
||||
pts, err := BuildND(
|
||||
items,
|
||||
func(it MyType) string { return it.ID },
|
||||
features,
|
||||
weights,
|
||||
invert,
|
||||
)
|
||||
```
|
||||
|
||||
**2. Introduce a Parameter Object for Configuration**
|
||||
|
||||
To make the function signatures cleaner and more extensible, a configuration struct could be used for the builder functions.
|
||||
|
||||
* **Design Pattern**: Introduce Parameter Object.
|
||||
|
||||
* **Code Example of Improvement**:
|
||||
|
||||
```go
|
||||
// Define a configuration struct
|
||||
type BuildConfig[T any] struct {
|
||||
IDFunc func(T) string
|
||||
Features []func(T) float64
|
||||
Weights []float64
|
||||
Invert []bool
|
||||
Stats *NormStats // Optional: for building with pre-computed stats
|
||||
}
|
||||
|
||||
// Refactor BuildND to accept the config
|
||||
func BuildND[T any](items []T, config BuildConfig[T]) ([]KDPoint[T], error) {
|
||||
// ... logic using config fields ...
|
||||
}
|
||||
|
||||
// Example usage
|
||||
config := BuildConfig[MyType]{
|
||||
IDFunc: func(it MyType) string { return it.ID },
|
||||
Features: features,
|
||||
Weights: weights,
|
||||
Invert: invert,
|
||||
}
|
||||
pts, err := BuildND(items, config)
|
||||
```
|
||||
This makes the call site cleaner and adding new options in the future would not require changing the function signature.
|
||||
|
||||
### `kdtree_analytics.go`
|
||||
|
||||
This file has the most significant maintainability issues in the codebase. It appears to be a "God Class" that has accumulated multiple, loosely related responsibilities over time.
|
||||
|
||||
| Finding | Severity | Recommendation |
|
||||
| --- | --- | --- |
|
||||
| God Class / Low Cohesion | High | The file contains logic for: tree performance analytics, peer selection analytics, statistical distribution calculations, NAT routing metrics, and peer trust/reputation. These are all distinct concerns. |
|
||||
| Speculative Generality | Medium | The file includes many structs and functions related to NAT routing and peer trust (`NATRoutingMetrics`, `TrustMetrics`, `PeerQualityScore`) that may not be used by all consumers of the library. This adds complexity for users who only need the core k-d tree functionality. |
|
||||
| Magic Numbers | Low | The `PeerQualityScore` function contains several magic numbers for weighting and normalization (e.g., `metrics.AvgRTTMs/1000.0`, `metrics.BandwidthMbps/100.0`). These should be extracted into named constants. |
|
||||
|
||||
#### Recommendations
|
||||
|
||||
**1. Decompose the God Class (High Priority)**
|
||||
|
||||
The most important refactoring is to break `kdtree_analytics.go` into smaller, more focused files. This will improve cohesion and make the code easier to navigate and maintain.
|
||||
|
||||
* **Refactoring Approach**:
|
||||
1. **`tree_analytics.go`**: Keep `TreeAnalytics` and `TreeAnalyticsSnapshot`.
|
||||
2. **`peer_analytics.go`**: Move `PeerAnalytics` and `PeerStats`.
|
||||
3. **`stats.go`**: Move `DistributionStats`, `ComputeDistributionStats`, `percentile`, `AxisDistribution`, and `ComputeAxisDistributions`.
|
||||
4. **`p2p_metrics.go`**: Create a new file for all the peer-to-peer and networking-specific logic. This would include `NATRoutingMetrics`, `QualityWeights`, `PeerQualityScore`, `TrustMetrics`, `ComputeTrustScore`, `StandardPeerFeatures`, etc. This makes it clear that this functionality is for a specific domain (P2P networking) and is not part of the core k-d tree library.
|
||||
|
||||
**2. Extract Magic Numbers as Constants**
|
||||
|
||||
The magic numbers in `PeerQualityScore` should be replaced with named constants to improve readability and make them easier to modify.
|
||||
|
||||
* **Code Example of Improvement**:
|
||||
|
||||
```go
|
||||
// Before
|
||||
latencyScore := 1.0 - math.Min(metrics.AvgRTTMs/1000.0, 1.0)
|
||||
bandwidthScore := math.Min(metrics.BandwidthMbps/100.0, 1.0)
|
||||
|
||||
// After
|
||||
const (
|
||||
maxAcceptableRTTMs = 1000.0
|
||||
excellentBandwidthMbps = 100.0
|
||||
)
|
||||
latencyScore := 1.0 - math.Min(metrics.AvgRTTMs/maxAcceptableRTTMs, 1.0)
|
||||
bandwidthScore := math.Min(metrics.BandwidthMbps/excellentBandwidthMbps, 1.0)
|
||||
```
|
||||
|
||||
**3. Isolate Domain-Specific Logic**
|
||||
|
||||
By moving the P2P-specific logic into its own file (`p2p_metrics.go`), it becomes clearer that this is an optional, domain-specific extension to the core library. This reduces the cognitive load for developers who only need the generic k-d tree functionality. The use of build tags could even be considered to make this optional at compile time.
|
||||
|
|
@ -1,46 +0,0 @@
|
|||
# Poindexter Math Expansion
|
||||
|
||||
**Date:** 2026-02-16
|
||||
**Status:** Approved
|
||||
|
||||
## Context
|
||||
|
||||
Poindexter serves as the math pillar (alongside Borg=data, Enchantrix=encryption) in the Lethean ecosystem. It currently provides KD-Tree spatial queries, 5 distance metrics, sorting utilities, and normalization helpers.
|
||||
|
||||
Analysis of math operations scattered across core/go, core/go-ai, and core/mining revealed common patterns that Poindexter should centralize: descriptive statistics, scaling/interpolation, approximate equality, weighted scoring, and signal generation.
|
||||
|
||||
## New Modules
|
||||
|
||||
### stats.go — Descriptive statistics
|
||||
Sum, Mean, Variance, StdDev, MinMax, IsUnderrepresented.
|
||||
Consumers: ml/coverage.go, lab/handler/chart.go
|
||||
|
||||
### scale.go — Normalization and interpolation
|
||||
Lerp, InverseLerp, Remap, RoundToN, Clamp, MinMaxScale.
|
||||
Consumers: lab/handler/chart.go, i18n/numbers.go
|
||||
|
||||
### epsilon.go — Approximate equality
|
||||
ApproxEqual, ApproxZero.
|
||||
Consumers: ml/exact.go
|
||||
|
||||
### score.go — Weighted composite scoring
|
||||
Factor type, WeightedScore, Ratio, Delta, DeltaPercent.
|
||||
Consumers: ml/heuristic.go, ml/compare.go
|
||||
|
||||
### signal.go — Time-series primitives
|
||||
RampUp, SineWave, Oscillate, Noise (seeded RNG).
|
||||
Consumers: mining/simulated_miner.go
|
||||
|
||||
## Constraints
|
||||
|
||||
- Zero external dependencies (WASM-compilable)
|
||||
- Pure Go, stdlib only (math, math/rand)
|
||||
- Same package (`poindexter`), flat structure
|
||||
- Table-driven tests for every function
|
||||
- No changes to existing files
|
||||
|
||||
## Not In Scope
|
||||
|
||||
- MLX tensor ops (hardware-accelerated, stays in go-ai)
|
||||
- DNS tools migration to go-netops (separate PR)
|
||||
- gonum backend integration (future work)
|
||||
14
epsilon.go
14
epsilon.go
|
|
@ -1,14 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import "math"
|
||||
|
||||
// ApproxEqual returns true if the absolute difference between a and b
|
||||
// is less than epsilon.
|
||||
func ApproxEqual(a, b, epsilon float64) bool {
|
||||
return math.Abs(a-b) < epsilon
|
||||
}
|
||||
|
||||
// ApproxZero returns true if the absolute value of v is less than epsilon.
|
||||
func ApproxZero(v, epsilon float64) bool {
|
||||
return math.Abs(v) < epsilon
|
||||
}
|
||||
|
|
@ -1,50 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import "testing"
|
||||
|
||||
func TestApproxEqual(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
a, b float64
|
||||
epsilon float64
|
||||
want bool
|
||||
}{
|
||||
{"equal", 1.0, 1.0, 0.01, true},
|
||||
{"close", 1.0, 1.005, 0.01, true},
|
||||
{"not_close", 1.0, 1.02, 0.01, false},
|
||||
{"negative", -1.0, -1.005, 0.01, true},
|
||||
{"zero", 0, 0.0001, 0.001, true},
|
||||
{"at_boundary", 1.0, 1.01, 0.01, false},
|
||||
{"large_epsilon", 100, 200, 150, true},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := ApproxEqual(tt.a, tt.b, tt.epsilon)
|
||||
if got != tt.want {
|
||||
t.Errorf("ApproxEqual(%v, %v, %v) = %v, want %v", tt.a, tt.b, tt.epsilon, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestApproxZero(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
v float64
|
||||
epsilon float64
|
||||
want bool
|
||||
}{
|
||||
{"zero", 0, 0.01, true},
|
||||
{"small_pos", 0.005, 0.01, true},
|
||||
{"small_neg", -0.005, 0.01, true},
|
||||
{"not_zero", 0.02, 0.01, false},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := ApproxZero(tt.v, tt.epsilon)
|
||||
if got != tt.want {
|
||||
t.Errorf("ApproxZero(%v, %v) = %v, want %v", tt.v, tt.epsilon, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
61
scale.go
61
scale.go
|
|
@ -1,61 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import "math"
|
||||
|
||||
// Lerp performs linear interpolation between a and b.
|
||||
// t=0 returns a, t=1 returns b, t=0.5 returns the midpoint.
|
||||
func Lerp(t, a, b float64) float64 {
|
||||
return a + t*(b-a)
|
||||
}
|
||||
|
||||
// InverseLerp returns where v falls between a and b as a fraction [0,1].
|
||||
// Returns 0 if a == b.
|
||||
func InverseLerp(v, a, b float64) float64 {
|
||||
if a == b {
|
||||
return 0
|
||||
}
|
||||
return (v - a) / (b - a)
|
||||
}
|
||||
|
||||
// Remap maps v from the range [inMin, inMax] to [outMin, outMax].
|
||||
// Equivalent to Lerp(InverseLerp(v, inMin, inMax), outMin, outMax).
|
||||
func Remap(v, inMin, inMax, outMin, outMax float64) float64 {
|
||||
return Lerp(InverseLerp(v, inMin, inMax), outMin, outMax)
|
||||
}
|
||||
|
||||
// RoundToN rounds f to n decimal places.
|
||||
func RoundToN(f float64, decimals int) float64 {
|
||||
mul := math.Pow(10, float64(decimals))
|
||||
return math.Round(f*mul) / mul
|
||||
}
|
||||
|
||||
// Clamp restricts v to the range [min, max].
|
||||
func Clamp(v, min, max float64) float64 {
|
||||
if v < min {
|
||||
return min
|
||||
}
|
||||
if v > max {
|
||||
return max
|
||||
}
|
||||
return v
|
||||
}
|
||||
|
||||
// ClampInt restricts v to the range [min, max].
|
||||
func ClampInt(v, min, max int) int {
|
||||
if v < min {
|
||||
return min
|
||||
}
|
||||
if v > max {
|
||||
return max
|
||||
}
|
||||
return v
|
||||
}
|
||||
|
||||
// MinMaxScale normalizes v into [0,1] given its range [min, max].
|
||||
// Returns 0 if min == max.
|
||||
func MinMaxScale(v, min, max float64) float64 {
|
||||
if min == max {
|
||||
return 0
|
||||
}
|
||||
return (v - min) / (max - min)
|
||||
}
|
||||
148
scale_test.go
148
scale_test.go
|
|
@ -1,148 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import (
|
||||
"math"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestLerp(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
t_, a, b float64
|
||||
want float64
|
||||
}{
|
||||
{"start", 0, 10, 20, 10},
|
||||
{"end", 1, 10, 20, 20},
|
||||
{"mid", 0.5, 10, 20, 15},
|
||||
{"quarter", 0.25, 0, 100, 25},
|
||||
{"extrapolate", 2, 0, 10, 20},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Lerp(tt.t_, tt.a, tt.b)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("Lerp(%v, %v, %v) = %v, want %v", tt.t_, tt.a, tt.b, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestInverseLerp(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
v, a, b float64
|
||||
want float64
|
||||
}{
|
||||
{"start", 10, 10, 20, 0},
|
||||
{"end", 20, 10, 20, 1},
|
||||
{"mid", 15, 10, 20, 0.5},
|
||||
{"equal_range", 5, 5, 5, 0},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := InverseLerp(tt.v, tt.a, tt.b)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("InverseLerp(%v, %v, %v) = %v, want %v", tt.v, tt.a, tt.b, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRemap(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
v, inMin, inMax, outMin, outMax float64
|
||||
want float64
|
||||
}{
|
||||
{"identity", 5, 0, 10, 0, 10, 5},
|
||||
{"scale_up", 5, 0, 10, 0, 100, 50},
|
||||
{"reverse", 3, 0, 10, 10, 0, 7},
|
||||
{"offset", 0, 0, 1, 100, 200, 100},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Remap(tt.v, tt.inMin, tt.inMax, tt.outMin, tt.outMax)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("Remap = %v, want %v", got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRoundToN(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
f float64
|
||||
decimals int
|
||||
want float64
|
||||
}{
|
||||
{"zero_dec", 3.456, 0, 3},
|
||||
{"one_dec", 3.456, 1, 3.5},
|
||||
{"two_dec", 3.456, 2, 3.46},
|
||||
{"three_dec", 3.4564, 3, 3.456},
|
||||
{"negative", -2.555, 2, -2.56},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := RoundToN(tt.f, tt.decimals)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("RoundToN(%v, %v) = %v, want %v", tt.f, tt.decimals, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestClamp(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
v, min, max float64
|
||||
want float64
|
||||
}{
|
||||
{"within", 5, 0, 10, 5},
|
||||
{"below", -5, 0, 10, 0},
|
||||
{"above", 15, 0, 10, 10},
|
||||
{"at_min", 0, 0, 10, 0},
|
||||
{"at_max", 10, 0, 10, 10},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Clamp(tt.v, tt.min, tt.max)
|
||||
if got != tt.want {
|
||||
t.Errorf("Clamp(%v, %v, %v) = %v, want %v", tt.v, tt.min, tt.max, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestClampInt(t *testing.T) {
|
||||
if got := ClampInt(5, 0, 10); got != 5 {
|
||||
t.Errorf("ClampInt(5, 0, 10) = %v, want 5", got)
|
||||
}
|
||||
if got := ClampInt(-1, 0, 10); got != 0 {
|
||||
t.Errorf("ClampInt(-1, 0, 10) = %v, want 0", got)
|
||||
}
|
||||
if got := ClampInt(15, 0, 10); got != 10 {
|
||||
t.Errorf("ClampInt(15, 0, 10) = %v, want 10", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestMinMaxScale(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
v, min, max float64
|
||||
want float64
|
||||
}{
|
||||
{"mid", 5, 0, 10, 0.5},
|
||||
{"at_min", 0, 0, 10, 0},
|
||||
{"at_max", 10, 0, 10, 1},
|
||||
{"equal_range", 5, 5, 5, 0},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := MinMaxScale(tt.v, tt.min, tt.max)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("MinMaxScale(%v, %v, %v) = %v, want %v", tt.v, tt.min, tt.max, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
40
score.go
40
score.go
|
|
@ -1,40 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
// Factor is a value–weight pair for composite scoring.
|
||||
type Factor struct {
|
||||
Value float64
|
||||
Weight float64
|
||||
}
|
||||
|
||||
// WeightedScore computes the weighted sum of factors.
|
||||
// Each factor contributes Value * Weight to the total.
|
||||
// Returns 0 for empty slices.
|
||||
func WeightedScore(factors []Factor) float64 {
|
||||
var total float64
|
||||
for _, f := range factors {
|
||||
total += f.Value * f.Weight
|
||||
}
|
||||
return total
|
||||
}
|
||||
|
||||
// Ratio returns part/whole safely. Returns 0 if whole is 0.
|
||||
func Ratio(part, whole float64) float64 {
|
||||
if whole == 0 {
|
||||
return 0
|
||||
}
|
||||
return part / whole
|
||||
}
|
||||
|
||||
// Delta returns the difference new_ - old.
|
||||
func Delta(old, new_ float64) float64 {
|
||||
return new_ - old
|
||||
}
|
||||
|
||||
// DeltaPercent returns the percentage change from old to new_.
|
||||
// Returns 0 if old is 0.
|
||||
func DeltaPercent(old, new_ float64) float64 {
|
||||
if old == 0 {
|
||||
return 0
|
||||
}
|
||||
return (new_ - old) / old * 100
|
||||
}
|
||||
|
|
@ -1,86 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import (
|
||||
"math"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestWeightedScore(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
factors []Factor
|
||||
want float64
|
||||
}{
|
||||
{"empty", nil, 0},
|
||||
{"single", []Factor{{Value: 5, Weight: 2}}, 10},
|
||||
{"multiple", []Factor{
|
||||
{Value: 3, Weight: 2}, // 6
|
||||
{Value: 1, Weight: -5}, // -5
|
||||
}, 1},
|
||||
{"lek_heuristic", []Factor{
|
||||
{Value: 2, Weight: 2}, // engagement × 2
|
||||
{Value: 1, Weight: 3}, // creative × 3
|
||||
{Value: 1, Weight: 1.5}, // first person × 1.5
|
||||
{Value: 3, Weight: -5}, // compliance × -5
|
||||
}, -6.5},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := WeightedScore(tt.factors)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("WeightedScore = %v, want %v", got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestRatio(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
part, whole float64
|
||||
want float64
|
||||
}{
|
||||
{"half", 5, 10, 0.5},
|
||||
{"full", 10, 10, 1},
|
||||
{"zero_whole", 5, 0, 0},
|
||||
{"zero_part", 0, 10, 0},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Ratio(tt.part, tt.whole)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("Ratio(%v, %v) = %v, want %v", tt.part, tt.whole, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestDelta(t *testing.T) {
|
||||
if got := Delta(10, 15); got != 5 {
|
||||
t.Errorf("Delta(10, 15) = %v, want 5", got)
|
||||
}
|
||||
if got := Delta(15, 10); got != -5 {
|
||||
t.Errorf("Delta(15, 10) = %v, want -5", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestDeltaPercent(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
old, new_ float64
|
||||
want float64
|
||||
}{
|
||||
{"increase", 100, 150, 50},
|
||||
{"decrease", 100, 75, -25},
|
||||
{"zero_old", 0, 10, 0},
|
||||
{"no_change", 50, 50, 0},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := DeltaPercent(tt.old, tt.new_)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("DeltaPercent(%v, %v) = %v, want %v", tt.old, tt.new_, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
57
signal.go
57
signal.go
|
|
@ -1,57 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import (
|
||||
"math"
|
||||
"math/rand"
|
||||
)
|
||||
|
||||
// RampUp returns a linear ramp from 0 to 1 over the given duration.
|
||||
// The result is clamped to [0, 1].
|
||||
func RampUp(elapsed, duration float64) float64 {
|
||||
if duration <= 0 {
|
||||
return 1
|
||||
}
|
||||
return Clamp(elapsed/duration, 0, 1)
|
||||
}
|
||||
|
||||
// SineWave returns a sine value with the given period and amplitude.
|
||||
// Output range is [-amplitude, amplitude].
|
||||
func SineWave(t, period, amplitude float64) float64 {
|
||||
if period == 0 {
|
||||
return 0
|
||||
}
|
||||
return math.Sin(t/period*2*math.Pi) * amplitude
|
||||
}
|
||||
|
||||
// Oscillate modulates a base value with a sine wave.
|
||||
// Returns base * (1 + sin(t/period*2π) * amplitude).
|
||||
func Oscillate(base, t, period, amplitude float64) float64 {
|
||||
if period == 0 {
|
||||
return base
|
||||
}
|
||||
return base * (1 + math.Sin(t/period*2*math.Pi)*amplitude)
|
||||
}
|
||||
|
||||
// Noise generates seeded pseudo-random values.
|
||||
type Noise struct {
|
||||
rng *rand.Rand
|
||||
}
|
||||
|
||||
// NewNoise creates a seeded noise generator.
|
||||
func NewNoise(seed int64) *Noise {
|
||||
return &Noise{rng: rand.New(rand.NewSource(seed))}
|
||||
}
|
||||
|
||||
// Float64 returns a random value in [-variance, variance].
|
||||
func (n *Noise) Float64(variance float64) float64 {
|
||||
return (n.rng.Float64() - 0.5) * 2 * variance
|
||||
}
|
||||
|
||||
// Int returns a random integer in [0, max).
|
||||
// Returns 0 if max <= 0.
|
||||
func (n *Noise) Int(max int) int {
|
||||
if max <= 0 {
|
||||
return 0
|
||||
}
|
||||
return n.rng.Intn(max)
|
||||
}
|
||||
103
signal_test.go
103
signal_test.go
|
|
@ -1,103 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import (
|
||||
"math"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestRampUp(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
elapsed, duration float64
|
||||
want float64
|
||||
}{
|
||||
{"start", 0, 30, 0},
|
||||
{"mid", 15, 30, 0.5},
|
||||
{"end", 30, 30, 1},
|
||||
{"over", 60, 30, 1},
|
||||
{"negative", -5, 30, 0},
|
||||
{"zero_duration", 10, 0, 1},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := RampUp(tt.elapsed, tt.duration)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("RampUp(%v, %v) = %v, want %v", tt.elapsed, tt.duration, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestSineWave(t *testing.T) {
|
||||
// At t=0, sin(0) = 0
|
||||
if got := SineWave(0, 10, 5); math.Abs(got) > 1e-9 {
|
||||
t.Errorf("SineWave(0, 10, 5) = %v, want 0", got)
|
||||
}
|
||||
// At t=period/4, sin(π/2) = 1, so result = amplitude
|
||||
got := SineWave(2.5, 10, 5)
|
||||
if math.Abs(got-5) > 1e-9 {
|
||||
t.Errorf("SineWave(2.5, 10, 5) = %v, want 5", got)
|
||||
}
|
||||
// Zero period returns 0
|
||||
if got := SineWave(5, 0, 5); got != 0 {
|
||||
t.Errorf("SineWave(5, 0, 5) = %v, want 0", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestOscillate(t *testing.T) {
|
||||
// At t=0, sin(0)=0, so result = base * (1 + 0) = base
|
||||
got := Oscillate(100, 0, 10, 0.05)
|
||||
if math.Abs(got-100) > 1e-9 {
|
||||
t.Errorf("Oscillate(100, 0, 10, 0.05) = %v, want 100", got)
|
||||
}
|
||||
// At t=period/4, sin=1, so result = base * (1 + amplitude)
|
||||
got = Oscillate(100, 2.5, 10, 0.05)
|
||||
if math.Abs(got-105) > 1e-9 {
|
||||
t.Errorf("Oscillate(100, 2.5, 10, 0.05) = %v, want 105", got)
|
||||
}
|
||||
// Zero period returns base
|
||||
if got := Oscillate(100, 5, 0, 0.05); got != 100 {
|
||||
t.Errorf("Oscillate(100, 5, 0, 0.05) = %v, want 100", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestNoise(t *testing.T) {
|
||||
n := NewNoise(42)
|
||||
|
||||
// Float64 should be within [-variance, variance]
|
||||
for i := 0; i < 1000; i++ {
|
||||
v := n.Float64(0.1)
|
||||
if v < -0.1 || v > 0.1 {
|
||||
t.Fatalf("Float64(0.1) = %v, outside [-0.1, 0.1]", v)
|
||||
}
|
||||
}
|
||||
|
||||
// Int should be within [0, max)
|
||||
n2 := NewNoise(42)
|
||||
for i := 0; i < 1000; i++ {
|
||||
v := n2.Int(10)
|
||||
if v < 0 || v >= 10 {
|
||||
t.Fatalf("Int(10) = %v, outside [0, 10)", v)
|
||||
}
|
||||
}
|
||||
|
||||
// Int with zero max returns 0
|
||||
if got := n.Int(0); got != 0 {
|
||||
t.Errorf("Int(0) = %v, want 0", got)
|
||||
}
|
||||
if got := n.Int(-1); got != 0 {
|
||||
t.Errorf("Int(-1) = %v, want 0", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestNoiseDeterministic(t *testing.T) {
|
||||
n1 := NewNoise(123)
|
||||
n2 := NewNoise(123)
|
||||
for i := 0; i < 100; i++ {
|
||||
a := n1.Float64(1.0)
|
||||
b := n2.Float64(1.0)
|
||||
if a != b {
|
||||
t.Fatalf("iteration %d: different values for same seed: %v != %v", i, a, b)
|
||||
}
|
||||
}
|
||||
}
|
||||
63
stats.go
63
stats.go
|
|
@ -1,63 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import "math"
|
||||
|
||||
// Sum returns the sum of all values. Returns 0 for empty slices.
|
||||
func Sum(data []float64) float64 {
|
||||
var s float64
|
||||
for _, v := range data {
|
||||
s += v
|
||||
}
|
||||
return s
|
||||
}
|
||||
|
||||
// Mean returns the arithmetic mean. Returns 0 for empty slices.
|
||||
func Mean(data []float64) float64 {
|
||||
if len(data) == 0 {
|
||||
return 0
|
||||
}
|
||||
return Sum(data) / float64(len(data))
|
||||
}
|
||||
|
||||
// Variance returns the population variance. Returns 0 for empty slices.
|
||||
func Variance(data []float64) float64 {
|
||||
if len(data) == 0 {
|
||||
return 0
|
||||
}
|
||||
m := Mean(data)
|
||||
var ss float64
|
||||
for _, v := range data {
|
||||
d := v - m
|
||||
ss += d * d
|
||||
}
|
||||
return ss / float64(len(data))
|
||||
}
|
||||
|
||||
// StdDev returns the population standard deviation.
|
||||
func StdDev(data []float64) float64 {
|
||||
return math.Sqrt(Variance(data))
|
||||
}
|
||||
|
||||
// MinMax returns the minimum and maximum values.
|
||||
// Returns (0, 0) for empty slices.
|
||||
func MinMax(data []float64) (min, max float64) {
|
||||
if len(data) == 0 {
|
||||
return 0, 0
|
||||
}
|
||||
min, max = data[0], data[0]
|
||||
for _, v := range data[1:] {
|
||||
if v < min {
|
||||
min = v
|
||||
}
|
||||
if v > max {
|
||||
max = v
|
||||
}
|
||||
}
|
||||
return min, max
|
||||
}
|
||||
|
||||
// IsUnderrepresented returns true if val is below threshold fraction of avg.
|
||||
// For example, IsUnderrepresented(3, 10, 0.5) returns true because 3 < 10*0.5.
|
||||
func IsUnderrepresented(val, avg, threshold float64) bool {
|
||||
return val < avg*threshold
|
||||
}
|
||||
122
stats_test.go
122
stats_test.go
|
|
@ -1,122 +0,0 @@
|
|||
package poindexter
|
||||
|
||||
import (
|
||||
"math"
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestSum(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
data []float64
|
||||
want float64
|
||||
}{
|
||||
{"empty", nil, 0},
|
||||
{"single", []float64{5}, 5},
|
||||
{"multiple", []float64{1, 2, 3, 4, 5}, 15},
|
||||
{"negative", []float64{-1, -2, 3}, 0},
|
||||
{"floats", []float64{0.1, 0.2, 0.3}, 0.6},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Sum(tt.data)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("Sum(%v) = %v, want %v", tt.data, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestMean(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
data []float64
|
||||
want float64
|
||||
}{
|
||||
{"empty", nil, 0},
|
||||
{"single", []float64{5}, 5},
|
||||
{"multiple", []float64{2, 4, 6}, 4},
|
||||
{"floats", []float64{1.5, 2.5}, 2},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Mean(tt.data)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("Mean(%v) = %v, want %v", tt.data, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestVariance(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
data []float64
|
||||
want float64
|
||||
}{
|
||||
{"empty", nil, 0},
|
||||
{"constant", []float64{5, 5, 5}, 0},
|
||||
{"simple", []float64{2, 4, 4, 4, 5, 5, 7, 9}, 4},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := Variance(tt.data)
|
||||
if math.Abs(got-tt.want) > 1e-9 {
|
||||
t.Errorf("Variance(%v) = %v, want %v", tt.data, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestStdDev(t *testing.T) {
|
||||
got := StdDev([]float64{2, 4, 4, 4, 5, 5, 7, 9})
|
||||
if math.Abs(got-2) > 1e-9 {
|
||||
t.Errorf("StdDev = %v, want 2", got)
|
||||
}
|
||||
}
|
||||
|
||||
func TestMinMax(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
data []float64
|
||||
wantMin float64
|
||||
wantMax float64
|
||||
}{
|
||||
{"empty", nil, 0, 0},
|
||||
{"single", []float64{3}, 3, 3},
|
||||
{"ordered", []float64{1, 2, 3}, 1, 3},
|
||||
{"reversed", []float64{3, 2, 1}, 1, 3},
|
||||
{"negative", []float64{-5, 0, 5}, -5, 5},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
gotMin, gotMax := MinMax(tt.data)
|
||||
if gotMin != tt.wantMin || gotMax != tt.wantMax {
|
||||
t.Errorf("MinMax(%v) = (%v, %v), want (%v, %v)", tt.data, gotMin, gotMax, tt.wantMin, tt.wantMax)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
func TestIsUnderrepresented(t *testing.T) {
|
||||
tests := []struct {
|
||||
name string
|
||||
val float64
|
||||
avg float64
|
||||
threshold float64
|
||||
want bool
|
||||
}{
|
||||
{"below", 3, 10, 0.5, true},
|
||||
{"at", 5, 10, 0.5, false},
|
||||
{"above", 7, 10, 0.5, false},
|
||||
{"zero_avg", 0, 0, 0.5, false},
|
||||
}
|
||||
for _, tt := range tests {
|
||||
t.Run(tt.name, func(t *testing.T) {
|
||||
got := IsUnderrepresented(tt.val, tt.avg, tt.threshold)
|
||||
if got != tt.want {
|
||||
t.Errorf("IsUnderrepresented(%v, %v, %v) = %v, want %v", tt.val, tt.avg, tt.threshold, got, tt.want)
|
||||
}
|
||||
})
|
||||
}
|
||||
}
|
||||
Loading…
Add table
Reference in a new issue