feat: BugSETI app, WebSocket hub, browser automation, and MCP tools (#336)
* feat: add security logging and fix framework regressions This commit implements comprehensive security event logging and resolves critical regressions in the core framework. Security Logging: - Enhanced `pkg/log` with a `Security` level and helper. - Added `log.Username()` to consistently identify the executing user. - Instrumented GitHub CLI auth, Agentic configuration, filesystem sandbox, MCP handlers, and MCP TCP transport with security logs. - Added `SecurityStyle` to the CLI for consistent visual representation of security events. UniFi Security (CodeQL): - Refactored `pkg/unifi` to remove hardcoded `InsecureSkipVerify`, resolving a high-severity alert. - Added a `--verify-tls` flag and configuration option to control TLS verification. - Updated command handlers to support the new verification parameter. Framework Fixes: - Restored original signatures for `MustServiceFor`, `Config()`, and `Display()` in `pkg/framework/core`, which had been corrupted during a merge. - Fixed `pkg/framework/framework.go` and `pkg/framework/core/runtime_pkg.go` to match the restored signatures. - These fixes resolve project-wide compilation errors caused by the signature mismatches. I encountered significant blockers due to a corrupted state of the `dev` branch after a merge, which introduced breaking changes in the core framework's DI system. I had to manually reconcile these signatures with the expected usage across the codebase to restore build stability. * feat(mcp): add RAG tools (query, ingest, collections) Add vector database tools to the MCP server for RAG operations: - rag_query: Search for relevant documentation using semantic similarity - rag_ingest: Ingest files or directories into the vector database - rag_collections: List available collections Uses existing internal/cmd/rag exports (QueryDocs, IngestDirectory, IngestFile) and pkg/rag for Qdrant client access. Default collection is "hostuk-docs" with topK=5 for queries. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat(mcp): add metrics tools (record, query) Add MCP tools for recording and querying AI/security metrics events. The metrics_record tool writes events to daily JSONL files, and the metrics_query tool provides aggregated statistics by type, repo, and agent. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat: add 'core mcp serve' command Add CLI command to start the MCP server for AI tool integration. - Create internal/cmd/mcpcmd package with serve subcommand - Support --workspace flag for directory restriction - Handle SIGINT/SIGTERM for clean shutdown - Register in full.go build variant Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat(ws): add WebSocket hub package for real-time streaming Add pkg/ws package implementing a hub pattern for WebSocket connections: - Hub manages client connections, broadcasts, and channel subscriptions - Client struct represents connected WebSocket clients - Message types: process_output, process_status, event, error, ping/pong - Channel-based subscription system (subscribe/unsubscribe) - SendProcessOutput and SendProcessStatus for process streaming integration - Full test coverage including concurrency tests Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat(mcp): add process management and WebSocket MCP tools Add MCP tools for process management: - process_start: Start a new external process - process_stop: Gracefully stop a running process - process_kill: Force kill a process - process_list: List all managed processes - process_output: Get captured process output - process_input: Send input to process stdin Add MCP tools for WebSocket: - ws_start: Start WebSocket server for real-time streaming - ws_info: Get hub statistics (clients, channels) Update Service struct with optional process.Service and ws.Hub fields, new WithProcessService and WithWSHub options, getter methods, and Shutdown method for cleanup. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat(webview): add browser automation package via Chrome DevTools Protocol Add pkg/webview package for browser automation: - webview.go: Main interface with Connect, Navigate, Click, Type, QuerySelector, Screenshot, Evaluate - cdp.go: Chrome DevTools Protocol WebSocket client implementation - actions.go: DOM action types (Click, Type, Hover, Scroll, etc.) and ActionSequence builder - console.go: Console message capture and filtering with ConsoleWatcher and ExceptionWatcher - angular.go: Angular-specific helpers for router navigation, component access, and Zone.js stability Add MCP tools for webview: - webview_connect/disconnect: Connection management - webview_navigate: Page navigation - webview_click/type/query/wait: DOM interaction - webview_console: Console output capture - webview_eval: JavaScript execution - webview_screenshot: Screenshot capture Add documentation: - docs/mcp/angular-testing.md: Guide for Angular application testing Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * docs: document new packages and BugSETI application - Update CLAUDE.md with documentation for: - pkg/ws (WebSocket hub for real-time streaming) - pkg/webview (Browser automation via CDP) - pkg/mcp (MCP server tools: process, ws, webview) - BugSETI application overview - Add comprehensive README for BugSETI with: - Installation and configuration guide - Usage workflow documentation - Architecture overview - Contributing guidelines Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * feat(bugseti): add BugSETI system tray app with auto-update BugSETI - Distributed Bug Fixing like SETI@home but for code Features: - System tray app with Wails v3 - GitHub issue fetching with label filters - Issue queue with priority management - AI context seeding via seed-agent-developer skill - Automated PR submission flow - Stats tracking and leaderboard - Cross-platform notifications - Self-updating with stable/beta/nightly channels Includes: - cmd/bugseti: Main application with Angular frontend - internal/bugseti: Core services (fetcher, queue, seeder, submit, config, stats, notify) - internal/bugseti/updater: Auto-update system (checker, downloader, installer) - .github/workflows/bugseti-release.yml: CI/CD for all platforms Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: resolve import cycle and code duplication - Remove pkg/log import from pkg/io/local to break import cycle (pkg/log/rotation.go imports pkg/io, creating circular dependency) - Use stderr logging for security events in sandbox escape detection - Remove unused sync/atomic import from core.go - Fix duplicate LogSecurity function declarations in cli/log.go - Update workspace/service.go Crypt() call to match interface Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> * fix: update tests for new function signatures and format code - Update core_test.go: Config(), Display() now panic instead of returning error - Update runtime_pkg_test.go: sr.Config() now panics instead of returning error - Update MustServiceFor tests to use assert.Panics - Format BugSETI, MCP tools, and webview packages with gofmt Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com> --------- Co-authored-by: Snider <631881+Snider@users.noreply.github.com> Co-authored-by: Claude <developers@lethean.io> Co-authored-by: Claude Opus 4.5 <noreply@anthropic.com>
This commit is contained in:
parent
4a1600e9be
commit
4debdc1449
132 changed files with 33034 additions and 257 deletions
|
|
@ -24,6 +24,12 @@ publishers:
|
|||
- type: github
|
||||
prerelease: false
|
||||
draft: false
|
||||
- type: homebrew
|
||||
tap: host-uk/homebrew-tap
|
||||
formula: core
|
||||
- type: scoop
|
||||
bucket: host-uk/scoop-bucket
|
||||
manifest: core
|
||||
|
||||
changelog:
|
||||
include:
|
||||
|
|
|
|||
396
.github/workflows/alpha-release.yml
vendored
396
.github/workflows/alpha-release.yml
vendored
|
|
@ -58,20 +58,155 @@ jobs:
|
|||
run: |
|
||||
EXT=""
|
||||
if [ "$GOOS" = "windows" ]; then EXT=".exe"; fi
|
||||
go build -o "./bin/core${EXT}" .
|
||||
BINARY="core${EXT}"
|
||||
ARCHIVE_PREFIX="core-${GOOS}-${GOARCH}"
|
||||
|
||||
APP_VERSION="${{ env.NEXT_VERSION }}-alpha.${{ github.run_number }}"
|
||||
go build -ldflags "-s -w -X github.com/host-uk/core/pkg/cli.AppVersion=${APP_VERSION}" -o "./bin/${BINARY}" .
|
||||
|
||||
# Create tar.gz for Homebrew (non-Windows)
|
||||
if [ "$GOOS" != "windows" ]; then
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "${BINARY}"
|
||||
fi
|
||||
|
||||
# Create zip for Scoop (Windows)
|
||||
if [ "$GOOS" = "windows" ]; then
|
||||
cd ./bin && zip "${ARCHIVE_PREFIX}.zip" "${BINARY}" && cd ..
|
||||
fi
|
||||
|
||||
# Rename raw binary to platform-specific name for release
|
||||
mv "./bin/${BINARY}" "./bin/${ARCHIVE_PREFIX}${EXT}"
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: core-${{ matrix.goos }}-${{ matrix.goarch }}
|
||||
path: ./bin/core*
|
||||
path: ./bin/core-*
|
||||
|
||||
release:
|
||||
needs: build
|
||||
runs-on: ubuntu-latest
|
||||
build-ide:
|
||||
strategy:
|
||||
matrix:
|
||||
include:
|
||||
- os: macos-latest
|
||||
goos: darwin
|
||||
goarch: arm64
|
||||
- os: ubuntu-latest
|
||||
goos: linux
|
||||
goarch: amd64
|
||||
- os: windows-latest
|
||||
goos: windows
|
||||
goarch: amd64
|
||||
runs-on: ${{ matrix.os }}
|
||||
env:
|
||||
GOOS: ${{ matrix.goos }}
|
||||
GOARCH: ${{ matrix.goarch }}
|
||||
defaults:
|
||||
run:
|
||||
working-directory: internal/core-ide
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Setup Go
|
||||
uses: host-uk/build/actions/setup/go@v4.0.0
|
||||
with:
|
||||
go-version: "1.25"
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "20"
|
||||
|
||||
- name: Install Wails CLI
|
||||
run: go install github.com/wailsapp/wails/v3/cmd/wails3@latest
|
||||
|
||||
- name: Install frontend dependencies
|
||||
working-directory: internal/core-ide/frontend
|
||||
run: npm ci
|
||||
|
||||
- name: Generate bindings
|
||||
run: wails3 generate bindings -f '-tags production' -clean=false -ts -i
|
||||
|
||||
- name: Build frontend
|
||||
working-directory: internal/core-ide/frontend
|
||||
run: npm run build
|
||||
|
||||
- name: Install Linux dependencies
|
||||
if: matrix.goos == 'linux'
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.0-dev
|
||||
|
||||
- name: Build IDE
|
||||
shell: bash
|
||||
run: |
|
||||
EXT=""
|
||||
if [ "$GOOS" = "windows" ]; then EXT=".exe"; fi
|
||||
BINARY="core-ide${EXT}"
|
||||
ARCHIVE_PREFIX="core-ide-${GOOS}-${GOARCH}"
|
||||
|
||||
BUILD_FLAGS="-tags production -trimpath -buildvcs=false"
|
||||
|
||||
if [ "$GOOS" = "windows" ]; then
|
||||
# Windows: no CGO, use windowsgui linker flag
|
||||
export CGO_ENABLED=0
|
||||
LDFLAGS="-w -s -H windowsgui"
|
||||
|
||||
# Generate Windows syso resource
|
||||
cd build
|
||||
wails3 generate syso -arch ${GOARCH} -icon windows/icon.ico -manifest windows/wails.exe.manifest -info windows/info.json -out ../wails_windows_${GOARCH}.syso
|
||||
cd ..
|
||||
elif [ "$GOOS" = "darwin" ]; then
|
||||
export CGO_ENABLED=1
|
||||
export CGO_CFLAGS="-mmacosx-version-min=10.15"
|
||||
export CGO_LDFLAGS="-mmacosx-version-min=10.15"
|
||||
export MACOSX_DEPLOYMENT_TARGET="10.15"
|
||||
LDFLAGS="-w -s"
|
||||
else
|
||||
export CGO_ENABLED=1
|
||||
LDFLAGS="-w -s"
|
||||
fi
|
||||
|
||||
go build ${BUILD_FLAGS} -ldflags="${LDFLAGS}" -o "./bin/${BINARY}"
|
||||
|
||||
# Clean up syso files
|
||||
rm -f *.syso
|
||||
|
||||
# Package
|
||||
if [ "$GOOS" = "darwin" ]; then
|
||||
# Create .app bundle
|
||||
mkdir -p "./bin/Core IDE.app/Contents/"{MacOS,Resources}
|
||||
cp build/darwin/icons.icns "./bin/Core IDE.app/Contents/Resources/"
|
||||
cp "./bin/${BINARY}" "./bin/Core IDE.app/Contents/MacOS/"
|
||||
cp build/darwin/Info.plist "./bin/Core IDE.app/Contents/"
|
||||
codesign --force --deep --sign - "./bin/Core IDE.app"
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "Core IDE.app"
|
||||
elif [ "$GOOS" = "windows" ]; then
|
||||
cd ./bin && zip "${ARCHIVE_PREFIX}.zip" "${BINARY}" && cd ..
|
||||
else
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "${BINARY}"
|
||||
fi
|
||||
|
||||
# Rename raw binary
|
||||
mv "./bin/${BINARY}" "./bin/${ARCHIVE_PREFIX}${EXT}"
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: core-ide-${{ matrix.goos }}-${{ matrix.goarch }}
|
||||
path: internal/core-ide/bin/core-ide-*
|
||||
|
||||
release:
|
||||
needs: [build, build-ide]
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
version: ${{ steps.version.outputs.version }}
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Set version
|
||||
id: version
|
||||
run: echo "version=v${{ env.NEXT_VERSION }}-alpha.${{ github.run_number }}" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
|
|
@ -87,9 +222,8 @@ jobs:
|
|||
- name: Create alpha release
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
VERSION: ${{ steps.version.outputs.version }}
|
||||
run: |
|
||||
VERSION="v${{ env.NEXT_VERSION }}-alpha.${{ github.run_number }}"
|
||||
|
||||
gh release create "$VERSION" \
|
||||
--title "Alpha: $VERSION" \
|
||||
--notes "Canary build from dev branch.
|
||||
|
|
@ -110,7 +244,14 @@ jobs:
|
|||
## Installation
|
||||
|
||||
\`\`\`bash
|
||||
# macOS/Linux
|
||||
# Homebrew (macOS/Linux)
|
||||
brew install host-uk/tap/core
|
||||
|
||||
# Scoop (Windows)
|
||||
scoop bucket add host-uk https://github.com/host-uk/scoop-bucket
|
||||
scoop install core
|
||||
|
||||
# Direct download (example: Linux amd64)
|
||||
curl -fsSL https://github.com/host-uk/core/releases/download/$VERSION/core-linux-amd64 -o core
|
||||
chmod +x core && sudo mv core /usr/local/bin/
|
||||
\`\`\`
|
||||
|
|
@ -118,3 +259,242 @@ jobs:
|
|||
--prerelease \
|
||||
--target dev \
|
||||
release/*
|
||||
|
||||
update-tap:
|
||||
needs: release
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
path: dist
|
||||
merge-multiple: true
|
||||
|
||||
- name: Generate checksums
|
||||
run: |
|
||||
cd dist
|
||||
for f in *.tar.gz; do
|
||||
sha256sum "$f" | awk '{print $1}' > "${f}.sha256"
|
||||
done
|
||||
echo "=== Checksums ==="
|
||||
cat *.sha256
|
||||
|
||||
- name: Update Homebrew formula
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.HOMEBREW_TAP_TOKEN }}
|
||||
VERSION: ${{ needs.release.outputs.version }}
|
||||
run: |
|
||||
# Strip leading 'v' for formula version
|
||||
FORMULA_VERSION="${VERSION#v}"
|
||||
|
||||
# Read checksums
|
||||
DARWIN_ARM64=$(cat dist/core-darwin-arm64.tar.gz.sha256)
|
||||
LINUX_AMD64=$(cat dist/core-linux-amd64.tar.gz.sha256)
|
||||
LINUX_ARM64=$(cat dist/core-linux-arm64.tar.gz.sha256)
|
||||
|
||||
# Clone tap repo (configure auth for push)
|
||||
gh repo clone host-uk/homebrew-tap /tmp/tap -- --depth=1
|
||||
cd /tmp/tap
|
||||
git remote set-url origin "https://x-access-token:${GH_TOKEN}@github.com/host-uk/homebrew-tap.git"
|
||||
cd -
|
||||
mkdir -p /tmp/tap/Formula
|
||||
|
||||
# Write formula
|
||||
cat > /tmp/tap/Formula/core.rb << FORMULA
|
||||
# typed: false
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Core < Formula
|
||||
desc "Host UK development CLI"
|
||||
homepage "https://github.com/host-uk/core"
|
||||
version "${FORMULA_VERSION}"
|
||||
license "EUPL-1.2"
|
||||
|
||||
on_macos do
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-darwin-arm64.tar.gz"
|
||||
sha256 "${DARWIN_ARM64}"
|
||||
end
|
||||
|
||||
on_linux do
|
||||
if Hardware::CPU.arm?
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-linux-arm64.tar.gz"
|
||||
sha256 "${LINUX_ARM64}"
|
||||
else
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-linux-amd64.tar.gz"
|
||||
sha256 "${LINUX_AMD64}"
|
||||
end
|
||||
end
|
||||
|
||||
def install
|
||||
bin.install "core"
|
||||
end
|
||||
|
||||
test do
|
||||
system "\#{bin}/core", "--version"
|
||||
end
|
||||
end
|
||||
FORMULA
|
||||
|
||||
# Remove leading whitespace from heredoc
|
||||
sed -i 's/^ //' /tmp/tap/Formula/core.rb
|
||||
|
||||
# Read IDE checksums (may not exist if build-ide failed)
|
||||
IDE_DARWIN_ARM64=$(cat dist/core-ide-darwin-arm64.tar.gz.sha256 2>/dev/null || echo "")
|
||||
IDE_LINUX_AMD64=$(cat dist/core-ide-linux-amd64.tar.gz.sha256 2>/dev/null || echo "")
|
||||
|
||||
# Write core-ide Formula (Linux binary)
|
||||
if [ -n "${IDE_LINUX_AMD64}" ]; then
|
||||
cat > /tmp/tap/Formula/core-ide.rb << FORMULA
|
||||
# typed: false
|
||||
# frozen_string_literal: true
|
||||
|
||||
class CoreIde < Formula
|
||||
desc "Host UK desktop development environment"
|
||||
homepage "https://github.com/host-uk/core"
|
||||
version "${FORMULA_VERSION}"
|
||||
license "EUPL-1.2"
|
||||
|
||||
on_linux do
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-ide-linux-amd64.tar.gz"
|
||||
sha256 "${IDE_LINUX_AMD64}"
|
||||
end
|
||||
|
||||
def install
|
||||
bin.install "core-ide"
|
||||
end
|
||||
end
|
||||
FORMULA
|
||||
sed -i 's/^ //' /tmp/tap/Formula/core-ide.rb
|
||||
fi
|
||||
|
||||
# Write core-ide Cask (macOS .app bundle)
|
||||
if [ -n "${IDE_DARWIN_ARM64}" ]; then
|
||||
mkdir -p /tmp/tap/Casks
|
||||
cat > /tmp/tap/Casks/core-ide.rb << CASK
|
||||
cask "core-ide" do
|
||||
version "${FORMULA_VERSION}"
|
||||
sha256 "${IDE_DARWIN_ARM64}"
|
||||
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-ide-darwin-arm64.tar.gz"
|
||||
name "Core IDE"
|
||||
desc "Host UK desktop development environment"
|
||||
homepage "https://github.com/host-uk/core"
|
||||
|
||||
app "Core IDE.app"
|
||||
end
|
||||
CASK
|
||||
sed -i 's/^ //' /tmp/tap/Casks/core-ide.rb
|
||||
fi
|
||||
|
||||
cd /tmp/tap
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
git add .
|
||||
git diff --cached --quiet && echo "No changes to tap" && exit 0
|
||||
git commit -m "Update core to ${FORMULA_VERSION}"
|
||||
git push
|
||||
|
||||
update-scoop:
|
||||
needs: release
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
path: dist
|
||||
merge-multiple: true
|
||||
|
||||
- name: Generate checksums
|
||||
run: |
|
||||
cd dist
|
||||
for f in *.zip; do
|
||||
[ -f "$f" ] || continue
|
||||
sha256sum "$f" | awk '{print $1}' > "${f}.sha256"
|
||||
done
|
||||
echo "=== Checksums ==="
|
||||
cat *.sha256 2>/dev/null || echo "No zip checksums"
|
||||
|
||||
- name: Update Scoop manifests
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.HOMEBREW_TAP_TOKEN }}
|
||||
VERSION: ${{ needs.release.outputs.version }}
|
||||
run: |
|
||||
# Strip leading 'v' for manifest version
|
||||
MANIFEST_VERSION="${VERSION#v}"
|
||||
|
||||
# Read checksums
|
||||
WIN_AMD64=$(cat dist/core-windows-amd64.zip.sha256 2>/dev/null || echo "")
|
||||
IDE_WIN_AMD64=$(cat dist/core-ide-windows-amd64.zip.sha256 2>/dev/null || echo "")
|
||||
|
||||
# Clone scoop bucket
|
||||
gh repo clone host-uk/scoop-bucket /tmp/scoop -- --depth=1
|
||||
cd /tmp/scoop
|
||||
git remote set-url origin "https://x-access-token:${GH_TOKEN}@github.com/host-uk/scoop-bucket.git"
|
||||
|
||||
# Write core.json manifest
|
||||
cat > core.json << 'MANIFEST'
|
||||
{
|
||||
"version": "VERSION_PLACEHOLDER",
|
||||
"description": "Host UK development CLI",
|
||||
"homepage": "https://github.com/host-uk/core",
|
||||
"license": "EUPL-1.2",
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "URL_PLACEHOLDER",
|
||||
"hash": "HASH_PLACEHOLDER",
|
||||
"bin": "core.exe"
|
||||
}
|
||||
},
|
||||
"checkver": "github",
|
||||
"autoupdate": {
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "https://github.com/host-uk/core/releases/download/v$version/core-windows-amd64.zip"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
MANIFEST
|
||||
|
||||
sed -i "s|VERSION_PLACEHOLDER|${MANIFEST_VERSION}|g" core.json
|
||||
sed -i "s|URL_PLACEHOLDER|https://github.com/host-uk/core/releases/download/${VERSION}/core-windows-amd64.zip|g" core.json
|
||||
sed -i "s|HASH_PLACEHOLDER|${WIN_AMD64}|g" core.json
|
||||
sed -i 's/^ //' core.json
|
||||
|
||||
# Write core-ide.json manifest
|
||||
if [ -n "${IDE_WIN_AMD64}" ]; then
|
||||
cat > core-ide.json << 'MANIFEST'
|
||||
{
|
||||
"version": "VERSION_PLACEHOLDER",
|
||||
"description": "Host UK desktop development environment",
|
||||
"homepage": "https://github.com/host-uk/core",
|
||||
"license": "EUPL-1.2",
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "URL_PLACEHOLDER",
|
||||
"hash": "HASH_PLACEHOLDER",
|
||||
"bin": "core-ide.exe"
|
||||
}
|
||||
},
|
||||
"checkver": "github",
|
||||
"autoupdate": {
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "https://github.com/host-uk/core/releases/download/v$version/core-ide-windows-amd64.zip"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
MANIFEST
|
||||
sed -i "s|VERSION_PLACEHOLDER|${MANIFEST_VERSION}|g" core-ide.json
|
||||
sed -i "s|URL_PLACEHOLDER|https://github.com/host-uk/core/releases/download/${VERSION}/core-ide-windows-amd64.zip|g" core-ide.json
|
||||
sed -i "s|HASH_PLACEHOLDER|${IDE_WIN_AMD64}|g" core-ide.json
|
||||
sed -i 's/^ //' core-ide.json
|
||||
fi
|
||||
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
git add .
|
||||
git diff --cached --quiet && echo "No changes to scoop bucket" && exit 0
|
||||
git commit -m "Update core to ${MANIFEST_VERSION}"
|
||||
git push
|
||||
|
|
|
|||
34
.github/workflows/auto-merge.yml
vendored
34
.github/workflows/auto-merge.yml
vendored
|
|
@ -4,22 +4,27 @@ on:
|
|||
pull_request:
|
||||
types: [opened, reopened, ready_for_review]
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
|
||||
env:
|
||||
GH_REPO: ${{ github.repository }}
|
||||
|
||||
jobs:
|
||||
merge:
|
||||
runs-on: ubuntu-latest
|
||||
permissions:
|
||||
contents: write
|
||||
pull-requests: write
|
||||
if: github.event.pull_request.draft == false
|
||||
steps:
|
||||
- name: Checkout
|
||||
uses: actions/checkout@v6
|
||||
- name: Auto Merge
|
||||
- name: Enable auto-merge
|
||||
uses: actions/github-script@v7
|
||||
env:
|
||||
PR_NUMBER: ${{ github.event.pull_request.number }}
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
GH_REPO: ${{ github.repository }}
|
||||
with:
|
||||
github-token: ${{ secrets.GITHUB_TOKEN }}
|
||||
script: |
|
||||
const author = context.payload.pull_request.user.login;
|
||||
const association = context.payload.pull_request.author_association;
|
||||
|
|
@ -28,15 +33,22 @@ jobs:
|
|||
const trustedBots = ['google-labs-jules[bot]'];
|
||||
const isTrustedBot = trustedBots.includes(author);
|
||||
|
||||
// Check author association from webhook payload (no API call needed)
|
||||
// Check author association from webhook payload
|
||||
const trusted = ['MEMBER', 'OWNER', 'COLLABORATOR'];
|
||||
if (!isTrustedBot && !trusted.includes(association)) {
|
||||
core.info(`${author} is ${association} — skipping auto-merge`);
|
||||
return;
|
||||
}
|
||||
|
||||
await exec.exec('gh', [
|
||||
'pr', 'merge', process.env.PR_NUMBER,
|
||||
'--auto',
|
||||
]);
|
||||
core.info(`Auto-merge enabled for #${process.env.PR_NUMBER}`);
|
||||
try {
|
||||
await exec.exec('gh', [
|
||||
'pr', 'merge', process.env.PR_NUMBER,
|
||||
'--auto',
|
||||
'--merge',
|
||||
'-R', `${context.repo.owner}/${context.repo.repo}`
|
||||
]);
|
||||
core.info(`Auto-merge enabled for #${process.env.PR_NUMBER}`);
|
||||
} catch (error) {
|
||||
core.error(`Failed to enable auto-merge: ${error.message}`);
|
||||
throw error;
|
||||
}
|
||||
|
|
|
|||
309
.github/workflows/bugseti-release.yml
vendored
Normal file
309
.github/workflows/bugseti-release.yml
vendored
Normal file
|
|
@ -0,0 +1,309 @@
|
|||
# BugSETI Release Workflow
|
||||
# Builds for all platforms and creates GitHub releases
|
||||
name: "BugSETI Release"
|
||||
|
||||
on:
|
||||
push:
|
||||
tags:
|
||||
- 'bugseti-v*.*.*' # Stable: bugseti-v1.0.0
|
||||
- 'bugseti-v*.*.*-beta.*' # Beta: bugseti-v1.0.0-beta.1
|
||||
- 'bugseti-nightly-*' # Nightly: bugseti-nightly-20260205
|
||||
|
||||
permissions:
|
||||
contents: write
|
||||
|
||||
env:
|
||||
APP_NAME: bugseti
|
||||
WAILS_VERSION: "3"
|
||||
|
||||
jobs:
|
||||
# Determine release channel from tag
|
||||
prepare:
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
version: ${{ steps.version.outputs.version }}
|
||||
channel: ${{ steps.version.outputs.channel }}
|
||||
prerelease: ${{ steps.version.outputs.prerelease }}
|
||||
steps:
|
||||
- name: Determine version and channel
|
||||
id: version
|
||||
env:
|
||||
TAG: ${{ github.ref_name }}
|
||||
run: |
|
||||
if [[ "$TAG" == bugseti-nightly-* ]]; then
|
||||
VERSION="${TAG#bugseti-}"
|
||||
CHANNEL="nightly"
|
||||
PRERELEASE="true"
|
||||
elif [[ "$TAG" == *-beta.* ]]; then
|
||||
VERSION="${TAG#bugseti-v}"
|
||||
CHANNEL="beta"
|
||||
PRERELEASE="true"
|
||||
else
|
||||
VERSION="${TAG#bugseti-v}"
|
||||
CHANNEL="stable"
|
||||
PRERELEASE="false"
|
||||
fi
|
||||
|
||||
echo "version=${VERSION}" >> "$GITHUB_OUTPUT"
|
||||
echo "channel=${CHANNEL}" >> "$GITHUB_OUTPUT"
|
||||
echo "prerelease=${PRERELEASE}" >> "$GITHUB_OUTPUT"
|
||||
|
||||
echo "Tag: $TAG"
|
||||
echo "Version: $VERSION"
|
||||
echo "Channel: $CHANNEL"
|
||||
echo "Prerelease: $PRERELEASE"
|
||||
|
||||
build:
|
||||
needs: prepare
|
||||
strategy:
|
||||
fail-fast: false
|
||||
matrix:
|
||||
include:
|
||||
# macOS ARM64 (Apple Silicon)
|
||||
- os: macos-latest
|
||||
goos: darwin
|
||||
goarch: arm64
|
||||
ext: ""
|
||||
archive: tar.gz
|
||||
# macOS AMD64 (Intel)
|
||||
- os: macos-13
|
||||
goos: darwin
|
||||
goarch: amd64
|
||||
ext: ""
|
||||
archive: tar.gz
|
||||
# Linux AMD64
|
||||
- os: ubuntu-latest
|
||||
goos: linux
|
||||
goarch: amd64
|
||||
ext: ""
|
||||
archive: tar.gz
|
||||
# Linux ARM64
|
||||
- os: ubuntu-24.04-arm
|
||||
goos: linux
|
||||
goarch: arm64
|
||||
ext: ""
|
||||
archive: tar.gz
|
||||
# Windows AMD64
|
||||
- os: windows-latest
|
||||
goos: windows
|
||||
goarch: amd64
|
||||
ext: ".exe"
|
||||
archive: zip
|
||||
|
||||
runs-on: ${{ matrix.os }}
|
||||
env:
|
||||
GOOS: ${{ matrix.goos }}
|
||||
GOARCH: ${{ matrix.goarch }}
|
||||
VERSION: ${{ needs.prepare.outputs.version }}
|
||||
CHANNEL: ${{ needs.prepare.outputs.channel }}
|
||||
|
||||
defaults:
|
||||
run:
|
||||
working-directory: cmd/bugseti
|
||||
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Setup Go
|
||||
uses: host-uk/build/actions/setup/go@v4.0.0
|
||||
with:
|
||||
go-version: "1.25"
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "20"
|
||||
|
||||
- name: Install Wails CLI
|
||||
run: go install github.com/wailsapp/wails/v3/cmd/wails3@latest
|
||||
|
||||
- name: Install frontend dependencies
|
||||
working-directory: cmd/bugseti/frontend
|
||||
run: npm ci
|
||||
|
||||
- name: Generate bindings
|
||||
run: wails3 generate bindings -f '-tags production' -clean=false -ts -i
|
||||
|
||||
- name: Build frontend
|
||||
working-directory: cmd/bugseti/frontend
|
||||
run: npm run build
|
||||
|
||||
- name: Install Linux dependencies
|
||||
if: matrix.goos == 'linux'
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.1-dev libayatana-appindicator3-dev
|
||||
|
||||
- name: Build BugSETI
|
||||
shell: bash
|
||||
env:
|
||||
EXT: ${{ matrix.ext }}
|
||||
ARCHIVE: ${{ matrix.archive }}
|
||||
COMMIT_SHA: ${{ github.sha }}
|
||||
run: |
|
||||
BINARY="${APP_NAME}${EXT}"
|
||||
ARCHIVE_PREFIX="${APP_NAME}-${GOOS}-${GOARCH}"
|
||||
|
||||
BUILD_FLAGS="-tags production -trimpath -buildvcs=false"
|
||||
|
||||
# Version injection via ldflags
|
||||
LDFLAGS="-s -w"
|
||||
LDFLAGS="${LDFLAGS} -X github.com/host-uk/core/internal/bugseti.Version=${VERSION}"
|
||||
LDFLAGS="${LDFLAGS} -X github.com/host-uk/core/internal/bugseti.Channel=${CHANNEL}"
|
||||
LDFLAGS="${LDFLAGS} -X github.com/host-uk/core/internal/bugseti.Commit=${COMMIT_SHA}"
|
||||
LDFLAGS="${LDFLAGS} -X github.com/host-uk/core/internal/bugseti.BuildTime=$(date -u +%Y-%m-%dT%H:%M:%SZ)"
|
||||
|
||||
if [ "$GOOS" = "windows" ]; then
|
||||
export CGO_ENABLED=0
|
||||
LDFLAGS="${LDFLAGS} -H windowsgui"
|
||||
|
||||
# Generate Windows syso resource
|
||||
cd build
|
||||
wails3 generate syso -arch ${GOARCH} -icon windows/icon.ico -manifest windows/wails.exe.manifest -info windows/info.json -out ../wails_windows_${GOARCH}.syso 2>/dev/null || true
|
||||
cd ..
|
||||
elif [ "$GOOS" = "darwin" ]; then
|
||||
export CGO_ENABLED=1
|
||||
export CGO_CFLAGS="-mmacosx-version-min=10.15"
|
||||
export CGO_LDFLAGS="-mmacosx-version-min=10.15"
|
||||
export MACOSX_DEPLOYMENT_TARGET="10.15"
|
||||
else
|
||||
export CGO_ENABLED=1
|
||||
fi
|
||||
|
||||
mkdir -p bin
|
||||
go build ${BUILD_FLAGS} -ldflags="${LDFLAGS}" -o "./bin/${BINARY}"
|
||||
|
||||
# Clean up syso files
|
||||
rm -f *.syso
|
||||
|
||||
# Package based on platform
|
||||
if [ "$GOOS" = "darwin" ]; then
|
||||
# Create .app bundle
|
||||
mkdir -p "./bin/BugSETI.app/Contents/"{MacOS,Resources}
|
||||
cp build/darwin/icons.icns "./bin/BugSETI.app/Contents/Resources/" 2>/dev/null || true
|
||||
cp "./bin/${BINARY}" "./bin/BugSETI.app/Contents/MacOS/"
|
||||
cp build/darwin/Info.plist "./bin/BugSETI.app/Contents/"
|
||||
codesign --force --deep --sign - "./bin/BugSETI.app" 2>/dev/null || true
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "BugSETI.app"
|
||||
elif [ "$GOOS" = "windows" ]; then
|
||||
cd ./bin && zip "${ARCHIVE_PREFIX}.zip" "${BINARY}" && cd ..
|
||||
else
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "${BINARY}"
|
||||
fi
|
||||
|
||||
# Rename raw binary for individual download
|
||||
mv "./bin/${BINARY}" "./bin/${ARCHIVE_PREFIX}${EXT}"
|
||||
|
||||
# Generate checksum
|
||||
cd ./bin
|
||||
sha256sum "${ARCHIVE_PREFIX}.${ARCHIVE}" > "${ARCHIVE_PREFIX}.${ARCHIVE}.sha256"
|
||||
sha256sum "${ARCHIVE_PREFIX}${EXT}" > "${ARCHIVE_PREFIX}${EXT}.sha256"
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: bugseti-${{ matrix.goos }}-${{ matrix.goarch }}
|
||||
path: |
|
||||
cmd/bugseti/bin/bugseti-*
|
||||
retention-days: 7
|
||||
|
||||
release:
|
||||
needs: [prepare, build]
|
||||
runs-on: ubuntu-latest
|
||||
env:
|
||||
TAG_NAME: ${{ github.ref_name }}
|
||||
VERSION: ${{ needs.prepare.outputs.version }}
|
||||
CHANNEL: ${{ needs.prepare.outputs.channel }}
|
||||
PRERELEASE: ${{ needs.prepare.outputs.prerelease }}
|
||||
REPO: ${{ github.repository }}
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Download all artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
path: dist
|
||||
merge-multiple: true
|
||||
|
||||
- name: List release files
|
||||
run: |
|
||||
echo "=== Release files ==="
|
||||
ls -la dist/
|
||||
echo "=== Checksums ==="
|
||||
cat dist/*.sha256
|
||||
|
||||
- name: Create release
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
# Determine release title
|
||||
if [ "$CHANNEL" = "nightly" ]; then
|
||||
TITLE="BugSETI Nightly (${VERSION})"
|
||||
elif [ "$CHANNEL" = "beta" ]; then
|
||||
TITLE="BugSETI v${VERSION} (Beta)"
|
||||
else
|
||||
TITLE="BugSETI v${VERSION}"
|
||||
fi
|
||||
|
||||
# Create release notes
|
||||
cat > release-notes.md << EOF
|
||||
## BugSETI ${VERSION}
|
||||
|
||||
**Channel:** ${CHANNEL}
|
||||
|
||||
### Downloads
|
||||
|
||||
| Platform | Architecture | Binary | Archive |
|
||||
|----------|-------------|--------|---------|
|
||||
| macOS | ARM64 (Apple Silicon) | [bugseti-darwin-arm64](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-darwin-arm64) | [tar.gz](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-darwin-arm64.tar.gz) |
|
||||
| macOS | AMD64 (Intel) | [bugseti-darwin-amd64](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-darwin-amd64) | [tar.gz](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-darwin-amd64.tar.gz) |
|
||||
| Linux | AMD64 | [bugseti-linux-amd64](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-linux-amd64) | [tar.gz](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-linux-amd64.tar.gz) |
|
||||
| Linux | ARM64 | [bugseti-linux-arm64](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-linux-arm64) | [tar.gz](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-linux-arm64.tar.gz) |
|
||||
| Windows | AMD64 | [bugseti-windows-amd64.exe](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-windows-amd64.exe) | [zip](https://github.com/${REPO}/releases/download/${TAG_NAME}/bugseti-windows-amd64.zip) |
|
||||
|
||||
### Checksums (SHA256)
|
||||
|
||||
\`\`\`
|
||||
$(cat dist/*.sha256)
|
||||
\`\`\`
|
||||
|
||||
---
|
||||
*BugSETI - Distributed Bug Fixing, like SETI@home but for code*
|
||||
EOF
|
||||
|
||||
# Build release command
|
||||
RELEASE_ARGS=(
|
||||
--title "$TITLE"
|
||||
--notes-file release-notes.md
|
||||
)
|
||||
|
||||
if [ "$PRERELEASE" = "true" ]; then
|
||||
RELEASE_ARGS+=(--prerelease)
|
||||
fi
|
||||
|
||||
# Create the release
|
||||
gh release create "$TAG_NAME" \
|
||||
"${RELEASE_ARGS[@]}" \
|
||||
dist/*
|
||||
|
||||
# Scheduled nightly builds
|
||||
nightly:
|
||||
if: github.event_name == 'schedule'
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Create nightly tag
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
|
||||
run: |
|
||||
DATE=$(date -u +%Y%m%d)
|
||||
TAG="bugseti-nightly-${DATE}"
|
||||
|
||||
# Delete existing nightly tag for today if it exists
|
||||
gh release delete "$TAG" --yes 2>/dev/null || true
|
||||
git push origin ":refs/tags/$TAG" 2>/dev/null || true
|
||||
|
||||
# Create new tag
|
||||
git tag "$TAG"
|
||||
git push origin "$TAG"
|
||||
2
.github/workflows/coverage.yml
vendored
2
.github/workflows/coverage.yml
vendored
|
|
@ -40,7 +40,7 @@ jobs:
|
|||
run: go generate ./internal/cmd/updater/...
|
||||
|
||||
- name: Run coverage
|
||||
run: core go cov
|
||||
run: core go cov --output coverage.txt --threshold 40 --branch-threshold 35
|
||||
|
||||
- name: Upload coverage reports to Codecov
|
||||
uses: codecov/codecov-action@v5
|
||||
|
|
|
|||
394
.github/workflows/release.yml
vendored
394
.github/workflows/release.yml
vendored
|
|
@ -33,16 +33,6 @@ jobs:
|
|||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
# GUI build disabled until build action supports Wails v3
|
||||
# - name: Wails Build Action
|
||||
# uses: host-uk/build@v4.0.0
|
||||
# with:
|
||||
# build-name: core
|
||||
# build-platform: ${{ matrix.goos }}/${{ matrix.goarch }}
|
||||
# build: true
|
||||
# package: true
|
||||
# sign: false
|
||||
|
||||
- name: Setup Go
|
||||
uses: host-uk/build/actions/setup/go@v4.0.0
|
||||
with:
|
||||
|
|
@ -53,20 +43,155 @@ jobs:
|
|||
run: |
|
||||
EXT=""
|
||||
if [ "$GOOS" = "windows" ]; then EXT=".exe"; fi
|
||||
go build -o "./bin/core${EXT}" .
|
||||
BINARY="core${EXT}"
|
||||
ARCHIVE_PREFIX="core-${GOOS}-${GOARCH}"
|
||||
|
||||
APP_VERSION="${GITHUB_REF_NAME#v}"
|
||||
go build -ldflags "-s -w -X github.com/host-uk/core/pkg/cli.AppVersion=${APP_VERSION}" -o "./bin/${BINARY}" .
|
||||
|
||||
# Create tar.gz for Homebrew (non-Windows)
|
||||
if [ "$GOOS" != "windows" ]; then
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "${BINARY}"
|
||||
fi
|
||||
|
||||
# Create zip for Scoop (Windows)
|
||||
if [ "$GOOS" = "windows" ]; then
|
||||
cd ./bin && zip "${ARCHIVE_PREFIX}.zip" "${BINARY}" && cd ..
|
||||
fi
|
||||
|
||||
# Rename raw binary to platform-specific name for release
|
||||
mv "./bin/${BINARY}" "./bin/${ARCHIVE_PREFIX}${EXT}"
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: core-${{ matrix.goos }}-${{ matrix.goarch }}
|
||||
path: ./bin/core*
|
||||
path: ./bin/core-*
|
||||
|
||||
release:
|
||||
needs: build
|
||||
runs-on: ubuntu-latest
|
||||
build-ide:
|
||||
strategy:
|
||||
matrix:
|
||||
include:
|
||||
- os: macos-latest
|
||||
goos: darwin
|
||||
goarch: arm64
|
||||
- os: ubuntu-latest
|
||||
goos: linux
|
||||
goarch: amd64
|
||||
- os: windows-latest
|
||||
goos: windows
|
||||
goarch: amd64
|
||||
runs-on: ${{ matrix.os }}
|
||||
env:
|
||||
GOOS: ${{ matrix.goos }}
|
||||
GOARCH: ${{ matrix.goarch }}
|
||||
defaults:
|
||||
run:
|
||||
working-directory: internal/core-ide
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Setup Go
|
||||
uses: host-uk/build/actions/setup/go@v4.0.0
|
||||
with:
|
||||
go-version: "1.25"
|
||||
|
||||
- name: Setup Node.js
|
||||
uses: actions/setup-node@v4
|
||||
with:
|
||||
node-version: "20"
|
||||
|
||||
- name: Install Wails CLI
|
||||
run: go install github.com/wailsapp/wails/v3/cmd/wails3@latest
|
||||
|
||||
- name: Install frontend dependencies
|
||||
working-directory: internal/core-ide/frontend
|
||||
run: npm ci
|
||||
|
||||
- name: Generate bindings
|
||||
run: wails3 generate bindings -f '-tags production' -clean=false -ts -i
|
||||
|
||||
- name: Build frontend
|
||||
working-directory: internal/core-ide/frontend
|
||||
run: npm run build
|
||||
|
||||
- name: Install Linux dependencies
|
||||
if: matrix.goos == 'linux'
|
||||
run: |
|
||||
sudo apt-get update
|
||||
sudo apt-get install -y libgtk-3-dev libwebkit2gtk-4.0-dev
|
||||
|
||||
- name: Build IDE
|
||||
shell: bash
|
||||
run: |
|
||||
EXT=""
|
||||
if [ "$GOOS" = "windows" ]; then EXT=".exe"; fi
|
||||
BINARY="core-ide${EXT}"
|
||||
ARCHIVE_PREFIX="core-ide-${GOOS}-${GOARCH}"
|
||||
|
||||
BUILD_FLAGS="-tags production -trimpath -buildvcs=false"
|
||||
|
||||
if [ "$GOOS" = "windows" ]; then
|
||||
# Windows: no CGO, use windowsgui linker flag
|
||||
export CGO_ENABLED=0
|
||||
LDFLAGS="-w -s -H windowsgui"
|
||||
|
||||
# Generate Windows syso resource
|
||||
cd build
|
||||
wails3 generate syso -arch ${GOARCH} -icon windows/icon.ico -manifest windows/wails.exe.manifest -info windows/info.json -out ../wails_windows_${GOARCH}.syso
|
||||
cd ..
|
||||
elif [ "$GOOS" = "darwin" ]; then
|
||||
export CGO_ENABLED=1
|
||||
export CGO_CFLAGS="-mmacosx-version-min=10.15"
|
||||
export CGO_LDFLAGS="-mmacosx-version-min=10.15"
|
||||
export MACOSX_DEPLOYMENT_TARGET="10.15"
|
||||
LDFLAGS="-w -s"
|
||||
else
|
||||
export CGO_ENABLED=1
|
||||
LDFLAGS="-w -s"
|
||||
fi
|
||||
|
||||
go build ${BUILD_FLAGS} -ldflags="${LDFLAGS}" -o "./bin/${BINARY}"
|
||||
|
||||
# Clean up syso files
|
||||
rm -f *.syso
|
||||
|
||||
# Package
|
||||
if [ "$GOOS" = "darwin" ]; then
|
||||
# Create .app bundle
|
||||
mkdir -p "./bin/Core IDE.app/Contents/"{MacOS,Resources}
|
||||
cp build/darwin/icons.icns "./bin/Core IDE.app/Contents/Resources/"
|
||||
cp "./bin/${BINARY}" "./bin/Core IDE.app/Contents/MacOS/"
|
||||
cp build/darwin/Info.plist "./bin/Core IDE.app/Contents/"
|
||||
codesign --force --deep --sign - "./bin/Core IDE.app"
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "Core IDE.app"
|
||||
elif [ "$GOOS" = "windows" ]; then
|
||||
cd ./bin && zip "${ARCHIVE_PREFIX}.zip" "${BINARY}" && cd ..
|
||||
else
|
||||
tar czf "./bin/${ARCHIVE_PREFIX}.tar.gz" -C ./bin "${BINARY}"
|
||||
fi
|
||||
|
||||
# Rename raw binary
|
||||
mv "./bin/${BINARY}" "./bin/${ARCHIVE_PREFIX}${EXT}"
|
||||
|
||||
- name: Upload artifact
|
||||
uses: actions/upload-artifact@v4
|
||||
with:
|
||||
name: core-ide-${{ matrix.goos }}-${{ matrix.goarch }}
|
||||
path: internal/core-ide/bin/core-ide-*
|
||||
|
||||
release:
|
||||
needs: [build, build-ide]
|
||||
runs-on: ubuntu-latest
|
||||
outputs:
|
||||
version: ${{ steps.version.outputs.version }}
|
||||
steps:
|
||||
- uses: actions/checkout@v6
|
||||
|
||||
- name: Set version
|
||||
id: version
|
||||
run: echo "version=${{ github.ref_name }}" >> "$GITHUB_OUTPUT"
|
||||
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
|
|
@ -88,3 +213,242 @@ jobs:
|
|||
--title "Release $TAG_NAME" \
|
||||
--generate-notes \
|
||||
release/*
|
||||
|
||||
update-tap:
|
||||
needs: release
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
path: dist
|
||||
merge-multiple: true
|
||||
|
||||
- name: Generate checksums
|
||||
run: |
|
||||
cd dist
|
||||
for f in *.tar.gz; do
|
||||
sha256sum "$f" | awk '{print $1}' > "${f}.sha256"
|
||||
done
|
||||
echo "=== Checksums ==="
|
||||
cat *.sha256
|
||||
|
||||
- name: Update Homebrew formula
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.HOMEBREW_TAP_TOKEN }}
|
||||
VERSION: ${{ needs.release.outputs.version }}
|
||||
run: |
|
||||
# Strip leading 'v' for formula version
|
||||
FORMULA_VERSION="${VERSION#v}"
|
||||
|
||||
# Read checksums
|
||||
DARWIN_ARM64=$(cat dist/core-darwin-arm64.tar.gz.sha256)
|
||||
LINUX_AMD64=$(cat dist/core-linux-amd64.tar.gz.sha256)
|
||||
LINUX_ARM64=$(cat dist/core-linux-arm64.tar.gz.sha256)
|
||||
|
||||
# Clone tap repo (configure auth for push)
|
||||
gh repo clone host-uk/homebrew-tap /tmp/tap -- --depth=1
|
||||
cd /tmp/tap
|
||||
git remote set-url origin "https://x-access-token:${GH_TOKEN}@github.com/host-uk/homebrew-tap.git"
|
||||
cd -
|
||||
mkdir -p /tmp/tap/Formula
|
||||
|
||||
# Write formula
|
||||
cat > /tmp/tap/Formula/core.rb << FORMULA
|
||||
# typed: false
|
||||
# frozen_string_literal: true
|
||||
|
||||
class Core < Formula
|
||||
desc "Host UK development CLI"
|
||||
homepage "https://github.com/host-uk/core"
|
||||
version "${FORMULA_VERSION}"
|
||||
license "EUPL-1.2"
|
||||
|
||||
on_macos do
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-darwin-arm64.tar.gz"
|
||||
sha256 "${DARWIN_ARM64}"
|
||||
end
|
||||
|
||||
on_linux do
|
||||
if Hardware::CPU.arm?
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-linux-arm64.tar.gz"
|
||||
sha256 "${LINUX_ARM64}"
|
||||
else
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-linux-amd64.tar.gz"
|
||||
sha256 "${LINUX_AMD64}"
|
||||
end
|
||||
end
|
||||
|
||||
def install
|
||||
bin.install "core"
|
||||
end
|
||||
|
||||
test do
|
||||
system "\#{bin}/core", "--version"
|
||||
end
|
||||
end
|
||||
FORMULA
|
||||
|
||||
# Remove leading whitespace from heredoc
|
||||
sed -i 's/^ //' /tmp/tap/Formula/core.rb
|
||||
|
||||
# Read IDE checksums (may not exist if build-ide failed)
|
||||
IDE_DARWIN_ARM64=$(cat dist/core-ide-darwin-arm64.tar.gz.sha256 2>/dev/null || echo "")
|
||||
IDE_LINUX_AMD64=$(cat dist/core-ide-linux-amd64.tar.gz.sha256 2>/dev/null || echo "")
|
||||
|
||||
# Write core-ide Formula (Linux binary)
|
||||
if [ -n "${IDE_LINUX_AMD64}" ]; then
|
||||
cat > /tmp/tap/Formula/core-ide.rb << FORMULA
|
||||
# typed: false
|
||||
# frozen_string_literal: true
|
||||
|
||||
class CoreIde < Formula
|
||||
desc "Host UK desktop development environment"
|
||||
homepage "https://github.com/host-uk/core"
|
||||
version "${FORMULA_VERSION}"
|
||||
license "EUPL-1.2"
|
||||
|
||||
on_linux do
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-ide-linux-amd64.tar.gz"
|
||||
sha256 "${IDE_LINUX_AMD64}"
|
||||
end
|
||||
|
||||
def install
|
||||
bin.install "core-ide"
|
||||
end
|
||||
end
|
||||
FORMULA
|
||||
sed -i 's/^ //' /tmp/tap/Formula/core-ide.rb
|
||||
fi
|
||||
|
||||
# Write core-ide Cask (macOS .app bundle)
|
||||
if [ -n "${IDE_DARWIN_ARM64}" ]; then
|
||||
mkdir -p /tmp/tap/Casks
|
||||
cat > /tmp/tap/Casks/core-ide.rb << CASK
|
||||
cask "core-ide" do
|
||||
version "${FORMULA_VERSION}"
|
||||
sha256 "${IDE_DARWIN_ARM64}"
|
||||
|
||||
url "https://github.com/host-uk/core/releases/download/${VERSION}/core-ide-darwin-arm64.tar.gz"
|
||||
name "Core IDE"
|
||||
desc "Host UK desktop development environment"
|
||||
homepage "https://github.com/host-uk/core"
|
||||
|
||||
app "Core IDE.app"
|
||||
end
|
||||
CASK
|
||||
sed -i 's/^ //' /tmp/tap/Casks/core-ide.rb
|
||||
fi
|
||||
|
||||
cd /tmp/tap
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
git add .
|
||||
git diff --cached --quiet && echo "No changes to tap" && exit 0
|
||||
git commit -m "Update core to ${FORMULA_VERSION}"
|
||||
git push
|
||||
|
||||
update-scoop:
|
||||
needs: release
|
||||
runs-on: ubuntu-latest
|
||||
steps:
|
||||
- name: Download artifacts
|
||||
uses: actions/download-artifact@v7
|
||||
with:
|
||||
path: dist
|
||||
merge-multiple: true
|
||||
|
||||
- name: Generate checksums
|
||||
run: |
|
||||
cd dist
|
||||
for f in *.zip; do
|
||||
[ -f "$f" ] || continue
|
||||
sha256sum "$f" | awk '{print $1}' > "${f}.sha256"
|
||||
done
|
||||
echo "=== Checksums ==="
|
||||
cat *.sha256 2>/dev/null || echo "No zip checksums"
|
||||
|
||||
- name: Update Scoop manifests
|
||||
env:
|
||||
GH_TOKEN: ${{ secrets.HOMEBREW_TAP_TOKEN }}
|
||||
VERSION: ${{ needs.release.outputs.version }}
|
||||
run: |
|
||||
# Strip leading 'v' for manifest version
|
||||
MANIFEST_VERSION="${VERSION#v}"
|
||||
|
||||
# Read checksums
|
||||
WIN_AMD64=$(cat dist/core-windows-amd64.zip.sha256 2>/dev/null || echo "")
|
||||
IDE_WIN_AMD64=$(cat dist/core-ide-windows-amd64.zip.sha256 2>/dev/null || echo "")
|
||||
|
||||
# Clone scoop bucket
|
||||
gh repo clone host-uk/scoop-bucket /tmp/scoop -- --depth=1
|
||||
cd /tmp/scoop
|
||||
git remote set-url origin "https://x-access-token:${GH_TOKEN}@github.com/host-uk/scoop-bucket.git"
|
||||
|
||||
# Write core.json manifest
|
||||
cat > core.json << 'MANIFEST'
|
||||
{
|
||||
"version": "VERSION_PLACEHOLDER",
|
||||
"description": "Host UK development CLI",
|
||||
"homepage": "https://github.com/host-uk/core",
|
||||
"license": "EUPL-1.2",
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "URL_PLACEHOLDER",
|
||||
"hash": "HASH_PLACEHOLDER",
|
||||
"bin": "core.exe"
|
||||
}
|
||||
},
|
||||
"checkver": "github",
|
||||
"autoupdate": {
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "https://github.com/host-uk/core/releases/download/v$version/core-windows-amd64.zip"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
MANIFEST
|
||||
|
||||
sed -i "s|VERSION_PLACEHOLDER|${MANIFEST_VERSION}|g" core.json
|
||||
sed -i "s|URL_PLACEHOLDER|https://github.com/host-uk/core/releases/download/${VERSION}/core-windows-amd64.zip|g" core.json
|
||||
sed -i "s|HASH_PLACEHOLDER|${WIN_AMD64}|g" core.json
|
||||
sed -i 's/^ //' core.json
|
||||
|
||||
# Write core-ide.json manifest
|
||||
if [ -n "${IDE_WIN_AMD64}" ]; then
|
||||
cat > core-ide.json << 'MANIFEST'
|
||||
{
|
||||
"version": "VERSION_PLACEHOLDER",
|
||||
"description": "Host UK desktop development environment",
|
||||
"homepage": "https://github.com/host-uk/core",
|
||||
"license": "EUPL-1.2",
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "URL_PLACEHOLDER",
|
||||
"hash": "HASH_PLACEHOLDER",
|
||||
"bin": "core-ide.exe"
|
||||
}
|
||||
},
|
||||
"checkver": "github",
|
||||
"autoupdate": {
|
||||
"architecture": {
|
||||
"64bit": {
|
||||
"url": "https://github.com/host-uk/core/releases/download/v$version/core-ide-windows-amd64.zip"
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
MANIFEST
|
||||
sed -i "s|VERSION_PLACEHOLDER|${MANIFEST_VERSION}|g" core-ide.json
|
||||
sed -i "s|URL_PLACEHOLDER|https://github.com/host-uk/core/releases/download/${VERSION}/core-ide-windows-amd64.zip|g" core-ide.json
|
||||
sed -i "s|HASH_PLACEHOLDER|${IDE_WIN_AMD64}|g" core-ide.json
|
||||
sed -i 's/^ //' core-ide.json
|
||||
fi
|
||||
|
||||
git config user.name "github-actions[bot]"
|
||||
git config user.email "github-actions[bot]@users.noreply.github.com"
|
||||
git add .
|
||||
git diff --cached --quiet && echo "No changes to scoop bucket" && exit 0
|
||||
git commit -m "Update core to ${MANIFEST_VERSION}"
|
||||
git push
|
||||
|
|
|
|||
67
CLAUDE.md
67
CLAUDE.md
|
|
@ -38,7 +38,7 @@ Run a single test: `go test -run TestName ./...`
|
|||
### Core Framework (`core.go`, `interfaces.go`)
|
||||
|
||||
The `Core` struct is the central application container managing:
|
||||
- **Services**: Named service registry with type-safe retrieval via `ServiceFor[T]()` and `MustServiceFor[T]()`
|
||||
- **Services**: Named service registry with type-safe retrieval via `ServiceFor[T]()`
|
||||
- **Actions/IPC**: Message-passing system where services communicate via `ACTION(msg Message)` and register handlers via `RegisterAction()`
|
||||
- **Lifecycle**: Services implementing `Startable` (OnStartup) and/or `Stoppable` (OnShutdown) interfaces are automatically called during app lifecycle
|
||||
|
||||
|
|
@ -97,6 +97,69 @@ Tests use `_Good`, `_Bad`, `_Ugly` suffix pattern:
|
|||
Uses Go 1.25 workspaces. The workspace includes:
|
||||
- Root module (Core framework)
|
||||
- `cmd/core-gui` (Wails GUI application)
|
||||
- `cmd/bugseti` (BugSETI system tray app - distributed bug fixing)
|
||||
- `cmd/examples/*` (Example applications)
|
||||
|
||||
After adding modules: `go work sync`
|
||||
After adding modules: `go work sync`
|
||||
|
||||
## Additional Packages
|
||||
|
||||
### pkg/ws (WebSocket Hub)
|
||||
|
||||
Real-time streaming via WebSocket connections. Implements a hub pattern for managing connections and channel-based subscriptions.
|
||||
|
||||
```go
|
||||
hub := ws.NewHub()
|
||||
go hub.Run(ctx)
|
||||
|
||||
// Register HTTP handler
|
||||
http.HandleFunc("/ws", hub.Handler())
|
||||
|
||||
// Send process output to subscribers
|
||||
hub.SendProcessOutput(processID, "output line")
|
||||
```
|
||||
|
||||
Message types: `process_output`, `process_status`, `event`, `error`, `ping/pong`, `subscribe/unsubscribe`
|
||||
|
||||
### pkg/webview (Browser Automation)
|
||||
|
||||
Chrome DevTools Protocol (CDP) client for browser automation, testing, and scraping.
|
||||
|
||||
```go
|
||||
wv, err := webview.New(webview.WithDebugURL("http://localhost:9222"))
|
||||
defer wv.Close()
|
||||
|
||||
wv.Navigate("https://example.com")
|
||||
wv.Click("#submit-button")
|
||||
wv.Type("#input", "text")
|
||||
screenshot, _ := wv.Screenshot()
|
||||
```
|
||||
|
||||
Features: Navigation, DOM queries, console capture, screenshots, JavaScript evaluation, Angular helpers
|
||||
|
||||
### pkg/mcp (MCP Server)
|
||||
|
||||
Model Context Protocol server with tools for:
|
||||
- **File operations**: file_read, file_write, file_edit, file_delete, file_rename, file_exists, dir_list, dir_create
|
||||
- **RAG**: rag_query, rag_ingest, rag_collections (Qdrant + Ollama)
|
||||
- **Metrics**: metrics_record, metrics_query (JSONL storage)
|
||||
- **Language detection**: lang_detect, lang_list
|
||||
- **Process management**: process_start, process_stop, process_kill, process_list, process_output, process_input
|
||||
- **WebSocket**: ws_start, ws_info
|
||||
- **Webview/CDP**: webview_connect, webview_navigate, webview_click, webview_type, webview_query, webview_console, webview_eval, webview_screenshot, webview_wait, webview_disconnect
|
||||
|
||||
Run server: `core mcp serve` (stdio) or `MCP_ADDR=:9000 core mcp serve` (TCP)
|
||||
|
||||
## BugSETI Application
|
||||
|
||||
System tray application for distributed bug fixing - "like SETI@home but for code".
|
||||
|
||||
Features:
|
||||
- Fetches OSS issues from GitHub
|
||||
- AI-powered context preparation via seeder
|
||||
- Issue queue management
|
||||
- Automated PR submission
|
||||
- Stats tracking and leaderboard
|
||||
|
||||
Build: `task bugseti:build`
|
||||
Run: `task bugseti:dev`
|
||||
163
README.md
163
README.md
|
|
@ -22,12 +22,31 @@ Core is an **opinionated Web3 desktop application framework** providing:
|
|||
|
||||
**Mental model:** A secure, encrypted workspace manager where each "workspace" is a cryptographically isolated environment. The framework handles windows, menus, trays, config, and i18n.
|
||||
|
||||
## Quick Start
|
||||
## CLI Quick Start
|
||||
|
||||
```bash
|
||||
# 1. Install Core
|
||||
go install github.com/host-uk/core/cmd/core@latest
|
||||
|
||||
# 2. Verify environment
|
||||
core doctor
|
||||
|
||||
# 3. Run tests in any Go/PHP project
|
||||
core go test # or core php test
|
||||
|
||||
# 4. Build and preview release
|
||||
core build
|
||||
core ci
|
||||
```
|
||||
|
||||
For more details, see the [User Guide](docs/user-guide.md).
|
||||
|
||||
## Framework Quick Start (Go)
|
||||
|
||||
```go
|
||||
import core "github.com/host-uk/core"
|
||||
import core "github.com/host-uk/core/pkg/framework/core"
|
||||
|
||||
app := core.New(
|
||||
app, err := core.New(
|
||||
core.WithServiceLock(),
|
||||
)
|
||||
```
|
||||
|
|
@ -118,7 +137,7 @@ Any configuration value can be overridden using environment variables with the `
|
|||
| `task test-gen` | Generate test stubs for public API |
|
||||
| `task check` | go mod tidy + tests + review |
|
||||
| `task review` | CodeRabbit review |
|
||||
| `task cov` | Generate coverage.txt |
|
||||
| `task cov` | Run tests with coverage report |
|
||||
| `task cov-view` | Open HTML coverage report |
|
||||
| `task sync` | Update public API Go files |
|
||||
|
||||
|
|
@ -130,21 +149,20 @@ Any configuration value can be overridden using environment variables with the `
|
|||
|
||||
```
|
||||
.
|
||||
├── core.go # Facade re-exporting pkg/core
|
||||
├── main.go # CLI application entry point
|
||||
├── pkg/
|
||||
│ ├── core/ # Service container, DI, Runtime[T]
|
||||
│ ├── config/ # JSON persistence, XDG paths
|
||||
│ ├── display/ # Windows, tray, menus (Wails)
|
||||
│ ├── framework/core/ # Service container, DI, Runtime[T]
|
||||
│ ├── crypt/ # Hashing, checksums, PGP
|
||||
│ │ └── openpgp/ # Full PGP implementation
|
||||
│ ├── io/ # Medium interface + backends
|
||||
│ ├── workspace/ # Encrypted workspace management
|
||||
│ ├── help/ # In-app documentation
|
||||
│ └── i18n/ # Internationalization
|
||||
├── cmd/
|
||||
│ ├── core/ # CLI application
|
||||
│ └── core-gui/ # Wails GUI application
|
||||
└── go.work # Links root, cmd/core, cmd/core-gui
|
||||
│ ├── i18n/ # Internationalization
|
||||
│ ├── repos/ # Multi-repo registry & management
|
||||
│ ├── agentic/ # AI agent task management
|
||||
│ └── mcp/ # Model Context Protocol service
|
||||
├── internal/
|
||||
│ ├── cmd/ # CLI command implementations
|
||||
│ └── variants/ # Build variants (full, minimal, etc.)
|
||||
└── go.mod # Go module definition
|
||||
```
|
||||
|
||||
### Service Pattern (Dual-Constructor DI)
|
||||
|
|
@ -201,6 +219,40 @@ Service("workspace") // Get service by name (returns any)
|
|||
|
||||
**NOT exposed:** Direct calls like `workspace.CreateWorkspace()` or `crypt.Hash()`.
|
||||
|
||||
## Configuration Management
|
||||
|
||||
Core uses a **centralized configuration service** implemented in `pkg/config`, with YAML-based persistence and layered overrides.
|
||||
|
||||
The `pkg/config` package provides:
|
||||
|
||||
- YAML-backed persistence at `~/.core/config.yaml`
|
||||
- Dot-notation key access (for example: `cfg.Set("dev.editor", "vim")`, `cfg.GetString("dev.editor")`)
|
||||
- Environment variable overlay support (env vars can override persisted values)
|
||||
- Thread-safe operations for concurrent reads/writes
|
||||
|
||||
Application code should treat `pkg/config` as the **primary configuration mechanism**. Direct reads/writes to YAML files should generally be avoided from application logic in favour of using this centralized service.
|
||||
|
||||
### Project and Service Configuration Files
|
||||
|
||||
In addition to the centralized configuration service, Core uses several YAML files for project-specific build/CI and service configuration. These live alongside (but are distinct from) the centralized configuration:
|
||||
|
||||
- **Project Configuration** (in the `.core/` directory of the project root):
|
||||
- `build.yaml`: Build targets, flags, and project metadata.
|
||||
- `release.yaml`: Release automation, changelog settings, and publishing targets.
|
||||
- `ci.yaml`: CI pipeline configuration.
|
||||
- **Global Configuration** (in the `~/.core/` directory):
|
||||
- `config.yaml`: Centralized user/framework settings and defaults, managed via `pkg/config`.
|
||||
- `agentic.yaml`: Configuration for agentic services (BaseURL, Token, etc.).
|
||||
- **Registry Configuration** (`repos.yaml`, auto-discovered):
|
||||
- Multi-repo registry definition.
|
||||
- Searched in the current directory and its parent directories (walking up).
|
||||
- Then in `~/Code/host-uk/repos.yaml`.
|
||||
- Finally in `~/.config/core/repos.yaml`.
|
||||
|
||||
### Format
|
||||
|
||||
All persisted configuration files described above use **YAML** format for readability and nested structure support.
|
||||
|
||||
### The IPC Bridge Pattern (Chosen Architecture)
|
||||
|
||||
Sub-services are accessed via Core's **IPC/ACTION system**, not direct Wails bindings:
|
||||
|
|
@ -241,16 +293,15 @@ func (s *Service) HandleIPCEvents(c *core.Core, msg core.Message) error {
|
|||
|
||||
### Generating Bindings
|
||||
|
||||
Wails v3 bindings are typically generated in the GUI repository (e.g., `core-gui`).
|
||||
|
||||
```bash
|
||||
cd cmd/core-gui
|
||||
wails3 generate bindings # Regenerate after Go changes
|
||||
```
|
||||
|
||||
Bindings output to `cmd/core-gui/public/bindings/github.com/host-uk/core/` mirroring Go package structure.
|
||||
|
||||
---
|
||||
|
||||
### Service Interfaces (`pkg/core/interfaces.go`)
|
||||
### Service Interfaces (`pkg/framework/core/interfaces.go`)
|
||||
|
||||
```go
|
||||
type Config interface {
|
||||
|
|
@ -283,54 +334,27 @@ type Crypt interface {
|
|||
|
||||
| Package | Notes |
|
||||
|---------|-------|
|
||||
| `pkg/core` | Service container, DI, thread-safe - solid |
|
||||
| `pkg/config` | JSON persistence, XDG paths - solid |
|
||||
| `pkg/crypt` | Hashing, checksums, PGP - solid, well-tested |
|
||||
| `pkg/help` | Embedded docs, Show/ShowAt - solid |
|
||||
| `pkg/framework/core` | Service container, DI, thread-safe - solid |
|
||||
| `pkg/config` | Layered YAML configuration, XDG paths - solid |
|
||||
| `pkg/crypt` | Hashing, checksums, symmetric/asymmetric - solid, well-tested |
|
||||
| `pkg/help` | Embedded docs, full-text search - solid |
|
||||
| `pkg/i18n` | Multi-language with go-i18n - solid |
|
||||
| `pkg/io` | Medium interface + local backend - solid |
|
||||
| `pkg/workspace` | Workspace creation, switching, file ops - functional |
|
||||
|
||||
### Partial
|
||||
|
||||
| Package | Issues |
|
||||
|---------|--------|
|
||||
| `pkg/display` | Window creation works; menu/tray handlers are TODOs |
|
||||
|
||||
---
|
||||
|
||||
## Priority Work Items
|
||||
|
||||
### 1. IMPLEMENT: System Tray Brand Support
|
||||
|
||||
`pkg/display/tray.go:52-63` - Commented brand-specific menu items need implementation.
|
||||
|
||||
### 2. ADD: Integration Tests
|
||||
|
||||
| Package | Notes |
|
||||
|---------|-------|
|
||||
| `pkg/display` | Integration tests requiring Wails runtime (27% unit coverage) |
|
||||
| `pkg/repos` | Multi-repo registry & management - solid |
|
||||
| `pkg/agentic` | AI agent task management - solid |
|
||||
| `pkg/mcp` | Model Context Protocol service - solid |
|
||||
|
||||
---
|
||||
|
||||
## Package Deep Dives
|
||||
|
||||
### pkg/workspace - The Core Feature
|
||||
### pkg/crypt
|
||||
|
||||
Each workspace is:
|
||||
1. Identified by LTHN hash of user identifier
|
||||
2. Has directory structure: `config/`, `log/`, `data/`, `files/`, `keys/`
|
||||
3. Gets a PGP keypair generated on creation
|
||||
4. Files accessed via obfuscated paths
|
||||
|
||||
The `workspaceList` maps workspace IDs to public keys.
|
||||
|
||||
### pkg/crypt/openpgp
|
||||
|
||||
Full PGP using `github.com/ProtonMail/go-crypto`:
|
||||
- `CreateKeyPair(name, passphrase)` - RSA-4096 with revocation cert
|
||||
- `EncryptPGP()` - Encrypt + optional signing
|
||||
- `DecryptPGP()` - Decrypt + optional signature verification
|
||||
The crypt package provides a comprehensive suite of cryptographic primitives:
|
||||
- **Hashing & Checksums**: SHA-256, SHA-512, and CRC32 support.
|
||||
- **Symmetric Encryption**: AES-GCM and ChaCha20-Poly1305 for secure data at rest.
|
||||
- **Key Derivation**: Argon2id for secure password hashing.
|
||||
- **Asymmetric Encryption**: PGP implementation in the `pkg/crypt/openpgp` subpackage using `github.com/ProtonMail/go-crypto`.
|
||||
|
||||
### pkg/io - Storage Abstraction
|
||||
|
||||
|
|
@ -393,10 +417,27 @@ Implementations: `local/`, `sftp/`, `webdav/`
|
|||
|
||||
---
|
||||
|
||||
## Getting Help
|
||||
|
||||
- **[User Guide](docs/user-guide.md)**: Detailed usage and concepts.
|
||||
- **[FAQ](docs/faq.md)**: Frequently asked questions.
|
||||
- **[Workflows](docs/workflows.md)**: Common task sequences.
|
||||
- **[Troubleshooting](docs/troubleshooting.md)**: Solving common issues.
|
||||
- **[Configuration](docs/configuration.md)**: Config file reference.
|
||||
|
||||
```bash
|
||||
# Check environment
|
||||
core doctor
|
||||
|
||||
# Command help
|
||||
core <command> --help
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## For New Contributors
|
||||
|
||||
1. Run `task test` to verify all tests pass
|
||||
2. Follow TDD: `task test-gen` creates stubs, implement to pass
|
||||
3. The dual-constructor pattern is intentional: `New(deps)` for tests, `Register()` for runtime
|
||||
4. See `cmd/core-gui/main.go` for how services wire together
|
||||
5. IPC handlers in each service's `HandleIPCEvents()` are the frontend bridge
|
||||
4. IPC handlers in each service's `HandleIPCEvents()` are the frontend bridge
|
||||
|
|
|
|||
|
|
@ -53,6 +53,11 @@ tasks:
|
|||
cmds:
|
||||
- core go cov
|
||||
|
||||
cov-view:
|
||||
desc: "Open HTML coverage report"
|
||||
cmds:
|
||||
- core go cov --open
|
||||
|
||||
fmt:
|
||||
desc: "Format Go code"
|
||||
cmds:
|
||||
|
|
|
|||
31
cmd/bugseti/.gitignore
vendored
Normal file
31
cmd/bugseti/.gitignore
vendored
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
# Build output
|
||||
bin/
|
||||
frontend/dist/
|
||||
frontend/node_modules/
|
||||
frontend/.angular/
|
||||
|
||||
# IDE
|
||||
.idea/
|
||||
.vscode/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
|
||||
# OS
|
||||
.DS_Store
|
||||
Thumbs.db
|
||||
|
||||
# Go
|
||||
*.exe
|
||||
*.exe~
|
||||
*.dll
|
||||
*.so
|
||||
*.dylib
|
||||
|
||||
# Test
|
||||
*.test
|
||||
*.out
|
||||
coverage/
|
||||
|
||||
# Wails
|
||||
wails.json
|
||||
186
cmd/bugseti/README.md
Normal file
186
cmd/bugseti/README.md
Normal file
|
|
@ -0,0 +1,186 @@
|
|||
# BugSETI
|
||||
|
||||
**Distributed Bug Fixing - like SETI@home but for code**
|
||||
|
||||
BugSETI is a system tray application that helps developers contribute to open source by fixing bugs in their spare CPU cycles. It fetches issues from GitHub repositories, prepares context using AI, and guides you through the fix-and-submit workflow.
|
||||
|
||||
## Features
|
||||
|
||||
- **System Tray Integration**: Runs quietly in the background, ready when you are
|
||||
- **Issue Queue**: Automatically fetches and queues issues from configured repositories
|
||||
- **AI Context Seeding**: Prepares relevant code context for each issue using pattern matching
|
||||
- **Workbench UI**: Full-featured interface for reviewing issues and submitting fixes
|
||||
- **Automated PR Submission**: Streamlined workflow from fix to pull request
|
||||
- **Stats & Leaderboard**: Track your contributions and compete with the community
|
||||
|
||||
## Installation
|
||||
|
||||
### From Source
|
||||
|
||||
```bash
|
||||
# Clone the repository
|
||||
git clone https://github.com/host-uk/core.git
|
||||
cd core
|
||||
|
||||
# Build BugSETI
|
||||
task bugseti:build
|
||||
|
||||
# The binary will be in build/bin/bugseti
|
||||
```
|
||||
|
||||
### Prerequisites
|
||||
|
||||
- Go 1.25 or later
|
||||
- Node.js 18+ and npm (for frontend)
|
||||
- GitHub CLI (`gh`) authenticated
|
||||
- Chrome/Chromium (optional, for webview features)
|
||||
|
||||
## Configuration
|
||||
|
||||
On first launch, BugSETI will show an onboarding wizard to configure:
|
||||
|
||||
1. **GitHub Token**: For fetching issues and submitting PRs
|
||||
2. **Repositories**: Which repos to fetch issues from
|
||||
3. **Filters**: Issue labels, difficulty levels, languages
|
||||
4. **Notifications**: How to alert you about new issues
|
||||
|
||||
### Configuration File
|
||||
|
||||
Settings are stored in `~/.config/bugseti/config.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"github_token": "ghp_...",
|
||||
"repositories": [
|
||||
"host-uk/core",
|
||||
"example/repo"
|
||||
],
|
||||
"filters": {
|
||||
"labels": ["good first issue", "help wanted", "bug"],
|
||||
"languages": ["go", "typescript"],
|
||||
"max_age_days": 30
|
||||
},
|
||||
"notifications": {
|
||||
"enabled": true,
|
||||
"sound": true
|
||||
},
|
||||
"fetch_interval_minutes": 30
|
||||
}
|
||||
```
|
||||
|
||||
## Usage
|
||||
|
||||
### Starting BugSETI
|
||||
|
||||
```bash
|
||||
# Run the application
|
||||
./bugseti
|
||||
|
||||
# Or use task runner
|
||||
task bugseti:run
|
||||
```
|
||||
|
||||
The app will appear in your system tray. Click the icon to see the quick menu or open the workbench.
|
||||
|
||||
### Workflow
|
||||
|
||||
1. **Browse Issues**: Click the tray icon to see available issues
|
||||
2. **Select an Issue**: Choose one to work on from the queue
|
||||
3. **Review Context**: BugSETI shows relevant files and patterns
|
||||
4. **Fix the Bug**: Make your changes in your preferred editor
|
||||
5. **Submit PR**: Use the workbench to create and submit your pull request
|
||||
|
||||
### Keyboard Shortcuts
|
||||
|
||||
| Shortcut | Action |
|
||||
|----------|--------|
|
||||
| `Ctrl+Shift+B` | Open workbench |
|
||||
| `Ctrl+Shift+N` | Next issue |
|
||||
| `Ctrl+Shift+S` | Submit PR |
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
cmd/bugseti/
|
||||
main.go # Application entry point
|
||||
tray.go # System tray service
|
||||
icons/ # Tray icons (light/dark/template)
|
||||
frontend/ # Angular frontend
|
||||
src/
|
||||
app/
|
||||
tray/ # Tray panel component
|
||||
workbench/ # Main workbench
|
||||
settings/ # Settings panel
|
||||
onboarding/ # First-run wizard
|
||||
|
||||
internal/bugseti/
|
||||
config.go # Configuration service
|
||||
fetcher.go # GitHub issue fetcher
|
||||
queue.go # Issue queue management
|
||||
seeder.go # Context seeding via AI
|
||||
submit.go # PR submission
|
||||
notify.go # Notification service
|
||||
stats.go # Statistics tracking
|
||||
```
|
||||
|
||||
## Contributing
|
||||
|
||||
We welcome contributions! Here's how to get involved:
|
||||
|
||||
### Development Setup
|
||||
|
||||
```bash
|
||||
# Install dependencies
|
||||
cd cmd/bugseti/frontend
|
||||
npm install
|
||||
|
||||
# Run in development mode
|
||||
task bugseti:dev
|
||||
```
|
||||
|
||||
### Running Tests
|
||||
|
||||
```bash
|
||||
# Go tests
|
||||
go test ./cmd/bugseti/... ./internal/bugseti/...
|
||||
|
||||
# Frontend tests
|
||||
cd cmd/bugseti/frontend
|
||||
npm test
|
||||
```
|
||||
|
||||
### Submitting Changes
|
||||
|
||||
1. Fork the repository
|
||||
2. Create a feature branch: `git checkout -b feature/my-feature`
|
||||
3. Make your changes and add tests
|
||||
4. Run the test suite: `task test`
|
||||
5. Submit a pull request
|
||||
|
||||
### Code Style
|
||||
|
||||
- Go: Follow standard Go conventions, run `go fmt`
|
||||
- TypeScript/Angular: Follow Angular style guide
|
||||
- Commits: Use conventional commit messages
|
||||
|
||||
## Roadmap
|
||||
|
||||
- [ ] Auto-update mechanism
|
||||
- [ ] Team/organization support
|
||||
- [ ] Integration with more issue trackers (GitLab, Jira)
|
||||
- [ ] AI-assisted code review
|
||||
- [ ] Mobile companion app
|
||||
|
||||
## License
|
||||
|
||||
MIT License - see [LICENSE](../../LICENSE) for details.
|
||||
|
||||
## Acknowledgments
|
||||
|
||||
- Inspired by SETI@home and distributed computing projects
|
||||
- Built with [Wails v3](https://wails.io/) for native desktop integration
|
||||
- Uses [Angular](https://angular.io/) for the frontend
|
||||
|
||||
---
|
||||
|
||||
**Happy Bug Hunting!**
|
||||
134
cmd/bugseti/Taskfile.yml
Normal file
134
cmd/bugseti/Taskfile.yml
Normal file
|
|
@ -0,0 +1,134 @@
|
|||
version: '3'
|
||||
|
||||
includes:
|
||||
common: ./build/Taskfile.yml
|
||||
windows: ./build/windows/Taskfile.yml
|
||||
darwin: ./build/darwin/Taskfile.yml
|
||||
linux: ./build/linux/Taskfile.yml
|
||||
|
||||
vars:
|
||||
APP_NAME: "bugseti"
|
||||
BIN_DIR: "bin"
|
||||
VITE_PORT: '{{.WAILS_VITE_PORT | default 9246}}'
|
||||
|
||||
tasks:
|
||||
build:
|
||||
summary: Builds the application
|
||||
cmds:
|
||||
- task: "{{OS}}:build"
|
||||
|
||||
package:
|
||||
summary: Packages a production build of the application
|
||||
cmds:
|
||||
- task: "{{OS}}:package"
|
||||
|
||||
run:
|
||||
summary: Runs the application
|
||||
cmds:
|
||||
- task: "{{OS}}:run"
|
||||
|
||||
dev:
|
||||
summary: Runs the application in development mode
|
||||
cmds:
|
||||
- wails3 dev -config ./build/config.yml -port {{.VITE_PORT}}
|
||||
|
||||
build:all:
|
||||
summary: Builds for all platforms
|
||||
cmds:
|
||||
- task: darwin:build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
- task: linux:build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
- task: windows:build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
|
||||
package:all:
|
||||
summary: Packages for all platforms
|
||||
cmds:
|
||||
- task: darwin:package
|
||||
- task: linux:package
|
||||
- task: windows:package
|
||||
|
||||
clean:
|
||||
summary: Cleans build artifacts
|
||||
cmds:
|
||||
- rm -rf bin/
|
||||
- rm -rf frontend/dist/
|
||||
- rm -rf frontend/node_modules/
|
||||
|
||||
# Release targets
|
||||
release:stable:
|
||||
summary: Creates a stable release tag
|
||||
desc: |
|
||||
Creates a stable release tag (bugseti-vX.Y.Z).
|
||||
Usage: task release:stable VERSION=1.0.0
|
||||
preconditions:
|
||||
- sh: '[ -n "{{.VERSION}}" ]'
|
||||
msg: "VERSION is required. Usage: task release:stable VERSION=1.0.0"
|
||||
cmds:
|
||||
- git tag -a "bugseti-v{{.VERSION}}" -m "BugSETI v{{.VERSION}} stable release"
|
||||
- echo "Created tag bugseti-v{{.VERSION}}"
|
||||
- echo "To push: git push origin bugseti-v{{.VERSION}}"
|
||||
|
||||
release:beta:
|
||||
summary: Creates a beta release tag
|
||||
desc: |
|
||||
Creates a beta release tag (bugseti-vX.Y.Z-beta.N).
|
||||
Usage: task release:beta VERSION=1.0.0 BETA=1
|
||||
preconditions:
|
||||
- sh: '[ -n "{{.VERSION}}" ]'
|
||||
msg: "VERSION is required. Usage: task release:beta VERSION=1.0.0 BETA=1"
|
||||
- sh: '[ -n "{{.BETA}}" ]'
|
||||
msg: "BETA number is required. Usage: task release:beta VERSION=1.0.0 BETA=1"
|
||||
cmds:
|
||||
- git tag -a "bugseti-v{{.VERSION}}-beta.{{.BETA}}" -m "BugSETI v{{.VERSION}} beta {{.BETA}}"
|
||||
- echo "Created tag bugseti-v{{.VERSION}}-beta.{{.BETA}}"
|
||||
- echo "To push: git push origin bugseti-v{{.VERSION}}-beta.{{.BETA}}"
|
||||
|
||||
release:nightly:
|
||||
summary: Creates a nightly release tag
|
||||
desc: Creates a nightly release tag (bugseti-nightly-YYYYMMDD)
|
||||
vars:
|
||||
DATE:
|
||||
sh: date -u +%Y%m%d
|
||||
cmds:
|
||||
- git tag -a "bugseti-nightly-{{.DATE}}" -m "BugSETI nightly build {{.DATE}}"
|
||||
- echo "Created tag bugseti-nightly-{{.DATE}}"
|
||||
- echo "To push: git push origin bugseti-nightly-{{.DATE}}"
|
||||
|
||||
release:push:
|
||||
summary: Pushes the latest release tag
|
||||
desc: |
|
||||
Pushes the most recent bugseti-* tag to origin.
|
||||
Usage: task release:push
|
||||
vars:
|
||||
TAG:
|
||||
sh: git tag -l 'bugseti-*' | sort -V | tail -1
|
||||
preconditions:
|
||||
- sh: '[ -n "{{.TAG}}" ]'
|
||||
msg: "No bugseti-* tags found"
|
||||
cmds:
|
||||
- echo "Pushing tag {{.TAG}}..."
|
||||
- git push origin {{.TAG}}
|
||||
- echo "Tag {{.TAG}} pushed. GitHub Actions will build and release."
|
||||
|
||||
release:list:
|
||||
summary: Lists all BugSETI release tags
|
||||
cmds:
|
||||
- echo "=== BugSETI Release Tags ==="
|
||||
- git tag -l 'bugseti-*' | sort -V
|
||||
|
||||
version:
|
||||
summary: Shows current version info
|
||||
cmds:
|
||||
- |
|
||||
echo "=== BugSETI Version Info ==="
|
||||
echo "Latest stable tag:"
|
||||
git tag -l 'bugseti-v*' | grep -v beta | sort -V | tail -1 || echo " (none)"
|
||||
echo "Latest beta tag:"
|
||||
git tag -l 'bugseti-v*-beta.*' | sort -V | tail -1 || echo " (none)"
|
||||
echo "Latest nightly tag:"
|
||||
git tag -l 'bugseti-nightly-*' | sort -V | tail -1 || echo " (none)"
|
||||
90
cmd/bugseti/build/Taskfile.yml
Normal file
90
cmd/bugseti/build/Taskfile.yml
Normal file
|
|
@ -0,0 +1,90 @@
|
|||
version: '3'
|
||||
|
||||
tasks:
|
||||
go:mod:tidy:
|
||||
summary: Runs `go mod tidy`
|
||||
internal: true
|
||||
cmds:
|
||||
- go mod tidy
|
||||
|
||||
install:frontend:deps:
|
||||
summary: Install frontend dependencies
|
||||
dir: frontend
|
||||
sources:
|
||||
- package.json
|
||||
- package-lock.json
|
||||
generates:
|
||||
- node_modules/*
|
||||
preconditions:
|
||||
- sh: npm version
|
||||
msg: "Looks like npm isn't installed. Npm is part of the Node installer: https://nodejs.org/en/download/"
|
||||
cmds:
|
||||
- npm install
|
||||
|
||||
build:frontend:
|
||||
label: build:frontend (PRODUCTION={{.PRODUCTION}})
|
||||
summary: Build the frontend project
|
||||
dir: frontend
|
||||
sources:
|
||||
- "**/*"
|
||||
generates:
|
||||
- dist/**/*
|
||||
deps:
|
||||
- task: install:frontend:deps
|
||||
- task: generate:bindings
|
||||
vars:
|
||||
BUILD_FLAGS:
|
||||
ref: .BUILD_FLAGS
|
||||
cmds:
|
||||
- npm run {{.BUILD_COMMAND}} -q
|
||||
env:
|
||||
PRODUCTION: '{{.PRODUCTION | default "false"}}'
|
||||
vars:
|
||||
BUILD_COMMAND: '{{if eq .PRODUCTION "true"}}build{{else}}build:dev{{end}}'
|
||||
|
||||
generate:bindings:
|
||||
label: generate:bindings (BUILD_FLAGS={{.BUILD_FLAGS}})
|
||||
summary: Generates bindings for the frontend
|
||||
deps:
|
||||
- task: go:mod:tidy
|
||||
sources:
|
||||
- "**/*.[jt]s"
|
||||
- exclude: frontend/**/*
|
||||
- frontend/bindings/**/*
|
||||
- "**/*.go"
|
||||
- go.mod
|
||||
- go.sum
|
||||
generates:
|
||||
- frontend/bindings/**/*
|
||||
cmds:
|
||||
- wails3 generate bindings -f '{{.BUILD_FLAGS}}' -clean=false -ts -i
|
||||
|
||||
generate:icons:
|
||||
summary: Generates Windows `.ico` and Mac `.icns` files from an image
|
||||
dir: build
|
||||
sources:
|
||||
- "appicon.png"
|
||||
generates:
|
||||
- "darwin/icons.icns"
|
||||
- "windows/icon.ico"
|
||||
cmds:
|
||||
- wails3 generate icons -input appicon.png -macfilename darwin/icons.icns -windowsfilename windows/icon.ico
|
||||
|
||||
dev:frontend:
|
||||
summary: Runs the frontend in development mode
|
||||
dir: frontend
|
||||
deps:
|
||||
- task: install:frontend:deps
|
||||
cmds:
|
||||
- npm run dev -- --port {{.VITE_PORT}}
|
||||
vars:
|
||||
VITE_PORT: '{{.VITE_PORT | default "5173"}}'
|
||||
|
||||
update:build-assets:
|
||||
summary: Updates the build assets
|
||||
dir: build
|
||||
preconditions:
|
||||
- sh: '[ -n "{{.APP_NAME}}" ]'
|
||||
msg: "APP_NAME variable is required"
|
||||
cmds:
|
||||
- wails3 update build-assets -name "{{.APP_NAME}}" -binaryname "{{.APP_NAME}}" -config config.yml -dir .
|
||||
30
cmd/bugseti/build/config.yml
Normal file
30
cmd/bugseti/build/config.yml
Normal file
|
|
@ -0,0 +1,30 @@
|
|||
# BugSETI Wails v3 Build Configuration
|
||||
|
||||
version: "3"
|
||||
|
||||
# Application information
|
||||
name: "BugSETI"
|
||||
outputfilename: "bugseti"
|
||||
description: "Distributed Bug Fixing - like SETI@home but for code"
|
||||
productidentifier: "io.lethean.bugseti"
|
||||
productname: "BugSETI"
|
||||
productcompany: "Lethean"
|
||||
copyright: "Copyright 2026 Lethean"
|
||||
|
||||
# Development server
|
||||
devserver:
|
||||
frontend: "http://localhost:9246"
|
||||
|
||||
# Frontend configuration
|
||||
frontend:
|
||||
dir: "frontend"
|
||||
installcmd: "npm install"
|
||||
buildcmd: "npm run build"
|
||||
devcmd: "npm run dev"
|
||||
|
||||
# Build information
|
||||
info:
|
||||
companyname: "Lethean"
|
||||
productversion: "0.1.0"
|
||||
fileversion: "0.1.0"
|
||||
comments: "Distributed OSS bug fixing application"
|
||||
37
cmd/bugseti/build/darwin/Info.dev.plist
Normal file
37
cmd/bugseti/build/darwin/Info.dev.plist
Normal file
|
|
@ -0,0 +1,37 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>CFBundlePackageType</key>
|
||||
<string>APPL</string>
|
||||
<key>CFBundleName</key>
|
||||
<string>BugSETI (Dev)</string>
|
||||
<key>CFBundleExecutable</key>
|
||||
<string>bugseti</string>
|
||||
<key>CFBundleIdentifier</key>
|
||||
<string>io.lethean.bugseti.dev</string>
|
||||
<key>CFBundleVersion</key>
|
||||
<string>0.1.0-dev</string>
|
||||
<key>CFBundleGetInfoString</key>
|
||||
<string>Distributed Bug Fixing - like SETI@home but for code (Development)</string>
|
||||
<key>CFBundleShortVersionString</key>
|
||||
<string>0.1.0-dev</string>
|
||||
<key>CFBundleIconFile</key>
|
||||
<string>icons.icns</string>
|
||||
<key>LSMinimumSystemVersion</key>
|
||||
<string>10.15.0</string>
|
||||
<key>NSHighResolutionCapable</key>
|
||||
<true/>
|
||||
<key>LSUIElement</key>
|
||||
<true/>
|
||||
<key>LSApplicationCategoryType</key>
|
||||
<string>public.app-category.developer-tools</string>
|
||||
<key>NSAppTransportSecurity</key>
|
||||
<dict>
|
||||
<key>NSAllowsLocalNetworking</key>
|
||||
<true/>
|
||||
<key>NSAllowsArbitraryLoads</key>
|
||||
<true/>
|
||||
</dict>
|
||||
</dict>
|
||||
</plist>
|
||||
35
cmd/bugseti/build/darwin/Info.plist
Normal file
35
cmd/bugseti/build/darwin/Info.plist
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
<?xml version="1.0" encoding="UTF-8"?>
|
||||
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
|
||||
<plist version="1.0">
|
||||
<dict>
|
||||
<key>CFBundlePackageType</key>
|
||||
<string>APPL</string>
|
||||
<key>CFBundleName</key>
|
||||
<string>BugSETI</string>
|
||||
<key>CFBundleExecutable</key>
|
||||
<string>bugseti</string>
|
||||
<key>CFBundleIdentifier</key>
|
||||
<string>io.lethean.bugseti</string>
|
||||
<key>CFBundleVersion</key>
|
||||
<string>0.1.0</string>
|
||||
<key>CFBundleGetInfoString</key>
|
||||
<string>Distributed Bug Fixing - like SETI@home but for code</string>
|
||||
<key>CFBundleShortVersionString</key>
|
||||
<string>0.1.0</string>
|
||||
<key>CFBundleIconFile</key>
|
||||
<string>icons.icns</string>
|
||||
<key>LSMinimumSystemVersion</key>
|
||||
<string>10.15.0</string>
|
||||
<key>NSHighResolutionCapable</key>
|
||||
<true/>
|
||||
<key>LSUIElement</key>
|
||||
<true/>
|
||||
<key>LSApplicationCategoryType</key>
|
||||
<string>public.app-category.developer-tools</string>
|
||||
<key>NSAppTransportSecurity</key>
|
||||
<dict>
|
||||
<key>NSAllowsLocalNetworking</key>
|
||||
<true/>
|
||||
</dict>
|
||||
</dict>
|
||||
</plist>
|
||||
84
cmd/bugseti/build/darwin/Taskfile.yml
Normal file
84
cmd/bugseti/build/darwin/Taskfile.yml
Normal file
|
|
@ -0,0 +1,84 @@
|
|||
version: '3'
|
||||
|
||||
includes:
|
||||
common: ../Taskfile.yml
|
||||
|
||||
tasks:
|
||||
build:
|
||||
summary: Creates a production build of the application
|
||||
deps:
|
||||
- task: common:go:mod:tidy
|
||||
- task: common:build:frontend
|
||||
vars:
|
||||
BUILD_FLAGS:
|
||||
ref: .BUILD_FLAGS
|
||||
PRODUCTION:
|
||||
ref: .PRODUCTION
|
||||
- task: common:generate:icons
|
||||
cmds:
|
||||
- go build {{.BUILD_FLAGS}} -o {{.OUTPUT}}
|
||||
vars:
|
||||
BUILD_FLAGS: '{{if eq .PRODUCTION "true"}}-tags production -trimpath -buildvcs=false -ldflags="-w -s"{{else}}-buildvcs=false -gcflags=all="-l"{{end}}'
|
||||
DEFAULT_OUTPUT: '{{.BIN_DIR}}/{{.APP_NAME}}'
|
||||
OUTPUT: '{{ .OUTPUT | default .DEFAULT_OUTPUT }}'
|
||||
env:
|
||||
GOOS: darwin
|
||||
CGO_ENABLED: 1
|
||||
GOARCH: '{{.ARCH | default ARCH}}'
|
||||
CGO_CFLAGS: "-mmacosx-version-min=10.15"
|
||||
CGO_LDFLAGS: "-mmacosx-version-min=10.15"
|
||||
MACOSX_DEPLOYMENT_TARGET: "10.15"
|
||||
PRODUCTION: '{{.PRODUCTION | default "false"}}'
|
||||
|
||||
build:universal:
|
||||
summary: Builds darwin universal binary (arm64 + amd64)
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
ARCH: amd64
|
||||
OUTPUT: "{{.BIN_DIR}}/{{.APP_NAME}}-amd64"
|
||||
PRODUCTION: '{{.PRODUCTION | default "true"}}'
|
||||
- task: build
|
||||
vars:
|
||||
ARCH: arm64
|
||||
OUTPUT: "{{.BIN_DIR}}/{{.APP_NAME}}-arm64"
|
||||
PRODUCTION: '{{.PRODUCTION | default "true"}}'
|
||||
cmds:
|
||||
- lipo -create -output "{{.BIN_DIR}}/{{.APP_NAME}}" "{{.BIN_DIR}}/{{.APP_NAME}}-amd64" "{{.BIN_DIR}}/{{.APP_NAME}}-arm64"
|
||||
- rm "{{.BIN_DIR}}/{{.APP_NAME}}-amd64" "{{.BIN_DIR}}/{{.APP_NAME}}-arm64"
|
||||
|
||||
package:
|
||||
summary: Packages a production build of the application into a `.app` bundle
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
cmds:
|
||||
- task: create:app:bundle
|
||||
|
||||
package:universal:
|
||||
summary: Packages darwin universal binary (arm64 + amd64)
|
||||
deps:
|
||||
- task: build:universal
|
||||
cmds:
|
||||
- task: create:app:bundle
|
||||
|
||||
create:app:bundle:
|
||||
summary: Creates an `.app` bundle
|
||||
cmds:
|
||||
- mkdir -p {{.BIN_DIR}}/{{.APP_NAME}}.app/Contents/{MacOS,Resources}
|
||||
- cp build/darwin/icons.icns {{.BIN_DIR}}/{{.APP_NAME}}.app/Contents/Resources
|
||||
- cp {{.BIN_DIR}}/{{.APP_NAME}} {{.BIN_DIR}}/{{.APP_NAME}}.app/Contents/MacOS
|
||||
- cp build/darwin/Info.plist {{.BIN_DIR}}/{{.APP_NAME}}.app/Contents
|
||||
- codesign --force --deep --sign - {{.BIN_DIR}}/{{.APP_NAME}}.app
|
||||
|
||||
run:
|
||||
deps:
|
||||
- task: build
|
||||
cmds:
|
||||
- mkdir -p {{.BIN_DIR}}/{{.APP_NAME}}.dev.app/Contents/{MacOS,Resources}
|
||||
- cp build/darwin/icons.icns {{.BIN_DIR}}/{{.APP_NAME}}.dev.app/Contents/Resources
|
||||
- cp {{.BIN_DIR}}/{{.APP_NAME}} {{.BIN_DIR}}/{{.APP_NAME}}.dev.app/Contents/MacOS
|
||||
- cp build/darwin/Info.dev.plist {{.BIN_DIR}}/{{.APP_NAME}}.dev.app/Contents/Info.plist
|
||||
- codesign --force --deep --sign - {{.BIN_DIR}}/{{.APP_NAME}}.dev.app
|
||||
- '{{.BIN_DIR}}/{{.APP_NAME}}.dev.app/Contents/MacOS/{{.APP_NAME}}'
|
||||
103
cmd/bugseti/build/linux/Taskfile.yml
Normal file
103
cmd/bugseti/build/linux/Taskfile.yml
Normal file
|
|
@ -0,0 +1,103 @@
|
|||
version: '3'
|
||||
|
||||
includes:
|
||||
common: ../Taskfile.yml
|
||||
|
||||
tasks:
|
||||
build:
|
||||
summary: Builds the application for Linux
|
||||
deps:
|
||||
- task: common:go:mod:tidy
|
||||
- task: common:build:frontend
|
||||
vars:
|
||||
BUILD_FLAGS:
|
||||
ref: .BUILD_FLAGS
|
||||
PRODUCTION:
|
||||
ref: .PRODUCTION
|
||||
- task: common:generate:icons
|
||||
cmds:
|
||||
- go build {{.BUILD_FLAGS}} -o {{.BIN_DIR}}/{{.APP_NAME}}
|
||||
vars:
|
||||
BUILD_FLAGS: '{{if eq .PRODUCTION "true"}}-tags production -trimpath -buildvcs=false -ldflags="-w -s"{{else}}-buildvcs=false -gcflags=all="-l"{{end}}'
|
||||
env:
|
||||
GOOS: linux
|
||||
CGO_ENABLED: 1
|
||||
GOARCH: '{{.ARCH | default ARCH}}'
|
||||
PRODUCTION: '{{.PRODUCTION | default "false"}}'
|
||||
|
||||
package:
|
||||
summary: Packages a production build of the application for Linux
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
cmds:
|
||||
- task: create:appimage
|
||||
- task: create:deb
|
||||
- task: create:rpm
|
||||
|
||||
create:appimage:
|
||||
summary: Creates an AppImage
|
||||
dir: build/linux/appimage
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
- task: generate:dotdesktop
|
||||
cmds:
|
||||
- cp {{.APP_BINARY}} {{.APP_NAME}}
|
||||
- cp ../../appicon.png {{.APP_NAME}}.png
|
||||
- wails3 generate appimage -binary {{.APP_NAME}} -icon {{.ICON}} -desktopfile {{.DESKTOP_FILE}} -outputdir {{.OUTPUT_DIR}} -builddir {{.ROOT_DIR}}/build/linux/appimage/build
|
||||
vars:
|
||||
APP_NAME: '{{.APP_NAME}}'
|
||||
APP_BINARY: '../../../bin/{{.APP_NAME}}'
|
||||
ICON: '{{.APP_NAME}}.png'
|
||||
DESKTOP_FILE: '../{{.APP_NAME}}.desktop'
|
||||
OUTPUT_DIR: '../../../bin'
|
||||
|
||||
create:deb:
|
||||
summary: Creates a deb package
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
cmds:
|
||||
- task: generate:dotdesktop
|
||||
- task: generate:deb
|
||||
|
||||
create:rpm:
|
||||
summary: Creates a rpm package
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
cmds:
|
||||
- task: generate:dotdesktop
|
||||
- task: generate:rpm
|
||||
|
||||
generate:deb:
|
||||
summary: Creates a deb package
|
||||
cmds:
|
||||
- wails3 tool package -name {{.APP_NAME}} -format deb -config ./build/linux/nfpm/nfpm.yaml -out {{.ROOT_DIR}}/bin
|
||||
|
||||
generate:rpm:
|
||||
summary: Creates a rpm package
|
||||
cmds:
|
||||
- wails3 tool package -name {{.APP_NAME}} -format rpm -config ./build/linux/nfpm/nfpm.yaml -out {{.ROOT_DIR}}/bin
|
||||
|
||||
generate:dotdesktop:
|
||||
summary: Generates a `.desktop` file
|
||||
dir: build
|
||||
cmds:
|
||||
- mkdir -p {{.ROOT_DIR}}/build/linux/appimage
|
||||
- wails3 generate .desktop -name "{{.APP_NAME}}" -exec "{{.EXEC}}" -icon "{{.ICON}}" -outputfile {{.ROOT_DIR}}/build/linux/{{.APP_NAME}}.desktop -categories "{{.CATEGORIES}}"
|
||||
vars:
|
||||
APP_NAME: 'BugSETI'
|
||||
EXEC: '{{.APP_NAME}}'
|
||||
ICON: 'bugseti'
|
||||
CATEGORIES: 'Development;'
|
||||
OUTPUTFILE: '{{.ROOT_DIR}}/build/linux/{{.APP_NAME}}.desktop'
|
||||
|
||||
run:
|
||||
cmds:
|
||||
- '{{.BIN_DIR}}/{{.APP_NAME}}'
|
||||
34
cmd/bugseti/build/linux/nfpm/nfpm.yaml
Normal file
34
cmd/bugseti/build/linux/nfpm/nfpm.yaml
Normal file
|
|
@ -0,0 +1,34 @@
|
|||
# nfpm configuration for BugSETI
|
||||
name: "bugseti"
|
||||
arch: "${GOARCH}"
|
||||
platform: "linux"
|
||||
version: "0.1.0"
|
||||
section: "devel"
|
||||
priority: "optional"
|
||||
maintainer: "Lethean <developers@lethean.io>"
|
||||
description: |
|
||||
BugSETI - Distributed Bug Fixing
|
||||
Like SETI@home but for code. Install the system tray app,
|
||||
it pulls OSS issues from GitHub, AI prepares context,
|
||||
you fix bugs, and it auto-submits PRs.
|
||||
vendor: "Lethean"
|
||||
homepage: "https://github.com/host-uk/core"
|
||||
license: "MIT"
|
||||
|
||||
contents:
|
||||
- src: ./bin/bugseti
|
||||
dst: /usr/bin/bugseti
|
||||
- src: ./build/linux/bugseti.desktop
|
||||
dst: /usr/share/applications/bugseti.desktop
|
||||
- src: ./build/appicon.png
|
||||
dst: /usr/share/icons/hicolor/256x256/apps/bugseti.png
|
||||
|
||||
overrides:
|
||||
deb:
|
||||
dependencies:
|
||||
- libwebkit2gtk-4.1-0
|
||||
- libgtk-3-0
|
||||
rpm:
|
||||
dependencies:
|
||||
- webkit2gtk4.1
|
||||
- gtk3
|
||||
49
cmd/bugseti/build/windows/Taskfile.yml
Normal file
49
cmd/bugseti/build/windows/Taskfile.yml
Normal file
|
|
@ -0,0 +1,49 @@
|
|||
version: '3'
|
||||
|
||||
includes:
|
||||
common: ../Taskfile.yml
|
||||
|
||||
tasks:
|
||||
build:
|
||||
summary: Builds the application for Windows
|
||||
deps:
|
||||
- task: common:go:mod:tidy
|
||||
- task: common:build:frontend
|
||||
vars:
|
||||
BUILD_FLAGS:
|
||||
ref: .BUILD_FLAGS
|
||||
PRODUCTION:
|
||||
ref: .PRODUCTION
|
||||
- task: common:generate:icons
|
||||
cmds:
|
||||
- go build {{.BUILD_FLAGS}} -o {{.BIN_DIR}}/{{.APP_NAME}}.exe
|
||||
vars:
|
||||
BUILD_FLAGS: '{{if eq .PRODUCTION "true"}}-tags production -trimpath -buildvcs=false -ldflags="-w -s -H windowsgui"{{else}}-buildvcs=false -gcflags=all="-l"{{end}}'
|
||||
env:
|
||||
GOOS: windows
|
||||
CGO_ENABLED: 1
|
||||
GOARCH: '{{.ARCH | default ARCH}}'
|
||||
PRODUCTION: '{{.PRODUCTION | default "false"}}'
|
||||
|
||||
package:
|
||||
summary: Packages a production build of the application for Windows
|
||||
deps:
|
||||
- task: build
|
||||
vars:
|
||||
PRODUCTION: "true"
|
||||
cmds:
|
||||
- task: create:nsis
|
||||
|
||||
create:nsis:
|
||||
summary: Creates an NSIS installer
|
||||
cmds:
|
||||
- wails3 tool package -name {{.APP_NAME}} -format nsis -config ./build/windows/nsis/installer.nsi -out {{.ROOT_DIR}}/bin
|
||||
|
||||
create:msi:
|
||||
summary: Creates an MSI installer
|
||||
cmds:
|
||||
- wails3 tool package -name {{.APP_NAME}} -format msi -config ./build/windows/wix/main.wxs -out {{.ROOT_DIR}}/bin
|
||||
|
||||
run:
|
||||
cmds:
|
||||
- '{{.BIN_DIR}}/{{.APP_NAME}}.exe'
|
||||
91
cmd/bugseti/frontend/angular.json
Normal file
91
cmd/bugseti/frontend/angular.json
Normal file
|
|
@ -0,0 +1,91 @@
|
|||
{
|
||||
"$schema": "./node_modules/@angular/cli/lib/config/schema.json",
|
||||
"version": 1,
|
||||
"newProjectRoot": "projects",
|
||||
"projects": {
|
||||
"bugseti": {
|
||||
"projectType": "application",
|
||||
"schematics": {
|
||||
"@schematics/angular:component": {
|
||||
"style": "scss",
|
||||
"standalone": true
|
||||
}
|
||||
},
|
||||
"root": "",
|
||||
"sourceRoot": "src",
|
||||
"prefix": "app",
|
||||
"architect": {
|
||||
"build": {
|
||||
"builder": "@angular-devkit/build-angular:application",
|
||||
"options": {
|
||||
"outputPath": "dist/bugseti",
|
||||
"index": "src/index.html",
|
||||
"browser": "src/main.ts",
|
||||
"polyfills": ["zone.js"],
|
||||
"tsConfig": "tsconfig.app.json",
|
||||
"inlineStyleLanguage": "scss",
|
||||
"assets": [
|
||||
"src/favicon.ico",
|
||||
"src/assets"
|
||||
],
|
||||
"styles": [
|
||||
"src/styles.scss"
|
||||
],
|
||||
"scripts": []
|
||||
},
|
||||
"configurations": {
|
||||
"production": {
|
||||
"budgets": [
|
||||
{
|
||||
"type": "initial",
|
||||
"maximumWarning": "500kb",
|
||||
"maximumError": "1mb"
|
||||
},
|
||||
{
|
||||
"type": "anyComponentStyle",
|
||||
"maximumWarning": "2kb",
|
||||
"maximumError": "4kb"
|
||||
}
|
||||
],
|
||||
"outputHashing": "all"
|
||||
},
|
||||
"development": {
|
||||
"optimization": false,
|
||||
"extractLicenses": false,
|
||||
"sourceMap": true
|
||||
}
|
||||
},
|
||||
"defaultConfiguration": "production"
|
||||
},
|
||||
"serve": {
|
||||
"builder": "@angular-devkit/build-angular:dev-server",
|
||||
"configurations": {
|
||||
"production": {
|
||||
"buildTarget": "bugseti:build:production"
|
||||
},
|
||||
"development": {
|
||||
"buildTarget": "bugseti:build:development"
|
||||
}
|
||||
},
|
||||
"defaultConfiguration": "development"
|
||||
},
|
||||
"test": {
|
||||
"builder": "@angular-devkit/build-angular:karma",
|
||||
"options": {
|
||||
"polyfills": ["zone.js", "zone.js/testing"],
|
||||
"tsConfig": "tsconfig.spec.json",
|
||||
"inlineStyleLanguage": "scss",
|
||||
"assets": [
|
||||
"src/favicon.ico",
|
||||
"src/assets"
|
||||
],
|
||||
"styles": [
|
||||
"src/styles.scss"
|
||||
],
|
||||
"scripts": []
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
14091
cmd/bugseti/frontend/package-lock.json
generated
Normal file
14091
cmd/bugseti/frontend/package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load diff
41
cmd/bugseti/frontend/package.json
Normal file
41
cmd/bugseti/frontend/package.json
Normal file
|
|
@ -0,0 +1,41 @@
|
|||
{
|
||||
"name": "bugseti",
|
||||
"version": "0.1.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
"ng": "ng",
|
||||
"start": "ng serve",
|
||||
"dev": "ng serve --configuration development",
|
||||
"build": "ng build --configuration production",
|
||||
"build:dev": "ng build --configuration development",
|
||||
"watch": "ng build --watch --configuration development",
|
||||
"test": "ng test",
|
||||
"lint": "ng lint"
|
||||
},
|
||||
"dependencies": {
|
||||
"@angular/animations": "^19.1.0",
|
||||
"@angular/common": "^19.1.0",
|
||||
"@angular/compiler": "^19.1.0",
|
||||
"@angular/core": "^19.1.0",
|
||||
"@angular/forms": "^19.1.0",
|
||||
"@angular/platform-browser": "^19.1.0",
|
||||
"@angular/platform-browser-dynamic": "^19.1.0",
|
||||
"@angular/router": "^19.1.0",
|
||||
"rxjs": "~7.8.0",
|
||||
"tslib": "^2.3.0",
|
||||
"zone.js": "~0.15.0"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@angular-devkit/build-angular": "^19.1.0",
|
||||
"@angular/cli": "^19.1.0",
|
||||
"@angular/compiler-cli": "^19.1.0",
|
||||
"@types/jasmine": "~5.1.0",
|
||||
"jasmine-core": "~5.1.0",
|
||||
"karma": "~6.4.0",
|
||||
"karma-chrome-launcher": "~3.2.0",
|
||||
"karma-coverage": "~2.2.0",
|
||||
"karma-jasmine": "~5.1.0",
|
||||
"karma-jasmine-html-reporter": "~2.1.0",
|
||||
"typescript": "~5.5.2"
|
||||
}
|
||||
}
|
||||
18
cmd/bugseti/frontend/src/app/app.component.ts
Normal file
18
cmd/bugseti/frontend/src/app/app.component.ts
Normal file
|
|
@ -0,0 +1,18 @@
|
|||
import { Component } from '@angular/core';
|
||||
import { RouterOutlet } from '@angular/router';
|
||||
|
||||
@Component({
|
||||
selector: 'app-root',
|
||||
standalone: true,
|
||||
imports: [RouterOutlet],
|
||||
template: '<router-outlet></router-outlet>',
|
||||
styles: [`
|
||||
:host {
|
||||
display: block;
|
||||
height: 100%;
|
||||
}
|
||||
`]
|
||||
})
|
||||
export class AppComponent {
|
||||
title = 'BugSETI';
|
||||
}
|
||||
9
cmd/bugseti/frontend/src/app/app.config.ts
Normal file
9
cmd/bugseti/frontend/src/app/app.config.ts
Normal file
|
|
@ -0,0 +1,9 @@
|
|||
import { ApplicationConfig } from '@angular/core';
|
||||
import { provideRouter, withHashLocation } from '@angular/router';
|
||||
import { routes } from './app.routes';
|
||||
|
||||
export const appConfig: ApplicationConfig = {
|
||||
providers: [
|
||||
provideRouter(routes, withHashLocation())
|
||||
]
|
||||
};
|
||||
25
cmd/bugseti/frontend/src/app/app.routes.ts
Normal file
25
cmd/bugseti/frontend/src/app/app.routes.ts
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
import { Routes } from '@angular/router';
|
||||
|
||||
export const routes: Routes = [
|
||||
{
|
||||
path: '',
|
||||
redirectTo: 'tray',
|
||||
pathMatch: 'full'
|
||||
},
|
||||
{
|
||||
path: 'tray',
|
||||
loadComponent: () => import('./tray/tray.component').then(m => m.TrayComponent)
|
||||
},
|
||||
{
|
||||
path: 'workbench',
|
||||
loadComponent: () => import('./workbench/workbench.component').then(m => m.WorkbenchComponent)
|
||||
},
|
||||
{
|
||||
path: 'settings',
|
||||
loadComponent: () => import('./settings/settings.component').then(m => m.SettingsComponent)
|
||||
},
|
||||
{
|
||||
path: 'onboarding',
|
||||
loadComponent: () => import('./onboarding/onboarding.component').then(m => m.OnboardingComponent)
|
||||
}
|
||||
];
|
||||
457
cmd/bugseti/frontend/src/app/onboarding/onboarding.component.ts
Normal file
457
cmd/bugseti/frontend/src/app/onboarding/onboarding.component.ts
Normal file
|
|
@ -0,0 +1,457 @@
|
|||
import { Component } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { FormsModule } from '@angular/forms';
|
||||
|
||||
@Component({
|
||||
selector: 'app-onboarding',
|
||||
standalone: true,
|
||||
imports: [CommonModule, FormsModule],
|
||||
template: `
|
||||
<div class="onboarding">
|
||||
<div class="onboarding-content">
|
||||
<!-- Step 1: Welcome -->
|
||||
<div class="step" *ngIf="step === 1">
|
||||
<div class="step-icon">B</div>
|
||||
<h1>Welcome to BugSETI</h1>
|
||||
<p class="subtitle">Distributed Bug Fixing - like SETI@home but for code</p>
|
||||
|
||||
<div class="feature-list">
|
||||
<div class="feature">
|
||||
<span class="feature-icon">[1]</span>
|
||||
<div>
|
||||
<strong>Find Issues</strong>
|
||||
<p>We pull beginner-friendly issues from OSS projects you care about.</p>
|
||||
</div>
|
||||
</div>
|
||||
<div class="feature">
|
||||
<span class="feature-icon">[2]</span>
|
||||
<div>
|
||||
<strong>Get Context</strong>
|
||||
<p>AI prepares relevant context to help you understand each issue.</p>
|
||||
</div>
|
||||
</div>
|
||||
<div class="feature">
|
||||
<span class="feature-icon">[3]</span>
|
||||
<div>
|
||||
<strong>Submit PRs</strong>
|
||||
<p>Fix bugs and submit PRs with minimal friction.</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button class="btn btn--primary btn--lg" (click)="nextStep()">Get Started</button>
|
||||
</div>
|
||||
|
||||
<!-- Step 2: GitHub Auth -->
|
||||
<div class="step" *ngIf="step === 2">
|
||||
<h2>Connect GitHub</h2>
|
||||
<p>BugSETI uses the GitHub CLI (gh) to interact with repositories.</p>
|
||||
|
||||
<div class="auth-status" [class.auth-success]="ghAuthenticated">
|
||||
<span class="status-icon">{{ ghAuthenticated ? '[OK]' : '[!]' }}</span>
|
||||
<span>{{ ghAuthenticated ? 'GitHub CLI authenticated' : 'GitHub CLI not detected' }}</span>
|
||||
</div>
|
||||
|
||||
<div class="auth-instructions" *ngIf="!ghAuthenticated">
|
||||
<p>To authenticate with GitHub CLI, run:</p>
|
||||
<code>gh auth login</code>
|
||||
<p class="note">After authenticating, click "Check Again".</p>
|
||||
</div>
|
||||
|
||||
<div class="step-actions">
|
||||
<button class="btn btn--secondary" (click)="checkGhAuth()">Check Again</button>
|
||||
<button class="btn btn--primary" (click)="nextStep()" [disabled]="!ghAuthenticated">Continue</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Step 3: Select Repos -->
|
||||
<div class="step" *ngIf="step === 3">
|
||||
<h2>Choose Repositories</h2>
|
||||
<p>Add repositories you want to contribute to.</p>
|
||||
|
||||
<div class="repo-input">
|
||||
<input type="text" class="form-input" [(ngModel)]="newRepo"
|
||||
placeholder="owner/repo (e.g., facebook/react)">
|
||||
<button class="btn btn--secondary" (click)="addRepo()" [disabled]="!newRepo">Add</button>
|
||||
</div>
|
||||
|
||||
<div class="selected-repos" *ngIf="selectedRepos.length">
|
||||
<h3>Selected Repositories</h3>
|
||||
<div class="repo-chip" *ngFor="let repo of selectedRepos; let i = index">
|
||||
{{ repo }}
|
||||
<button class="repo-remove" (click)="removeRepo(i)">x</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="suggested-repos">
|
||||
<h3>Suggested Repositories</h3>
|
||||
<div class="suggested-list">
|
||||
<button class="suggestion" *ngFor="let repo of suggestedRepos" (click)="addSuggested(repo)">
|
||||
{{ repo }}
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="step-actions">
|
||||
<button class="btn btn--secondary" (click)="prevStep()">Back</button>
|
||||
<button class="btn btn--primary" (click)="nextStep()" [disabled]="selectedRepos.length === 0">Continue</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<!-- Step 4: Complete -->
|
||||
<div class="step" *ngIf="step === 4">
|
||||
<div class="complete-icon">[OK]</div>
|
||||
<h2>You're All Set!</h2>
|
||||
<p>BugSETI is ready to help you contribute to open source.</p>
|
||||
|
||||
<div class="summary">
|
||||
<p><strong>{{ selectedRepos.length }}</strong> repositories selected</p>
|
||||
<p>Looking for issues with these labels:</p>
|
||||
<div class="label-list">
|
||||
<span class="badge badge--primary">good first issue</span>
|
||||
<span class="badge badge--primary">help wanted</span>
|
||||
<span class="badge badge--primary">beginner-friendly</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button class="btn btn--success btn--lg" (click)="complete()">Start Finding Issues</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="step-indicators">
|
||||
<span class="indicator" [class.active]="step >= 1" [class.current]="step === 1"></span>
|
||||
<span class="indicator" [class.active]="step >= 2" [class.current]="step === 2"></span>
|
||||
<span class="indicator" [class.active]="step >= 3" [class.current]="step === 3"></span>
|
||||
<span class="indicator" [class.active]="step >= 4" [class.current]="step === 4"></span>
|
||||
</div>
|
||||
</div>
|
||||
`,
|
||||
styles: [`
|
||||
.onboarding {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100%;
|
||||
background-color: var(--bg-primary);
|
||||
}
|
||||
|
||||
.onboarding-content {
|
||||
flex: 1;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
padding: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.step {
|
||||
max-width: 500px;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.step-icon, .complete-icon {
|
||||
width: 80px;
|
||||
height: 80px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
margin: 0 auto var(--spacing-lg);
|
||||
background: linear-gradient(135deg, var(--accent-primary), var(--accent-success));
|
||||
border-radius: var(--radius-lg);
|
||||
font-size: 32px;
|
||||
font-weight: bold;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.complete-icon {
|
||||
background: var(--accent-success);
|
||||
}
|
||||
|
||||
h1 {
|
||||
font-size: 28px;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
h2 {
|
||||
font-size: 24px;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.subtitle {
|
||||
color: var(--text-secondary);
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.feature-list {
|
||||
text-align: left;
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.feature {
|
||||
display: flex;
|
||||
gap: var(--spacing-md);
|
||||
margin-bottom: var(--spacing-md);
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--bg-secondary);
|
||||
border-radius: var(--radius-md);
|
||||
}
|
||||
|
||||
.feature-icon {
|
||||
font-family: var(--font-mono);
|
||||
color: var(--accent-primary);
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.feature strong {
|
||||
display: block;
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.feature p {
|
||||
color: var(--text-secondary);
|
||||
font-size: 13px;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.auth-status {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: var(--spacing-sm);
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--bg-tertiary);
|
||||
border-radius: var(--radius-md);
|
||||
margin: var(--spacing-lg) 0;
|
||||
}
|
||||
|
||||
.auth-status.auth-success {
|
||||
background-color: rgba(63, 185, 80, 0.15);
|
||||
color: var(--accent-success);
|
||||
}
|
||||
|
||||
.status-icon {
|
||||
font-family: var(--font-mono);
|
||||
font-weight: bold;
|
||||
}
|
||||
|
||||
.auth-instructions {
|
||||
text-align: left;
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--bg-secondary);
|
||||
border-radius: var(--radius-md);
|
||||
}
|
||||
|
||||
.auth-instructions code {
|
||||
display: block;
|
||||
margin: var(--spacing-md) 0;
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--bg-tertiary);
|
||||
}
|
||||
|
||||
.auth-instructions .note {
|
||||
color: var(--text-muted);
|
||||
font-size: 13px;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.step-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-md);
|
||||
justify-content: center;
|
||||
margin-top: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.repo-input {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.repo-input .form-input {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.selected-repos, .suggested-repos {
|
||||
text-align: left;
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.selected-repos h3, .suggested-repos h3 {
|
||||
font-size: 12px;
|
||||
text-transform: uppercase;
|
||||
color: var(--text-muted);
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.repo-chip {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-xs);
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
background-color: var(--bg-secondary);
|
||||
border-radius: var(--radius-md);
|
||||
margin-right: var(--spacing-xs);
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.repo-remove {
|
||||
background: none;
|
||||
border: none;
|
||||
color: var(--text-muted);
|
||||
cursor: pointer;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
.suggested-list {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.suggestion {
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
background-color: var(--bg-tertiary);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-md);
|
||||
color: var(--text-secondary);
|
||||
cursor: pointer;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
.suggestion:hover {
|
||||
background-color: var(--bg-secondary);
|
||||
border-color: var(--accent-primary);
|
||||
}
|
||||
|
||||
.summary {
|
||||
padding: var(--spacing-lg);
|
||||
background-color: var(--bg-secondary);
|
||||
border-radius: var(--radius-md);
|
||||
margin-bottom: var(--spacing-xl);
|
||||
}
|
||||
|
||||
.summary p {
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.label-list {
|
||||
display: flex;
|
||||
gap: var(--spacing-xs);
|
||||
justify-content: center;
|
||||
flex-wrap: wrap;
|
||||
}
|
||||
|
||||
.step-indicators {
|
||||
display: flex;
|
||||
justify-content: center;
|
||||
gap: var(--spacing-sm);
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.indicator {
|
||||
width: 8px;
|
||||
height: 8px;
|
||||
border-radius: 50%;
|
||||
background-color: var(--border-color);
|
||||
}
|
||||
|
||||
.indicator.active {
|
||||
background-color: var(--accent-primary);
|
||||
}
|
||||
|
||||
.indicator.current {
|
||||
width: 24px;
|
||||
border-radius: 4px;
|
||||
}
|
||||
|
||||
.btn--lg {
|
||||
padding: var(--spacing-md) var(--spacing-xl);
|
||||
font-size: 16px;
|
||||
}
|
||||
`]
|
||||
})
|
||||
export class OnboardingComponent {
|
||||
step = 1;
|
||||
ghAuthenticated = false;
|
||||
newRepo = '';
|
||||
selectedRepos: string[] = [];
|
||||
suggestedRepos = [
|
||||
'facebook/react',
|
||||
'microsoft/vscode',
|
||||
'golang/go',
|
||||
'kubernetes/kubernetes',
|
||||
'rust-lang/rust',
|
||||
'angular/angular',
|
||||
'nodejs/node',
|
||||
'python/cpython'
|
||||
];
|
||||
|
||||
ngOnInit() {
|
||||
this.checkGhAuth();
|
||||
}
|
||||
|
||||
nextStep() {
|
||||
if (this.step < 4) {
|
||||
this.step++;
|
||||
}
|
||||
}
|
||||
|
||||
prevStep() {
|
||||
if (this.step > 1) {
|
||||
this.step--;
|
||||
}
|
||||
}
|
||||
|
||||
async checkGhAuth() {
|
||||
try {
|
||||
// Check if gh CLI is authenticated
|
||||
// In a real implementation, this would call the backend
|
||||
this.ghAuthenticated = true; // Assume authenticated for demo
|
||||
} catch (err) {
|
||||
this.ghAuthenticated = false;
|
||||
}
|
||||
}
|
||||
|
||||
addRepo() {
|
||||
if (this.newRepo && !this.selectedRepos.includes(this.newRepo)) {
|
||||
this.selectedRepos.push(this.newRepo);
|
||||
this.newRepo = '';
|
||||
}
|
||||
}
|
||||
|
||||
removeRepo(index: number) {
|
||||
this.selectedRepos.splice(index, 1);
|
||||
}
|
||||
|
||||
addSuggested(repo: string) {
|
||||
if (!this.selectedRepos.includes(repo)) {
|
||||
this.selectedRepos.push(repo);
|
||||
}
|
||||
}
|
||||
|
||||
async complete() {
|
||||
try {
|
||||
// Save repos to config
|
||||
if ((window as any).go?.main?.ConfigService?.SetConfig) {
|
||||
const config = await (window as any).go.main.ConfigService.GetConfig() || {};
|
||||
config.watchedRepos = this.selectedRepos;
|
||||
await (window as any).go.main.ConfigService.SetConfig(config);
|
||||
}
|
||||
|
||||
// Mark onboarding as complete
|
||||
if ((window as any).go?.main?.TrayService?.CompleteOnboarding) {
|
||||
await (window as any).go.main.TrayService.CompleteOnboarding();
|
||||
}
|
||||
|
||||
// Close onboarding window and start fetching
|
||||
if ((window as any).wails?.Window) {
|
||||
(window as any).wails.Window.GetByName('onboarding').then((w: any) => w.Hide());
|
||||
}
|
||||
|
||||
// Start fetching
|
||||
if ((window as any).go?.main?.TrayService?.StartFetching) {
|
||||
await (window as any).go.main.TrayService.StartFetching();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to complete onboarding:', err);
|
||||
}
|
||||
}
|
||||
}
|
||||
398
cmd/bugseti/frontend/src/app/settings/settings.component.ts
Normal file
398
cmd/bugseti/frontend/src/app/settings/settings.component.ts
Normal file
|
|
@ -0,0 +1,398 @@
|
|||
import { Component, OnInit } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { FormsModule } from '@angular/forms';
|
||||
|
||||
interface Config {
|
||||
watchedRepos: string[];
|
||||
labels: string[];
|
||||
fetchIntervalMinutes: number;
|
||||
notificationsEnabled: boolean;
|
||||
notificationSound: boolean;
|
||||
workspaceDir: string;
|
||||
theme: string;
|
||||
autoSeedContext: boolean;
|
||||
workHours?: {
|
||||
enabled: boolean;
|
||||
startHour: number;
|
||||
endHour: number;
|
||||
days: number[];
|
||||
timezone: string;
|
||||
};
|
||||
}
|
||||
|
||||
@Component({
|
||||
selector: 'app-settings',
|
||||
standalone: true,
|
||||
imports: [CommonModule, FormsModule],
|
||||
template: `
|
||||
<div class="settings">
|
||||
<header class="settings-header">
|
||||
<h1>Settings</h1>
|
||||
<button class="btn btn--primary" (click)="saveSettings()">Save</button>
|
||||
</header>
|
||||
|
||||
<div class="settings-content">
|
||||
<section class="settings-section">
|
||||
<h2>Repositories</h2>
|
||||
<p class="section-description">Add GitHub repositories to watch for issues.</p>
|
||||
|
||||
<div class="repo-list">
|
||||
<div class="repo-item" *ngFor="let repo of config.watchedRepos; let i = index">
|
||||
<span>{{ repo }}</span>
|
||||
<button class="btn btn--danger btn--sm" (click)="removeRepo(i)">Remove</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="add-repo">
|
||||
<input type="text" class="form-input" [(ngModel)]="newRepo"
|
||||
placeholder="owner/repo (e.g., facebook/react)">
|
||||
<button class="btn btn--secondary" (click)="addRepo()" [disabled]="!newRepo">Add</button>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="settings-section">
|
||||
<h2>Issue Labels</h2>
|
||||
<p class="section-description">Filter issues by these labels.</p>
|
||||
|
||||
<div class="label-list">
|
||||
<span class="label-chip" *ngFor="let label of config.labels; let i = index">
|
||||
{{ label }}
|
||||
<button class="label-remove" (click)="removeLabel(i)">x</button>
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<div class="add-label">
|
||||
<input type="text" class="form-input" [(ngModel)]="newLabel"
|
||||
placeholder="Add label (e.g., good first issue)">
|
||||
<button class="btn btn--secondary" (click)="addLabel()" [disabled]="!newLabel">Add</button>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="settings-section">
|
||||
<h2>Fetch Settings</h2>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Fetch Interval (minutes)</label>
|
||||
<input type="number" class="form-input" [(ngModel)]="config.fetchIntervalMinutes" min="5" max="120">
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="checkbox-label">
|
||||
<input type="checkbox" [(ngModel)]="config.autoSeedContext">
|
||||
<span>Auto-prepare AI context for issues</span>
|
||||
</label>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="settings-section">
|
||||
<h2>Work Hours</h2>
|
||||
<p class="section-description">Only fetch issues during these hours.</p>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="checkbox-label">
|
||||
<input type="checkbox" [(ngModel)]="config.workHours!.enabled">
|
||||
<span>Enable work hours</span>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div class="work-hours-config" *ngIf="config.workHours?.enabled">
|
||||
<div class="form-group">
|
||||
<label class="form-label">Start Hour</label>
|
||||
<select class="form-select" [(ngModel)]="config.workHours!.startHour">
|
||||
<option *ngFor="let h of hours" [value]="h">{{ h }}:00</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">End Hour</label>
|
||||
<select class="form-select" [(ngModel)]="config.workHours!.endHour">
|
||||
<option *ngFor="let h of hours" [value]="h">{{ h }}:00</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Days</label>
|
||||
<div class="day-checkboxes">
|
||||
<label class="checkbox-label" *ngFor="let day of days; let i = index">
|
||||
<input type="checkbox" [checked]="isDaySelected(i)" (change)="toggleDay(i)">
|
||||
<span>{{ day }}</span>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="settings-section">
|
||||
<h2>Notifications</h2>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="checkbox-label">
|
||||
<input type="checkbox" [(ngModel)]="config.notificationsEnabled">
|
||||
<span>Enable desktop notifications</span>
|
||||
</label>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="checkbox-label">
|
||||
<input type="checkbox" [(ngModel)]="config.notificationSound">
|
||||
<span>Play notification sounds</span>
|
||||
</label>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="settings-section">
|
||||
<h2>Appearance</h2>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Theme</label>
|
||||
<select class="form-select" [(ngModel)]="config.theme">
|
||||
<option value="dark">Dark</option>
|
||||
<option value="light">Light</option>
|
||||
<option value="system">System</option>
|
||||
</select>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="settings-section">
|
||||
<h2>Storage</h2>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Workspace Directory</label>
|
||||
<input type="text" class="form-input" [(ngModel)]="config.workspaceDir"
|
||||
placeholder="Leave empty for default">
|
||||
</div>
|
||||
</section>
|
||||
</div>
|
||||
</div>
|
||||
`,
|
||||
styles: [`
|
||||
.settings {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100%;
|
||||
background-color: var(--bg-secondary);
|
||||
}
|
||||
|
||||
.settings-header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
padding: var(--spacing-md) var(--spacing-lg);
|
||||
background-color: var(--bg-primary);
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
.settings-header h1 {
|
||||
font-size: 18px;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.settings-content {
|
||||
flex: 1;
|
||||
overflow-y: auto;
|
||||
padding: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.settings-section {
|
||||
background-color: var(--bg-primary);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.settings-section h2 {
|
||||
font-size: 16px;
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.section-description {
|
||||
color: var(--text-muted);
|
||||
font-size: 13px;
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.repo-list, .label-list {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.repo-item {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
padding: var(--spacing-sm);
|
||||
background-color: var(--bg-secondary);
|
||||
border-radius: var(--radius-md);
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.add-repo, .add-label {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.add-repo .form-input, .add-label .form-input {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.label-list {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.label-chip {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-xs);
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
background-color: var(--bg-tertiary);
|
||||
border-radius: 999px;
|
||||
font-size: 13px;
|
||||
}
|
||||
|
||||
.label-remove {
|
||||
background: none;
|
||||
border: none;
|
||||
color: var(--text-muted);
|
||||
cursor: pointer;
|
||||
padding: 0;
|
||||
font-size: 14px;
|
||||
line-height: 1;
|
||||
}
|
||||
|
||||
.label-remove:hover {
|
||||
color: var(--accent-danger);
|
||||
}
|
||||
|
||||
.checkbox-label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.checkbox-label input[type="checkbox"] {
|
||||
width: 16px;
|
||||
height: 16px;
|
||||
}
|
||||
|
||||
.work-hours-config {
|
||||
display: grid;
|
||||
grid-template-columns: 1fr 1fr;
|
||||
gap: var(--spacing-md);
|
||||
margin-top: var(--spacing-md);
|
||||
}
|
||||
|
||||
.day-checkboxes {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.day-checkboxes .checkbox-label {
|
||||
width: auto;
|
||||
}
|
||||
|
||||
.btn--sm {
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
font-size: 12px;
|
||||
}
|
||||
`]
|
||||
})
|
||||
export class SettingsComponent implements OnInit {
|
||||
config: Config = {
|
||||
watchedRepos: [],
|
||||
labels: ['good first issue', 'help wanted'],
|
||||
fetchIntervalMinutes: 15,
|
||||
notificationsEnabled: true,
|
||||
notificationSound: true,
|
||||
workspaceDir: '',
|
||||
theme: 'dark',
|
||||
autoSeedContext: true,
|
||||
workHours: {
|
||||
enabled: false,
|
||||
startHour: 9,
|
||||
endHour: 17,
|
||||
days: [1, 2, 3, 4, 5],
|
||||
timezone: ''
|
||||
}
|
||||
};
|
||||
|
||||
newRepo = '';
|
||||
newLabel = '';
|
||||
hours = Array.from({ length: 24 }, (_, i) => i);
|
||||
days = ['Sun', 'Mon', 'Tue', 'Wed', 'Thu', 'Fri', 'Sat'];
|
||||
|
||||
ngOnInit() {
|
||||
this.loadConfig();
|
||||
}
|
||||
|
||||
async loadConfig() {
|
||||
try {
|
||||
if ((window as any).go?.main?.ConfigService?.GetConfig) {
|
||||
this.config = await (window as any).go.main.ConfigService.GetConfig();
|
||||
if (!this.config.workHours) {
|
||||
this.config.workHours = {
|
||||
enabled: false,
|
||||
startHour: 9,
|
||||
endHour: 17,
|
||||
days: [1, 2, 3, 4, 5],
|
||||
timezone: ''
|
||||
};
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to load config:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async saveSettings() {
|
||||
try {
|
||||
if ((window as any).go?.main?.ConfigService?.SetConfig) {
|
||||
await (window as any).go.main.ConfigService.SetConfig(this.config);
|
||||
alert('Settings saved!');
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to save config:', err);
|
||||
alert('Failed to save settings.');
|
||||
}
|
||||
}
|
||||
|
||||
addRepo() {
|
||||
if (this.newRepo && !this.config.watchedRepos.includes(this.newRepo)) {
|
||||
this.config.watchedRepos.push(this.newRepo);
|
||||
this.newRepo = '';
|
||||
}
|
||||
}
|
||||
|
||||
removeRepo(index: number) {
|
||||
this.config.watchedRepos.splice(index, 1);
|
||||
}
|
||||
|
||||
addLabel() {
|
||||
if (this.newLabel && !this.config.labels.includes(this.newLabel)) {
|
||||
this.config.labels.push(this.newLabel);
|
||||
this.newLabel = '';
|
||||
}
|
||||
}
|
||||
|
||||
removeLabel(index: number) {
|
||||
this.config.labels.splice(index, 1);
|
||||
}
|
||||
|
||||
isDaySelected(day: number): boolean {
|
||||
return this.config.workHours?.days.includes(day) || false;
|
||||
}
|
||||
|
||||
toggleDay(day: number) {
|
||||
if (!this.config.workHours) return;
|
||||
|
||||
const index = this.config.workHours.days.indexOf(day);
|
||||
if (index === -1) {
|
||||
this.config.workHours.days.push(day);
|
||||
} else {
|
||||
this.config.workHours.days.splice(index, 1);
|
||||
}
|
||||
}
|
||||
}
|
||||
556
cmd/bugseti/frontend/src/app/settings/updates.component.ts
Normal file
556
cmd/bugseti/frontend/src/app/settings/updates.component.ts
Normal file
|
|
@ -0,0 +1,556 @@
|
|||
import { Component, OnInit, OnDestroy } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { FormsModule } from '@angular/forms';
|
||||
|
||||
interface UpdateSettings {
|
||||
channel: string;
|
||||
autoUpdate: boolean;
|
||||
checkInterval: number;
|
||||
lastCheck: string;
|
||||
}
|
||||
|
||||
interface VersionInfo {
|
||||
version: string;
|
||||
channel: string;
|
||||
commit: string;
|
||||
buildTime: string;
|
||||
goVersion: string;
|
||||
os: string;
|
||||
arch: string;
|
||||
}
|
||||
|
||||
interface ChannelInfo {
|
||||
id: string;
|
||||
name: string;
|
||||
description: string;
|
||||
}
|
||||
|
||||
interface UpdateCheckResult {
|
||||
available: boolean;
|
||||
currentVersion: string;
|
||||
latestVersion: string;
|
||||
release?: {
|
||||
version: string;
|
||||
channel: string;
|
||||
tag: string;
|
||||
name: string;
|
||||
body: string;
|
||||
publishedAt: string;
|
||||
htmlUrl: string;
|
||||
};
|
||||
error?: string;
|
||||
checkedAt: string;
|
||||
}
|
||||
|
||||
@Component({
|
||||
selector: 'app-updates-settings',
|
||||
standalone: true,
|
||||
imports: [CommonModule, FormsModule],
|
||||
template: `
|
||||
<div class="updates-settings">
|
||||
<div class="current-version">
|
||||
<div class="version-badge">
|
||||
<span class="version-number">{{ versionInfo?.version || 'Unknown' }}</span>
|
||||
<span class="channel-badge" [class]="'channel-' + (versionInfo?.channel || 'dev')">
|
||||
{{ versionInfo?.channel || 'dev' }}
|
||||
</span>
|
||||
</div>
|
||||
<p class="build-info" *ngIf="versionInfo">
|
||||
Built {{ versionInfo.buildTime | date:'medium' }} ({{ versionInfo.commit?.substring(0, 7) }})
|
||||
</p>
|
||||
</div>
|
||||
|
||||
<div class="update-check" *ngIf="checkResult">
|
||||
<div class="update-available" *ngIf="checkResult.available">
|
||||
<div class="update-icon">!</div>
|
||||
<div class="update-info">
|
||||
<h4>Update Available</h4>
|
||||
<p>Version {{ checkResult.latestVersion }} is available</p>
|
||||
<a *ngIf="checkResult.release?.htmlUrl"
|
||||
[href]="checkResult.release.htmlUrl"
|
||||
target="_blank"
|
||||
class="release-link">
|
||||
View Release Notes
|
||||
</a>
|
||||
</div>
|
||||
<button class="btn btn--primary" (click)="installUpdate()" [disabled]="isInstalling">
|
||||
{{ isInstalling ? 'Installing...' : 'Install Update' }}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="up-to-date" *ngIf="!checkResult.available && !checkResult.error">
|
||||
<div class="check-icon">OK</div>
|
||||
<div class="check-info">
|
||||
<h4>Up to Date</h4>
|
||||
<p>You're running the latest version</p>
|
||||
<span class="last-check" *ngIf="checkResult.checkedAt">
|
||||
Last checked: {{ checkResult.checkedAt | date:'short' }}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="check-error" *ngIf="checkResult.error">
|
||||
<div class="error-icon">X</div>
|
||||
<div class="error-info">
|
||||
<h4>Check Failed</h4>
|
||||
<p>{{ checkResult.error }}</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="check-button-row">
|
||||
<button class="btn btn--secondary" (click)="checkForUpdates()" [disabled]="isChecking">
|
||||
{{ isChecking ? 'Checking...' : 'Check for Updates' }}
|
||||
</button>
|
||||
</div>
|
||||
|
||||
<div class="settings-section">
|
||||
<h3>Update Channel</h3>
|
||||
<p class="section-description">Choose which release channel to follow for updates.</p>
|
||||
|
||||
<div class="channel-options">
|
||||
<label class="channel-option" *ngFor="let channel of channels"
|
||||
[class.selected]="settings.channel === channel.id">
|
||||
<input type="radio"
|
||||
[name]="'channel'"
|
||||
[value]="channel.id"
|
||||
[(ngModel)]="settings.channel"
|
||||
(change)="onSettingsChange()">
|
||||
<div class="channel-content">
|
||||
<span class="channel-name">{{ channel.name }}</span>
|
||||
<span class="channel-desc">{{ channel.description }}</span>
|
||||
</div>
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="settings-section">
|
||||
<h3>Automatic Updates</h3>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="checkbox-label">
|
||||
<input type="checkbox"
|
||||
[(ngModel)]="settings.autoUpdate"
|
||||
(change)="onSettingsChange()">
|
||||
<span>Automatically install updates</span>
|
||||
</label>
|
||||
<p class="setting-hint">When enabled, updates will be installed automatically on app restart.</p>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Check Interval</label>
|
||||
<select class="form-select"
|
||||
[(ngModel)]="settings.checkInterval"
|
||||
(change)="onSettingsChange()">
|
||||
<option [value]="0">Disabled</option>
|
||||
<option [value]="1">Every hour</option>
|
||||
<option [value]="6">Every 6 hours</option>
|
||||
<option [value]="12">Every 12 hours</option>
|
||||
<option [value]="24">Daily</option>
|
||||
<option [value]="168">Weekly</option>
|
||||
</select>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="save-status" *ngIf="saveMessage">
|
||||
<span [class.error]="saveError">{{ saveMessage }}</span>
|
||||
</div>
|
||||
</div>
|
||||
`,
|
||||
styles: [`
|
||||
.updates-settings {
|
||||
padding: var(--spacing-md);
|
||||
}
|
||||
|
||||
.current-version {
|
||||
background: var(--bg-tertiary);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
margin-bottom: var(--spacing-lg);
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.version-badge {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: var(--spacing-sm);
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.version-number {
|
||||
font-size: 24px;
|
||||
font-weight: 600;
|
||||
}
|
||||
|
||||
.channel-badge {
|
||||
padding: 2px 8px;
|
||||
border-radius: 999px;
|
||||
font-size: 11px;
|
||||
font-weight: 600;
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.channel-stable { background: var(--accent-success); color: white; }
|
||||
.channel-beta { background: var(--accent-warning); color: black; }
|
||||
.channel-nightly { background: var(--accent-purple, #8b5cf6); color: white; }
|
||||
.channel-dev { background: var(--text-muted); color: var(--bg-primary); }
|
||||
|
||||
.build-info {
|
||||
color: var(--text-muted);
|
||||
font-size: 12px;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.update-check {
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.update-available, .up-to-date, .check-error {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-md);
|
||||
padding: var(--spacing-md);
|
||||
border-radius: var(--radius-md);
|
||||
}
|
||||
|
||||
.update-available {
|
||||
background: var(--accent-warning-bg, rgba(245, 158, 11, 0.1));
|
||||
border: 1px solid var(--accent-warning);
|
||||
}
|
||||
|
||||
.up-to-date {
|
||||
background: var(--accent-success-bg, rgba(34, 197, 94, 0.1));
|
||||
border: 1px solid var(--accent-success);
|
||||
}
|
||||
|
||||
.check-error {
|
||||
background: var(--accent-danger-bg, rgba(239, 68, 68, 0.1));
|
||||
border: 1px solid var(--accent-danger);
|
||||
}
|
||||
|
||||
.update-icon, .check-icon, .error-icon {
|
||||
width: 40px;
|
||||
height: 40px;
|
||||
border-radius: 50%;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
font-weight: bold;
|
||||
flex-shrink: 0;
|
||||
}
|
||||
|
||||
.update-icon { background: var(--accent-warning); color: black; }
|
||||
.check-icon { background: var(--accent-success); color: white; }
|
||||
.error-icon { background: var(--accent-danger); color: white; }
|
||||
|
||||
.update-info, .check-info, .error-info {
|
||||
flex: 1;
|
||||
}
|
||||
|
||||
.update-info h4, .check-info h4, .error-info h4 {
|
||||
margin: 0 0 var(--spacing-xs) 0;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.update-info p, .check-info p, .error-info p {
|
||||
margin: 0;
|
||||
font-size: 13px;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.release-link {
|
||||
color: var(--accent-primary);
|
||||
font-size: 12px;
|
||||
}
|
||||
|
||||
.last-check {
|
||||
font-size: 11px;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.check-button-row {
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.settings-section {
|
||||
background: var(--bg-primary);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: var(--spacing-lg);
|
||||
margin-bottom: var(--spacing-lg);
|
||||
}
|
||||
|
||||
.settings-section h3 {
|
||||
font-size: 14px;
|
||||
margin: 0 0 var(--spacing-xs) 0;
|
||||
}
|
||||
|
||||
.section-description {
|
||||
color: var(--text-muted);
|
||||
font-size: 12px;
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.channel-options {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.channel-option {
|
||||
display: flex;
|
||||
align-items: flex-start;
|
||||
gap: var(--spacing-sm);
|
||||
padding: var(--spacing-md);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-md);
|
||||
cursor: pointer;
|
||||
transition: all 0.15s ease;
|
||||
}
|
||||
|
||||
.channel-option:hover {
|
||||
border-color: var(--accent-primary);
|
||||
}
|
||||
|
||||
.channel-option.selected {
|
||||
border-color: var(--accent-primary);
|
||||
background: var(--accent-primary-bg, rgba(59, 130, 246, 0.1));
|
||||
}
|
||||
|
||||
.channel-option input[type="radio"] {
|
||||
margin-top: 2px;
|
||||
}
|
||||
|
||||
.channel-content {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: 2px;
|
||||
}
|
||||
|
||||
.channel-name {
|
||||
font-weight: 500;
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.channel-desc {
|
||||
font-size: 12px;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.form-group {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.form-group:last-child {
|
||||
margin-bottom: 0;
|
||||
}
|
||||
|
||||
.checkbox-label {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
cursor: pointer;
|
||||
}
|
||||
|
||||
.setting-hint {
|
||||
color: var(--text-muted);
|
||||
font-size: 12px;
|
||||
margin: var(--spacing-xs) 0 0 24px;
|
||||
}
|
||||
|
||||
.form-label {
|
||||
display: block;
|
||||
font-size: 13px;
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.form-select {
|
||||
width: 100%;
|
||||
padding: var(--spacing-sm);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-md);
|
||||
background: var(--bg-secondary);
|
||||
color: var(--text-primary);
|
||||
font-size: 14px;
|
||||
}
|
||||
|
||||
.save-status {
|
||||
text-align: center;
|
||||
font-size: 13px;
|
||||
color: var(--accent-success);
|
||||
}
|
||||
|
||||
.save-status .error {
|
||||
color: var(--accent-danger);
|
||||
}
|
||||
|
||||
.btn {
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
border: none;
|
||||
border-radius: var(--radius-md);
|
||||
font-size: 14px;
|
||||
cursor: pointer;
|
||||
transition: all 0.15s ease;
|
||||
}
|
||||
|
||||
.btn:disabled {
|
||||
opacity: 0.6;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
.btn--primary {
|
||||
background: var(--accent-primary);
|
||||
color: white;
|
||||
}
|
||||
|
||||
.btn--primary:hover:not(:disabled) {
|
||||
background: var(--accent-primary-hover, #2563eb);
|
||||
}
|
||||
|
||||
.btn--secondary {
|
||||
background: var(--bg-tertiary);
|
||||
color: var(--text-primary);
|
||||
border: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
.btn--secondary:hover:not(:disabled) {
|
||||
background: var(--bg-secondary);
|
||||
}
|
||||
`]
|
||||
})
|
||||
export class UpdatesComponent implements OnInit, OnDestroy {
|
||||
settings: UpdateSettings = {
|
||||
channel: 'stable',
|
||||
autoUpdate: false,
|
||||
checkInterval: 6,
|
||||
lastCheck: ''
|
||||
};
|
||||
|
||||
versionInfo: VersionInfo | null = null;
|
||||
checkResult: UpdateCheckResult | null = null;
|
||||
|
||||
channels: ChannelInfo[] = [
|
||||
{ id: 'stable', name: 'Stable', description: 'Production releases - most stable, recommended for most users' },
|
||||
{ id: 'beta', name: 'Beta', description: 'Pre-release builds - new features being tested before stable release' },
|
||||
{ id: 'nightly', name: 'Nightly', description: 'Latest development builds - bleeding edge, may be unstable' }
|
||||
];
|
||||
|
||||
isChecking = false;
|
||||
isInstalling = false;
|
||||
saveMessage = '';
|
||||
saveError = false;
|
||||
|
||||
private saveTimeout: ReturnType<typeof setTimeout> | null = null;
|
||||
|
||||
ngOnInit() {
|
||||
this.loadSettings();
|
||||
this.loadVersionInfo();
|
||||
}
|
||||
|
||||
ngOnDestroy() {
|
||||
if (this.saveTimeout) {
|
||||
clearTimeout(this.saveTimeout);
|
||||
}
|
||||
}
|
||||
|
||||
async loadSettings() {
|
||||
try {
|
||||
const wails = (window as any).go?.main;
|
||||
if (wails?.UpdateService?.GetSettings) {
|
||||
this.settings = await wails.UpdateService.GetSettings();
|
||||
} else if (wails?.ConfigService?.GetUpdateSettings) {
|
||||
this.settings = await wails.ConfigService.GetUpdateSettings();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to load update settings:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async loadVersionInfo() {
|
||||
try {
|
||||
const wails = (window as any).go?.main;
|
||||
if (wails?.VersionService?.GetVersionInfo) {
|
||||
this.versionInfo = await wails.VersionService.GetVersionInfo();
|
||||
} else if (wails?.UpdateService?.GetVersionInfo) {
|
||||
this.versionInfo = await wails.UpdateService.GetVersionInfo();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to load version info:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async checkForUpdates() {
|
||||
this.isChecking = true;
|
||||
this.checkResult = null;
|
||||
|
||||
try {
|
||||
const wails = (window as any).go?.main;
|
||||
if (wails?.UpdateService?.CheckForUpdate) {
|
||||
this.checkResult = await wails.UpdateService.CheckForUpdate();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to check for updates:', err);
|
||||
this.checkResult = {
|
||||
available: false,
|
||||
currentVersion: this.versionInfo?.version || 'unknown',
|
||||
latestVersion: '',
|
||||
error: 'Failed to check for updates',
|
||||
checkedAt: new Date().toISOString()
|
||||
};
|
||||
} finally {
|
||||
this.isChecking = false;
|
||||
}
|
||||
}
|
||||
|
||||
async installUpdate() {
|
||||
if (!this.checkResult?.available || !this.checkResult.release) {
|
||||
return;
|
||||
}
|
||||
|
||||
this.isInstalling = true;
|
||||
|
||||
try {
|
||||
const wails = (window as any).go?.main;
|
||||
if (wails?.UpdateService?.InstallUpdate) {
|
||||
await wails.UpdateService.InstallUpdate();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to install update:', err);
|
||||
alert('Failed to install update. Please try again or download manually.');
|
||||
} finally {
|
||||
this.isInstalling = false;
|
||||
}
|
||||
}
|
||||
|
||||
async onSettingsChange() {
|
||||
// Debounce save
|
||||
if (this.saveTimeout) {
|
||||
clearTimeout(this.saveTimeout);
|
||||
}
|
||||
|
||||
this.saveTimeout = setTimeout(() => this.saveSettings(), 500);
|
||||
}
|
||||
|
||||
async saveSettings() {
|
||||
try {
|
||||
const wails = (window as any).go?.main;
|
||||
if (wails?.UpdateService?.SetSettings) {
|
||||
await wails.UpdateService.SetSettings(this.settings);
|
||||
} else if (wails?.ConfigService?.SetUpdateSettings) {
|
||||
await wails.ConfigService.SetUpdateSettings(this.settings);
|
||||
}
|
||||
this.saveMessage = 'Settings saved';
|
||||
this.saveError = false;
|
||||
} catch (err) {
|
||||
console.error('Failed to save update settings:', err);
|
||||
this.saveMessage = 'Failed to save settings';
|
||||
this.saveError = true;
|
||||
}
|
||||
|
||||
// Clear message after 2 seconds
|
||||
setTimeout(() => {
|
||||
this.saveMessage = '';
|
||||
}, 2000);
|
||||
}
|
||||
}
|
||||
296
cmd/bugseti/frontend/src/app/tray/tray.component.ts
Normal file
296
cmd/bugseti/frontend/src/app/tray/tray.component.ts
Normal file
|
|
@ -0,0 +1,296 @@
|
|||
import { Component, OnInit, OnDestroy } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
|
||||
interface TrayStatus {
|
||||
running: boolean;
|
||||
currentIssue: string;
|
||||
queueSize: number;
|
||||
issuesFixed: number;
|
||||
prsMerged: number;
|
||||
}
|
||||
|
||||
@Component({
|
||||
selector: 'app-tray',
|
||||
standalone: true,
|
||||
imports: [CommonModule],
|
||||
template: `
|
||||
<div class="tray-panel">
|
||||
<header class="tray-header">
|
||||
<div class="logo">
|
||||
<span class="logo-icon">B</span>
|
||||
<span class="logo-text">BugSETI</span>
|
||||
</div>
|
||||
<span class="badge" [class.badge--success]="status.running" [class.badge--warning]="!status.running">
|
||||
{{ status.running ? 'Running' : 'Paused' }}
|
||||
</span>
|
||||
</header>
|
||||
|
||||
<section class="stats-grid">
|
||||
<div class="stat-card">
|
||||
<span class="stat-value">{{ status.queueSize }}</span>
|
||||
<span class="stat-label">In Queue</span>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<span class="stat-value">{{ status.issuesFixed }}</span>
|
||||
<span class="stat-label">Fixed</span>
|
||||
</div>
|
||||
<div class="stat-card">
|
||||
<span class="stat-value">{{ status.prsMerged }}</span>
|
||||
<span class="stat-label">Merged</span>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="current-issue" *ngIf="status.currentIssue">
|
||||
<h3>Current Issue</h3>
|
||||
<div class="issue-card">
|
||||
<p class="issue-title">{{ status.currentIssue }}</p>
|
||||
<div class="issue-actions">
|
||||
<button class="btn btn--primary btn--sm" (click)="openWorkbench()">
|
||||
Open Workbench
|
||||
</button>
|
||||
<button class="btn btn--secondary btn--sm" (click)="skipIssue()">
|
||||
Skip
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<section class="current-issue" *ngIf="!status.currentIssue">
|
||||
<div class="empty-state">
|
||||
<span class="empty-icon">[ ]</span>
|
||||
<p>No issue in progress</p>
|
||||
<button class="btn btn--primary btn--sm" (click)="nextIssue()" [disabled]="status.queueSize === 0">
|
||||
Get Next Issue
|
||||
</button>
|
||||
</div>
|
||||
</section>
|
||||
|
||||
<footer class="tray-footer">
|
||||
<button class="btn btn--secondary btn--sm" (click)="toggleRunning()">
|
||||
{{ status.running ? 'Pause' : 'Start' }}
|
||||
</button>
|
||||
<button class="btn btn--secondary btn--sm" (click)="openSettings()">
|
||||
Settings
|
||||
</button>
|
||||
</footer>
|
||||
</div>
|
||||
`,
|
||||
styles: [`
|
||||
.tray-panel {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100%;
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--bg-primary);
|
||||
}
|
||||
|
||||
.tray-header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.logo {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.logo-icon {
|
||||
width: 28px;
|
||||
height: 28px;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
background: linear-gradient(135deg, var(--accent-primary), var(--accent-success));
|
||||
border-radius: var(--radius-md);
|
||||
font-weight: bold;
|
||||
color: white;
|
||||
}
|
||||
|
||||
.logo-text {
|
||||
font-weight: 600;
|
||||
font-size: 16px;
|
||||
}
|
||||
|
||||
.stats-grid {
|
||||
display: grid;
|
||||
grid-template-columns: repeat(3, 1fr);
|
||||
gap: var(--spacing-sm);
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.stat-card {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
padding: var(--spacing-sm);
|
||||
background-color: var(--bg-secondary);
|
||||
border-radius: var(--radius-md);
|
||||
}
|
||||
|
||||
.stat-value {
|
||||
font-size: 24px;
|
||||
font-weight: bold;
|
||||
color: var(--accent-primary);
|
||||
}
|
||||
|
||||
.stat-label {
|
||||
font-size: 11px;
|
||||
color: var(--text-muted);
|
||||
text-transform: uppercase;
|
||||
}
|
||||
|
||||
.current-issue {
|
||||
flex: 1;
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.current-issue h3 {
|
||||
font-size: 12px;
|
||||
color: var(--text-muted);
|
||||
text-transform: uppercase;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.issue-card {
|
||||
background-color: var(--bg-secondary);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-md);
|
||||
padding: var(--spacing-md);
|
||||
}
|
||||
|
||||
.issue-title {
|
||||
font-size: 13px;
|
||||
line-height: 1.4;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.issue-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.empty-state {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
padding: var(--spacing-xl);
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.empty-icon {
|
||||
font-size: 32px;
|
||||
color: var(--text-muted);
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.empty-state p {
|
||||
color: var(--text-muted);
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.tray-footer {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
justify-content: center;
|
||||
}
|
||||
|
||||
.btn--sm {
|
||||
padding: var(--spacing-xs) var(--spacing-sm);
|
||||
font-size: 12px;
|
||||
}
|
||||
`]
|
||||
})
|
||||
export class TrayComponent implements OnInit, OnDestroy {
|
||||
status: TrayStatus = {
|
||||
running: false,
|
||||
currentIssue: '',
|
||||
queueSize: 0,
|
||||
issuesFixed: 0,
|
||||
prsMerged: 0
|
||||
};
|
||||
|
||||
private refreshInterval?: ReturnType<typeof setInterval>;
|
||||
|
||||
ngOnInit() {
|
||||
this.loadStatus();
|
||||
this.refreshInterval = setInterval(() => this.loadStatus(), 5000);
|
||||
}
|
||||
|
||||
ngOnDestroy() {
|
||||
if (this.refreshInterval) {
|
||||
clearInterval(this.refreshInterval);
|
||||
}
|
||||
}
|
||||
|
||||
async loadStatus() {
|
||||
try {
|
||||
// Call Wails binding when available
|
||||
if ((window as any).go?.main?.TrayService?.GetStatus) {
|
||||
this.status = await (window as any).go.main.TrayService.GetStatus();
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to load status:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async toggleRunning() {
|
||||
try {
|
||||
if (this.status.running) {
|
||||
if ((window as any).go?.main?.TrayService?.PauseFetching) {
|
||||
await (window as any).go.main.TrayService.PauseFetching();
|
||||
}
|
||||
} else {
|
||||
if ((window as any).go?.main?.TrayService?.StartFetching) {
|
||||
await (window as any).go.main.TrayService.StartFetching();
|
||||
}
|
||||
}
|
||||
this.loadStatus();
|
||||
} catch (err) {
|
||||
console.error('Failed to toggle running:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async nextIssue() {
|
||||
try {
|
||||
if ((window as any).go?.main?.TrayService?.NextIssue) {
|
||||
await (window as any).go.main.TrayService.NextIssue();
|
||||
}
|
||||
this.loadStatus();
|
||||
} catch (err) {
|
||||
console.error('Failed to get next issue:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async skipIssue() {
|
||||
try {
|
||||
if ((window as any).go?.main?.TrayService?.SkipIssue) {
|
||||
await (window as any).go.main.TrayService.SkipIssue();
|
||||
}
|
||||
this.loadStatus();
|
||||
} catch (err) {
|
||||
console.error('Failed to skip issue:', err);
|
||||
}
|
||||
}
|
||||
|
||||
openWorkbench() {
|
||||
if ((window as any).wails?.Window) {
|
||||
(window as any).wails.Window.GetByName('workbench').then((w: any) => {
|
||||
w.Show();
|
||||
w.Focus();
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
openSettings() {
|
||||
if ((window as any).wails?.Window) {
|
||||
(window as any).wails.Window.GetByName('settings').then((w: any) => {
|
||||
w.Show();
|
||||
w.Focus();
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
356
cmd/bugseti/frontend/src/app/workbench/workbench.component.ts
Normal file
356
cmd/bugseti/frontend/src/app/workbench/workbench.component.ts
Normal file
|
|
@ -0,0 +1,356 @@
|
|||
import { Component, OnInit } from '@angular/core';
|
||||
import { CommonModule } from '@angular/common';
|
||||
import { FormsModule } from '@angular/forms';
|
||||
|
||||
interface Issue {
|
||||
id: string;
|
||||
number: number;
|
||||
repo: string;
|
||||
title: string;
|
||||
body: string;
|
||||
url: string;
|
||||
labels: string[];
|
||||
author: string;
|
||||
context?: IssueContext;
|
||||
}
|
||||
|
||||
interface IssueContext {
|
||||
summary: string;
|
||||
relevantFiles: string[];
|
||||
suggestedFix: string;
|
||||
complexity: string;
|
||||
estimatedTime: string;
|
||||
}
|
||||
|
||||
@Component({
|
||||
selector: 'app-workbench',
|
||||
standalone: true,
|
||||
imports: [CommonModule, FormsModule],
|
||||
template: `
|
||||
<div class="workbench">
|
||||
<header class="workbench-header">
|
||||
<h1>BugSETI Workbench</h1>
|
||||
<div class="header-actions">
|
||||
<button class="btn btn--secondary" (click)="skipIssue()">Skip</button>
|
||||
<button class="btn btn--success" (click)="submitPR()" [disabled]="!canSubmit">Submit PR</button>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
<div class="workbench-content" *ngIf="currentIssue">
|
||||
<aside class="issue-panel">
|
||||
<div class="card">
|
||||
<div class="card__header">
|
||||
<h2 class="card__title">Issue #{{ currentIssue.number }}</h2>
|
||||
<a [href]="currentIssue.url" target="_blank" class="btn btn--secondary btn--sm">View on GitHub</a>
|
||||
</div>
|
||||
|
||||
<h3>{{ currentIssue.title }}</h3>
|
||||
|
||||
<div class="labels">
|
||||
<span class="badge badge--primary" *ngFor="let label of currentIssue.labels">
|
||||
{{ label }}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<div class="issue-meta">
|
||||
<span>{{ currentIssue.repo }}</span>
|
||||
<span>by {{ currentIssue.author }}</span>
|
||||
</div>
|
||||
|
||||
<div class="issue-body">
|
||||
<pre>{{ currentIssue.body }}</pre>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="card" *ngIf="currentIssue.context">
|
||||
<div class="card__header">
|
||||
<h2 class="card__title">AI Context</h2>
|
||||
<span class="badge" [ngClass]="{
|
||||
'badge--success': currentIssue.context.complexity === 'easy',
|
||||
'badge--warning': currentIssue.context.complexity === 'medium',
|
||||
'badge--danger': currentIssue.context.complexity === 'hard'
|
||||
}">
|
||||
{{ currentIssue.context.complexity }}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
<p class="context-summary">{{ currentIssue.context.summary }}</p>
|
||||
|
||||
<div class="context-section" *ngIf="currentIssue.context.relevantFiles?.length">
|
||||
<h4>Relevant Files</h4>
|
||||
<ul class="file-list">
|
||||
<li *ngFor="let file of currentIssue.context.relevantFiles">
|
||||
<code>{{ file }}</code>
|
||||
</li>
|
||||
</ul>
|
||||
</div>
|
||||
|
||||
<div class="context-section" *ngIf="currentIssue.context.suggestedFix">
|
||||
<h4>Suggested Approach</h4>
|
||||
<p>{{ currentIssue.context.suggestedFix }}</p>
|
||||
</div>
|
||||
|
||||
<div class="context-meta">
|
||||
<span>Est. time: {{ currentIssue.context.estimatedTime || 'Unknown' }}</span>
|
||||
</div>
|
||||
</div>
|
||||
</aside>
|
||||
|
||||
<main class="editor-panel">
|
||||
<div class="card">
|
||||
<div class="card__header">
|
||||
<h2 class="card__title">PR Details</h2>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">PR Title</label>
|
||||
<input type="text" class="form-input" [(ngModel)]="prTitle"
|
||||
[placeholder]="'Fix #' + currentIssue.number + ': ' + currentIssue.title">
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">PR Description</label>
|
||||
<textarea class="form-textarea" [(ngModel)]="prBody" rows="8"
|
||||
placeholder="Describe your changes..."></textarea>
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Branch Name</label>
|
||||
<input type="text" class="form-input" [(ngModel)]="branchName"
|
||||
[placeholder]="'bugseti/issue-' + currentIssue.number">
|
||||
</div>
|
||||
|
||||
<div class="form-group">
|
||||
<label class="form-label">Commit Message</label>
|
||||
<textarea class="form-textarea" [(ngModel)]="commitMessage" rows="3"
|
||||
[placeholder]="'fix: resolve issue #' + currentIssue.number"></textarea>
|
||||
</div>
|
||||
</div>
|
||||
</main>
|
||||
</div>
|
||||
|
||||
<div class="empty-state" *ngIf="!currentIssue">
|
||||
<h2>No Issue Selected</h2>
|
||||
<p>Get an issue from the queue to start working.</p>
|
||||
<button class="btn btn--primary" (click)="nextIssue()">Get Next Issue</button>
|
||||
</div>
|
||||
</div>
|
||||
`,
|
||||
styles: [`
|
||||
.workbench {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
height: 100%;
|
||||
background-color: var(--bg-secondary);
|
||||
}
|
||||
|
||||
.workbench-header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
padding: var(--spacing-md) var(--spacing-lg);
|
||||
background-color: var(--bg-primary);
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
.workbench-header h1 {
|
||||
font-size: 18px;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.header-actions {
|
||||
display: flex;
|
||||
gap: var(--spacing-sm);
|
||||
}
|
||||
|
||||
.workbench-content {
|
||||
display: grid;
|
||||
grid-template-columns: 400px 1fr;
|
||||
flex: 1;
|
||||
overflow: hidden;
|
||||
}
|
||||
|
||||
.issue-panel {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
gap: var(--spacing-md);
|
||||
padding: var(--spacing-md);
|
||||
overflow-y: auto;
|
||||
border-right: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
.editor-panel {
|
||||
padding: var(--spacing-md);
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.labels {
|
||||
display: flex;
|
||||
flex-wrap: wrap;
|
||||
gap: var(--spacing-xs);
|
||||
margin: var(--spacing-sm) 0;
|
||||
}
|
||||
|
||||
.issue-meta {
|
||||
display: flex;
|
||||
gap: var(--spacing-md);
|
||||
font-size: 12px;
|
||||
color: var(--text-muted);
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.issue-body {
|
||||
padding: var(--spacing-md);
|
||||
background-color: var(--bg-tertiary);
|
||||
border-radius: var(--radius-md);
|
||||
max-height: 200px;
|
||||
overflow-y: auto;
|
||||
}
|
||||
|
||||
.issue-body pre {
|
||||
white-space: pre-wrap;
|
||||
word-wrap: break-word;
|
||||
font-size: 13px;
|
||||
line-height: 1.5;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.context-summary {
|
||||
color: var(--text-secondary);
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.context-section {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.context-section h4 {
|
||||
font-size: 12px;
|
||||
text-transform: uppercase;
|
||||
color: var(--text-muted);
|
||||
margin-bottom: var(--spacing-xs);
|
||||
}
|
||||
|
||||
.file-list {
|
||||
list-style: none;
|
||||
padding: 0;
|
||||
margin: 0;
|
||||
}
|
||||
|
||||
.file-list li {
|
||||
padding: var(--spacing-xs) 0;
|
||||
}
|
||||
|
||||
.context-meta {
|
||||
font-size: 12px;
|
||||
color: var(--text-muted);
|
||||
}
|
||||
|
||||
.empty-state {
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
flex: 1;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
.empty-state h2 {
|
||||
color: var(--text-secondary);
|
||||
}
|
||||
|
||||
.empty-state p {
|
||||
color: var(--text-muted);
|
||||
}
|
||||
`]
|
||||
})
|
||||
export class WorkbenchComponent implements OnInit {
|
||||
currentIssue: Issue | null = null;
|
||||
prTitle = '';
|
||||
prBody = '';
|
||||
branchName = '';
|
||||
commitMessage = '';
|
||||
|
||||
get canSubmit(): boolean {
|
||||
return !!this.currentIssue && !!this.prTitle;
|
||||
}
|
||||
|
||||
ngOnInit() {
|
||||
this.loadCurrentIssue();
|
||||
}
|
||||
|
||||
async loadCurrentIssue() {
|
||||
try {
|
||||
if ((window as any).go?.main?.TrayService?.GetCurrentIssue) {
|
||||
this.currentIssue = await (window as any).go.main.TrayService.GetCurrentIssue();
|
||||
if (this.currentIssue) {
|
||||
this.initDefaults();
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to load current issue:', err);
|
||||
}
|
||||
}
|
||||
|
||||
initDefaults() {
|
||||
if (!this.currentIssue) return;
|
||||
|
||||
this.prTitle = `Fix #${this.currentIssue.number}: ${this.currentIssue.title}`;
|
||||
this.branchName = `bugseti/issue-${this.currentIssue.number}`;
|
||||
this.commitMessage = `fix: resolve issue #${this.currentIssue.number}\n\n${this.currentIssue.title}`;
|
||||
}
|
||||
|
||||
async nextIssue() {
|
||||
try {
|
||||
if ((window as any).go?.main?.TrayService?.NextIssue) {
|
||||
this.currentIssue = await (window as any).go.main.TrayService.NextIssue();
|
||||
if (this.currentIssue) {
|
||||
this.initDefaults();
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to get next issue:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async skipIssue() {
|
||||
try {
|
||||
if ((window as any).go?.main?.TrayService?.SkipIssue) {
|
||||
await (window as any).go.main.TrayService.SkipIssue();
|
||||
this.currentIssue = null;
|
||||
this.prTitle = '';
|
||||
this.prBody = '';
|
||||
this.branchName = '';
|
||||
this.commitMessage = '';
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to skip issue:', err);
|
||||
}
|
||||
}
|
||||
|
||||
async submitPR() {
|
||||
if (!this.currentIssue || !this.canSubmit) return;
|
||||
|
||||
try {
|
||||
if ((window as any).go?.main?.SubmitService?.Submit) {
|
||||
const result = await (window as any).go.main.SubmitService.Submit({
|
||||
issue: this.currentIssue,
|
||||
title: this.prTitle,
|
||||
body: this.prBody,
|
||||
branch: this.branchName,
|
||||
commitMsg: this.commitMessage
|
||||
});
|
||||
|
||||
if (result.success) {
|
||||
alert(`PR submitted successfully!\n\n${result.prUrl}`);
|
||||
this.currentIssue = null;
|
||||
} else {
|
||||
alert(`Failed to submit PR: ${result.error}`);
|
||||
}
|
||||
}
|
||||
} catch (err) {
|
||||
console.error('Failed to submit PR:', err);
|
||||
alert('Failed to submit PR. Check console for details.');
|
||||
}
|
||||
}
|
||||
}
|
||||
0
cmd/bugseti/frontend/src/favicon.ico
Normal file
0
cmd/bugseti/frontend/src/favicon.ico
Normal file
13
cmd/bugseti/frontend/src/index.html
Normal file
13
cmd/bugseti/frontend/src/index.html
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
<!doctype html>
|
||||
<html lang="en">
|
||||
<head>
|
||||
<meta charset="utf-8">
|
||||
<title>BugSETI</title>
|
||||
<base href="/">
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1">
|
||||
<link rel="icon" type="image/x-icon" href="favicon.ico">
|
||||
</head>
|
||||
<body>
|
||||
<app-root></app-root>
|
||||
</body>
|
||||
</html>
|
||||
6
cmd/bugseti/frontend/src/main.ts
Normal file
6
cmd/bugseti/frontend/src/main.ts
Normal file
|
|
@ -0,0 +1,6 @@
|
|||
import { bootstrapApplication } from '@angular/platform-browser';
|
||||
import { appConfig } from './app/app.config';
|
||||
import { AppComponent } from './app/app.component';
|
||||
|
||||
bootstrapApplication(AppComponent, appConfig)
|
||||
.catch((err) => console.error(err));
|
||||
268
cmd/bugseti/frontend/src/styles.scss
Normal file
268
cmd/bugseti/frontend/src/styles.scss
Normal file
|
|
@ -0,0 +1,268 @@
|
|||
// BugSETI Global Styles
|
||||
|
||||
// CSS Variables for theming
|
||||
:root {
|
||||
// Dark theme (default)
|
||||
--bg-primary: #161b22;
|
||||
--bg-secondary: #0d1117;
|
||||
--bg-tertiary: #21262d;
|
||||
--text-primary: #c9d1d9;
|
||||
--text-secondary: #8b949e;
|
||||
--text-muted: #6e7681;
|
||||
--border-color: #30363d;
|
||||
--accent-primary: #58a6ff;
|
||||
--accent-success: #3fb950;
|
||||
--accent-warning: #d29922;
|
||||
--accent-danger: #f85149;
|
||||
|
||||
// Spacing
|
||||
--spacing-xs: 4px;
|
||||
--spacing-sm: 8px;
|
||||
--spacing-md: 16px;
|
||||
--spacing-lg: 24px;
|
||||
--spacing-xl: 32px;
|
||||
|
||||
// Border radius
|
||||
--radius-sm: 4px;
|
||||
--radius-md: 6px;
|
||||
--radius-lg: 12px;
|
||||
|
||||
// Font
|
||||
--font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', 'Noto Sans', Helvetica, Arial, sans-serif;
|
||||
--font-mono: ui-monospace, SFMono-Regular, SF Mono, Menlo, Consolas, Liberation Mono, monospace;
|
||||
}
|
||||
|
||||
// Light theme
|
||||
[data-theme="light"] {
|
||||
--bg-primary: #ffffff;
|
||||
--bg-secondary: #f6f8fa;
|
||||
--bg-tertiary: #f0f3f6;
|
||||
--text-primary: #24292f;
|
||||
--text-secondary: #57606a;
|
||||
--text-muted: #8b949e;
|
||||
--border-color: #d0d7de;
|
||||
--accent-primary: #0969da;
|
||||
--accent-success: #1a7f37;
|
||||
--accent-warning: #9a6700;
|
||||
--accent-danger: #cf222e;
|
||||
}
|
||||
|
||||
// Reset
|
||||
*,
|
||||
*::before,
|
||||
*::after {
|
||||
box-sizing: border-box;
|
||||
margin: 0;
|
||||
padding: 0;
|
||||
}
|
||||
|
||||
html, body {
|
||||
height: 100%;
|
||||
width: 100%;
|
||||
}
|
||||
|
||||
body {
|
||||
font-family: var(--font-family);
|
||||
font-size: 14px;
|
||||
line-height: 1.5;
|
||||
color: var(--text-primary);
|
||||
background-color: var(--bg-primary);
|
||||
-webkit-font-smoothing: antialiased;
|
||||
-moz-osx-font-smoothing: grayscale;
|
||||
}
|
||||
|
||||
// Typography
|
||||
h1, h2, h3, h4, h5, h6 {
|
||||
font-weight: 600;
|
||||
line-height: 1.25;
|
||||
margin-bottom: var(--spacing-sm);
|
||||
}
|
||||
|
||||
h1 { font-size: 24px; }
|
||||
h2 { font-size: 20px; }
|
||||
h3 { font-size: 16px; }
|
||||
h4 { font-size: 14px; }
|
||||
|
||||
p {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
a {
|
||||
color: var(--accent-primary);
|
||||
text-decoration: none;
|
||||
|
||||
&:hover {
|
||||
text-decoration: underline;
|
||||
}
|
||||
}
|
||||
|
||||
code {
|
||||
font-family: var(--font-mono);
|
||||
font-size: 12px;
|
||||
padding: 2px 6px;
|
||||
background-color: var(--bg-tertiary);
|
||||
border-radius: var(--radius-sm);
|
||||
}
|
||||
|
||||
// Buttons
|
||||
.btn {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
justify-content: center;
|
||||
gap: var(--spacing-xs);
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
font-size: 14px;
|
||||
font-weight: 500;
|
||||
line-height: 1;
|
||||
border: 1px solid transparent;
|
||||
border-radius: var(--radius-md);
|
||||
cursor: pointer;
|
||||
transition: all 0.2s;
|
||||
|
||||
&:disabled {
|
||||
opacity: 0.5;
|
||||
cursor: not-allowed;
|
||||
}
|
||||
|
||||
&--primary {
|
||||
background-color: var(--accent-primary);
|
||||
color: white;
|
||||
|
||||
&:hover:not(:disabled) {
|
||||
opacity: 0.9;
|
||||
}
|
||||
}
|
||||
|
||||
&--secondary {
|
||||
background-color: var(--bg-tertiary);
|
||||
border-color: var(--border-color);
|
||||
color: var(--text-primary);
|
||||
|
||||
&:hover:not(:disabled) {
|
||||
background-color: var(--bg-secondary);
|
||||
}
|
||||
}
|
||||
|
||||
&--success {
|
||||
background-color: var(--accent-success);
|
||||
color: white;
|
||||
}
|
||||
|
||||
&--danger {
|
||||
background-color: var(--accent-danger);
|
||||
color: white;
|
||||
}
|
||||
}
|
||||
|
||||
// Forms
|
||||
.form-group {
|
||||
margin-bottom: var(--spacing-md);
|
||||
}
|
||||
|
||||
.form-label {
|
||||
display: block;
|
||||
margin-bottom: var(--spacing-xs);
|
||||
font-weight: 500;
|
||||
color: var(--text-primary);
|
||||
}
|
||||
|
||||
.form-input,
|
||||
.form-select,
|
||||
.form-textarea {
|
||||
width: 100%;
|
||||
padding: var(--spacing-sm) var(--spacing-md);
|
||||
font-size: 14px;
|
||||
background-color: var(--bg-secondary);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-md);
|
||||
color: var(--text-primary);
|
||||
|
||||
&:focus {
|
||||
outline: none;
|
||||
border-color: var(--accent-primary);
|
||||
box-shadow: 0 0 0 3px rgba(88, 166, 255, 0.2);
|
||||
}
|
||||
|
||||
&::placeholder {
|
||||
color: var(--text-muted);
|
||||
}
|
||||
}
|
||||
|
||||
.form-textarea {
|
||||
resize: vertical;
|
||||
min-height: 100px;
|
||||
}
|
||||
|
||||
// Cards
|
||||
.card {
|
||||
background-color: var(--bg-secondary);
|
||||
border: 1px solid var(--border-color);
|
||||
border-radius: var(--radius-lg);
|
||||
padding: var(--spacing-md);
|
||||
|
||||
&__header {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
justify-content: space-between;
|
||||
margin-bottom: var(--spacing-md);
|
||||
padding-bottom: var(--spacing-sm);
|
||||
border-bottom: 1px solid var(--border-color);
|
||||
}
|
||||
|
||||
&__title {
|
||||
font-size: 16px;
|
||||
font-weight: 600;
|
||||
}
|
||||
}
|
||||
|
||||
// Badges
|
||||
.badge {
|
||||
display: inline-flex;
|
||||
align-items: center;
|
||||
padding: 2px 8px;
|
||||
font-size: 12px;
|
||||
font-weight: 500;
|
||||
border-radius: 999px;
|
||||
|
||||
&--primary {
|
||||
background-color: rgba(88, 166, 255, 0.15);
|
||||
color: var(--accent-primary);
|
||||
}
|
||||
|
||||
&--success {
|
||||
background-color: rgba(63, 185, 80, 0.15);
|
||||
color: var(--accent-success);
|
||||
}
|
||||
|
||||
&--warning {
|
||||
background-color: rgba(210, 153, 34, 0.15);
|
||||
color: var(--accent-warning);
|
||||
}
|
||||
|
||||
&--danger {
|
||||
background-color: rgba(248, 81, 73, 0.15);
|
||||
color: var(--accent-danger);
|
||||
}
|
||||
}
|
||||
|
||||
// Utility classes
|
||||
.text-center { text-align: center; }
|
||||
.text-right { text-align: right; }
|
||||
.text-muted { color: var(--text-muted); }
|
||||
.text-success { color: var(--accent-success); }
|
||||
.text-danger { color: var(--accent-danger); }
|
||||
.text-warning { color: var(--accent-warning); }
|
||||
|
||||
.flex { display: flex; }
|
||||
.flex-col { flex-direction: column; }
|
||||
.items-center { align-items: center; }
|
||||
.justify-between { justify-content: space-between; }
|
||||
.gap-sm { gap: var(--spacing-sm); }
|
||||
.gap-md { gap: var(--spacing-md); }
|
||||
|
||||
.mt-sm { margin-top: var(--spacing-sm); }
|
||||
.mt-md { margin-top: var(--spacing-md); }
|
||||
.mb-sm { margin-bottom: var(--spacing-sm); }
|
||||
.mb-md { margin-bottom: var(--spacing-md); }
|
||||
|
||||
.hidden { display: none; }
|
||||
13
cmd/bugseti/frontend/tsconfig.app.json
Normal file
13
cmd/bugseti/frontend/tsconfig.app.json
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"outDir": "./out-tsc/app",
|
||||
"types": []
|
||||
},
|
||||
"files": [
|
||||
"src/main.ts"
|
||||
],
|
||||
"include": [
|
||||
"src/**/*.d.ts"
|
||||
]
|
||||
}
|
||||
35
cmd/bugseti/frontend/tsconfig.json
Normal file
35
cmd/bugseti/frontend/tsconfig.json
Normal file
|
|
@ -0,0 +1,35 @@
|
|||
{
|
||||
"compileOnSave": false,
|
||||
"compilerOptions": {
|
||||
"baseUrl": "./",
|
||||
"outDir": "./dist/out-tsc",
|
||||
"forceConsistentCasingInFileNames": true,
|
||||
"strict": true,
|
||||
"noImplicitOverride": true,
|
||||
"noPropertyAccessFromIndexSignature": true,
|
||||
"noImplicitReturns": true,
|
||||
"noFallthroughCasesInSwitch": true,
|
||||
"esModuleInterop": true,
|
||||
"sourceMap": true,
|
||||
"declaration": false,
|
||||
"experimentalDecorators": true,
|
||||
"moduleResolution": "bundler",
|
||||
"importHelpers": true,
|
||||
"target": "ES2022",
|
||||
"module": "ES2022",
|
||||
"lib": [
|
||||
"ES2022",
|
||||
"dom"
|
||||
],
|
||||
"paths": {
|
||||
"@app/*": ["src/app/*"],
|
||||
"@shared/*": ["src/app/shared/*"]
|
||||
}
|
||||
},
|
||||
"angularCompilerOptions": {
|
||||
"enableI18nLegacyMessageIdFormat": false,
|
||||
"strictInjectionParameters": true,
|
||||
"strictInputAccessModifiers": true,
|
||||
"strictTemplates": true
|
||||
}
|
||||
}
|
||||
13
cmd/bugseti/frontend/tsconfig.spec.json
Normal file
13
cmd/bugseti/frontend/tsconfig.spec.json
Normal file
|
|
@ -0,0 +1,13 @@
|
|||
{
|
||||
"extends": "./tsconfig.json",
|
||||
"compilerOptions": {
|
||||
"outDir": "./out-tsc/spec",
|
||||
"types": [
|
||||
"jasmine"
|
||||
]
|
||||
},
|
||||
"include": [
|
||||
"src/**/*.spec.ts",
|
||||
"src/**/*.d.ts"
|
||||
]
|
||||
}
|
||||
56
cmd/bugseti/go.mod
Normal file
56
cmd/bugseti/go.mod
Normal file
|
|
@ -0,0 +1,56 @@
|
|||
module github.com/host-uk/core/cmd/bugseti
|
||||
|
||||
go 1.25.5
|
||||
|
||||
require (
|
||||
github.com/host-uk/core/internal/bugseti v0.0.0
|
||||
github.com/host-uk/core/internal/bugseti/updater v0.0.0
|
||||
github.com/wailsapp/wails/v3 v3.0.0-alpha.64
|
||||
)
|
||||
|
||||
replace github.com/host-uk/core/internal/bugseti => ../../internal/bugseti
|
||||
|
||||
replace github.com/host-uk/core/internal/bugseti/updater => ../../internal/bugseti/updater
|
||||
|
||||
require (
|
||||
dario.cat/mergo v1.0.2 // indirect
|
||||
github.com/Microsoft/go-winio v0.6.2 // indirect
|
||||
github.com/ProtonMail/go-crypto v1.3.0 // indirect
|
||||
github.com/adrg/xdg v0.5.3 // indirect
|
||||
github.com/bep/debounce v1.2.1 // indirect
|
||||
github.com/cloudflare/circl v1.6.3 // indirect
|
||||
github.com/coder/websocket v1.8.14 // indirect
|
||||
github.com/cyphar/filepath-securejoin v0.6.1 // indirect
|
||||
github.com/ebitengine/purego v0.9.1 // indirect
|
||||
github.com/emirpasic/gods v1.18.1 // indirect
|
||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect
|
||||
github.com/go-git/go-billy/v5 v5.7.0 // indirect
|
||||
github.com/go-git/go-git/v5 v5.16.4 // indirect
|
||||
github.com/go-ole/go-ole v1.3.0 // indirect
|
||||
github.com/godbus/dbus/v5 v5.2.2 // indirect
|
||||
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 // indirect
|
||||
github.com/google/uuid v1.6.0 // indirect
|
||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect
|
||||
github.com/jchv/go-winloader v0.0.0-20250406163304-c1995be93bd1 // indirect
|
||||
github.com/kevinburke/ssh_config v1.4.0 // indirect
|
||||
github.com/klauspost/cpuid/v2 v2.3.0 // indirect
|
||||
github.com/leaanthony/go-ansi-parser v1.6.1 // indirect
|
||||
github.com/leaanthony/u v1.1.1 // indirect
|
||||
github.com/lmittmann/tint v1.1.2 // indirect
|
||||
github.com/mattn/go-colorable v0.1.14 // indirect
|
||||
github.com/mattn/go-isatty v0.0.20 // indirect
|
||||
github.com/pjbgf/sha1cd v0.5.0 // indirect
|
||||
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c // indirect
|
||||
github.com/rivo/uniseg v0.4.7 // indirect
|
||||
github.com/samber/lo v1.52.0 // indirect
|
||||
github.com/sergi/go-diff v1.4.0 // indirect
|
||||
github.com/skeema/knownhosts v1.3.2 // indirect
|
||||
github.com/wailsapp/go-webview2 v1.0.23 // indirect
|
||||
github.com/xanzy/ssh-agent v0.3.3 // indirect
|
||||
golang.org/x/crypto v0.47.0 // indirect
|
||||
golang.org/x/mod v0.32.0 // indirect
|
||||
golang.org/x/net v0.49.0 // indirect
|
||||
golang.org/x/sys v0.40.0 // indirect
|
||||
golang.org/x/text v0.33.0 // indirect
|
||||
gopkg.in/warnings.v0 v0.1.2 // indirect
|
||||
)
|
||||
151
cmd/bugseti/go.sum
Normal file
151
cmd/bugseti/go.sum
Normal file
|
|
@ -0,0 +1,151 @@
|
|||
dario.cat/mergo v1.0.2 h1:85+piFYR1tMbRrLcDwR18y4UKJ3aH1Tbzi24VRW1TK8=
|
||||
dario.cat/mergo v1.0.2/go.mod h1:E/hbnu0NxMFBjpMIE34DRGLWqDy0g5FuKDhCb31ngxA=
|
||||
github.com/Microsoft/go-winio v0.5.2/go.mod h1:WpS1mjBmmwHBEWmogvA2mj8546UReBk4v8QkMxJ6pZY=
|
||||
github.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=
|
||||
github.com/Microsoft/go-winio v0.6.2/go.mod h1:yd8OoFMLzJbo9gZq8j5qaps8bJ9aShtEA8Ipt1oGCvU=
|
||||
github.com/ProtonMail/go-crypto v1.3.0 h1:ILq8+Sf5If5DCpHQp4PbZdS1J7HDFRXz/+xKBiRGFrw=
|
||||
github.com/ProtonMail/go-crypto v1.3.0/go.mod h1:9whxjD8Rbs29b4XWbB8irEcE8KHMqaR2e7GWU1R+/PE=
|
||||
github.com/adrg/xdg v0.5.3 h1:xRnxJXne7+oWDatRhR1JLnvuccuIeCoBu2rtuLqQB78=
|
||||
github.com/adrg/xdg v0.5.3/go.mod h1:nlTsY+NNiCBGCK2tpm09vRqfVzrc2fLmXGpBLF0zlTQ=
|
||||
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
|
||||
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be/go.mod h1:ySMOLuWl6zY27l47sB3qLNK6tF2fkHG55UZxx8oIVo4=
|
||||
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
|
||||
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5/go.mod h1:wHh0iHkYZB8zMSxRWpUBQtwG5a7fFgvEO+odwuTv2gs=
|
||||
github.com/bep/debounce v1.2.1 h1:v67fRdBA9UQu2NhLFXrSg0Brw7CexQekrBwDMM8bzeY=
|
||||
github.com/bep/debounce v1.2.1/go.mod h1:H8yggRPQKLUhUoqrJC1bO2xNya7vanpDl7xR3ISbCJ0=
|
||||
github.com/cloudflare/circl v1.6.3 h1:9GPOhQGF9MCYUeXyMYlqTR6a5gTrgR/fBLXvUgtVcg8=
|
||||
github.com/cloudflare/circl v1.6.3/go.mod h1:2eXP6Qfat4O/Yhh8BznvKnJ+uzEoTQ6jVKJRn81BiS4=
|
||||
github.com/coder/websocket v1.8.14 h1:9L0p0iKiNOibykf283eHkKUHHrpG7f65OE3BhhO7v9g=
|
||||
github.com/coder/websocket v1.8.14/go.mod h1:NX3SzP+inril6yawo5CQXx8+fk145lPDC6pumgx0mVg=
|
||||
github.com/cyphar/filepath-securejoin v0.6.1 h1:5CeZ1jPXEiYt3+Z6zqprSAgSWiggmpVyciv8syjIpVE=
|
||||
github.com/cyphar/filepath-securejoin v0.6.1/go.mod h1:A8hd4EnAeyujCJRrICiOWqjS1AX0a9kM5XL+NwKoYSc=
|
||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||
github.com/ebitengine/purego v0.9.1 h1:a/k2f2HQU3Pi399RPW1MOaZyhKJL9w/xFpKAg4q1s0A=
|
||||
github.com/ebitengine/purego v0.9.1/go.mod h1:iIjxzd6CiRiOG0UyXP+V1+jWqUXVjPKLAI0mRfJZTmQ=
|
||||
github.com/elazarl/goproxy v1.7.2 h1:Y2o6urb7Eule09PjlhQRGNsqRfPmYI3KKQLFpCAV3+o=
|
||||
github.com/elazarl/goproxy v1.7.2/go.mod h1:82vkLNir0ALaW14Rc399OTTjyNREgmdL2cVoIbS6XaE=
|
||||
github.com/emirpasic/gods v1.18.1 h1:FXtiHYKDGKCW2KzwZKx0iC0PQmdlorYgdFG9jPXJ1Bc=
|
||||
github.com/emirpasic/gods v1.18.1/go.mod h1:8tpGGwCnJ5H4r6BWwaV6OrWmMoPhUl5jm/FMNAnJvWQ=
|
||||
github.com/gliderlabs/ssh v0.3.8 h1:a4YXD1V7xMF9g5nTkdfnja3Sxy1PVDCj1Zg4Wb8vY6c=
|
||||
github.com/gliderlabs/ssh v0.3.8/go.mod h1:xYoytBv1sV0aL3CavoDuJIQNURXkkfPA/wxQ1pL1fAU=
|
||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 h1:+zs/tPmkDkHx3U66DAb0lQFJrpS6731Oaa12ikc+DiI=
|
||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376/go.mod h1:an3vInlBmSxCcxctByoQdvwPiA7DTK7jaaFDBTtu0ic=
|
||||
github.com/go-git/go-billy/v5 v5.7.0 h1:83lBUJhGWhYp0ngzCMSgllhUSuoHP1iEWYjsPl9nwqM=
|
||||
github.com/go-git/go-billy/v5 v5.7.0/go.mod h1:/1IUejTKH8xipsAcdfcSAlUlo2J7lkYV8GTKxAT/L3E=
|
||||
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399 h1:eMje31YglSBqCdIqdhKBW8lokaMrL3uTkpGYlE2OOT4=
|
||||
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399/go.mod h1:1OCfN199q1Jm3HZlxleg+Dw/mwps2Wbk9frAWm+4FII=
|
||||
github.com/go-git/go-git/v5 v5.16.4 h1:7ajIEZHZJULcyJebDLo99bGgS0jRrOxzZG4uCk2Yb2Y=
|
||||
github.com/go-git/go-git/v5 v5.16.4/go.mod h1:4Ge4alE/5gPs30F2H1esi2gPd69R0C39lolkucHBOp8=
|
||||
github.com/go-json-experiment/json v0.0.0-20251027170946-4849db3c2f7e h1:Lf/gRkoycfOBPa42vU2bbgPurFong6zXeFtPoxholzU=
|
||||
github.com/go-json-experiment/json v0.0.0-20251027170946-4849db3c2f7e/go.mod h1:uNVvRXArCGbZ508SxYYTC5v1JWoz2voff5pm25jU1Ok=
|
||||
github.com/go-ole/go-ole v1.3.0 h1:Dt6ye7+vXGIKZ7Xtk4s6/xVdGDQynvom7xCFEdWr6uE=
|
||||
github.com/go-ole/go-ole v1.3.0/go.mod h1:5LS6F96DhAwUc7C+1HLexzMXY1xGRSryjyPPKW6zv78=
|
||||
github.com/godbus/dbus/v5 v5.2.2 h1:TUR3TgtSVDmjiXOgAAyaZbYmIeP3DPkld3jgKGV8mXQ=
|
||||
github.com/godbus/dbus/v5 v5.2.2/go.mod h1:3AAv2+hPq5rdnr5txxxRwiGjPXamgoIHgz9FPBfOp3c=
|
||||
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 h1:f+oWsMOmNPc8JmEHVZIycC7hBoQxHH9pNKQORJNozsQ=
|
||||
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8/go.mod h1:wcDNUvekVysuuOpQKo3191zZyTpiI6se1N1ULghS0sw=
|
||||
github.com/google/go-cmp v0.7.0 h1:wk8382ETsv4JYUZwIsn6YpYiWiBsYLSJiTsyBybVuN8=
|
||||
github.com/google/go-cmp v0.7.0/go.mod h1:pXiqmnSA92OHEEa9HXL2W4E7lf9JzCmGVUdgjX3N/iU=
|
||||
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
|
||||
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
|
||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=
|
||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=
|
||||
github.com/jchv/go-winloader v0.0.0-20250406163304-c1995be93bd1 h1:njuLRcjAuMKr7kI3D85AXWkw6/+v9PwtV6M6o11sWHQ=
|
||||
github.com/jchv/go-winloader v0.0.0-20250406163304-c1995be93bd1/go.mod h1:alcuEEnZsY1WQsagKhZDsoPCRoOijYqhZvPwLG0kzVs=
|
||||
github.com/kevinburke/ssh_config v1.4.0 h1:6xxtP5bZ2E4NF5tuQulISpTO2z8XbtH8cg1PWkxoFkQ=
|
||||
github.com/kevinburke/ssh_config v1.4.0/go.mod h1:q2RIzfka+BXARoNexmF9gkxEX7DmvbW9P4hIVx2Kg4M=
|
||||
github.com/klauspost/cpuid/v2 v2.3.0 h1:S4CRMLnYUhGeDFDqkGriYKdfoFlDnMtqTiI/sFzhA9Y=
|
||||
github.com/klauspost/cpuid/v2 v2.3.0/go.mod h1:hqwkgyIinND0mEev00jJYCxPNVRVXFQeu1XKlok6oO0=
|
||||
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
|
||||
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
|
||||
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
|
||||
github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
|
||||
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
||||
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||
github.com/leaanthony/go-ansi-parser v1.6.1 h1:xd8bzARK3dErqkPFtoF9F3/HgN8UQk0ed1YDKpEz01A=
|
||||
github.com/leaanthony/go-ansi-parser v1.6.1/go.mod h1:+vva/2y4alzVmmIEpk9QDhA7vLC5zKDTRwfZGOp3IWU=
|
||||
github.com/leaanthony/u v1.1.1 h1:TUFjwDGlNX+WuwVEzDqQwC2lOv0P4uhTQw7CMFdiK7M=
|
||||
github.com/leaanthony/u v1.1.1/go.mod h1:9+o6hejoRljvZ3BzdYlVL0JYCwtnAsVuN9pVTQcaRfI=
|
||||
github.com/lmittmann/tint v1.1.2 h1:2CQzrL6rslrsyjqLDwD11bZ5OpLBPU+g3G/r5LSfS8w=
|
||||
github.com/lmittmann/tint v1.1.2/go.mod h1:HIS3gSy7qNwGCj+5oRjAutErFBl4BzdQP6cJZ0NfMwE=
|
||||
github.com/matryer/is v1.4.0/go.mod h1:8I/i5uYgLzgsgEloJE1U6xx5HkBQpAZvepWuujKwMRU=
|
||||
github.com/matryer/is v1.4.1 h1:55ehd8zaGABKLXQUe2awZ99BD/PTc2ls+KV/dXphgEQ=
|
||||
github.com/matryer/is v1.4.1/go.mod h1:8I/i5uYgLzgsgEloJE1U6xx5HkBQpAZvepWuujKwMRU=
|
||||
github.com/mattn/go-colorable v0.1.14 h1:9A9LHSqF/7dyVVX6g0U9cwm9pG3kP9gSzcuIPHPsaIE=
|
||||
github.com/mattn/go-colorable v0.1.14/go.mod h1:6LmQG8QLFO4G5z1gPvYEzlUgJ2wF+stgPZH1UqBm1s8=
|
||||
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
|
||||
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
|
||||
github.com/onsi/gomega v1.34.1 h1:EUMJIKUjM8sKjYbtxQI9A4z2o+rruxnzNvpknOXie6k=
|
||||
github.com/onsi/gomega v1.34.1/go.mod h1:kU1QgUvBDLXBJq618Xvm2LUX6rSAfRaFRTcdOeDLwwY=
|
||||
github.com/pjbgf/sha1cd v0.5.0 h1:a+UkboSi1znleCDUNT3M5YxjOnN1fz2FhN48FlwCxs0=
|
||||
github.com/pjbgf/sha1cd v0.5.0/go.mod h1:lhpGlyHLpQZoxMv8HcgXvZEhcGs0PG/vsZnEJ7H0iCM=
|
||||
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c h1:+mdjkGKdHQG3305AYmdv1U2eRNDiU2ErMBj1gwrq8eQ=
|
||||
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c/go.mod h1:7rwL4CYBLnjLxUqIJNnCWiEdr3bn6IUYi15bNlnbCCU=
|
||||
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
|
||||
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
|
||||
github.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=
|
||||
github.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
||||
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
||||
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
||||
github.com/samber/lo v1.52.0 h1:Rvi+3BFHES3A8meP33VPAxiBZX/Aws5RxrschYGjomw=
|
||||
github.com/samber/lo v1.52.0/go.mod h1:4+MXEGsJzbKGaUEQFKBq2xtfuznW9oz/WrgyzMzRoM0=
|
||||
github.com/sergi/go-diff v1.4.0 h1:n/SP9D5ad1fORl+llWyN+D6qoUETXNZARKjyY2/KVCw=
|
||||
github.com/sergi/go-diff v1.4.0/go.mod h1:A0bzQcvG0E7Rwjx0REVgAGH58e96+X0MeOfepqsbeW4=
|
||||
github.com/sirupsen/logrus v1.7.0/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=
|
||||
github.com/skeema/knownhosts v1.3.2 h1:EDL9mgf4NzwMXCTfaxSD/o/a5fxDw/xL9nkU28JjdBg=
|
||||
github.com/skeema/knownhosts v1.3.2/go.mod h1:bEg3iQAuw+jyiw+484wwFJoKSLwcfd7fqRy+N0QTiow=
|
||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
|
||||
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
|
||||
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
|
||||
github.com/wailsapp/go-webview2 v1.0.23 h1:jmv8qhz1lHibCc79bMM/a/FqOnnzOGEisLav+a0b9P0=
|
||||
github.com/wailsapp/go-webview2 v1.0.23/go.mod h1:qJmWAmAmaniuKGZPWwne+uor3AHMB5PFhqiK0Bbj8kc=
|
||||
github.com/wailsapp/wails/v3 v3.0.0-alpha.64 h1:xAhLFVfdbg7XdZQ5mMQmBv2BglWu8hMqe50Z+3UJvBs=
|
||||
github.com/wailsapp/wails/v3 v3.0.0-alpha.64/go.mod h1:zvgNL/mlFcX8aRGu6KOz9AHrMmTBD+4hJRQIONqF/Yw=
|
||||
github.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=
|
||||
github.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=
|
||||
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
|
||||
golang.org/x/crypto v0.47.0 h1:V6e3FRj+n4dbpw86FJ8Fv7XVOql7TEwpHapKoMJ/GO8=
|
||||
golang.org/x/crypto v0.47.0/go.mod h1:ff3Y9VzzKbwSSEzWqJsJVBnWmRwRSHt/6Op5n9bQc4A=
|
||||
golang.org/x/exp v0.0.0-20260112195511-716be5621a96 h1:Z/6YuSHTLOHfNFdb8zVZomZr7cqNgTJvA8+Qz75D8gU=
|
||||
golang.org/x/exp v0.0.0-20260112195511-716be5621a96/go.mod h1:nzimsREAkjBCIEFtHiYkrJyT+2uy9YZJB7H1k68CXZU=
|
||||
golang.org/x/mod v0.32.0 h1:9F4d3PHLljb6x//jOyokMv3eX+YDeepZSEo3mFJy93c=
|
||||
golang.org/x/mod v0.32.0/go.mod h1:SgipZ/3h2Ci89DlEtEXWUk/HteuRin+HHhN+WbNhguU=
|
||||
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||
golang.org/x/net v0.49.0 h1:eeHFmOGUTtaaPSGNmjBKpbng9MulQsJURQUAfUwY++o=
|
||||
golang.org/x/net v0.49.0/go.mod h1:/ysNB2EvaqvesRkuLAyjI1ycPZlQHM3q01F02UY/MV8=
|
||||
golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20200810151505-1b9f1253b3ed/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.1.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||
golang.org/x/sys v0.40.0 h1:DBZZqJ2Rkml6QMQsZywtnjnnGvHza6BTfYFWY9kjEWQ=
|
||||
golang.org/x/sys v0.40.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||
golang.org/x/term v0.39.0 h1:RclSuaJf32jOqZz74CkPA9qFuVTX7vhLlpfj/IGWlqY=
|
||||
golang.org/x/term v0.39.0/go.mod h1:yxzUCTP/U+FzoxfdKmLaA0RV1WgE0VY7hXBwKtY/4ww=
|
||||
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||
golang.org/x/text v0.33.0 h1:B3njUFyqtHDUI5jMn1YIr5B0IE2U0qck04r6d4KPAxE=
|
||||
golang.org/x/text v0.33.0/go.mod h1:LuMebE6+rBincTi9+xWTY8TztLzKHc/9C1uBCG27+q8=
|
||||
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
||||
gopkg.in/check.v1 v0.0.0-20161208181325-20d25e280405/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20190902080502-41f04d3bba15/go.mod h1:Co6ibVJAznAaIkqp8huTwlJQCZ016jof/cbN4VW5Yz0=
|
||||
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c h1:Hei/4ADfdWqJk1ZMxUNpqntNwaWcugrBjAiHlqqRiVk=
|
||||
gopkg.in/check.v1 v1.0.0-20201130134442-10cb98267c6c/go.mod h1:JHkPIbrfpd72SG/EVd6muEfDQjcINNoR0C8j2r3qZ4Q=
|
||||
gopkg.in/warnings.v0 v0.1.2 h1:wFXVbFY8DY5/xOe1ECiWdKCzZlxgshcYVNkBHstARME=
|
||||
gopkg.in/warnings.v0 v0.1.2/go.mod h1:jksf8JmL6Qr/oQM2OXTHunEvvTAsrWBLb6OOjuVWRNI=
|
||||
gopkg.in/yaml.v2 v2.2.2/go.mod h1:hI93XBmqTisBFMUTm0b8Fm+jr3Dg1NNxqwp+5A1VGuI=
|
||||
gopkg.in/yaml.v2 v2.4.0/go.mod h1:RDklbk79AGWmwhnvt/jBztapEOGDOx6ZbXqjP6csGnQ=
|
||||
gopkg.in/yaml.v3 v3.0.1 h1:fxVm/GzAzEWqLHuvctI91KS9hhNmmWOoWu0XTYJS7CA=
|
||||
gopkg.in/yaml.v3 v3.0.1/go.mod h1:K4uyk7z7BCEPqu6E+C64Yfv1cQ7kz7rIZviUmN+EgEM=
|
||||
BIN
cmd/bugseti/icons/appicon.png
Normal file
BIN
cmd/bugseti/icons/appicon.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 76 B |
25
cmd/bugseti/icons/icons.go
Normal file
25
cmd/bugseti/icons/icons.go
Normal file
|
|
@ -0,0 +1,25 @@
|
|||
// Package icons provides embedded icon assets for the BugSETI application.
|
||||
package icons
|
||||
|
||||
import _ "embed"
|
||||
|
||||
// TrayTemplate is the template icon for macOS systray (22x22 PNG, black on transparent).
|
||||
// Template icons automatically adapt to light/dark mode on macOS.
|
||||
//
|
||||
//go:embed tray-template.png
|
||||
var TrayTemplate []byte
|
||||
|
||||
// TrayLight is the light mode icon for Windows/Linux systray.
|
||||
//
|
||||
//go:embed tray-light.png
|
||||
var TrayLight []byte
|
||||
|
||||
// TrayDark is the dark mode icon for Windows/Linux systray.
|
||||
//
|
||||
//go:embed tray-dark.png
|
||||
var TrayDark []byte
|
||||
|
||||
// AppIcon is the main application icon.
|
||||
//
|
||||
//go:embed appicon.png
|
||||
var AppIcon []byte
|
||||
BIN
cmd/bugseti/icons/tray-dark.png
Normal file
BIN
cmd/bugseti/icons/tray-dark.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 76 B |
BIN
cmd/bugseti/icons/tray-light.png
Normal file
BIN
cmd/bugseti/icons/tray-light.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 76 B |
BIN
cmd/bugseti/icons/tray-template.png
Normal file
BIN
cmd/bugseti/icons/tray-template.png
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 76 B |
242
cmd/bugseti/main.go
Normal file
242
cmd/bugseti/main.go
Normal file
|
|
@ -0,0 +1,242 @@
|
|||
// Package main provides the BugSETI system tray application.
|
||||
// BugSETI - "Distributed Bug Fixing like SETI@home but for code"
|
||||
//
|
||||
// The application runs as a system tray app that:
|
||||
// - Pulls OSS issues from GitHub
|
||||
// - Uses AI to prepare context for each issue
|
||||
// - Presents issues to users for fixing
|
||||
// - Automates PR submission
|
||||
package main
|
||||
|
||||
import (
|
||||
"embed"
|
||||
"io/fs"
|
||||
"log"
|
||||
"runtime"
|
||||
|
||||
"github.com/host-uk/core/cmd/bugseti/icons"
|
||||
"github.com/host-uk/core/internal/bugseti"
|
||||
"github.com/host-uk/core/internal/bugseti/updater"
|
||||
"github.com/wailsapp/wails/v3/pkg/application"
|
||||
)
|
||||
|
||||
//go:embed all:frontend/dist/bugseti/browser
|
||||
var assets embed.FS
|
||||
|
||||
func main() {
|
||||
// Strip the embed path prefix so files are served from root
|
||||
staticAssets, err := fs.Sub(assets, "frontend/dist/bugseti/browser")
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Initialize the config service
|
||||
configService := bugseti.NewConfigService()
|
||||
if err := configService.Load(); err != nil {
|
||||
log.Printf("Warning: Could not load config: %v", err)
|
||||
}
|
||||
|
||||
// Initialize core services
|
||||
notifyService := bugseti.NewNotifyService()
|
||||
statsService := bugseti.NewStatsService(configService)
|
||||
fetcherService := bugseti.NewFetcherService(configService, notifyService)
|
||||
queueService := bugseti.NewQueueService(configService)
|
||||
seederService := bugseti.NewSeederService(configService)
|
||||
submitService := bugseti.NewSubmitService(configService, notifyService, statsService)
|
||||
versionService := bugseti.NewVersionService()
|
||||
|
||||
// Initialize update service
|
||||
updateService, err := updater.NewService(configService)
|
||||
if err != nil {
|
||||
log.Printf("Warning: Could not initialize update service: %v", err)
|
||||
}
|
||||
|
||||
// Create the tray service (we'll set the app reference later)
|
||||
trayService := NewTrayService(nil)
|
||||
|
||||
// Build services list
|
||||
services := []application.Service{
|
||||
application.NewService(configService),
|
||||
application.NewService(notifyService),
|
||||
application.NewService(statsService),
|
||||
application.NewService(fetcherService),
|
||||
application.NewService(queueService),
|
||||
application.NewService(seederService),
|
||||
application.NewService(submitService),
|
||||
application.NewService(versionService),
|
||||
application.NewService(trayService),
|
||||
}
|
||||
|
||||
// Add update service if available
|
||||
if updateService != nil {
|
||||
services = append(services, application.NewService(updateService))
|
||||
}
|
||||
|
||||
// Create the application
|
||||
app := application.New(application.Options{
|
||||
Name: "BugSETI",
|
||||
Description: "Distributed Bug Fixing - like SETI@home but for code",
|
||||
Services: services,
|
||||
Assets: application.AssetOptions{
|
||||
Handler: application.AssetFileServerFS(staticAssets),
|
||||
},
|
||||
Mac: application.MacOptions{
|
||||
ActivationPolicy: application.ActivationPolicyAccessory,
|
||||
},
|
||||
})
|
||||
|
||||
// Set the app reference and services in tray service
|
||||
trayService.app = app
|
||||
trayService.SetServices(fetcherService, queueService, configService, statsService)
|
||||
|
||||
// Set up system tray
|
||||
setupSystemTray(app, fetcherService, queueService, configService)
|
||||
|
||||
// Start update service background checker
|
||||
if updateService != nil {
|
||||
updateService.Start()
|
||||
}
|
||||
|
||||
log.Println("Starting BugSETI...")
|
||||
log.Println(" - System tray active")
|
||||
log.Println(" - Waiting for issues...")
|
||||
log.Printf(" - Version: %s (%s)", bugseti.GetVersion(), bugseti.GetChannel())
|
||||
|
||||
if err := app.Run(); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Stop update service on exit
|
||||
if updateService != nil {
|
||||
updateService.Stop()
|
||||
}
|
||||
}
|
||||
|
||||
// setupSystemTray configures the system tray icon and menu
|
||||
func setupSystemTray(app *application.App, fetcher *bugseti.FetcherService, queue *bugseti.QueueService, config *bugseti.ConfigService) {
|
||||
systray := app.SystemTray.New()
|
||||
systray.SetTooltip("BugSETI - Distributed Bug Fixing")
|
||||
|
||||
// Set tray icon based on OS
|
||||
if runtime.GOOS == "darwin" {
|
||||
systray.SetTemplateIcon(icons.TrayTemplate)
|
||||
} else {
|
||||
systray.SetDarkModeIcon(icons.TrayDark)
|
||||
systray.SetIcon(icons.TrayLight)
|
||||
}
|
||||
|
||||
// Create tray panel window (workbench preview)
|
||||
trayWindow := app.Window.NewWithOptions(application.WebviewWindowOptions{
|
||||
Name: "tray-panel",
|
||||
Title: "BugSETI",
|
||||
Width: 420,
|
||||
Height: 520,
|
||||
URL: "/tray",
|
||||
Hidden: true,
|
||||
Frameless: true,
|
||||
BackgroundColour: application.NewRGB(22, 27, 34),
|
||||
})
|
||||
systray.AttachWindow(trayWindow).WindowOffset(5)
|
||||
|
||||
// Create main workbench window
|
||||
workbenchWindow := app.Window.NewWithOptions(application.WebviewWindowOptions{
|
||||
Name: "workbench",
|
||||
Title: "BugSETI Workbench",
|
||||
Width: 1200,
|
||||
Height: 800,
|
||||
URL: "/workbench",
|
||||
Hidden: true,
|
||||
BackgroundColour: application.NewRGB(22, 27, 34),
|
||||
})
|
||||
|
||||
// Create settings window
|
||||
settingsWindow := app.Window.NewWithOptions(application.WebviewWindowOptions{
|
||||
Name: "settings",
|
||||
Title: "BugSETI Settings",
|
||||
Width: 600,
|
||||
Height: 500,
|
||||
URL: "/settings",
|
||||
Hidden: true,
|
||||
BackgroundColour: application.NewRGB(22, 27, 34),
|
||||
})
|
||||
|
||||
// Create onboarding window
|
||||
onboardingWindow := app.Window.NewWithOptions(application.WebviewWindowOptions{
|
||||
Name: "onboarding",
|
||||
Title: "Welcome to BugSETI",
|
||||
Width: 700,
|
||||
Height: 600,
|
||||
URL: "/onboarding",
|
||||
Hidden: true,
|
||||
BackgroundColour: application.NewRGB(22, 27, 34),
|
||||
})
|
||||
|
||||
// Build tray menu
|
||||
trayMenu := app.Menu.New()
|
||||
|
||||
// Status item (dynamic)
|
||||
statusItem := trayMenu.Add("Status: Idle")
|
||||
statusItem.SetEnabled(false)
|
||||
|
||||
trayMenu.AddSeparator()
|
||||
|
||||
// Start/Pause toggle
|
||||
startPauseItem := trayMenu.Add("Start Fetching")
|
||||
startPauseItem.OnClick(func(ctx *application.Context) {
|
||||
if fetcher.IsRunning() {
|
||||
fetcher.Pause()
|
||||
startPauseItem.SetLabel("Start Fetching")
|
||||
statusItem.SetLabel("Status: Paused")
|
||||
} else {
|
||||
fetcher.Start()
|
||||
startPauseItem.SetLabel("Pause")
|
||||
statusItem.SetLabel("Status: Running")
|
||||
}
|
||||
})
|
||||
|
||||
trayMenu.AddSeparator()
|
||||
|
||||
// Current Issue
|
||||
currentIssueItem := trayMenu.Add("Current Issue: None")
|
||||
currentIssueItem.OnClick(func(ctx *application.Context) {
|
||||
if issue := queue.CurrentIssue(); issue != nil {
|
||||
workbenchWindow.Show()
|
||||
workbenchWindow.Focus()
|
||||
}
|
||||
})
|
||||
|
||||
// Open Workbench
|
||||
trayMenu.Add("Open Workbench").OnClick(func(ctx *application.Context) {
|
||||
workbenchWindow.Show()
|
||||
workbenchWindow.Focus()
|
||||
})
|
||||
|
||||
trayMenu.AddSeparator()
|
||||
|
||||
// Settings
|
||||
trayMenu.Add("Settings...").OnClick(func(ctx *application.Context) {
|
||||
settingsWindow.Show()
|
||||
settingsWindow.Focus()
|
||||
})
|
||||
|
||||
// Stats submenu
|
||||
statsMenu := trayMenu.AddSubmenu("Stats")
|
||||
statsMenu.Add("Issues Fixed: 0").SetEnabled(false)
|
||||
statsMenu.Add("PRs Merged: 0").SetEnabled(false)
|
||||
statsMenu.Add("Repos Contributed: 0").SetEnabled(false)
|
||||
|
||||
trayMenu.AddSeparator()
|
||||
|
||||
// Quit
|
||||
trayMenu.Add("Quit BugSETI").OnClick(func(ctx *application.Context) {
|
||||
app.Quit()
|
||||
})
|
||||
|
||||
systray.SetMenu(trayMenu)
|
||||
|
||||
// Check if onboarding needed
|
||||
if !config.IsOnboarded() {
|
||||
onboardingWindow.Show()
|
||||
onboardingWindow.Focus()
|
||||
}
|
||||
}
|
||||
158
cmd/bugseti/tray.go
Normal file
158
cmd/bugseti/tray.go
Normal file
|
|
@ -0,0 +1,158 @@
|
|||
// Package main provides the BugSETI system tray application.
|
||||
package main
|
||||
|
||||
import (
|
||||
"context"
|
||||
"log"
|
||||
|
||||
"github.com/host-uk/core/internal/bugseti"
|
||||
"github.com/wailsapp/wails/v3/pkg/application"
|
||||
)
|
||||
|
||||
// TrayService provides system tray bindings for the frontend.
|
||||
type TrayService struct {
|
||||
app *application.App
|
||||
fetcher *bugseti.FetcherService
|
||||
queue *bugseti.QueueService
|
||||
config *bugseti.ConfigService
|
||||
stats *bugseti.StatsService
|
||||
}
|
||||
|
||||
// NewTrayService creates a new TrayService instance.
|
||||
func NewTrayService(app *application.App) *TrayService {
|
||||
return &TrayService{
|
||||
app: app,
|
||||
}
|
||||
}
|
||||
|
||||
// SetServices sets the service references after initialization.
|
||||
func (t *TrayService) SetServices(fetcher *bugseti.FetcherService, queue *bugseti.QueueService, config *bugseti.ConfigService, stats *bugseti.StatsService) {
|
||||
t.fetcher = fetcher
|
||||
t.queue = queue
|
||||
t.config = config
|
||||
t.stats = stats
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (t *TrayService) ServiceName() string {
|
||||
return "TrayService"
|
||||
}
|
||||
|
||||
// ServiceStartup is called when the Wails application starts.
|
||||
func (t *TrayService) ServiceStartup(ctx context.Context, options application.ServiceOptions) error {
|
||||
log.Println("TrayService started")
|
||||
return nil
|
||||
}
|
||||
|
||||
// ServiceShutdown is called when the Wails application shuts down.
|
||||
func (t *TrayService) ServiceShutdown() error {
|
||||
log.Println("TrayService shutdown")
|
||||
return nil
|
||||
}
|
||||
|
||||
// TrayStatus represents the current status of the tray.
|
||||
type TrayStatus struct {
|
||||
Running bool `json:"running"`
|
||||
CurrentIssue string `json:"currentIssue"`
|
||||
QueueSize int `json:"queueSize"`
|
||||
IssuesFixed int `json:"issuesFixed"`
|
||||
PRsMerged int `json:"prsMerged"`
|
||||
}
|
||||
|
||||
// GetStatus returns the current tray status.
|
||||
func (t *TrayService) GetStatus() TrayStatus {
|
||||
var currentIssue string
|
||||
if t.queue != nil {
|
||||
if issue := t.queue.CurrentIssue(); issue != nil {
|
||||
currentIssue = issue.Title
|
||||
}
|
||||
}
|
||||
|
||||
var queueSize int
|
||||
if t.queue != nil {
|
||||
queueSize = t.queue.Size()
|
||||
}
|
||||
|
||||
var running bool
|
||||
if t.fetcher != nil {
|
||||
running = t.fetcher.IsRunning()
|
||||
}
|
||||
|
||||
var issuesFixed, prsMerged int
|
||||
if t.stats != nil {
|
||||
stats := t.stats.GetStats()
|
||||
issuesFixed = stats.IssuesAttempted
|
||||
prsMerged = stats.PRsMerged
|
||||
}
|
||||
|
||||
return TrayStatus{
|
||||
Running: running,
|
||||
CurrentIssue: currentIssue,
|
||||
QueueSize: queueSize,
|
||||
IssuesFixed: issuesFixed,
|
||||
PRsMerged: prsMerged,
|
||||
}
|
||||
}
|
||||
|
||||
// StartFetching starts the issue fetcher.
|
||||
func (t *TrayService) StartFetching() error {
|
||||
if t.fetcher == nil {
|
||||
return nil
|
||||
}
|
||||
return t.fetcher.Start()
|
||||
}
|
||||
|
||||
// PauseFetching pauses the issue fetcher.
|
||||
func (t *TrayService) PauseFetching() {
|
||||
if t.fetcher != nil {
|
||||
t.fetcher.Pause()
|
||||
}
|
||||
}
|
||||
|
||||
// GetCurrentIssue returns the current issue being worked on.
|
||||
func (t *TrayService) GetCurrentIssue() *bugseti.Issue {
|
||||
if t.queue == nil {
|
||||
return nil
|
||||
}
|
||||
return t.queue.CurrentIssue()
|
||||
}
|
||||
|
||||
// NextIssue moves to the next issue in the queue.
|
||||
func (t *TrayService) NextIssue() *bugseti.Issue {
|
||||
if t.queue == nil {
|
||||
return nil
|
||||
}
|
||||
return t.queue.Next()
|
||||
}
|
||||
|
||||
// SkipIssue skips the current issue.
|
||||
func (t *TrayService) SkipIssue() {
|
||||
if t.queue == nil {
|
||||
return
|
||||
}
|
||||
t.queue.Skip()
|
||||
}
|
||||
|
||||
// ShowWindow shows a specific window by name.
|
||||
func (t *TrayService) ShowWindow(name string) {
|
||||
if t.app == nil {
|
||||
return
|
||||
}
|
||||
// Window will be shown by the frontend via Wails runtime
|
||||
}
|
||||
|
||||
// IsOnboarded returns whether the user has completed onboarding.
|
||||
func (t *TrayService) IsOnboarded() bool {
|
||||
if t.config == nil {
|
||||
return false
|
||||
}
|
||||
return t.config.IsOnboarded()
|
||||
}
|
||||
|
||||
// CompleteOnboarding marks onboarding as complete.
|
||||
func (t *TrayService) CompleteOnboarding() error {
|
||||
if t.config == nil {
|
||||
return nil
|
||||
}
|
||||
return t.config.CompleteOnboarding()
|
||||
}
|
||||
|
|
@ -160,7 +160,10 @@ dev:
|
|||
|
||||
test:
|
||||
parallel: true
|
||||
coverage: false
|
||||
coverage: true
|
||||
thresholds:
|
||||
statements: 40
|
||||
branches: 35
|
||||
|
||||
deploy:
|
||||
coolify:
|
||||
|
|
|
|||
470
docs/mcp/angular-testing.md
Normal file
470
docs/mcp/angular-testing.md
Normal file
|
|
@ -0,0 +1,470 @@
|
|||
# Angular Testing with Webview MCP Tools
|
||||
|
||||
This guide explains how to use the webview MCP tools to automate testing of Angular applications via Chrome DevTools Protocol (CDP).
|
||||
|
||||
## Prerequisites
|
||||
|
||||
1. **Chrome/Chromium Browser**: Installed and accessible
|
||||
2. **Remote Debugging Port**: Chrome must be started with remote debugging enabled
|
||||
|
||||
### Starting Chrome with Remote Debugging
|
||||
|
||||
```bash
|
||||
# Linux
|
||||
google-chrome --remote-debugging-port=9222
|
||||
|
||||
# macOS
|
||||
/Applications/Google\ Chrome.app/Contents/MacOS/Google\ Chrome --remote-debugging-port=9222
|
||||
|
||||
# Windows
|
||||
"C:\Program Files\Google\Chrome\Application\chrome.exe" --remote-debugging-port=9222
|
||||
|
||||
# Headless mode (no visible window)
|
||||
google-chrome --headless --remote-debugging-port=9222
|
||||
```
|
||||
|
||||
## Available MCP Tools
|
||||
|
||||
### Connection Management
|
||||
|
||||
#### webview_connect
|
||||
Connect to Chrome DevTools.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_connect",
|
||||
"arguments": {
|
||||
"debug_url": "http://localhost:9222",
|
||||
"timeout": 30
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### webview_disconnect
|
||||
Disconnect from Chrome DevTools.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_disconnect",
|
||||
"arguments": {}
|
||||
}
|
||||
```
|
||||
|
||||
### Navigation
|
||||
|
||||
#### webview_navigate
|
||||
Navigate to a URL.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_navigate",
|
||||
"arguments": {
|
||||
"url": "http://localhost:4200"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### DOM Interaction
|
||||
|
||||
#### webview_click
|
||||
Click an element by CSS selector.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_click",
|
||||
"arguments": {
|
||||
"selector": "#login-button"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### webview_type
|
||||
Type text into an element.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_type",
|
||||
"arguments": {
|
||||
"selector": "#email-input",
|
||||
"text": "user@example.com"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### webview_query
|
||||
Query DOM elements.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_query",
|
||||
"arguments": {
|
||||
"selector": ".error-message",
|
||||
"all": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### webview_wait
|
||||
Wait for an element to appear.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_wait",
|
||||
"arguments": {
|
||||
"selector": ".loading-spinner",
|
||||
"timeout": 10
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### JavaScript Evaluation
|
||||
|
||||
#### webview_eval
|
||||
Execute JavaScript in the browser context.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_eval",
|
||||
"arguments": {
|
||||
"script": "document.title"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Console & Debugging
|
||||
|
||||
#### webview_console
|
||||
Get browser console output.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_console",
|
||||
"arguments": {
|
||||
"clear": false
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
#### webview_screenshot
|
||||
Capture a screenshot.
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_screenshot",
|
||||
"arguments": {
|
||||
"format": "png"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Angular-Specific Testing Patterns
|
||||
|
||||
### 1. Waiting for Angular Zone Stability
|
||||
|
||||
Before interacting with Angular components, wait for Zone.js to become stable:
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_eval",
|
||||
"arguments": {
|
||||
"script": "(function() { const roots = window.getAllAngularRootElements(); if (!roots.length) return true; const injector = window.ng.probe(roots[0]).injector; const zone = injector.get('NgZone'); return zone.isStable; })()"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Navigating with Angular Router
|
||||
|
||||
Use the Angular Router for client-side navigation:
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_eval",
|
||||
"arguments": {
|
||||
"script": "(function() { const roots = window.getAllAngularRootElements(); const injector = window.ng.probe(roots[0]).injector; const router = injector.get('Router'); router.navigateByUrl('/dashboard'); return true; })()"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 3. Accessing Component Properties
|
||||
|
||||
Read or modify component state:
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_eval",
|
||||
"arguments": {
|
||||
"script": "(function() { const el = document.querySelector('app-user-profile'); const component = window.ng.probe(el).componentInstance; return component.user; })()"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 4. Triggering Change Detection
|
||||
|
||||
Force Angular to update the view:
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_eval",
|
||||
"arguments": {
|
||||
"script": "(function() { const roots = window.getAllAngularRootElements(); const injector = window.ng.probe(roots[0]).injector; const appRef = injector.get('ApplicationRef'); appRef.tick(); return true; })()"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### 5. Testing Form Validation
|
||||
|
||||
Check Angular form state:
|
||||
|
||||
```json
|
||||
{
|
||||
"tool": "webview_eval",
|
||||
"arguments": {
|
||||
"script": "(function() { const form = document.querySelector('form'); const component = window.ng.probe(form).componentInstance; return { valid: component.form.valid, errors: component.form.errors }; })()"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Complete Test Flow Example
|
||||
|
||||
Here's a complete example testing an Angular login flow:
|
||||
|
||||
### Step 1: Connect to Chrome
|
||||
|
||||
```json
|
||||
{"tool": "webview_connect", "arguments": {"debug_url": "http://localhost:9222"}}
|
||||
```
|
||||
|
||||
### Step 2: Navigate to the Application
|
||||
|
||||
```json
|
||||
{"tool": "webview_navigate", "arguments": {"url": "http://localhost:4200/login"}}
|
||||
```
|
||||
|
||||
### Step 3: Wait for Angular to Load
|
||||
|
||||
```json
|
||||
{"tool": "webview_wait", "arguments": {"selector": "app-login"}}
|
||||
```
|
||||
|
||||
### Step 4: Fill in Login Form
|
||||
|
||||
```json
|
||||
{"tool": "webview_type", "arguments": {"selector": "#email", "text": "test@example.com"}}
|
||||
{"tool": "webview_type", "arguments": {"selector": "#password", "text": "password123"}}
|
||||
```
|
||||
|
||||
### Step 5: Submit the Form
|
||||
|
||||
```json
|
||||
{"tool": "webview_click", "arguments": {"selector": "button[type='submit']"}}
|
||||
```
|
||||
|
||||
### Step 6: Wait for Navigation
|
||||
|
||||
```json
|
||||
{"tool": "webview_wait", "arguments": {"selector": "app-dashboard", "timeout": 10}}
|
||||
```
|
||||
|
||||
### Step 7: Verify Success
|
||||
|
||||
```json
|
||||
{"tool": "webview_eval", "arguments": {"script": "window.location.pathname === '/dashboard'"}}
|
||||
```
|
||||
|
||||
### Step 8: Check Console for Errors
|
||||
|
||||
```json
|
||||
{"tool": "webview_console", "arguments": {"clear": true}}
|
||||
```
|
||||
|
||||
### Step 9: Disconnect
|
||||
|
||||
```json
|
||||
{"tool": "webview_disconnect", "arguments": {}}
|
||||
```
|
||||
|
||||
## Debugging Tips
|
||||
|
||||
### 1. Check for JavaScript Errors
|
||||
|
||||
Always check the console output after operations:
|
||||
|
||||
```json
|
||||
{"tool": "webview_console", "arguments": {}}
|
||||
```
|
||||
|
||||
### 2. Take Screenshots on Failure
|
||||
|
||||
Capture the current state when something unexpected happens:
|
||||
|
||||
```json
|
||||
{"tool": "webview_screenshot", "arguments": {"format": "png"}}
|
||||
```
|
||||
|
||||
### 3. Inspect Element State
|
||||
|
||||
Query elements to understand their current state:
|
||||
|
||||
```json
|
||||
{"tool": "webview_query", "arguments": {"selector": ".my-component", "all": false}}
|
||||
```
|
||||
|
||||
### 4. Get Page Source
|
||||
|
||||
Retrieve the current HTML for debugging:
|
||||
|
||||
```json
|
||||
{"tool": "webview_eval", "arguments": {"script": "document.documentElement.outerHTML"}}
|
||||
```
|
||||
|
||||
## Common Issues
|
||||
|
||||
### Element Not Found
|
||||
|
||||
If `webview_click` or `webview_type` fails with "element not found":
|
||||
|
||||
1. Check the selector is correct
|
||||
2. Wait for the element to appear first
|
||||
3. Verify the element is visible (not hidden)
|
||||
|
||||
### Angular Not Detected
|
||||
|
||||
If Angular-specific scripts fail:
|
||||
|
||||
1. Ensure the Angular app has loaded completely
|
||||
2. Check that you're using Angular 2+ (not AngularJS)
|
||||
3. Verify the element has an Angular component attached
|
||||
|
||||
### Timeout Errors
|
||||
|
||||
If operations timeout:
|
||||
|
||||
1. Increase the timeout value
|
||||
2. Check for loading spinners or blocking operations
|
||||
3. Verify the network is working correctly
|
||||
|
||||
## Best Practices
|
||||
|
||||
1. **Always wait for elements** before interacting with them
|
||||
2. **Check console for errors** after each major step
|
||||
3. **Use explicit selectors** like IDs or data attributes
|
||||
4. **Clear console** at the start of each test
|
||||
5. **Disconnect** when done to free resources
|
||||
6. **Take screenshots** at key checkpoints
|
||||
7. **Handle async operations** by waiting for stability
|
||||
|
||||
## Go API Usage
|
||||
|
||||
For direct Go integration, use the `pkg/webview` package:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"log"
|
||||
"time"
|
||||
|
||||
"github.com/host-uk/core/pkg/webview"
|
||||
)
|
||||
|
||||
func main() {
|
||||
// Connect to Chrome
|
||||
wv, err := webview.New(
|
||||
webview.WithDebugURL("http://localhost:9222"),
|
||||
webview.WithTimeout(30*time.Second),
|
||||
)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
defer wv.Close()
|
||||
|
||||
// Navigate
|
||||
if err := wv.Navigate("http://localhost:4200"); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Wait for element
|
||||
if err := wv.WaitForSelector("app-root"); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Click button
|
||||
if err := wv.Click("#login-button"); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Type text
|
||||
if err := wv.Type("#email", "test@example.com"); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Get console output
|
||||
messages := wv.GetConsole()
|
||||
for _, msg := range messages {
|
||||
log.Printf("[%s] %s", msg.Type, msg.Text)
|
||||
}
|
||||
|
||||
// Take screenshot
|
||||
data, err := wv.Screenshot()
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
// Save data to file...
|
||||
}
|
||||
```
|
||||
|
||||
### Using Angular Helper
|
||||
|
||||
For Angular-specific operations:
|
||||
|
||||
```go
|
||||
package main
|
||||
|
||||
import (
|
||||
"log"
|
||||
"time"
|
||||
|
||||
"github.com/host-uk/core/pkg/webview"
|
||||
)
|
||||
|
||||
func main() {
|
||||
wv, err := webview.New(webview.WithDebugURL("http://localhost:9222"))
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
defer wv.Close()
|
||||
|
||||
// Create Angular helper
|
||||
angular := webview.NewAngularHelper(wv)
|
||||
|
||||
// Navigate using Angular Router
|
||||
if err := angular.NavigateByRouter("/dashboard"); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Wait for Angular to stabilize
|
||||
if err := angular.WaitForAngular(); err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
|
||||
// Get component property
|
||||
value, err := angular.GetComponentProperty("app-user-profile", "user")
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
log.Printf("User: %v", value)
|
||||
|
||||
// Call component method
|
||||
result, err := angular.CallComponentMethod("app-counter", "increment", 5)
|
||||
if err != nil {
|
||||
log.Fatal(err)
|
||||
}
|
||||
log.Printf("Result: %v", result)
|
||||
}
|
||||
```
|
||||
|
||||
## See Also
|
||||
|
||||
- [Chrome DevTools Protocol Documentation](https://chromedevtools.github.io/devtools-protocol/)
|
||||
- [pkg/webview package documentation](../../pkg/webview/)
|
||||
- [MCP Tools Reference](../mcp/)
|
||||
849
docs/plans/2026-02-05-mcp-integration.md
Normal file
849
docs/plans/2026-02-05-mcp-integration.md
Normal file
|
|
@ -0,0 +1,849 @@
|
|||
# MCP Integration Implementation Plan
|
||||
|
||||
> **For Claude:** REQUIRED SUB-SKILL: Use superpowers:executing-plans to implement this plan task-by-task.
|
||||
|
||||
**Goal:** Add `core mcp serve` command with RAG and metrics tools, then configure the agentic-flows plugin to use it.
|
||||
|
||||
**Architecture:** Create a new `mcp` command package that starts the pkg/mcp server with extended tools. RAG tools call the existing exported functions in internal/cmd/rag. Metrics tools call pkg/ai directly. The agentic-flows plugin gets a `.mcp.json` that spawns `core mcp serve`.
|
||||
|
||||
**Tech Stack:** Go 1.25, github.com/modelcontextprotocol/go-sdk/mcp, pkg/rag, pkg/ai
|
||||
|
||||
---
|
||||
|
||||
## Task 1: Add RAG tools to pkg/mcp
|
||||
|
||||
**Files:**
|
||||
- Create: `pkg/mcp/tools_rag.go`
|
||||
- Modify: `pkg/mcp/mcp.go:99-101` (registerTools)
|
||||
- Test: `pkg/mcp/tools_rag_test.go`
|
||||
|
||||
**Step 1: Write the failing test**
|
||||
|
||||
Create `pkg/mcp/tools_rag_test.go`:
|
||||
|
||||
```go
|
||||
package mcp
|
||||
|
||||
import (
|
||||
"context"
|
||||
"testing"
|
||||
|
||||
"github.com/modelcontextprotocol/go-sdk/mcp"
|
||||
)
|
||||
|
||||
func TestRAGQueryTool_Good(t *testing.T) {
|
||||
// This test verifies the tool is registered and callable.
|
||||
// It doesn't require Qdrant/Ollama running - just checks structure.
|
||||
s, err := New(WithWorkspaceRoot(""))
|
||||
if err != nil {
|
||||
t.Fatalf("New() error: %v", err)
|
||||
}
|
||||
|
||||
// Check that rag_query tool is registered
|
||||
tools := s.Server().ListTools()
|
||||
found := false
|
||||
for _, tool := range tools {
|
||||
if tool.Name == "rag_query" {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
t.Error("rag_query tool not registered")
|
||||
}
|
||||
}
|
||||
|
||||
func TestRAGQueryInput_Good(t *testing.T) {
|
||||
input := RAGQueryInput{
|
||||
Question: "how do I deploy?",
|
||||
Collection: "hostuk-docs",
|
||||
TopK: 5,
|
||||
}
|
||||
if input.Question == "" {
|
||||
t.Error("Question should not be empty")
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Step 2: Run test to verify it fails**
|
||||
|
||||
Run: `go test -run TestRAGQueryTool ./pkg/mcp/... -v`
|
||||
Expected: FAIL with "rag_query tool not registered"
|
||||
|
||||
**Step 3: Create tools_rag.go with types and tool registration**
|
||||
|
||||
Create `pkg/mcp/tools_rag.go`:
|
||||
|
||||
```go
|
||||
package mcp
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
|
||||
ragcmd "github.com/host-uk/core/internal/cmd/rag"
|
||||
"github.com/host-uk/core/pkg/rag"
|
||||
"github.com/modelcontextprotocol/go-sdk/mcp"
|
||||
)
|
||||
|
||||
// RAG tool input/output types
|
||||
|
||||
// RAGQueryInput contains parameters for querying the vector database.
|
||||
type RAGQueryInput struct {
|
||||
Question string `json:"question"`
|
||||
Collection string `json:"collection,omitempty"`
|
||||
TopK int `json:"top_k,omitempty"`
|
||||
}
|
||||
|
||||
// RAGQueryOutput contains the query results.
|
||||
type RAGQueryOutput struct {
|
||||
Results []RAGResult `json:"results"`
|
||||
Context string `json:"context"`
|
||||
}
|
||||
|
||||
// RAGResult represents a single search result.
|
||||
type RAGResult struct {
|
||||
Content string `json:"content"`
|
||||
Score float32 `json:"score"`
|
||||
Source string `json:"source"`
|
||||
Metadata map[string]string `json:"metadata,omitempty"`
|
||||
}
|
||||
|
||||
// RAGIngestInput contains parameters for ingesting documents.
|
||||
type RAGIngestInput struct {
|
||||
Path string `json:"path"`
|
||||
Collection string `json:"collection,omitempty"`
|
||||
Recreate bool `json:"recreate,omitempty"`
|
||||
}
|
||||
|
||||
// RAGIngestOutput contains the ingestion results.
|
||||
type RAGIngestOutput struct {
|
||||
Success bool `json:"success"`
|
||||
Path string `json:"path"`
|
||||
Chunks int `json:"chunks"`
|
||||
Message string `json:"message,omitempty"`
|
||||
}
|
||||
|
||||
// RAGCollectionsInput contains parameters for listing collections.
|
||||
type RAGCollectionsInput struct {
|
||||
ShowStats bool `json:"show_stats,omitempty"`
|
||||
}
|
||||
|
||||
// RAGCollectionsOutput contains the list of collections.
|
||||
type RAGCollectionsOutput struct {
|
||||
Collections []CollectionInfo `json:"collections"`
|
||||
}
|
||||
|
||||
// CollectionInfo describes a Qdrant collection.
|
||||
type CollectionInfo struct {
|
||||
Name string `json:"name"`
|
||||
PointsCount uint64 `json:"points_count,omitempty"`
|
||||
Status string `json:"status,omitempty"`
|
||||
}
|
||||
|
||||
// registerRAGTools adds RAG tools to the MCP server.
|
||||
func (s *Service) registerRAGTools(server *mcp.Server) {
|
||||
mcp.AddTool(server, &mcp.Tool{
|
||||
Name: "rag_query",
|
||||
Description: "Query the vector database for relevant documents using semantic search",
|
||||
}, s.ragQuery)
|
||||
|
||||
mcp.AddTool(server, &mcp.Tool{
|
||||
Name: "rag_ingest",
|
||||
Description: "Ingest a file or directory into the vector database",
|
||||
}, s.ragIngest)
|
||||
|
||||
mcp.AddTool(server, &mcp.Tool{
|
||||
Name: "rag_collections",
|
||||
Description: "List available vector database collections",
|
||||
}, s.ragCollections)
|
||||
}
|
||||
|
||||
func (s *Service) ragQuery(ctx context.Context, req *mcp.CallToolRequest, input RAGQueryInput) (*mcp.CallToolResult, RAGQueryOutput, error) {
|
||||
s.logger.Info("MCP tool execution", "tool", "rag_query", "question", input.Question)
|
||||
|
||||
collection := input.Collection
|
||||
if collection == "" {
|
||||
collection = "hostuk-docs"
|
||||
}
|
||||
topK := input.TopK
|
||||
if topK <= 0 {
|
||||
topK = 5
|
||||
}
|
||||
|
||||
results, err := ragcmd.QueryDocs(ctx, input.Question, collection, topK)
|
||||
if err != nil {
|
||||
return nil, RAGQueryOutput{}, fmt.Errorf("query failed: %w", err)
|
||||
}
|
||||
|
||||
// Convert to output format
|
||||
out := RAGQueryOutput{
|
||||
Results: make([]RAGResult, 0, len(results)),
|
||||
Context: rag.FormatResultsContext(results),
|
||||
}
|
||||
for _, r := range results {
|
||||
out.Results = append(out.Results, RAGResult{
|
||||
Content: r.Content,
|
||||
Score: r.Score,
|
||||
Source: r.Source,
|
||||
Metadata: r.Metadata,
|
||||
})
|
||||
}
|
||||
|
||||
return nil, out, nil
|
||||
}
|
||||
|
||||
func (s *Service) ragIngest(ctx context.Context, req *mcp.CallToolRequest, input RAGIngestInput) (*mcp.CallToolResult, RAGIngestOutput, error) {
|
||||
s.logger.Security("MCP tool execution", "tool", "rag_ingest", "path", input.Path)
|
||||
|
||||
collection := input.Collection
|
||||
if collection == "" {
|
||||
collection = "hostuk-docs"
|
||||
}
|
||||
|
||||
// Check if path is a file or directory
|
||||
info, err := s.medium.Stat(input.Path)
|
||||
if err != nil {
|
||||
return nil, RAGIngestOutput{}, fmt.Errorf("path not found: %w", err)
|
||||
}
|
||||
|
||||
if info.IsDir() {
|
||||
err = ragcmd.IngestDirectory(ctx, input.Path, collection, input.Recreate)
|
||||
if err != nil {
|
||||
return nil, RAGIngestOutput{}, fmt.Errorf("ingest directory failed: %w", err)
|
||||
}
|
||||
return nil, RAGIngestOutput{
|
||||
Success: true,
|
||||
Path: input.Path,
|
||||
Message: fmt.Sprintf("Ingested directory into collection %s", collection),
|
||||
}, nil
|
||||
}
|
||||
|
||||
chunks, err := ragcmd.IngestFile(ctx, input.Path, collection)
|
||||
if err != nil {
|
||||
return nil, RAGIngestOutput{}, fmt.Errorf("ingest file failed: %w", err)
|
||||
}
|
||||
|
||||
return nil, RAGIngestOutput{
|
||||
Success: true,
|
||||
Path: input.Path,
|
||||
Chunks: chunks,
|
||||
Message: fmt.Sprintf("Ingested %d chunks into collection %s", chunks, collection),
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (s *Service) ragCollections(ctx context.Context, req *mcp.CallToolRequest, input RAGCollectionsInput) (*mcp.CallToolResult, RAGCollectionsOutput, error) {
|
||||
s.logger.Info("MCP tool execution", "tool", "rag_collections")
|
||||
|
||||
client, err := rag.NewQdrantClient(rag.DefaultQdrantConfig())
|
||||
if err != nil {
|
||||
return nil, RAGCollectionsOutput{}, fmt.Errorf("connect to Qdrant: %w", err)
|
||||
}
|
||||
defer func() { _ = client.Close() }()
|
||||
|
||||
names, err := client.ListCollections(ctx)
|
||||
if err != nil {
|
||||
return nil, RAGCollectionsOutput{}, fmt.Errorf("list collections: %w", err)
|
||||
}
|
||||
|
||||
out := RAGCollectionsOutput{
|
||||
Collections: make([]CollectionInfo, 0, len(names)),
|
||||
}
|
||||
|
||||
for _, name := range names {
|
||||
info := CollectionInfo{Name: name}
|
||||
if input.ShowStats {
|
||||
cinfo, err := client.CollectionInfo(ctx, name)
|
||||
if err == nil {
|
||||
info.PointsCount = cinfo.PointsCount
|
||||
info.Status = cinfo.Status.String()
|
||||
}
|
||||
}
|
||||
out.Collections = append(out.Collections, info)
|
||||
}
|
||||
|
||||
return nil, out, nil
|
||||
}
|
||||
```
|
||||
|
||||
**Step 4: Update mcp.go to call registerRAGTools**
|
||||
|
||||
In `pkg/mcp/mcp.go`, modify the `registerTools` function (around line 104) to add:
|
||||
|
||||
```go
|
||||
func (s *Service) registerTools(server *mcp.Server) {
|
||||
// File operations (existing)
|
||||
// ... existing code ...
|
||||
|
||||
// RAG operations
|
||||
s.registerRAGTools(server)
|
||||
}
|
||||
```
|
||||
|
||||
**Step 5: Run test to verify it passes**
|
||||
|
||||
Run: `go test -run TestRAGQuery ./pkg/mcp/... -v`
|
||||
Expected: PASS
|
||||
|
||||
**Step 6: Commit**
|
||||
|
||||
```bash
|
||||
git add pkg/mcp/tools_rag.go pkg/mcp/tools_rag_test.go pkg/mcp/mcp.go
|
||||
git commit -m "feat(mcp): add RAG tools (query, ingest, collections)"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 2: Add metrics tools to pkg/mcp
|
||||
|
||||
**Files:**
|
||||
- Create: `pkg/mcp/tools_metrics.go`
|
||||
- Modify: `pkg/mcp/mcp.go` (registerTools)
|
||||
- Test: `pkg/mcp/tools_metrics_test.go`
|
||||
|
||||
**Step 1: Write the failing test**
|
||||
|
||||
Create `pkg/mcp/tools_metrics_test.go`:
|
||||
|
||||
```go
|
||||
package mcp
|
||||
|
||||
import (
|
||||
"testing"
|
||||
)
|
||||
|
||||
func TestMetricsRecordTool_Good(t *testing.T) {
|
||||
s, err := New(WithWorkspaceRoot(""))
|
||||
if err != nil {
|
||||
t.Fatalf("New() error: %v", err)
|
||||
}
|
||||
|
||||
tools := s.Server().ListTools()
|
||||
found := false
|
||||
for _, tool := range tools {
|
||||
if tool.Name == "metrics_record" {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
t.Error("metrics_record tool not registered")
|
||||
}
|
||||
}
|
||||
|
||||
func TestMetricsQueryTool_Good(t *testing.T) {
|
||||
s, err := New(WithWorkspaceRoot(""))
|
||||
if err != nil {
|
||||
t.Fatalf("New() error: %v", err)
|
||||
}
|
||||
|
||||
tools := s.Server().ListTools()
|
||||
found := false
|
||||
for _, tool := range tools {
|
||||
if tool.Name == "metrics_query" {
|
||||
found = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !found {
|
||||
t.Error("metrics_query tool not registered")
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Step 2: Run test to verify it fails**
|
||||
|
||||
Run: `go test -run TestMetrics ./pkg/mcp/... -v`
|
||||
Expected: FAIL
|
||||
|
||||
**Step 3: Create tools_metrics.go**
|
||||
|
||||
Create `pkg/mcp/tools_metrics.go`:
|
||||
|
||||
```go
|
||||
package mcp
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"time"
|
||||
|
||||
"github.com/host-uk/core/pkg/ai"
|
||||
"github.com/modelcontextprotocol/go-sdk/mcp"
|
||||
)
|
||||
|
||||
// Metrics tool input/output types
|
||||
|
||||
// MetricsRecordInput contains parameters for recording a metric event.
|
||||
type MetricsRecordInput struct {
|
||||
Type string `json:"type"`
|
||||
AgentID string `json:"agent_id,omitempty"`
|
||||
Repo string `json:"repo,omitempty"`
|
||||
Data map[string]any `json:"data,omitempty"`
|
||||
}
|
||||
|
||||
// MetricsRecordOutput contains the result of recording.
|
||||
type MetricsRecordOutput struct {
|
||||
Success bool `json:"success"`
|
||||
Timestamp time.Time `json:"timestamp"`
|
||||
}
|
||||
|
||||
// MetricsQueryInput contains parameters for querying metrics.
|
||||
type MetricsQueryInput struct {
|
||||
Since string `json:"since,omitempty"` // e.g., "7d", "24h"
|
||||
}
|
||||
|
||||
// MetricsQueryOutput contains the query results.
|
||||
type MetricsQueryOutput struct {
|
||||
Total int `json:"total"`
|
||||
ByType []MetricCount `json:"by_type"`
|
||||
ByRepo []MetricCount `json:"by_repo"`
|
||||
ByAgent []MetricCount `json:"by_agent"`
|
||||
Events []MetricEventBrief `json:"events,omitempty"`
|
||||
}
|
||||
|
||||
// MetricCount represents a count by key.
|
||||
type MetricCount struct {
|
||||
Key string `json:"key"`
|
||||
Count int `json:"count"`
|
||||
}
|
||||
|
||||
// MetricEventBrief is a simplified event for output.
|
||||
type MetricEventBrief struct {
|
||||
Type string `json:"type"`
|
||||
Timestamp time.Time `json:"timestamp"`
|
||||
AgentID string `json:"agent_id,omitempty"`
|
||||
Repo string `json:"repo,omitempty"`
|
||||
}
|
||||
|
||||
// registerMetricsTools adds metrics tools to the MCP server.
|
||||
func (s *Service) registerMetricsTools(server *mcp.Server) {
|
||||
mcp.AddTool(server, &mcp.Tool{
|
||||
Name: "metrics_record",
|
||||
Description: "Record a metric event (AI task, security scan, job creation, etc.)",
|
||||
}, s.metricsRecord)
|
||||
|
||||
mcp.AddTool(server, &mcp.Tool{
|
||||
Name: "metrics_query",
|
||||
Description: "Query recorded metrics with aggregation by type, repo, and agent",
|
||||
}, s.metricsQuery)
|
||||
}
|
||||
|
||||
func (s *Service) metricsRecord(ctx context.Context, req *mcp.CallToolRequest, input MetricsRecordInput) (*mcp.CallToolResult, MetricsRecordOutput, error) {
|
||||
s.logger.Info("MCP tool execution", "tool", "metrics_record", "type", input.Type)
|
||||
|
||||
if input.Type == "" {
|
||||
return nil, MetricsRecordOutput{}, fmt.Errorf("type is required")
|
||||
}
|
||||
|
||||
event := ai.Event{
|
||||
Type: input.Type,
|
||||
Timestamp: time.Now(),
|
||||
AgentID: input.AgentID,
|
||||
Repo: input.Repo,
|
||||
Data: input.Data,
|
||||
}
|
||||
|
||||
if err := ai.Record(event); err != nil {
|
||||
return nil, MetricsRecordOutput{}, fmt.Errorf("record event: %w", err)
|
||||
}
|
||||
|
||||
return nil, MetricsRecordOutput{
|
||||
Success: true,
|
||||
Timestamp: event.Timestamp,
|
||||
}, nil
|
||||
}
|
||||
|
||||
func (s *Service) metricsQuery(ctx context.Context, req *mcp.CallToolRequest, input MetricsQueryInput) (*mcp.CallToolResult, MetricsQueryOutput, error) {
|
||||
s.logger.Info("MCP tool execution", "tool", "metrics_query", "since", input.Since)
|
||||
|
||||
since := input.Since
|
||||
if since == "" {
|
||||
since = "7d"
|
||||
}
|
||||
|
||||
duration, err := parseDuration(since)
|
||||
if err != nil {
|
||||
return nil, MetricsQueryOutput{}, fmt.Errorf("invalid since value: %w", err)
|
||||
}
|
||||
|
||||
sinceTime := time.Now().Add(-duration)
|
||||
events, err := ai.ReadEvents(sinceTime)
|
||||
if err != nil {
|
||||
return nil, MetricsQueryOutput{}, fmt.Errorf("read events: %w", err)
|
||||
}
|
||||
|
||||
summary := ai.Summary(events)
|
||||
|
||||
out := MetricsQueryOutput{
|
||||
Total: summary["total"].(int),
|
||||
}
|
||||
|
||||
// Convert by_type
|
||||
if byType, ok := summary["by_type"].([]map[string]any); ok {
|
||||
for _, entry := range byType {
|
||||
out.ByType = append(out.ByType, MetricCount{
|
||||
Key: entry["key"].(string),
|
||||
Count: entry["count"].(int),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Convert by_repo
|
||||
if byRepo, ok := summary["by_repo"].([]map[string]any); ok {
|
||||
for _, entry := range byRepo {
|
||||
out.ByRepo = append(out.ByRepo, MetricCount{
|
||||
Key: entry["key"].(string),
|
||||
Count: entry["count"].(int),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Convert by_agent
|
||||
if byAgent, ok := summary["by_agent"].([]map[string]any); ok {
|
||||
for _, entry := range byAgent {
|
||||
out.ByAgent = append(out.ByAgent, MetricCount{
|
||||
Key: entry["key"].(string),
|
||||
Count: entry["count"].(int),
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
// Include last 10 events for context
|
||||
limit := 10
|
||||
if len(events) < limit {
|
||||
limit = len(events)
|
||||
}
|
||||
for i := len(events) - limit; i < len(events); i++ {
|
||||
ev := events[i]
|
||||
out.Events = append(out.Events, MetricEventBrief{
|
||||
Type: ev.Type,
|
||||
Timestamp: ev.Timestamp,
|
||||
AgentID: ev.AgentID,
|
||||
Repo: ev.Repo,
|
||||
})
|
||||
}
|
||||
|
||||
return nil, out, nil
|
||||
}
|
||||
|
||||
// parseDuration parses a human-friendly duration like "7d", "24h", "30d".
|
||||
func parseDuration(s string) (time.Duration, error) {
|
||||
if len(s) < 2 {
|
||||
return 0, fmt.Errorf("invalid duration: %s", s)
|
||||
}
|
||||
|
||||
unit := s[len(s)-1]
|
||||
value := s[:len(s)-1]
|
||||
|
||||
var n int
|
||||
if _, err := fmt.Sscanf(value, "%d", &n); err != nil {
|
||||
return 0, fmt.Errorf("invalid duration: %s", s)
|
||||
}
|
||||
|
||||
if n <= 0 {
|
||||
return 0, fmt.Errorf("duration must be positive: %s", s)
|
||||
}
|
||||
|
||||
switch unit {
|
||||
case 'd':
|
||||
return time.Duration(n) * 24 * time.Hour, nil
|
||||
case 'h':
|
||||
return time.Duration(n) * time.Hour, nil
|
||||
case 'm':
|
||||
return time.Duration(n) * time.Minute, nil
|
||||
default:
|
||||
return 0, fmt.Errorf("unknown unit %c in duration: %s", unit, s)
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Step 4: Update mcp.go to call registerMetricsTools**
|
||||
|
||||
In `pkg/mcp/mcp.go`, add to `registerTools`:
|
||||
|
||||
```go
|
||||
func (s *Service) registerTools(server *mcp.Server) {
|
||||
// ... existing file operations ...
|
||||
|
||||
// RAG operations
|
||||
s.registerRAGTools(server)
|
||||
|
||||
// Metrics operations
|
||||
s.registerMetricsTools(server)
|
||||
}
|
||||
```
|
||||
|
||||
**Step 5: Run test to verify it passes**
|
||||
|
||||
Run: `go test -run TestMetrics ./pkg/mcp/... -v`
|
||||
Expected: PASS
|
||||
|
||||
**Step 6: Commit**
|
||||
|
||||
```bash
|
||||
git add pkg/mcp/tools_metrics.go pkg/mcp/tools_metrics_test.go pkg/mcp/mcp.go
|
||||
git commit -m "feat(mcp): add metrics tools (record, query)"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 3: Create `core mcp serve` command
|
||||
|
||||
**Files:**
|
||||
- Create: `internal/cmd/mcpcmd/cmd_mcp.go`
|
||||
- Modify: `internal/variants/full.go` (add import)
|
||||
- Test: Manual test via `core mcp serve`
|
||||
|
||||
**Step 1: Create the mcp command package**
|
||||
|
||||
Create `internal/cmd/mcpcmd/cmd_mcp.go`:
|
||||
|
||||
```go
|
||||
package mcpcmd
|
||||
|
||||
import (
|
||||
"context"
|
||||
"os"
|
||||
"os/signal"
|
||||
"syscall"
|
||||
|
||||
"github.com/host-uk/core/pkg/cli"
|
||||
"github.com/host-uk/core/pkg/i18n"
|
||||
"github.com/host-uk/core/pkg/mcp"
|
||||
)
|
||||
|
||||
func init() {
|
||||
cli.RegisterCommands(AddMCPCommands)
|
||||
}
|
||||
|
||||
var (
|
||||
mcpWorkspace string
|
||||
)
|
||||
|
||||
var mcpCmd = &cli.Command{
|
||||
Use: "mcp",
|
||||
Short: i18n.T("cmd.mcp.short"),
|
||||
Long: i18n.T("cmd.mcp.long"),
|
||||
}
|
||||
|
||||
var serveCmd = &cli.Command{
|
||||
Use: "serve",
|
||||
Short: i18n.T("cmd.mcp.serve.short"),
|
||||
Long: i18n.T("cmd.mcp.serve.long"),
|
||||
RunE: func(cmd *cli.Command, args []string) error {
|
||||
return runServe()
|
||||
},
|
||||
}
|
||||
|
||||
func AddMCPCommands(root *cli.Command) {
|
||||
initMCPFlags()
|
||||
mcpCmd.AddCommand(serveCmd)
|
||||
root.AddCommand(mcpCmd)
|
||||
}
|
||||
|
||||
func initMCPFlags() {
|
||||
serveCmd.Flags().StringVar(&mcpWorkspace, "workspace", "", i18n.T("cmd.mcp.serve.flag.workspace"))
|
||||
}
|
||||
|
||||
func runServe() error {
|
||||
opts := []mcp.Option{}
|
||||
|
||||
if mcpWorkspace != "" {
|
||||
opts = append(opts, mcp.WithWorkspaceRoot(mcpWorkspace))
|
||||
} else {
|
||||
// Default to unrestricted for MCP server
|
||||
opts = append(opts, mcp.WithWorkspaceRoot(""))
|
||||
}
|
||||
|
||||
svc, err := mcp.New(opts...)
|
||||
if err != nil {
|
||||
return cli.Wrap(err, "create MCP service")
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithCancel(context.Background())
|
||||
defer cancel()
|
||||
|
||||
// Handle shutdown signals
|
||||
sigCh := make(chan os.Signal, 1)
|
||||
signal.Notify(sigCh, syscall.SIGINT, syscall.SIGTERM)
|
||||
go func() {
|
||||
<-sigCh
|
||||
cancel()
|
||||
}()
|
||||
|
||||
return svc.Run(ctx)
|
||||
}
|
||||
```
|
||||
|
||||
**Step 2: Add i18n strings**
|
||||
|
||||
Create or update `pkg/i18n/en.yaml` (if it exists) or add to the existing i18n mechanism:
|
||||
|
||||
```yaml
|
||||
cmd.mcp.short: "MCP (Model Context Protocol) server"
|
||||
cmd.mcp.long: "Start an MCP server for Claude Code integration with file, RAG, and metrics tools."
|
||||
cmd.mcp.serve.short: "Start the MCP server"
|
||||
cmd.mcp.serve.long: "Start the MCP server in stdio mode. Use MCP_ADDR env var for TCP mode."
|
||||
cmd.mcp.serve.flag.workspace: "Restrict file operations to this directory (empty = unrestricted)"
|
||||
```
|
||||
|
||||
**Step 3: Add import to full.go**
|
||||
|
||||
Modify `internal/variants/full.go` to add:
|
||||
|
||||
```go
|
||||
import (
|
||||
// ... existing imports ...
|
||||
_ "github.com/host-uk/core/internal/cmd/mcpcmd"
|
||||
)
|
||||
```
|
||||
|
||||
**Step 4: Build and test**
|
||||
|
||||
Run: `go build && ./core mcp serve --help`
|
||||
Expected: Help output showing the serve command
|
||||
|
||||
**Step 5: Test MCP server manually**
|
||||
|
||||
Run: `echo '{"jsonrpc":"2.0","method":"tools/list","id":1}' | ./core mcp serve`
|
||||
Expected: JSON response listing all tools including rag_query, metrics_record, etc.
|
||||
|
||||
**Step 6: Commit**
|
||||
|
||||
```bash
|
||||
git add internal/cmd/mcpcmd/cmd_mcp.go internal/variants/full.go
|
||||
git commit -m "feat: add 'core mcp serve' command"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 4: Configure agentic-flows plugin with .mcp.json
|
||||
|
||||
**Files:**
|
||||
- Create: `/home/shared/hostuk/claude-plugins/plugins/agentic-flows/.mcp.json`
|
||||
- Modify: `/home/shared/hostuk/claude-plugins/plugins/agentic-flows/.claude-plugin/plugin.json` (optional, add mcpServers)
|
||||
|
||||
**Step 1: Create .mcp.json**
|
||||
|
||||
Create `/home/shared/hostuk/claude-plugins/plugins/agentic-flows/.mcp.json`:
|
||||
|
||||
```json
|
||||
{
|
||||
"core-cli": {
|
||||
"command": "core",
|
||||
"args": ["mcp", "serve"],
|
||||
"env": {
|
||||
"MCP_WORKSPACE": ""
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**Step 2: Verify plugin loads**
|
||||
|
||||
Restart Claude Code and run `/mcp` to verify the core-cli server appears.
|
||||
|
||||
**Step 3: Test MCP tools**
|
||||
|
||||
Test that tools are available:
|
||||
- `mcp__plugin_agentic-flows_core-cli__rag_query`
|
||||
- `mcp__plugin_agentic-flows_core-cli__rag_ingest`
|
||||
- `mcp__plugin_agentic-flows_core-cli__rag_collections`
|
||||
- `mcp__plugin_agentic-flows_core-cli__metrics_record`
|
||||
- `mcp__plugin_agentic-flows_core-cli__metrics_query`
|
||||
- `mcp__plugin_agentic-flows_core-cli__file_read`
|
||||
- etc.
|
||||
|
||||
**Step 4: Commit plugin changes**
|
||||
|
||||
```bash
|
||||
cd /home/shared/hostuk/claude-plugins
|
||||
git add plugins/agentic-flows/.mcp.json
|
||||
git commit -m "feat(agentic-flows): add MCP server configuration for core-cli"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Task 5: Update documentation
|
||||
|
||||
**Files:**
|
||||
- Modify: `/home/claude/.claude/projects/-home-claude/memory/MEMORY.md`
|
||||
- Modify: `/home/claude/.claude/projects/-home-claude/memory/plugin-dev-notes.md`
|
||||
|
||||
**Step 1: Update MEMORY.md**
|
||||
|
||||
Add under "Core CLI MCP Server" section:
|
||||
|
||||
```markdown
|
||||
### Core CLI MCP Server
|
||||
- **Command:** `core mcp serve` (stdio mode) or `MCP_ADDR=:9000 core mcp serve` (TCP)
|
||||
- **Tools available:**
|
||||
- File ops: file_read, file_write, file_edit, file_delete, file_rename, file_exists, dir_list, dir_create
|
||||
- RAG: rag_query, rag_ingest, rag_collections
|
||||
- Metrics: metrics_record, metrics_query
|
||||
- Language: lang_detect, lang_list
|
||||
- **Plugin config:** `plugins/agentic-flows/.mcp.json`
|
||||
```
|
||||
|
||||
**Step 2: Update plugin-dev-notes.md**
|
||||
|
||||
Add section:
|
||||
|
||||
```markdown
|
||||
## MCP Server (core mcp serve)
|
||||
|
||||
### Available Tools
|
||||
| Tool | Description |
|
||||
|------|-------------|
|
||||
| file_read | Read file contents |
|
||||
| file_write | Write file contents |
|
||||
| file_edit | Edit file (replace string) |
|
||||
| file_delete | Delete file |
|
||||
| file_rename | Rename/move file |
|
||||
| file_exists | Check if file exists |
|
||||
| dir_list | List directory contents |
|
||||
| dir_create | Create directory |
|
||||
| rag_query | Query vector DB |
|
||||
| rag_ingest | Ingest file/directory |
|
||||
| rag_collections | List collections |
|
||||
| metrics_record | Record event |
|
||||
| metrics_query | Query events |
|
||||
| lang_detect | Detect file language |
|
||||
| lang_list | List supported languages |
|
||||
|
||||
### Example .mcp.json
|
||||
```json
|
||||
{
|
||||
"core-cli": {
|
||||
"command": "core",
|
||||
"args": ["mcp", "serve"]
|
||||
}
|
||||
}
|
||||
```
|
||||
```
|
||||
|
||||
**Step 3: Commit documentation**
|
||||
|
||||
```bash
|
||||
git add ~/.claude/projects/-home-claude/memory/*.md
|
||||
git commit -m "docs: update memory with MCP server tools"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Summary
|
||||
|
||||
| Task | Files | Purpose |
|
||||
|------|-------|---------|
|
||||
| 1 | `pkg/mcp/tools_rag.go` | RAG tools (query, ingest, collections) |
|
||||
| 2 | `pkg/mcp/tools_metrics.go` | Metrics tools (record, query) |
|
||||
| 3 | `internal/cmd/mcpcmd/cmd_mcp.go` | `core mcp serve` command |
|
||||
| 4 | `plugins/agentic-flows/.mcp.json` | Plugin MCP configuration |
|
||||
| 5 | Memory docs | Documentation updates |
|
||||
|
||||
## Services Required
|
||||
|
||||
- **Qdrant:** localhost:6333 (verified running)
|
||||
- **Ollama:** localhost:11434 with nomic-embed-text (verified running)
|
||||
- **InfluxDB:** localhost:8086 (optional, for future time-series metrics)
|
||||
|
|
@ -293,6 +293,30 @@ go mod download
|
|||
|
||||
---
|
||||
|
||||
## AI and Agentic Issues
|
||||
|
||||
### "ANTHROPIC_API_KEY not set"
|
||||
|
||||
**Cause:** You're trying to use `core ai` or `core dev commit` (which uses Claude for messages) without an API key.
|
||||
|
||||
**Fix:**
|
||||
|
||||
```bash
|
||||
export ANTHROPIC_API_KEY=sk-ant-xxxxxxxxxxxx
|
||||
```
|
||||
|
||||
### "failed to connect to Agentic API"
|
||||
|
||||
**Cause:** Network issues or incorrect `AGENTIC_BASE_URL`.
|
||||
|
||||
**Fix:**
|
||||
|
||||
1. Check your internet connection
|
||||
2. If using a custom endpoint, verify `AGENTIC_BASE_URL`
|
||||
3. Ensure you are authenticated if required: `export AGENTIC_TOKEN=xxxx`
|
||||
|
||||
---
|
||||
|
||||
## Getting More Help
|
||||
|
||||
### Enable Verbose Output
|
||||
|
|
|
|||
|
|
@ -10,8 +10,8 @@ Complete workflow from code to GitHub release.
|
|||
# 1. Run tests
|
||||
core go test
|
||||
|
||||
# 2. Check coverage
|
||||
core go cov --threshold 80
|
||||
# 2. Check coverage (Statement and Branch)
|
||||
core go cov --threshold 40 --branch-threshold 35
|
||||
|
||||
# 3. Format and lint
|
||||
core go fmt --fix
|
||||
|
|
|
|||
5
go.mod
5
go.mod
|
|
@ -3,6 +3,7 @@ module github.com/host-uk/core
|
|||
go 1.25.5
|
||||
|
||||
require (
|
||||
code.gitea.io/sdk/gitea v0.23.2
|
||||
github.com/Snider/Borg v0.2.0
|
||||
github.com/getkin/kin-openapi v0.133.0
|
||||
github.com/host-uk/core/internal/core-ide v0.0.0-20260204004957-989b7e1e6555
|
||||
|
|
@ -31,17 +32,20 @@ require (
|
|||
aead.dev/minisign v0.3.0 // indirect
|
||||
cloud.google.com/go v0.123.0 // indirect
|
||||
dario.cat/mergo v1.0.2 // indirect
|
||||
github.com/42wim/httpsig v1.2.3 // indirect
|
||||
github.com/Microsoft/go-winio v0.6.2 // indirect
|
||||
github.com/ProtonMail/go-crypto v1.3.0 // indirect
|
||||
github.com/TwiN/go-color v1.4.1 // indirect
|
||||
github.com/adrg/xdg v0.5.3 // indirect
|
||||
github.com/bahlo/generic-list-go v0.2.0 // indirect
|
||||
github.com/bep/debounce v1.2.1 // indirect
|
||||
github.com/brianvoe/gofakeit/v6 v6.28.0 // indirect
|
||||
github.com/buger/jsonparser v1.1.1 // indirect
|
||||
github.com/cloudflare/circl v1.6.3 // indirect
|
||||
github.com/coder/websocket v1.8.14 // indirect
|
||||
github.com/cyphar/filepath-securejoin v0.6.1 // indirect
|
||||
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc // indirect
|
||||
github.com/davidmz/go-pageant v1.0.2 // indirect
|
||||
github.com/ebitengine/purego v0.9.1 // indirect
|
||||
github.com/emirpasic/gods v1.18.1 // indirect
|
||||
github.com/fatih/color v1.18.0 // indirect
|
||||
|
|
@ -60,6 +64,7 @@ require (
|
|||
github.com/google/jsonschema-go v0.4.2 // indirect
|
||||
github.com/google/uuid v1.6.0 // indirect
|
||||
github.com/gorilla/websocket v1.5.3 // indirect
|
||||
github.com/hashicorp/go-version v1.7.0 // indirect
|
||||
github.com/inconshreveable/mousetrap v1.1.0 // indirect
|
||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect
|
||||
github.com/jchv/go-winloader v0.0.0-20250406163304-c1995be93bd1 // indirect
|
||||
|
|
|
|||
504
internal/bugseti/config.go
Normal file
504
internal/bugseti/config.go
Normal file
|
|
@ -0,0 +1,504 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
// ConfigService manages application configuration and persistence.
|
||||
type ConfigService struct {
|
||||
config *Config
|
||||
path string
|
||||
mu sync.RWMutex
|
||||
}
|
||||
|
||||
// Config holds all BugSETI configuration.
|
||||
type Config struct {
|
||||
// Authentication
|
||||
GitHubToken string `json:"githubToken,omitempty"`
|
||||
|
||||
// Repositories
|
||||
WatchedRepos []string `json:"watchedRepos"`
|
||||
Labels []string `json:"labels"`
|
||||
|
||||
// Scheduling
|
||||
WorkHours *WorkHours `json:"workHours,omitempty"`
|
||||
FetchInterval int `json:"fetchIntervalMinutes"`
|
||||
|
||||
// Notifications
|
||||
NotificationsEnabled bool `json:"notificationsEnabled"`
|
||||
NotificationSound bool `json:"notificationSound"`
|
||||
|
||||
// Workspace
|
||||
WorkspaceDir string `json:"workspaceDir,omitempty"`
|
||||
DataDir string `json:"dataDir,omitempty"`
|
||||
|
||||
// Onboarding
|
||||
Onboarded bool `json:"onboarded"`
|
||||
OnboardedAt time.Time `json:"onboardedAt,omitempty"`
|
||||
|
||||
// UI Preferences
|
||||
Theme string `json:"theme"`
|
||||
ShowTrayPanel bool `json:"showTrayPanel"`
|
||||
|
||||
// Advanced
|
||||
MaxConcurrentIssues int `json:"maxConcurrentIssues"`
|
||||
AutoSeedContext bool `json:"autoSeedContext"`
|
||||
|
||||
// Updates
|
||||
UpdateChannel string `json:"updateChannel"` // stable, beta, nightly
|
||||
AutoUpdate bool `json:"autoUpdate"` // Automatically install updates
|
||||
UpdateCheckInterval int `json:"updateCheckInterval"` // Check interval in hours (0 = disabled)
|
||||
LastUpdateCheck time.Time `json:"lastUpdateCheck,omitempty"`
|
||||
}
|
||||
|
||||
// WorkHours defines when BugSETI should actively fetch issues.
|
||||
type WorkHours struct {
|
||||
Enabled bool `json:"enabled"`
|
||||
StartHour int `json:"startHour"` // 0-23
|
||||
EndHour int `json:"endHour"` // 0-23
|
||||
Days []int `json:"days"` // 0=Sunday, 6=Saturday
|
||||
Timezone string `json:"timezone"`
|
||||
}
|
||||
|
||||
// NewConfigService creates a new ConfigService with default values.
|
||||
func NewConfigService() *ConfigService {
|
||||
// Determine config path
|
||||
configDir, err := os.UserConfigDir()
|
||||
if err != nil {
|
||||
configDir = filepath.Join(os.Getenv("HOME"), ".config")
|
||||
}
|
||||
|
||||
bugsetiDir := filepath.Join(configDir, "bugseti")
|
||||
if err := os.MkdirAll(bugsetiDir, 0755); err != nil {
|
||||
log.Printf("Warning: could not create config directory: %v", err)
|
||||
}
|
||||
|
||||
return &ConfigService{
|
||||
path: filepath.Join(bugsetiDir, "config.json"),
|
||||
config: &Config{
|
||||
WatchedRepos: []string{},
|
||||
Labels: []string{
|
||||
"good first issue",
|
||||
"help wanted",
|
||||
"beginner-friendly",
|
||||
},
|
||||
FetchInterval: 15,
|
||||
NotificationsEnabled: true,
|
||||
NotificationSound: true,
|
||||
Theme: "dark",
|
||||
ShowTrayPanel: true,
|
||||
MaxConcurrentIssues: 1,
|
||||
AutoSeedContext: true,
|
||||
DataDir: bugsetiDir,
|
||||
UpdateChannel: "stable",
|
||||
AutoUpdate: false,
|
||||
UpdateCheckInterval: 6, // Check every 6 hours
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (c *ConfigService) ServiceName() string {
|
||||
return "ConfigService"
|
||||
}
|
||||
|
||||
// Load reads the configuration from disk.
|
||||
func (c *ConfigService) Load() error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
|
||||
data, err := os.ReadFile(c.path)
|
||||
if err != nil {
|
||||
if os.IsNotExist(err) {
|
||||
// No config file yet, use defaults
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
var config Config
|
||||
if err := json.Unmarshal(data, &config); err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// Merge with defaults for any new fields
|
||||
c.mergeDefaults(&config)
|
||||
c.config = &config
|
||||
return nil
|
||||
}
|
||||
|
||||
// Save persists the configuration to disk.
|
||||
func (c *ConfigService) Save() error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// saveUnsafe writes config without acquiring lock.
|
||||
func (c *ConfigService) saveUnsafe() error {
|
||||
data, err := json.MarshalIndent(c.config, "", " ")
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
return os.WriteFile(c.path, data, 0644)
|
||||
}
|
||||
|
||||
// mergeDefaults fills in default values for any unset fields.
|
||||
func (c *ConfigService) mergeDefaults(config *Config) {
|
||||
if config.Labels == nil || len(config.Labels) == 0 {
|
||||
config.Labels = c.config.Labels
|
||||
}
|
||||
if config.FetchInterval == 0 {
|
||||
config.FetchInterval = 15
|
||||
}
|
||||
if config.Theme == "" {
|
||||
config.Theme = "dark"
|
||||
}
|
||||
if config.MaxConcurrentIssues == 0 {
|
||||
config.MaxConcurrentIssues = 1
|
||||
}
|
||||
if config.DataDir == "" {
|
||||
config.DataDir = c.config.DataDir
|
||||
}
|
||||
if config.UpdateChannel == "" {
|
||||
config.UpdateChannel = "stable"
|
||||
}
|
||||
if config.UpdateCheckInterval == 0 {
|
||||
config.UpdateCheckInterval = 6
|
||||
}
|
||||
}
|
||||
|
||||
// GetConfig returns a copy of the current configuration.
|
||||
func (c *ConfigService) GetConfig() Config {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return *c.config
|
||||
}
|
||||
|
||||
// SetConfig updates the configuration and saves it.
|
||||
func (c *ConfigService) SetConfig(config Config) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config = &config
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetWatchedRepos returns the list of watched repositories.
|
||||
func (c *ConfigService) GetWatchedRepos() []string {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.WatchedRepos
|
||||
}
|
||||
|
||||
// AddWatchedRepo adds a repository to the watch list.
|
||||
func (c *ConfigService) AddWatchedRepo(repo string) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
|
||||
for _, r := range c.config.WatchedRepos {
|
||||
if r == repo {
|
||||
return nil // Already watching
|
||||
}
|
||||
}
|
||||
|
||||
c.config.WatchedRepos = append(c.config.WatchedRepos, repo)
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// RemoveWatchedRepo removes a repository from the watch list.
|
||||
func (c *ConfigService) RemoveWatchedRepo(repo string) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
|
||||
for i, r := range c.config.WatchedRepos {
|
||||
if r == repo {
|
||||
c.config.WatchedRepos = append(c.config.WatchedRepos[:i], c.config.WatchedRepos[i+1:]...)
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// GetLabels returns the issue labels to filter by.
|
||||
func (c *ConfigService) GetLabels() []string {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.Labels
|
||||
}
|
||||
|
||||
// SetLabels updates the issue labels.
|
||||
func (c *ConfigService) SetLabels(labels []string) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.Labels = labels
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetFetchInterval returns the fetch interval as a duration.
|
||||
func (c *ConfigService) GetFetchInterval() time.Duration {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return time.Duration(c.config.FetchInterval) * time.Minute
|
||||
}
|
||||
|
||||
// SetFetchInterval sets the fetch interval in minutes.
|
||||
func (c *ConfigService) SetFetchInterval(minutes int) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.FetchInterval = minutes
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// IsWithinWorkHours checks if the current time is within configured work hours.
|
||||
func (c *ConfigService) IsWithinWorkHours() bool {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
|
||||
if c.config.WorkHours == nil || !c.config.WorkHours.Enabled {
|
||||
return true // No work hours restriction
|
||||
}
|
||||
|
||||
wh := c.config.WorkHours
|
||||
now := time.Now()
|
||||
|
||||
// Check timezone
|
||||
if wh.Timezone != "" {
|
||||
loc, err := time.LoadLocation(wh.Timezone)
|
||||
if err == nil {
|
||||
now = now.In(loc)
|
||||
}
|
||||
}
|
||||
|
||||
// Check day
|
||||
day := int(now.Weekday())
|
||||
dayAllowed := false
|
||||
for _, d := range wh.Days {
|
||||
if d == day {
|
||||
dayAllowed = true
|
||||
break
|
||||
}
|
||||
}
|
||||
if !dayAllowed {
|
||||
return false
|
||||
}
|
||||
|
||||
// Check hour
|
||||
hour := now.Hour()
|
||||
if wh.StartHour <= wh.EndHour {
|
||||
return hour >= wh.StartHour && hour < wh.EndHour
|
||||
}
|
||||
// Handle overnight (e.g., 22:00 - 06:00)
|
||||
return hour >= wh.StartHour || hour < wh.EndHour
|
||||
}
|
||||
|
||||
// GetWorkHours returns the work hours configuration.
|
||||
func (c *ConfigService) GetWorkHours() *WorkHours {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.WorkHours
|
||||
}
|
||||
|
||||
// SetWorkHours updates the work hours configuration.
|
||||
func (c *ConfigService) SetWorkHours(wh *WorkHours) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.WorkHours = wh
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// IsNotificationsEnabled returns whether notifications are enabled.
|
||||
func (c *ConfigService) IsNotificationsEnabled() bool {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.NotificationsEnabled
|
||||
}
|
||||
|
||||
// SetNotificationsEnabled enables or disables notifications.
|
||||
func (c *ConfigService) SetNotificationsEnabled(enabled bool) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.NotificationsEnabled = enabled
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetWorkspaceDir returns the workspace directory.
|
||||
func (c *ConfigService) GetWorkspaceDir() string {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.WorkspaceDir
|
||||
}
|
||||
|
||||
// SetWorkspaceDir sets the workspace directory.
|
||||
func (c *ConfigService) SetWorkspaceDir(dir string) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.WorkspaceDir = dir
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetDataDir returns the data directory.
|
||||
func (c *ConfigService) GetDataDir() string {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.DataDir
|
||||
}
|
||||
|
||||
// IsOnboarded returns whether the user has completed onboarding.
|
||||
func (c *ConfigService) IsOnboarded() bool {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.Onboarded
|
||||
}
|
||||
|
||||
// CompleteOnboarding marks onboarding as complete.
|
||||
func (c *ConfigService) CompleteOnboarding() error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.Onboarded = true
|
||||
c.config.OnboardedAt = time.Now()
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetTheme returns the current theme.
|
||||
func (c *ConfigService) GetTheme() string {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.Theme
|
||||
}
|
||||
|
||||
// SetTheme sets the theme.
|
||||
func (c *ConfigService) SetTheme(theme string) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.Theme = theme
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// IsAutoSeedEnabled returns whether automatic context seeding is enabled.
|
||||
func (c *ConfigService) IsAutoSeedEnabled() bool {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.AutoSeedContext
|
||||
}
|
||||
|
||||
// SetAutoSeedEnabled enables or disables automatic context seeding.
|
||||
func (c *ConfigService) SetAutoSeedEnabled(enabled bool) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.AutoSeedContext = enabled
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// UpdateSettings holds update-related configuration.
|
||||
type UpdateSettings struct {
|
||||
Channel string `json:"channel"`
|
||||
AutoUpdate bool `json:"autoUpdate"`
|
||||
CheckInterval int `json:"checkInterval"` // Hours
|
||||
LastCheck time.Time `json:"lastCheck"`
|
||||
}
|
||||
|
||||
// GetUpdateSettings returns the update settings.
|
||||
func (c *ConfigService) GetUpdateSettings() UpdateSettings {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return UpdateSettings{
|
||||
Channel: c.config.UpdateChannel,
|
||||
AutoUpdate: c.config.AutoUpdate,
|
||||
CheckInterval: c.config.UpdateCheckInterval,
|
||||
LastCheck: c.config.LastUpdateCheck,
|
||||
}
|
||||
}
|
||||
|
||||
// SetUpdateSettings updates the update settings.
|
||||
func (c *ConfigService) SetUpdateSettings(settings UpdateSettings) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.UpdateChannel = settings.Channel
|
||||
c.config.AutoUpdate = settings.AutoUpdate
|
||||
c.config.UpdateCheckInterval = settings.CheckInterval
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetUpdateChannel returns the update channel.
|
||||
func (c *ConfigService) GetUpdateChannel() string {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.UpdateChannel
|
||||
}
|
||||
|
||||
// SetUpdateChannel sets the update channel.
|
||||
func (c *ConfigService) SetUpdateChannel(channel string) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.UpdateChannel = channel
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// IsAutoUpdateEnabled returns whether automatic updates are enabled.
|
||||
func (c *ConfigService) IsAutoUpdateEnabled() bool {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.AutoUpdate
|
||||
}
|
||||
|
||||
// SetAutoUpdateEnabled enables or disables automatic updates.
|
||||
func (c *ConfigService) SetAutoUpdateEnabled(enabled bool) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.AutoUpdate = enabled
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetUpdateCheckInterval returns the update check interval in hours.
|
||||
func (c *ConfigService) GetUpdateCheckInterval() int {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.UpdateCheckInterval
|
||||
}
|
||||
|
||||
// SetUpdateCheckInterval sets the update check interval in hours.
|
||||
func (c *ConfigService) SetUpdateCheckInterval(hours int) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.UpdateCheckInterval = hours
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// GetLastUpdateCheck returns the last update check time.
|
||||
func (c *ConfigService) GetLastUpdateCheck() time.Time {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
return c.config.LastUpdateCheck
|
||||
}
|
||||
|
||||
// SetLastUpdateCheck sets the last update check time.
|
||||
func (c *ConfigService) SetLastUpdateCheck(t time.Time) error {
|
||||
c.mu.Lock()
|
||||
defer c.mu.Unlock()
|
||||
c.config.LastUpdateCheck = t
|
||||
return c.saveUnsafe()
|
||||
}
|
||||
|
||||
// ShouldCheckForUpdates returns true if it's time to check for updates.
|
||||
func (c *ConfigService) ShouldCheckForUpdates() bool {
|
||||
c.mu.RLock()
|
||||
defer c.mu.RUnlock()
|
||||
|
||||
if c.config.UpdateCheckInterval <= 0 {
|
||||
return false // Updates disabled
|
||||
}
|
||||
|
||||
if c.config.LastUpdateCheck.IsZero() {
|
||||
return true // Never checked
|
||||
}
|
||||
|
||||
interval := time.Duration(c.config.UpdateCheckInterval) * time.Hour
|
||||
return time.Since(c.config.LastUpdateCheck) >= interval
|
||||
}
|
||||
296
internal/bugseti/fetcher.go
Normal file
296
internal/bugseti/fetcher.go
Normal file
|
|
@ -0,0 +1,296 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"log"
|
||||
"os/exec"
|
||||
"strings"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
// FetcherService fetches issues from configured OSS repositories.
|
||||
type FetcherService struct {
|
||||
config *ConfigService
|
||||
notify *NotifyService
|
||||
running bool
|
||||
mu sync.RWMutex
|
||||
stopCh chan struct{}
|
||||
issuesCh chan []*Issue
|
||||
}
|
||||
|
||||
// NewFetcherService creates a new FetcherService.
|
||||
func NewFetcherService(config *ConfigService, notify *NotifyService) *FetcherService {
|
||||
return &FetcherService{
|
||||
config: config,
|
||||
notify: notify,
|
||||
issuesCh: make(chan []*Issue, 10),
|
||||
}
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (f *FetcherService) ServiceName() string {
|
||||
return "FetcherService"
|
||||
}
|
||||
|
||||
// Start begins fetching issues from configured repositories.
|
||||
func (f *FetcherService) Start() error {
|
||||
f.mu.Lock()
|
||||
defer f.mu.Unlock()
|
||||
|
||||
if f.running {
|
||||
return nil
|
||||
}
|
||||
|
||||
f.running = true
|
||||
f.stopCh = make(chan struct{})
|
||||
|
||||
go f.fetchLoop()
|
||||
log.Println("FetcherService started")
|
||||
return nil
|
||||
}
|
||||
|
||||
// Pause stops fetching issues.
|
||||
func (f *FetcherService) Pause() {
|
||||
f.mu.Lock()
|
||||
defer f.mu.Unlock()
|
||||
|
||||
if !f.running {
|
||||
return
|
||||
}
|
||||
|
||||
f.running = false
|
||||
close(f.stopCh)
|
||||
log.Println("FetcherService paused")
|
||||
}
|
||||
|
||||
// IsRunning returns whether the fetcher is actively running.
|
||||
func (f *FetcherService) IsRunning() bool {
|
||||
f.mu.RLock()
|
||||
defer f.mu.RUnlock()
|
||||
return f.running
|
||||
}
|
||||
|
||||
// Issues returns a channel that receives batches of fetched issues.
|
||||
func (f *FetcherService) Issues() <-chan []*Issue {
|
||||
return f.issuesCh
|
||||
}
|
||||
|
||||
// fetchLoop periodically fetches issues from all configured repositories.
|
||||
func (f *FetcherService) fetchLoop() {
|
||||
// Initial fetch
|
||||
f.fetchAll()
|
||||
|
||||
// Set up ticker for periodic fetching
|
||||
interval := f.config.GetFetchInterval()
|
||||
if interval < time.Minute {
|
||||
interval = 15 * time.Minute
|
||||
}
|
||||
ticker := time.NewTicker(interval)
|
||||
defer ticker.Stop()
|
||||
|
||||
for {
|
||||
select {
|
||||
case <-f.stopCh:
|
||||
return
|
||||
case <-ticker.C:
|
||||
// Check if within work hours
|
||||
if f.config.IsWithinWorkHours() {
|
||||
f.fetchAll()
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// fetchAll fetches issues from all configured repositories.
|
||||
func (f *FetcherService) fetchAll() {
|
||||
repos := f.config.GetWatchedRepos()
|
||||
if len(repos) == 0 {
|
||||
log.Println("No repositories configured")
|
||||
return
|
||||
}
|
||||
|
||||
var allIssues []*Issue
|
||||
for _, repo := range repos {
|
||||
issues, err := f.fetchFromRepo(repo)
|
||||
if err != nil {
|
||||
log.Printf("Error fetching from %s: %v", repo, err)
|
||||
continue
|
||||
}
|
||||
allIssues = append(allIssues, issues...)
|
||||
}
|
||||
|
||||
if len(allIssues) > 0 {
|
||||
select {
|
||||
case f.issuesCh <- allIssues:
|
||||
f.notify.Notify("BugSETI", fmt.Sprintf("Found %d new issues", len(allIssues)))
|
||||
default:
|
||||
// Channel full, skip
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// fetchFromRepo fetches issues from a single repository using GitHub CLI.
|
||||
func (f *FetcherService) fetchFromRepo(repo string) ([]*Issue, error) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
|
||||
defer cancel()
|
||||
|
||||
// Build query for good first issues
|
||||
labels := f.config.GetLabels()
|
||||
if len(labels) == 0 {
|
||||
labels = []string{"good first issue", "help wanted", "beginner-friendly"}
|
||||
}
|
||||
|
||||
labelQuery := strings.Join(labels, ",")
|
||||
|
||||
// Use gh CLI to fetch issues
|
||||
cmd := exec.CommandContext(ctx, "gh", "issue", "list",
|
||||
"--repo", repo,
|
||||
"--label", labelQuery,
|
||||
"--state", "open",
|
||||
"--limit", "20",
|
||||
"--json", "number,title,body,url,labels,createdAt,author")
|
||||
|
||||
output, err := cmd.Output()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("gh issue list failed: %w", err)
|
||||
}
|
||||
|
||||
var ghIssues []struct {
|
||||
Number int `json:"number"`
|
||||
Title string `json:"title"`
|
||||
Body string `json:"body"`
|
||||
URL string `json:"url"`
|
||||
CreatedAt time.Time `json:"createdAt"`
|
||||
Author struct {
|
||||
Login string `json:"login"`
|
||||
} `json:"author"`
|
||||
Labels []struct {
|
||||
Name string `json:"name"`
|
||||
} `json:"labels"`
|
||||
}
|
||||
|
||||
if err := json.Unmarshal(output, &ghIssues); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse gh output: %w", err)
|
||||
}
|
||||
|
||||
issues := make([]*Issue, 0, len(ghIssues))
|
||||
for _, gi := range ghIssues {
|
||||
labels := make([]string, len(gi.Labels))
|
||||
for i, l := range gi.Labels {
|
||||
labels[i] = l.Name
|
||||
}
|
||||
|
||||
issues = append(issues, &Issue{
|
||||
ID: fmt.Sprintf("%s#%d", repo, gi.Number),
|
||||
Number: gi.Number,
|
||||
Repo: repo,
|
||||
Title: gi.Title,
|
||||
Body: gi.Body,
|
||||
URL: gi.URL,
|
||||
Labels: labels,
|
||||
Author: gi.Author.Login,
|
||||
CreatedAt: gi.CreatedAt,
|
||||
Priority: calculatePriority(labels),
|
||||
})
|
||||
}
|
||||
|
||||
return issues, nil
|
||||
}
|
||||
|
||||
// FetchIssue fetches a single issue by repo and number.
|
||||
func (f *FetcherService) FetchIssue(repo string, number int) (*Issue, error) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 15*time.Second)
|
||||
defer cancel()
|
||||
|
||||
cmd := exec.CommandContext(ctx, "gh", "issue", "view",
|
||||
"--repo", repo,
|
||||
fmt.Sprintf("%d", number),
|
||||
"--json", "number,title,body,url,labels,createdAt,author,comments")
|
||||
|
||||
output, err := cmd.Output()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("gh issue view failed: %w", err)
|
||||
}
|
||||
|
||||
var ghIssue struct {
|
||||
Number int `json:"number"`
|
||||
Title string `json:"title"`
|
||||
Body string `json:"body"`
|
||||
URL string `json:"url"`
|
||||
CreatedAt time.Time `json:"createdAt"`
|
||||
Author struct {
|
||||
Login string `json:"login"`
|
||||
} `json:"author"`
|
||||
Labels []struct {
|
||||
Name string `json:"name"`
|
||||
} `json:"labels"`
|
||||
Comments []struct {
|
||||
Body string `json:"body"`
|
||||
Author struct {
|
||||
Login string `json:"login"`
|
||||
} `json:"author"`
|
||||
} `json:"comments"`
|
||||
}
|
||||
|
||||
if err := json.Unmarshal(output, &ghIssue); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse gh output: %w", err)
|
||||
}
|
||||
|
||||
labels := make([]string, len(ghIssue.Labels))
|
||||
for i, l := range ghIssue.Labels {
|
||||
labels[i] = l.Name
|
||||
}
|
||||
|
||||
comments := make([]Comment, len(ghIssue.Comments))
|
||||
for i, c := range ghIssue.Comments {
|
||||
comments[i] = Comment{
|
||||
Author: c.Author.Login,
|
||||
Body: c.Body,
|
||||
}
|
||||
}
|
||||
|
||||
return &Issue{
|
||||
ID: fmt.Sprintf("%s#%d", repo, ghIssue.Number),
|
||||
Number: ghIssue.Number,
|
||||
Repo: repo,
|
||||
Title: ghIssue.Title,
|
||||
Body: ghIssue.Body,
|
||||
URL: ghIssue.URL,
|
||||
Labels: labels,
|
||||
Author: ghIssue.Author.Login,
|
||||
CreatedAt: ghIssue.CreatedAt,
|
||||
Priority: calculatePriority(labels),
|
||||
Comments: comments,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// calculatePriority assigns a priority score based on labels.
|
||||
func calculatePriority(labels []string) int {
|
||||
priority := 50 // Default priority
|
||||
|
||||
for _, label := range labels {
|
||||
lower := strings.ToLower(label)
|
||||
switch {
|
||||
case strings.Contains(lower, "good first issue"):
|
||||
priority += 30
|
||||
case strings.Contains(lower, "help wanted"):
|
||||
priority += 20
|
||||
case strings.Contains(lower, "beginner"):
|
||||
priority += 25
|
||||
case strings.Contains(lower, "easy"):
|
||||
priority += 20
|
||||
case strings.Contains(lower, "bug"):
|
||||
priority += 10
|
||||
case strings.Contains(lower, "documentation"):
|
||||
priority += 5
|
||||
case strings.Contains(lower, "priority"):
|
||||
priority += 15
|
||||
}
|
||||
}
|
||||
|
||||
return priority
|
||||
}
|
||||
3
internal/bugseti/go.mod
Normal file
3
internal/bugseti/go.mod
Normal file
|
|
@ -0,0 +1,3 @@
|
|||
module github.com/host-uk/core/internal/bugseti
|
||||
|
||||
go 1.25.5
|
||||
236
internal/bugseti/notify.go
Normal file
236
internal/bugseti/notify.go
Normal file
|
|
@ -0,0 +1,236 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"log"
|
||||
"os/exec"
|
||||
"runtime"
|
||||
"time"
|
||||
)
|
||||
|
||||
// NotifyService handles desktop notifications.
|
||||
type NotifyService struct {
|
||||
enabled bool
|
||||
sound bool
|
||||
}
|
||||
|
||||
// NewNotifyService creates a new NotifyService.
|
||||
func NewNotifyService() *NotifyService {
|
||||
return &NotifyService{
|
||||
enabled: true,
|
||||
sound: true,
|
||||
}
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (n *NotifyService) ServiceName() string {
|
||||
return "NotifyService"
|
||||
}
|
||||
|
||||
// SetEnabled enables or disables notifications.
|
||||
func (n *NotifyService) SetEnabled(enabled bool) {
|
||||
n.enabled = enabled
|
||||
}
|
||||
|
||||
// SetSound enables or disables notification sounds.
|
||||
func (n *NotifyService) SetSound(sound bool) {
|
||||
n.sound = sound
|
||||
}
|
||||
|
||||
// Notify sends a desktop notification.
|
||||
func (n *NotifyService) Notify(title, message string) error {
|
||||
if !n.enabled {
|
||||
return nil
|
||||
}
|
||||
|
||||
log.Printf("Notification: %s - %s", title, message)
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
|
||||
defer cancel()
|
||||
|
||||
var err error
|
||||
switch runtime.GOOS {
|
||||
case "darwin":
|
||||
err = n.notifyMacOS(ctx, title, message)
|
||||
case "linux":
|
||||
err = n.notifyLinux(ctx, title, message)
|
||||
case "windows":
|
||||
err = n.notifyWindows(ctx, title, message)
|
||||
default:
|
||||
err = fmt.Errorf("unsupported platform: %s", runtime.GOOS)
|
||||
}
|
||||
|
||||
if err != nil {
|
||||
log.Printf("Notification error: %v", err)
|
||||
}
|
||||
return err
|
||||
}
|
||||
|
||||
// NotifyIssue sends a notification about a new issue.
|
||||
func (n *NotifyService) NotifyIssue(issue *Issue) error {
|
||||
title := "New Issue Available"
|
||||
message := fmt.Sprintf("%s: %s", issue.Repo, issue.Title)
|
||||
return n.Notify(title, message)
|
||||
}
|
||||
|
||||
// NotifyPRStatus sends a notification about a PR status change.
|
||||
func (n *NotifyService) NotifyPRStatus(repo string, prNumber int, status string) error {
|
||||
title := "PR Status Update"
|
||||
message := fmt.Sprintf("%s #%d: %s", repo, prNumber, status)
|
||||
return n.Notify(title, message)
|
||||
}
|
||||
|
||||
// notifyMacOS sends a notification on macOS using osascript.
|
||||
func (n *NotifyService) notifyMacOS(ctx context.Context, title, message string) error {
|
||||
script := fmt.Sprintf(`display notification "%s" with title "%s"`, message, title)
|
||||
if n.sound {
|
||||
script += ` sound name "Glass"`
|
||||
}
|
||||
cmd := exec.CommandContext(ctx, "osascript", "-e", script)
|
||||
return cmd.Run()
|
||||
}
|
||||
|
||||
// notifyLinux sends a notification on Linux using notify-send.
|
||||
func (n *NotifyService) notifyLinux(ctx context.Context, title, message string) error {
|
||||
args := []string{
|
||||
"--app-name=BugSETI",
|
||||
"--urgency=normal",
|
||||
title,
|
||||
message,
|
||||
}
|
||||
cmd := exec.CommandContext(ctx, "notify-send", args...)
|
||||
return cmd.Run()
|
||||
}
|
||||
|
||||
// notifyWindows sends a notification on Windows using PowerShell.
|
||||
func (n *NotifyService) notifyWindows(ctx context.Context, title, message string) error {
|
||||
script := fmt.Sprintf(`
|
||||
[Windows.UI.Notifications.ToastNotificationManager, Windows.UI.Notifications, ContentType = WindowsRuntime] | Out-Null
|
||||
[Windows.Data.Xml.Dom.XmlDocument, Windows.Data.Xml.Dom.XmlDocument, ContentType = WindowsRuntime] | Out-Null
|
||||
|
||||
$template = @"
|
||||
<toast>
|
||||
<visual>
|
||||
<binding template="ToastText02">
|
||||
<text id="1">%s</text>
|
||||
<text id="2">%s</text>
|
||||
</binding>
|
||||
</visual>
|
||||
</toast>
|
||||
"@
|
||||
|
||||
$xml = New-Object Windows.Data.Xml.Dom.XmlDocument
|
||||
$xml.LoadXml($template)
|
||||
$toast = [Windows.UI.Notifications.ToastNotification]::new($xml)
|
||||
[Windows.UI.Notifications.ToastNotificationManager]::CreateToastNotifier("BugSETI").Show($toast)
|
||||
`, title, message)
|
||||
|
||||
cmd := exec.CommandContext(ctx, "powershell", "-Command", script)
|
||||
return cmd.Run()
|
||||
}
|
||||
|
||||
// NotifyWithAction sends a notification with an action button (platform-specific).
|
||||
func (n *NotifyService) NotifyWithAction(title, message, actionLabel string) error {
|
||||
if !n.enabled {
|
||||
return nil
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
|
||||
defer cancel()
|
||||
|
||||
switch runtime.GOOS {
|
||||
case "darwin":
|
||||
// macOS: Use terminal-notifier if available for actions
|
||||
if _, err := exec.LookPath("terminal-notifier"); err == nil {
|
||||
cmd := exec.CommandContext(ctx, "terminal-notifier",
|
||||
"-title", title,
|
||||
"-message", message,
|
||||
"-appIcon", "NSApplication",
|
||||
"-actions", actionLabel,
|
||||
"-group", "BugSETI")
|
||||
return cmd.Run()
|
||||
}
|
||||
return n.notifyMacOS(ctx, title, message)
|
||||
|
||||
case "linux":
|
||||
// Linux: Use notify-send with action
|
||||
args := []string{
|
||||
"--app-name=BugSETI",
|
||||
"--urgency=normal",
|
||||
"--action=open=" + actionLabel,
|
||||
title,
|
||||
message,
|
||||
}
|
||||
cmd := exec.CommandContext(ctx, "notify-send", args...)
|
||||
return cmd.Run()
|
||||
|
||||
default:
|
||||
return n.Notify(title, message)
|
||||
}
|
||||
}
|
||||
|
||||
// NotifyProgress sends a notification with a progress indicator.
|
||||
func (n *NotifyService) NotifyProgress(title, message string, progress int) error {
|
||||
if !n.enabled {
|
||||
return nil
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Second)
|
||||
defer cancel()
|
||||
|
||||
switch runtime.GOOS {
|
||||
case "linux":
|
||||
// Linux supports progress hints
|
||||
args := []string{
|
||||
"--app-name=BugSETI",
|
||||
"--hint=int:value:" + fmt.Sprintf("%d", progress),
|
||||
title,
|
||||
message,
|
||||
}
|
||||
cmd := exec.CommandContext(ctx, "notify-send", args...)
|
||||
return cmd.Run()
|
||||
|
||||
default:
|
||||
// Other platforms: include progress in message
|
||||
messageWithProgress := fmt.Sprintf("%s (%d%%)", message, progress)
|
||||
return n.Notify(title, messageWithProgress)
|
||||
}
|
||||
}
|
||||
|
||||
// PlaySound plays a notification sound.
|
||||
func (n *NotifyService) PlaySound() error {
|
||||
if !n.sound {
|
||||
return nil
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Second)
|
||||
defer cancel()
|
||||
|
||||
switch runtime.GOOS {
|
||||
case "darwin":
|
||||
cmd := exec.CommandContext(ctx, "afplay", "/System/Library/Sounds/Glass.aiff")
|
||||
return cmd.Run()
|
||||
|
||||
case "linux":
|
||||
// Try paplay (PulseAudio), then aplay (ALSA)
|
||||
if _, err := exec.LookPath("paplay"); err == nil {
|
||||
cmd := exec.CommandContext(ctx, "paplay", "/usr/share/sounds/freedesktop/stereo/complete.oga")
|
||||
return cmd.Run()
|
||||
}
|
||||
if _, err := exec.LookPath("aplay"); err == nil {
|
||||
cmd := exec.CommandContext(ctx, "aplay", "-q", "/usr/share/sounds/alsa/Front_Center.wav")
|
||||
return cmd.Run()
|
||||
}
|
||||
return nil
|
||||
|
||||
case "windows":
|
||||
script := `[console]::beep(800, 200)`
|
||||
cmd := exec.CommandContext(ctx, "powershell", "-Command", script)
|
||||
return cmd.Run()
|
||||
|
||||
default:
|
||||
return nil
|
||||
}
|
||||
}
|
||||
308
internal/bugseti/queue.go
Normal file
308
internal/bugseti/queue.go
Normal file
|
|
@ -0,0 +1,308 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"container/heap"
|
||||
"encoding/json"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
// IssueStatus represents the status of an issue in the queue.
|
||||
type IssueStatus string
|
||||
|
||||
const (
|
||||
StatusPending IssueStatus = "pending"
|
||||
StatusClaimed IssueStatus = "claimed"
|
||||
StatusInProgress IssueStatus = "in_progress"
|
||||
StatusCompleted IssueStatus = "completed"
|
||||
StatusSkipped IssueStatus = "skipped"
|
||||
)
|
||||
|
||||
// Issue represents a GitHub issue in the queue.
|
||||
type Issue struct {
|
||||
ID string `json:"id"`
|
||||
Number int `json:"number"`
|
||||
Repo string `json:"repo"`
|
||||
Title string `json:"title"`
|
||||
Body string `json:"body"`
|
||||
URL string `json:"url"`
|
||||
Labels []string `json:"labels"`
|
||||
Author string `json:"author"`
|
||||
CreatedAt time.Time `json:"createdAt"`
|
||||
Priority int `json:"priority"`
|
||||
Status IssueStatus `json:"status"`
|
||||
ClaimedAt time.Time `json:"claimedAt,omitempty"`
|
||||
Context *IssueContext `json:"context,omitempty"`
|
||||
Comments []Comment `json:"comments,omitempty"`
|
||||
index int // For heap interface
|
||||
}
|
||||
|
||||
// Comment represents a comment on an issue.
|
||||
type Comment struct {
|
||||
Author string `json:"author"`
|
||||
Body string `json:"body"`
|
||||
}
|
||||
|
||||
// IssueContext contains AI-prepared context for an issue.
|
||||
type IssueContext struct {
|
||||
Summary string `json:"summary"`
|
||||
RelevantFiles []string `json:"relevantFiles"`
|
||||
SuggestedFix string `json:"suggestedFix"`
|
||||
RelatedIssues []string `json:"relatedIssues"`
|
||||
Complexity string `json:"complexity"`
|
||||
EstimatedTime string `json:"estimatedTime"`
|
||||
PreparedAt time.Time `json:"preparedAt"`
|
||||
}
|
||||
|
||||
// QueueService manages the priority queue of issues.
|
||||
type QueueService struct {
|
||||
config *ConfigService
|
||||
issues issueHeap
|
||||
seen map[string]bool
|
||||
current *Issue
|
||||
mu sync.RWMutex
|
||||
}
|
||||
|
||||
// issueHeap implements heap.Interface for priority queue.
|
||||
type issueHeap []*Issue
|
||||
|
||||
func (h issueHeap) Len() int { return len(h) }
|
||||
func (h issueHeap) Less(i, j int) bool { return h[i].Priority > h[j].Priority } // Higher priority first
|
||||
func (h issueHeap) Swap(i, j int) {
|
||||
h[i], h[j] = h[j], h[i]
|
||||
h[i].index = i
|
||||
h[j].index = j
|
||||
}
|
||||
|
||||
func (h *issueHeap) Push(x any) {
|
||||
n := len(*h)
|
||||
item := x.(*Issue)
|
||||
item.index = n
|
||||
*h = append(*h, item)
|
||||
}
|
||||
|
||||
func (h *issueHeap) Pop() any {
|
||||
old := *h
|
||||
n := len(old)
|
||||
item := old[n-1]
|
||||
old[n-1] = nil
|
||||
item.index = -1
|
||||
*h = old[0 : n-1]
|
||||
return item
|
||||
}
|
||||
|
||||
// NewQueueService creates a new QueueService.
|
||||
func NewQueueService(config *ConfigService) *QueueService {
|
||||
q := &QueueService{
|
||||
config: config,
|
||||
issues: make(issueHeap, 0),
|
||||
seen: make(map[string]bool),
|
||||
}
|
||||
heap.Init(&q.issues)
|
||||
q.load() // Load persisted queue
|
||||
return q
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (q *QueueService) ServiceName() string {
|
||||
return "QueueService"
|
||||
}
|
||||
|
||||
// Add adds issues to the queue, deduplicating by ID.
|
||||
func (q *QueueService) Add(issues []*Issue) int {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
added := 0
|
||||
for _, issue := range issues {
|
||||
if q.seen[issue.ID] {
|
||||
continue
|
||||
}
|
||||
q.seen[issue.ID] = true
|
||||
issue.Status = StatusPending
|
||||
heap.Push(&q.issues, issue)
|
||||
added++
|
||||
}
|
||||
|
||||
if added > 0 {
|
||||
q.save()
|
||||
}
|
||||
return added
|
||||
}
|
||||
|
||||
// Size returns the number of issues in the queue.
|
||||
func (q *QueueService) Size() int {
|
||||
q.mu.RLock()
|
||||
defer q.mu.RUnlock()
|
||||
return len(q.issues)
|
||||
}
|
||||
|
||||
// CurrentIssue returns the issue currently being worked on.
|
||||
func (q *QueueService) CurrentIssue() *Issue {
|
||||
q.mu.RLock()
|
||||
defer q.mu.RUnlock()
|
||||
return q.current
|
||||
}
|
||||
|
||||
// Next claims and returns the next issue from the queue.
|
||||
func (q *QueueService) Next() *Issue {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
if len(q.issues) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Pop the highest priority issue
|
||||
issue := heap.Pop(&q.issues).(*Issue)
|
||||
issue.Status = StatusClaimed
|
||||
issue.ClaimedAt = time.Now()
|
||||
q.current = issue
|
||||
q.save()
|
||||
return issue
|
||||
}
|
||||
|
||||
// Skip marks the current issue as skipped and moves to the next.
|
||||
func (q *QueueService) Skip() {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
if q.current != nil {
|
||||
q.current.Status = StatusSkipped
|
||||
q.current = nil
|
||||
q.save()
|
||||
}
|
||||
}
|
||||
|
||||
// Complete marks the current issue as completed.
|
||||
func (q *QueueService) Complete() {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
if q.current != nil {
|
||||
q.current.Status = StatusCompleted
|
||||
q.current = nil
|
||||
q.save()
|
||||
}
|
||||
}
|
||||
|
||||
// SetInProgress marks the current issue as in progress.
|
||||
func (q *QueueService) SetInProgress() {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
if q.current != nil {
|
||||
q.current.Status = StatusInProgress
|
||||
q.save()
|
||||
}
|
||||
}
|
||||
|
||||
// SetContext sets the AI-prepared context for the current issue.
|
||||
func (q *QueueService) SetContext(ctx *IssueContext) {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
if q.current != nil {
|
||||
q.current.Context = ctx
|
||||
q.save()
|
||||
}
|
||||
}
|
||||
|
||||
// GetPending returns all pending issues.
|
||||
func (q *QueueService) GetPending() []*Issue {
|
||||
q.mu.RLock()
|
||||
defer q.mu.RUnlock()
|
||||
|
||||
result := make([]*Issue, 0, len(q.issues))
|
||||
for _, issue := range q.issues {
|
||||
if issue.Status == StatusPending {
|
||||
result = append(result, issue)
|
||||
}
|
||||
}
|
||||
return result
|
||||
}
|
||||
|
||||
// Clear removes all issues from the queue.
|
||||
func (q *QueueService) Clear() {
|
||||
q.mu.Lock()
|
||||
defer q.mu.Unlock()
|
||||
|
||||
q.issues = make(issueHeap, 0)
|
||||
q.seen = make(map[string]bool)
|
||||
q.current = nil
|
||||
heap.Init(&q.issues)
|
||||
q.save()
|
||||
}
|
||||
|
||||
// queueState represents the persisted queue state.
|
||||
type queueState struct {
|
||||
Issues []*Issue `json:"issues"`
|
||||
Current *Issue `json:"current"`
|
||||
Seen []string `json:"seen"`
|
||||
}
|
||||
|
||||
// save persists the queue to disk.
|
||||
func (q *QueueService) save() {
|
||||
dataDir := q.config.GetDataDir()
|
||||
if dataDir == "" {
|
||||
return
|
||||
}
|
||||
|
||||
path := filepath.Join(dataDir, "queue.json")
|
||||
|
||||
seen := make([]string, 0, len(q.seen))
|
||||
for id := range q.seen {
|
||||
seen = append(seen, id)
|
||||
}
|
||||
|
||||
state := queueState{
|
||||
Issues: []*Issue(q.issues),
|
||||
Current: q.current,
|
||||
Seen: seen,
|
||||
}
|
||||
|
||||
data, err := json.MarshalIndent(state, "", " ")
|
||||
if err != nil {
|
||||
log.Printf("Failed to marshal queue: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
if err := os.WriteFile(path, data, 0644); err != nil {
|
||||
log.Printf("Failed to save queue: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// load restores the queue from disk.
|
||||
func (q *QueueService) load() {
|
||||
dataDir := q.config.GetDataDir()
|
||||
if dataDir == "" {
|
||||
return
|
||||
}
|
||||
|
||||
path := filepath.Join(dataDir, "queue.json")
|
||||
data, err := os.ReadFile(path)
|
||||
if err != nil {
|
||||
if !os.IsNotExist(err) {
|
||||
log.Printf("Failed to read queue: %v", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
var state queueState
|
||||
if err := json.Unmarshal(data, &state); err != nil {
|
||||
log.Printf("Failed to unmarshal queue: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
q.issues = state.Issues
|
||||
heap.Init(&q.issues)
|
||||
q.current = state.Current
|
||||
q.seen = make(map[string]bool)
|
||||
for _, id := range state.Seen {
|
||||
q.seen[id] = true
|
||||
}
|
||||
}
|
||||
272
internal/bugseti/seeder.go
Normal file
272
internal/bugseti/seeder.go
Normal file
|
|
@ -0,0 +1,272 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"log"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
// SeederService prepares context for issues using the seed-agent-developer skill.
|
||||
type SeederService struct {
|
||||
config *ConfigService
|
||||
}
|
||||
|
||||
// NewSeederService creates a new SeederService.
|
||||
func NewSeederService(config *ConfigService) *SeederService {
|
||||
return &SeederService{
|
||||
config: config,
|
||||
}
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (s *SeederService) ServiceName() string {
|
||||
return "SeederService"
|
||||
}
|
||||
|
||||
// SeedIssue prepares context for an issue by calling the seed-agent-developer skill.
|
||||
func (s *SeederService) SeedIssue(issue *Issue) (*IssueContext, error) {
|
||||
if issue == nil {
|
||||
return nil, fmt.Errorf("issue is nil")
|
||||
}
|
||||
|
||||
// Create a temporary workspace for the issue
|
||||
workDir, err := s.prepareWorkspace(issue)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to prepare workspace: %w", err)
|
||||
}
|
||||
|
||||
// Try to use the seed-agent-developer skill via plugin system
|
||||
ctx, err := s.runSeedSkill(issue, workDir)
|
||||
if err != nil {
|
||||
log.Printf("Seed skill failed, using fallback: %v", err)
|
||||
// Fallback to basic context preparation
|
||||
ctx = s.prepareBasicContext(issue)
|
||||
}
|
||||
|
||||
ctx.PreparedAt = time.Now()
|
||||
return ctx, nil
|
||||
}
|
||||
|
||||
// prepareWorkspace creates a temporary workspace and clones the repo.
|
||||
func (s *SeederService) prepareWorkspace(issue *Issue) (string, error) {
|
||||
// Create workspace directory
|
||||
baseDir := s.config.GetWorkspaceDir()
|
||||
if baseDir == "" {
|
||||
baseDir = filepath.Join(os.TempDir(), "bugseti")
|
||||
}
|
||||
|
||||
// Create issue-specific directory
|
||||
workDir := filepath.Join(baseDir, sanitizeRepoName(issue.Repo), fmt.Sprintf("issue-%d", issue.Number))
|
||||
if err := os.MkdirAll(workDir, 0755); err != nil {
|
||||
return "", fmt.Errorf("failed to create workspace: %w", err)
|
||||
}
|
||||
|
||||
// Check if repo already cloned
|
||||
if _, err := os.Stat(filepath.Join(workDir, ".git")); os.IsNotExist(err) {
|
||||
// Clone the repository
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 5*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
cmd := exec.CommandContext(ctx, "gh", "repo", "clone", issue.Repo, workDir, "--", "--depth=1")
|
||||
var stderr bytes.Buffer
|
||||
cmd.Stderr = &stderr
|
||||
if err := cmd.Run(); err != nil {
|
||||
return "", fmt.Errorf("failed to clone repo: %s: %w", stderr.String(), err)
|
||||
}
|
||||
}
|
||||
|
||||
return workDir, nil
|
||||
}
|
||||
|
||||
// runSeedSkill executes the seed-agent-developer skill to prepare context.
|
||||
func (s *SeederService) runSeedSkill(issue *Issue, workDir string) (*IssueContext, error) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
// Look for the plugin script
|
||||
pluginPaths := []string{
|
||||
"/home/shared/hostuk/claude-plugins/agentic-flows/skills/seed-agent-developer/scripts/analyze-issue.sh",
|
||||
filepath.Join(os.Getenv("HOME"), ".claude/plugins/agentic-flows/skills/seed-agent-developer/scripts/analyze-issue.sh"),
|
||||
}
|
||||
|
||||
var scriptPath string
|
||||
for _, p := range pluginPaths {
|
||||
if _, err := os.Stat(p); err == nil {
|
||||
scriptPath = p
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
if scriptPath == "" {
|
||||
return nil, fmt.Errorf("seed-agent-developer skill not found")
|
||||
}
|
||||
|
||||
// Run the analyze-issue script
|
||||
cmd := exec.CommandContext(ctx, "bash", scriptPath)
|
||||
cmd.Dir = workDir
|
||||
cmd.Env = append(os.Environ(),
|
||||
fmt.Sprintf("ISSUE_NUMBER=%d", issue.Number),
|
||||
fmt.Sprintf("ISSUE_REPO=%s", issue.Repo),
|
||||
fmt.Sprintf("ISSUE_TITLE=%s", issue.Title),
|
||||
fmt.Sprintf("ISSUE_URL=%s", issue.URL),
|
||||
)
|
||||
|
||||
var stdout, stderr bytes.Buffer
|
||||
cmd.Stdout = &stdout
|
||||
cmd.Stderr = &stderr
|
||||
|
||||
if err := cmd.Run(); err != nil {
|
||||
return nil, fmt.Errorf("seed skill failed: %s: %w", stderr.String(), err)
|
||||
}
|
||||
|
||||
// Parse the output as JSON
|
||||
var result struct {
|
||||
Summary string `json:"summary"`
|
||||
RelevantFiles []string `json:"relevant_files"`
|
||||
SuggestedFix string `json:"suggested_fix"`
|
||||
RelatedIssues []string `json:"related_issues"`
|
||||
Complexity string `json:"complexity"`
|
||||
EstimatedTime string `json:"estimated_time"`
|
||||
}
|
||||
|
||||
if err := json.Unmarshal(stdout.Bytes(), &result); err != nil {
|
||||
// If not JSON, treat as plain text summary
|
||||
return &IssueContext{
|
||||
Summary: stdout.String(),
|
||||
Complexity: "unknown",
|
||||
}, nil
|
||||
}
|
||||
|
||||
return &IssueContext{
|
||||
Summary: result.Summary,
|
||||
RelevantFiles: result.RelevantFiles,
|
||||
SuggestedFix: result.SuggestedFix,
|
||||
RelatedIssues: result.RelatedIssues,
|
||||
Complexity: result.Complexity,
|
||||
EstimatedTime: result.EstimatedTime,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// prepareBasicContext creates a basic context without the seed skill.
|
||||
func (s *SeederService) prepareBasicContext(issue *Issue) *IssueContext {
|
||||
// Extract potential file references from issue body
|
||||
files := extractFileReferences(issue.Body)
|
||||
|
||||
// Estimate complexity based on labels and body length
|
||||
complexity := estimateComplexity(issue)
|
||||
|
||||
return &IssueContext{
|
||||
Summary: fmt.Sprintf("Issue #%d in %s: %s", issue.Number, issue.Repo, issue.Title),
|
||||
RelevantFiles: files,
|
||||
Complexity: complexity,
|
||||
EstimatedTime: estimateTime(complexity),
|
||||
}
|
||||
}
|
||||
|
||||
// sanitizeRepoName converts owner/repo to a safe directory name.
|
||||
func sanitizeRepoName(repo string) string {
|
||||
return strings.ReplaceAll(repo, "/", "-")
|
||||
}
|
||||
|
||||
// extractFileReferences finds file paths mentioned in text.
|
||||
func extractFileReferences(text string) []string {
|
||||
var files []string
|
||||
seen := make(map[string]bool)
|
||||
|
||||
// Common file patterns
|
||||
patterns := []string{
|
||||
`.go`, `.js`, `.ts`, `.py`, `.rs`, `.java`, `.cpp`, `.c`, `.h`,
|
||||
`.json`, `.yaml`, `.yml`, `.toml`, `.xml`, `.md`,
|
||||
}
|
||||
|
||||
words := strings.Fields(text)
|
||||
for _, word := range words {
|
||||
// Clean up the word
|
||||
word = strings.Trim(word, "`,\"'()[]{}:")
|
||||
|
||||
// Check if it looks like a file path
|
||||
for _, ext := range patterns {
|
||||
if strings.HasSuffix(word, ext) && !seen[word] {
|
||||
files = append(files, word)
|
||||
seen[word] = true
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return files
|
||||
}
|
||||
|
||||
// estimateComplexity guesses issue complexity from content.
|
||||
func estimateComplexity(issue *Issue) string {
|
||||
bodyLen := len(issue.Body)
|
||||
labelScore := 0
|
||||
|
||||
for _, label := range issue.Labels {
|
||||
lower := strings.ToLower(label)
|
||||
switch {
|
||||
case strings.Contains(lower, "good first issue"), strings.Contains(lower, "beginner"):
|
||||
labelScore -= 2
|
||||
case strings.Contains(lower, "easy"):
|
||||
labelScore -= 1
|
||||
case strings.Contains(lower, "complex"), strings.Contains(lower, "hard"):
|
||||
labelScore += 2
|
||||
case strings.Contains(lower, "refactor"):
|
||||
labelScore += 1
|
||||
}
|
||||
}
|
||||
|
||||
// Combine body length and label score
|
||||
score := labelScore
|
||||
if bodyLen > 2000 {
|
||||
score += 2
|
||||
} else if bodyLen > 500 {
|
||||
score += 1
|
||||
}
|
||||
|
||||
switch {
|
||||
case score <= -1:
|
||||
return "easy"
|
||||
case score <= 1:
|
||||
return "medium"
|
||||
default:
|
||||
return "hard"
|
||||
}
|
||||
}
|
||||
|
||||
// estimateTime suggests time based on complexity.
|
||||
func estimateTime(complexity string) string {
|
||||
switch complexity {
|
||||
case "easy":
|
||||
return "15-30 minutes"
|
||||
case "medium":
|
||||
return "1-2 hours"
|
||||
case "hard":
|
||||
return "2-4 hours"
|
||||
default:
|
||||
return "unknown"
|
||||
}
|
||||
}
|
||||
|
||||
// GetWorkspaceDir returns the workspace directory for an issue.
|
||||
func (s *SeederService) GetWorkspaceDir(issue *Issue) string {
|
||||
baseDir := s.config.GetWorkspaceDir()
|
||||
if baseDir == "" {
|
||||
baseDir = filepath.Join(os.TempDir(), "bugseti")
|
||||
}
|
||||
return filepath.Join(baseDir, sanitizeRepoName(issue.Repo), fmt.Sprintf("issue-%d", issue.Number))
|
||||
}
|
||||
|
||||
// CleanupWorkspace removes the workspace for an issue.
|
||||
func (s *SeederService) CleanupWorkspace(issue *Issue) error {
|
||||
workDir := s.GetWorkspaceDir(issue)
|
||||
return os.RemoveAll(workDir)
|
||||
}
|
||||
359
internal/bugseti/stats.go
Normal file
359
internal/bugseti/stats.go
Normal file
|
|
@ -0,0 +1,359 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"encoding/json"
|
||||
"log"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"sync"
|
||||
"time"
|
||||
)
|
||||
|
||||
// StatsService tracks user contribution statistics.
|
||||
type StatsService struct {
|
||||
config *ConfigService
|
||||
stats *Stats
|
||||
mu sync.RWMutex
|
||||
}
|
||||
|
||||
// Stats contains all tracked statistics.
|
||||
type Stats struct {
|
||||
// Issue stats
|
||||
IssuesAttempted int `json:"issuesAttempted"`
|
||||
IssuesCompleted int `json:"issuesCompleted"`
|
||||
IssuesSkipped int `json:"issuesSkipped"`
|
||||
|
||||
// PR stats
|
||||
PRsSubmitted int `json:"prsSubmitted"`
|
||||
PRsMerged int `json:"prsMerged"`
|
||||
PRsRejected int `json:"prsRejected"`
|
||||
|
||||
// Repository stats
|
||||
ReposContributed map[string]*RepoStats `json:"reposContributed"`
|
||||
|
||||
// Streaks
|
||||
CurrentStreak int `json:"currentStreak"`
|
||||
LongestStreak int `json:"longestStreak"`
|
||||
LastActivity time.Time `json:"lastActivity"`
|
||||
|
||||
// Time tracking
|
||||
TotalTimeSpent time.Duration `json:"totalTimeSpent"`
|
||||
AverageTimePerPR time.Duration `json:"averageTimePerPR"`
|
||||
|
||||
// Activity history (last 30 days)
|
||||
DailyActivity map[string]*DayStats `json:"dailyActivity"`
|
||||
}
|
||||
|
||||
// RepoStats contains statistics for a single repository.
|
||||
type RepoStats struct {
|
||||
Name string `json:"name"`
|
||||
IssuesFixed int `json:"issuesFixed"`
|
||||
PRsSubmitted int `json:"prsSubmitted"`
|
||||
PRsMerged int `json:"prsMerged"`
|
||||
FirstContrib time.Time `json:"firstContrib"`
|
||||
LastContrib time.Time `json:"lastContrib"`
|
||||
}
|
||||
|
||||
// DayStats contains statistics for a single day.
|
||||
type DayStats struct {
|
||||
Date string `json:"date"`
|
||||
IssuesWorked int `json:"issuesWorked"`
|
||||
PRsSubmitted int `json:"prsSubmitted"`
|
||||
TimeSpent int `json:"timeSpentMinutes"`
|
||||
}
|
||||
|
||||
// NewStatsService creates a new StatsService.
|
||||
func NewStatsService(config *ConfigService) *StatsService {
|
||||
s := &StatsService{
|
||||
config: config,
|
||||
stats: &Stats{
|
||||
ReposContributed: make(map[string]*RepoStats),
|
||||
DailyActivity: make(map[string]*DayStats),
|
||||
},
|
||||
}
|
||||
s.load()
|
||||
return s
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (s *StatsService) ServiceName() string {
|
||||
return "StatsService"
|
||||
}
|
||||
|
||||
// GetStats returns a copy of the current statistics.
|
||||
func (s *StatsService) GetStats() Stats {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return *s.stats
|
||||
}
|
||||
|
||||
// RecordIssueAttempted records that an issue was started.
|
||||
func (s *StatsService) RecordIssueAttempted(repo string) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.IssuesAttempted++
|
||||
s.ensureRepo(repo)
|
||||
s.updateStreak()
|
||||
s.updateDailyActivity("issue")
|
||||
s.save()
|
||||
}
|
||||
|
||||
// RecordIssueCompleted records that an issue was completed.
|
||||
func (s *StatsService) RecordIssueCompleted(repo string) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.IssuesCompleted++
|
||||
if rs, ok := s.stats.ReposContributed[repo]; ok {
|
||||
rs.IssuesFixed++
|
||||
rs.LastContrib = time.Now()
|
||||
}
|
||||
s.save()
|
||||
}
|
||||
|
||||
// RecordIssueSkipped records that an issue was skipped.
|
||||
func (s *StatsService) RecordIssueSkipped() {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.IssuesSkipped++
|
||||
s.save()
|
||||
}
|
||||
|
||||
// RecordPRSubmitted records that a PR was submitted.
|
||||
func (s *StatsService) RecordPRSubmitted(repo string) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.PRsSubmitted++
|
||||
if rs, ok := s.stats.ReposContributed[repo]; ok {
|
||||
rs.PRsSubmitted++
|
||||
rs.LastContrib = time.Now()
|
||||
}
|
||||
s.updateDailyActivity("pr")
|
||||
s.save()
|
||||
}
|
||||
|
||||
// RecordPRMerged records that a PR was merged.
|
||||
func (s *StatsService) RecordPRMerged(repo string) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.PRsMerged++
|
||||
if rs, ok := s.stats.ReposContributed[repo]; ok {
|
||||
rs.PRsMerged++
|
||||
}
|
||||
s.save()
|
||||
}
|
||||
|
||||
// RecordPRRejected records that a PR was rejected.
|
||||
func (s *StatsService) RecordPRRejected() {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.PRsRejected++
|
||||
s.save()
|
||||
}
|
||||
|
||||
// RecordTimeSpent adds time spent on an issue.
|
||||
func (s *StatsService) RecordTimeSpent(duration time.Duration) {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats.TotalTimeSpent += duration
|
||||
|
||||
// Recalculate average
|
||||
if s.stats.PRsSubmitted > 0 {
|
||||
s.stats.AverageTimePerPR = s.stats.TotalTimeSpent / time.Duration(s.stats.PRsSubmitted)
|
||||
}
|
||||
|
||||
// Update daily activity
|
||||
today := time.Now().Format("2006-01-02")
|
||||
if day, ok := s.stats.DailyActivity[today]; ok {
|
||||
day.TimeSpent += int(duration.Minutes())
|
||||
}
|
||||
|
||||
s.save()
|
||||
}
|
||||
|
||||
// GetRepoStats returns statistics for a specific repository.
|
||||
func (s *StatsService) GetRepoStats(repo string) *RepoStats {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.stats.ReposContributed[repo]
|
||||
}
|
||||
|
||||
// GetTopRepos returns the top N repositories by contributions.
|
||||
func (s *StatsService) GetTopRepos(n int) []*RepoStats {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
repos := make([]*RepoStats, 0, len(s.stats.ReposContributed))
|
||||
for _, rs := range s.stats.ReposContributed {
|
||||
repos = append(repos, rs)
|
||||
}
|
||||
|
||||
// Sort by PRs merged (descending)
|
||||
for i := 0; i < len(repos)-1; i++ {
|
||||
for j := i + 1; j < len(repos); j++ {
|
||||
if repos[j].PRsMerged > repos[i].PRsMerged {
|
||||
repos[i], repos[j] = repos[j], repos[i]
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if n > len(repos) {
|
||||
n = len(repos)
|
||||
}
|
||||
return repos[:n]
|
||||
}
|
||||
|
||||
// GetActivityHistory returns the activity for the last N days.
|
||||
func (s *StatsService) GetActivityHistory(days int) []*DayStats {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
|
||||
result := make([]*DayStats, 0, days)
|
||||
now := time.Now()
|
||||
|
||||
for i := 0; i < days; i++ {
|
||||
date := now.AddDate(0, 0, -i).Format("2006-01-02")
|
||||
if day, ok := s.stats.DailyActivity[date]; ok {
|
||||
result = append(result, day)
|
||||
} else {
|
||||
result = append(result, &DayStats{Date: date})
|
||||
}
|
||||
}
|
||||
|
||||
return result
|
||||
}
|
||||
|
||||
// ensureRepo creates a repo stats entry if it doesn't exist.
|
||||
func (s *StatsService) ensureRepo(repo string) {
|
||||
if _, ok := s.stats.ReposContributed[repo]; !ok {
|
||||
s.stats.ReposContributed[repo] = &RepoStats{
|
||||
Name: repo,
|
||||
FirstContrib: time.Now(),
|
||||
LastContrib: time.Now(),
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// updateStreak updates the contribution streak.
|
||||
func (s *StatsService) updateStreak() {
|
||||
now := time.Now()
|
||||
lastActivity := s.stats.LastActivity
|
||||
|
||||
if lastActivity.IsZero() {
|
||||
s.stats.CurrentStreak = 1
|
||||
} else {
|
||||
daysSince := int(now.Sub(lastActivity).Hours() / 24)
|
||||
if daysSince <= 1 {
|
||||
// Same day or next day
|
||||
if daysSince == 1 || now.Day() != lastActivity.Day() {
|
||||
s.stats.CurrentStreak++
|
||||
}
|
||||
} else {
|
||||
// Streak broken
|
||||
s.stats.CurrentStreak = 1
|
||||
}
|
||||
}
|
||||
|
||||
if s.stats.CurrentStreak > s.stats.LongestStreak {
|
||||
s.stats.LongestStreak = s.stats.CurrentStreak
|
||||
}
|
||||
|
||||
s.stats.LastActivity = now
|
||||
}
|
||||
|
||||
// updateDailyActivity updates today's activity.
|
||||
func (s *StatsService) updateDailyActivity(activityType string) {
|
||||
today := time.Now().Format("2006-01-02")
|
||||
|
||||
if _, ok := s.stats.DailyActivity[today]; !ok {
|
||||
s.stats.DailyActivity[today] = &DayStats{Date: today}
|
||||
}
|
||||
|
||||
day := s.stats.DailyActivity[today]
|
||||
switch activityType {
|
||||
case "issue":
|
||||
day.IssuesWorked++
|
||||
case "pr":
|
||||
day.PRsSubmitted++
|
||||
}
|
||||
|
||||
// Clean up old entries (keep last 90 days)
|
||||
cutoff := time.Now().AddDate(0, 0, -90).Format("2006-01-02")
|
||||
for date := range s.stats.DailyActivity {
|
||||
if date < cutoff {
|
||||
delete(s.stats.DailyActivity, date)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// save persists stats to disk.
|
||||
func (s *StatsService) save() {
|
||||
dataDir := s.config.GetDataDir()
|
||||
if dataDir == "" {
|
||||
return
|
||||
}
|
||||
|
||||
path := filepath.Join(dataDir, "stats.json")
|
||||
data, err := json.MarshalIndent(s.stats, "", " ")
|
||||
if err != nil {
|
||||
log.Printf("Failed to marshal stats: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
if err := os.WriteFile(path, data, 0644); err != nil {
|
||||
log.Printf("Failed to save stats: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// load restores stats from disk.
|
||||
func (s *StatsService) load() {
|
||||
dataDir := s.config.GetDataDir()
|
||||
if dataDir == "" {
|
||||
return
|
||||
}
|
||||
|
||||
path := filepath.Join(dataDir, "stats.json")
|
||||
data, err := os.ReadFile(path)
|
||||
if err != nil {
|
||||
if !os.IsNotExist(err) {
|
||||
log.Printf("Failed to read stats: %v", err)
|
||||
}
|
||||
return
|
||||
}
|
||||
|
||||
var stats Stats
|
||||
if err := json.Unmarshal(data, &stats); err != nil {
|
||||
log.Printf("Failed to unmarshal stats: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
// Ensure maps are initialized
|
||||
if stats.ReposContributed == nil {
|
||||
stats.ReposContributed = make(map[string]*RepoStats)
|
||||
}
|
||||
if stats.DailyActivity == nil {
|
||||
stats.DailyActivity = make(map[string]*DayStats)
|
||||
}
|
||||
|
||||
s.stats = &stats
|
||||
}
|
||||
|
||||
// Reset clears all statistics.
|
||||
func (s *StatsService) Reset() error {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.stats = &Stats{
|
||||
ReposContributed: make(map[string]*RepoStats),
|
||||
DailyActivity: make(map[string]*DayStats),
|
||||
}
|
||||
s.save()
|
||||
return nil
|
||||
}
|
||||
405
internal/bugseti/submit.go
Normal file
405
internal/bugseti/submit.go
Normal file
|
|
@ -0,0 +1,405 @@
|
|||
// Package bugseti provides services for the BugSETI distributed bug fixing application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"bytes"
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"log"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"strings"
|
||||
"time"
|
||||
)
|
||||
|
||||
// SubmitService handles the PR submission flow.
|
||||
type SubmitService struct {
|
||||
config *ConfigService
|
||||
notify *NotifyService
|
||||
stats *StatsService
|
||||
}
|
||||
|
||||
// NewSubmitService creates a new SubmitService.
|
||||
func NewSubmitService(config *ConfigService, notify *NotifyService, stats *StatsService) *SubmitService {
|
||||
return &SubmitService{
|
||||
config: config,
|
||||
notify: notify,
|
||||
stats: stats,
|
||||
}
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (s *SubmitService) ServiceName() string {
|
||||
return "SubmitService"
|
||||
}
|
||||
|
||||
// PRSubmission contains the data for a pull request submission.
|
||||
type PRSubmission struct {
|
||||
Issue *Issue `json:"issue"`
|
||||
Title string `json:"title"`
|
||||
Body string `json:"body"`
|
||||
Branch string `json:"branch"`
|
||||
CommitMsg string `json:"commitMsg"`
|
||||
Files []string `json:"files"`
|
||||
WorkDir string `json:"workDir"`
|
||||
}
|
||||
|
||||
// PRResult contains the result of a PR submission.
|
||||
type PRResult struct {
|
||||
Success bool `json:"success"`
|
||||
PRURL string `json:"prUrl,omitempty"`
|
||||
PRNumber int `json:"prNumber,omitempty"`
|
||||
Error string `json:"error,omitempty"`
|
||||
ForkOwner string `json:"forkOwner,omitempty"`
|
||||
}
|
||||
|
||||
// Submit creates a pull request for the given issue.
|
||||
// Flow: Fork -> Branch -> Commit -> PR
|
||||
func (s *SubmitService) Submit(submission *PRSubmission) (*PRResult, error) {
|
||||
if submission == nil || submission.Issue == nil {
|
||||
return nil, fmt.Errorf("invalid submission")
|
||||
}
|
||||
|
||||
issue := submission.Issue
|
||||
workDir := submission.WorkDir
|
||||
if workDir == "" {
|
||||
return nil, fmt.Errorf("work directory not specified")
|
||||
}
|
||||
|
||||
// Step 1: Ensure we have a fork
|
||||
forkOwner, err := s.ensureFork(issue.Repo)
|
||||
if err != nil {
|
||||
return &PRResult{Success: false, Error: fmt.Sprintf("fork failed: %v", err)}, err
|
||||
}
|
||||
|
||||
// Step 2: Create branch
|
||||
branch := submission.Branch
|
||||
if branch == "" {
|
||||
branch = fmt.Sprintf("bugseti/issue-%d", issue.Number)
|
||||
}
|
||||
if err := s.createBranch(workDir, branch); err != nil {
|
||||
return &PRResult{Success: false, Error: fmt.Sprintf("branch creation failed: %v", err)}, err
|
||||
}
|
||||
|
||||
// Step 3: Stage and commit changes
|
||||
commitMsg := submission.CommitMsg
|
||||
if commitMsg == "" {
|
||||
commitMsg = fmt.Sprintf("fix: resolve issue #%d\n\n%s\n\nFixes #%d", issue.Number, issue.Title, issue.Number)
|
||||
}
|
||||
if err := s.commitChanges(workDir, submission.Files, commitMsg); err != nil {
|
||||
return &PRResult{Success: false, Error: fmt.Sprintf("commit failed: %v", err)}, err
|
||||
}
|
||||
|
||||
// Step 4: Push to fork
|
||||
if err := s.pushToFork(workDir, forkOwner, branch); err != nil {
|
||||
return &PRResult{Success: false, Error: fmt.Sprintf("push failed: %v", err)}, err
|
||||
}
|
||||
|
||||
// Step 5: Create PR
|
||||
prTitle := submission.Title
|
||||
if prTitle == "" {
|
||||
prTitle = fmt.Sprintf("Fix #%d: %s", issue.Number, issue.Title)
|
||||
}
|
||||
prBody := submission.Body
|
||||
if prBody == "" {
|
||||
prBody = s.generatePRBody(issue)
|
||||
}
|
||||
|
||||
prURL, prNumber, err := s.createPR(issue.Repo, forkOwner, branch, prTitle, prBody)
|
||||
if err != nil {
|
||||
return &PRResult{Success: false, Error: fmt.Sprintf("PR creation failed: %v", err)}, err
|
||||
}
|
||||
|
||||
// Update stats
|
||||
s.stats.RecordPRSubmitted(issue.Repo)
|
||||
|
||||
// Notify user
|
||||
s.notify.Notify("BugSETI", fmt.Sprintf("PR #%d submitted for issue #%d", prNumber, issue.Number))
|
||||
|
||||
return &PRResult{
|
||||
Success: true,
|
||||
PRURL: prURL,
|
||||
PRNumber: prNumber,
|
||||
ForkOwner: forkOwner,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// ensureFork ensures a fork exists for the repo.
|
||||
func (s *SubmitService) ensureFork(repo string) (string, error) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
// Check if fork exists
|
||||
parts := strings.Split(repo, "/")
|
||||
if len(parts) != 2 {
|
||||
return "", fmt.Errorf("invalid repo format: %s", repo)
|
||||
}
|
||||
|
||||
// Get current user
|
||||
cmd := exec.CommandContext(ctx, "gh", "api", "user", "--jq", ".login")
|
||||
output, err := cmd.Output()
|
||||
if err != nil {
|
||||
return "", fmt.Errorf("failed to get user: %w", err)
|
||||
}
|
||||
username := strings.TrimSpace(string(output))
|
||||
|
||||
// Check if fork exists
|
||||
forkRepo := fmt.Sprintf("%s/%s", username, parts[1])
|
||||
cmd = exec.CommandContext(ctx, "gh", "repo", "view", forkRepo, "--json", "name")
|
||||
if err := cmd.Run(); err != nil {
|
||||
// Fork doesn't exist, create it
|
||||
log.Printf("Creating fork of %s...", repo)
|
||||
cmd = exec.CommandContext(ctx, "gh", "repo", "fork", repo, "--clone=false")
|
||||
if err := cmd.Run(); err != nil {
|
||||
return "", fmt.Errorf("failed to create fork: %w", err)
|
||||
}
|
||||
// Wait a bit for GitHub to process
|
||||
time.Sleep(2 * time.Second)
|
||||
}
|
||||
|
||||
return username, nil
|
||||
}
|
||||
|
||||
// createBranch creates a new branch in the repository.
|
||||
func (s *SubmitService) createBranch(workDir, branch string) error {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
|
||||
defer cancel()
|
||||
|
||||
// Fetch latest from upstream
|
||||
cmd := exec.CommandContext(ctx, "git", "fetch", "origin")
|
||||
cmd.Dir = workDir
|
||||
cmd.Run() // Ignore errors
|
||||
|
||||
// Create and checkout new branch
|
||||
cmd = exec.CommandContext(ctx, "git", "checkout", "-b", branch)
|
||||
cmd.Dir = workDir
|
||||
var stderr bytes.Buffer
|
||||
cmd.Stderr = &stderr
|
||||
if err := cmd.Run(); err != nil {
|
||||
// Branch might already exist, try to checkout
|
||||
cmd = exec.CommandContext(ctx, "git", "checkout", branch)
|
||||
cmd.Dir = workDir
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to create/checkout branch: %s: %w", stderr.String(), err)
|
||||
}
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// commitChanges stages and commits the specified files.
|
||||
func (s *SubmitService) commitChanges(workDir string, files []string, message string) error {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
|
||||
defer cancel()
|
||||
|
||||
// Stage files
|
||||
if len(files) == 0 {
|
||||
// Stage all changes
|
||||
cmd := exec.CommandContext(ctx, "git", "add", "-A")
|
||||
cmd.Dir = workDir
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to stage changes: %w", err)
|
||||
}
|
||||
} else {
|
||||
// Stage specific files
|
||||
args := append([]string{"add"}, files...)
|
||||
cmd := exec.CommandContext(ctx, "git", args...)
|
||||
cmd.Dir = workDir
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to stage files: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Check if there are changes to commit
|
||||
cmd := exec.CommandContext(ctx, "git", "diff", "--cached", "--quiet")
|
||||
cmd.Dir = workDir
|
||||
if err := cmd.Run(); err == nil {
|
||||
return fmt.Errorf("no changes to commit")
|
||||
}
|
||||
|
||||
// Commit
|
||||
cmd = exec.CommandContext(ctx, "git", "commit", "-m", message)
|
||||
cmd.Dir = workDir
|
||||
var stderr bytes.Buffer
|
||||
cmd.Stderr = &stderr
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to commit: %s: %w", stderr.String(), err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// pushToFork pushes the branch to the user's fork.
|
||||
func (s *SubmitService) pushToFork(workDir, forkOwner, branch string) error {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 2*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
// Add fork as remote if not exists
|
||||
forkRemote := "fork"
|
||||
cmd := exec.CommandContext(ctx, "git", "remote", "get-url", forkRemote)
|
||||
cmd.Dir = workDir
|
||||
if err := cmd.Run(); err != nil {
|
||||
// Get the origin URL and construct fork URL
|
||||
cmd = exec.CommandContext(ctx, "git", "remote", "get-url", "origin")
|
||||
cmd.Dir = workDir
|
||||
output, err := cmd.Output()
|
||||
if err != nil {
|
||||
return fmt.Errorf("failed to get origin URL: %w", err)
|
||||
}
|
||||
|
||||
originURL := strings.TrimSpace(string(output))
|
||||
// Replace original owner with fork owner
|
||||
var forkURL string
|
||||
if strings.HasPrefix(originURL, "https://") {
|
||||
// https://github.com/owner/repo.git
|
||||
parts := strings.Split(originURL, "/")
|
||||
if len(parts) >= 4 {
|
||||
parts[len(parts)-2] = forkOwner
|
||||
forkURL = strings.Join(parts, "/")
|
||||
}
|
||||
} else {
|
||||
// git@github.com:owner/repo.git
|
||||
forkURL = strings.Replace(originURL, ":", fmt.Sprintf(":%s/", forkOwner), 1)
|
||||
forkURL = strings.Replace(forkURL, strings.Split(forkURL, "/")[0]+"/", "", 1)
|
||||
forkURL = fmt.Sprintf("git@github.com:%s/%s", forkOwner, filepath.Base(originURL))
|
||||
}
|
||||
|
||||
cmd = exec.CommandContext(ctx, "git", "remote", "add", forkRemote, forkURL)
|
||||
cmd.Dir = workDir
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to add fork remote: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Push to fork
|
||||
cmd = exec.CommandContext(ctx, "git", "push", "-u", forkRemote, branch)
|
||||
cmd.Dir = workDir
|
||||
var stderr bytes.Buffer
|
||||
cmd.Stderr = &stderr
|
||||
if err := cmd.Run(); err != nil {
|
||||
return fmt.Errorf("failed to push: %s: %w", stderr.String(), err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// createPR creates a pull request using GitHub CLI.
|
||||
func (s *SubmitService) createPR(repo, forkOwner, branch, title, body string) (string, int, error) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
|
||||
defer cancel()
|
||||
|
||||
// Create PR
|
||||
cmd := exec.CommandContext(ctx, "gh", "pr", "create",
|
||||
"--repo", repo,
|
||||
"--head", fmt.Sprintf("%s:%s", forkOwner, branch),
|
||||
"--title", title,
|
||||
"--body", body,
|
||||
"--json", "url,number")
|
||||
|
||||
var stdout, stderr bytes.Buffer
|
||||
cmd.Stdout = &stdout
|
||||
cmd.Stderr = &stderr
|
||||
|
||||
if err := cmd.Run(); err != nil {
|
||||
return "", 0, fmt.Errorf("failed to create PR: %s: %w", stderr.String(), err)
|
||||
}
|
||||
|
||||
var result struct {
|
||||
URL string `json:"url"`
|
||||
Number int `json:"number"`
|
||||
}
|
||||
if err := json.Unmarshal(stdout.Bytes(), &result); err != nil {
|
||||
return "", 0, fmt.Errorf("failed to parse PR response: %w", err)
|
||||
}
|
||||
|
||||
return result.URL, result.Number, nil
|
||||
}
|
||||
|
||||
// generatePRBody creates a default PR body for an issue.
|
||||
func (s *SubmitService) generatePRBody(issue *Issue) string {
|
||||
var body strings.Builder
|
||||
|
||||
body.WriteString("## Summary\n\n")
|
||||
body.WriteString(fmt.Sprintf("This PR addresses issue #%d.\n\n", issue.Number))
|
||||
|
||||
if issue.Context != nil && issue.Context.Summary != "" {
|
||||
body.WriteString("## Context\n\n")
|
||||
body.WriteString(issue.Context.Summary)
|
||||
body.WriteString("\n\n")
|
||||
}
|
||||
|
||||
body.WriteString("## Changes\n\n")
|
||||
body.WriteString("<!-- Describe your changes here -->\n\n")
|
||||
|
||||
body.WriteString("## Testing\n\n")
|
||||
body.WriteString("<!-- Describe how you tested your changes -->\n\n")
|
||||
|
||||
body.WriteString("---\n\n")
|
||||
body.WriteString("*Submitted via [BugSETI](https://github.com/host-uk/core) - Distributed Bug Fixing*\n")
|
||||
|
||||
return body.String()
|
||||
}
|
||||
|
||||
// GetPRStatus checks the status of a submitted PR.
|
||||
func (s *SubmitService) GetPRStatus(repo string, prNumber int) (*PRStatus, error) {
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 15*time.Second)
|
||||
defer cancel()
|
||||
|
||||
cmd := exec.CommandContext(ctx, "gh", "pr", "view",
|
||||
"--repo", repo,
|
||||
fmt.Sprintf("%d", prNumber),
|
||||
"--json", "state,mergeable,reviews,statusCheckRollup")
|
||||
|
||||
output, err := cmd.Output()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get PR status: %w", err)
|
||||
}
|
||||
|
||||
var result struct {
|
||||
State string `json:"state"`
|
||||
Mergeable string `json:"mergeable"`
|
||||
StatusCheckRollup []struct {
|
||||
State string `json:"state"`
|
||||
} `json:"statusCheckRollup"`
|
||||
Reviews []struct {
|
||||
State string `json:"state"`
|
||||
} `json:"reviews"`
|
||||
}
|
||||
|
||||
if err := json.Unmarshal(output, &result); err != nil {
|
||||
return nil, fmt.Errorf("failed to parse PR status: %w", err)
|
||||
}
|
||||
|
||||
status := &PRStatus{
|
||||
State: result.State,
|
||||
Mergeable: result.Mergeable == "MERGEABLE",
|
||||
}
|
||||
|
||||
// Check CI status
|
||||
status.CIPassing = true
|
||||
for _, check := range result.StatusCheckRollup {
|
||||
if check.State != "SUCCESS" && check.State != "NEUTRAL" {
|
||||
status.CIPassing = false
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
// Check review status
|
||||
for _, review := range result.Reviews {
|
||||
if review.State == "APPROVED" {
|
||||
status.Approved = true
|
||||
break
|
||||
}
|
||||
}
|
||||
|
||||
return status, nil
|
||||
}
|
||||
|
||||
// PRStatus represents the current status of a PR.
|
||||
type PRStatus struct {
|
||||
State string `json:"state"`
|
||||
Mergeable bool `json:"mergeable"`
|
||||
CIPassing bool `json:"ciPassing"`
|
||||
Approved bool `json:"approved"`
|
||||
}
|
||||
176
internal/bugseti/updater/channels.go
Normal file
176
internal/bugseti/updater/channels.go
Normal file
|
|
@ -0,0 +1,176 @@
|
|||
// Package updater provides auto-update functionality for BugSETI.
|
||||
package updater
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"regexp"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// Channel represents an update channel.
|
||||
type Channel string
|
||||
|
||||
const (
|
||||
// ChannelStable is the production release channel.
|
||||
// Tags: bugseti-vX.Y.Z (e.g., bugseti-v1.0.0)
|
||||
ChannelStable Channel = "stable"
|
||||
|
||||
// ChannelBeta is the pre-release testing channel.
|
||||
// Tags: bugseti-vX.Y.Z-beta.N (e.g., bugseti-v1.0.0-beta.1)
|
||||
ChannelBeta Channel = "beta"
|
||||
|
||||
// ChannelNightly is the latest development builds channel.
|
||||
// Tags: bugseti-nightly-YYYYMMDD (e.g., bugseti-nightly-20260205)
|
||||
ChannelNightly Channel = "nightly"
|
||||
)
|
||||
|
||||
// String returns the string representation of the channel.
|
||||
func (c Channel) String() string {
|
||||
return string(c)
|
||||
}
|
||||
|
||||
// DisplayName returns a human-readable name for the channel.
|
||||
func (c Channel) DisplayName() string {
|
||||
switch c {
|
||||
case ChannelStable:
|
||||
return "Stable"
|
||||
case ChannelBeta:
|
||||
return "Beta"
|
||||
case ChannelNightly:
|
||||
return "Nightly"
|
||||
default:
|
||||
return "Unknown"
|
||||
}
|
||||
}
|
||||
|
||||
// Description returns a description of the channel.
|
||||
func (c Channel) Description() string {
|
||||
switch c {
|
||||
case ChannelStable:
|
||||
return "Production releases - most stable, recommended for most users"
|
||||
case ChannelBeta:
|
||||
return "Pre-release builds - new features being tested before stable release"
|
||||
case ChannelNightly:
|
||||
return "Latest development builds - bleeding edge, may be unstable"
|
||||
default:
|
||||
return "Unknown channel"
|
||||
}
|
||||
}
|
||||
|
||||
// TagPrefix returns the tag prefix used for this channel.
|
||||
func (c Channel) TagPrefix() string {
|
||||
switch c {
|
||||
case ChannelStable:
|
||||
return "bugseti-v"
|
||||
case ChannelBeta:
|
||||
return "bugseti-v"
|
||||
case ChannelNightly:
|
||||
return "bugseti-nightly-"
|
||||
default:
|
||||
return ""
|
||||
}
|
||||
}
|
||||
|
||||
// TagPattern returns a regex pattern to match tags for this channel.
|
||||
func (c Channel) TagPattern() *regexp.Regexp {
|
||||
switch c {
|
||||
case ChannelStable:
|
||||
// Match bugseti-vX.Y.Z but NOT bugseti-vX.Y.Z-beta.N
|
||||
return regexp.MustCompile(`^bugseti-v(\d+\.\d+\.\d+)$`)
|
||||
case ChannelBeta:
|
||||
// Match bugseti-vX.Y.Z-beta.N
|
||||
return regexp.MustCompile(`^bugseti-v(\d+\.\d+\.\d+-beta\.\d+)$`)
|
||||
case ChannelNightly:
|
||||
// Match bugseti-nightly-YYYYMMDD
|
||||
return regexp.MustCompile(`^bugseti-nightly-(\d{8})$`)
|
||||
default:
|
||||
return nil
|
||||
}
|
||||
}
|
||||
|
||||
// MatchesTag returns true if the given tag matches this channel's pattern.
|
||||
func (c Channel) MatchesTag(tag string) bool {
|
||||
pattern := c.TagPattern()
|
||||
if pattern == nil {
|
||||
return false
|
||||
}
|
||||
return pattern.MatchString(tag)
|
||||
}
|
||||
|
||||
// ExtractVersion extracts the version from a tag for this channel.
|
||||
func (c Channel) ExtractVersion(tag string) string {
|
||||
pattern := c.TagPattern()
|
||||
if pattern == nil {
|
||||
return ""
|
||||
}
|
||||
matches := pattern.FindStringSubmatch(tag)
|
||||
if len(matches) < 2 {
|
||||
return ""
|
||||
}
|
||||
return matches[1]
|
||||
}
|
||||
|
||||
// AllChannels returns all available channels.
|
||||
func AllChannels() []Channel {
|
||||
return []Channel{ChannelStable, ChannelBeta, ChannelNightly}
|
||||
}
|
||||
|
||||
// ParseChannel parses a string into a Channel.
|
||||
func ParseChannel(s string) (Channel, error) {
|
||||
switch strings.ToLower(s) {
|
||||
case "stable":
|
||||
return ChannelStable, nil
|
||||
case "beta":
|
||||
return ChannelBeta, nil
|
||||
case "nightly":
|
||||
return ChannelNightly, nil
|
||||
default:
|
||||
return "", fmt.Errorf("unknown channel: %s", s)
|
||||
}
|
||||
}
|
||||
|
||||
// ChannelInfo contains information about an update channel.
|
||||
type ChannelInfo struct {
|
||||
ID string `json:"id"`
|
||||
Name string `json:"name"`
|
||||
Description string `json:"description"`
|
||||
}
|
||||
|
||||
// GetChannelInfo returns information about a channel.
|
||||
func GetChannelInfo(c Channel) ChannelInfo {
|
||||
return ChannelInfo{
|
||||
ID: c.String(),
|
||||
Name: c.DisplayName(),
|
||||
Description: c.Description(),
|
||||
}
|
||||
}
|
||||
|
||||
// GetAllChannelInfo returns information about all channels.
|
||||
func GetAllChannelInfo() []ChannelInfo {
|
||||
channels := AllChannels()
|
||||
info := make([]ChannelInfo, len(channels))
|
||||
for i, c := range channels {
|
||||
info[i] = GetChannelInfo(c)
|
||||
}
|
||||
return info
|
||||
}
|
||||
|
||||
// IncludesPrerelease returns true if the channel includes pre-release versions.
|
||||
func (c Channel) IncludesPrerelease() bool {
|
||||
return c == ChannelBeta || c == ChannelNightly
|
||||
}
|
||||
|
||||
// IncludesChannel returns true if this channel should include releases from the given channel.
|
||||
// For example, beta channel includes stable releases, nightly includes both.
|
||||
func (c Channel) IncludesChannel(other Channel) bool {
|
||||
switch c {
|
||||
case ChannelStable:
|
||||
return other == ChannelStable
|
||||
case ChannelBeta:
|
||||
return other == ChannelStable || other == ChannelBeta
|
||||
case ChannelNightly:
|
||||
return true // Nightly users can see all releases
|
||||
default:
|
||||
return false
|
||||
}
|
||||
}
|
||||
379
internal/bugseti/updater/checker.go
Normal file
379
internal/bugseti/updater/checker.go
Normal file
|
|
@ -0,0 +1,379 @@
|
|||
// Package updater provides auto-update functionality for BugSETI.
|
||||
package updater
|
||||
|
||||
import (
|
||||
"context"
|
||||
"encoding/json"
|
||||
"fmt"
|
||||
"net/http"
|
||||
"runtime"
|
||||
"sort"
|
||||
"strings"
|
||||
"time"
|
||||
|
||||
"golang.org/x/mod/semver"
|
||||
)
|
||||
|
||||
const (
|
||||
// GitHubReleasesAPI is the GitHub API endpoint for releases.
|
||||
GitHubReleasesAPI = "https://api.github.com/repos/%s/%s/releases"
|
||||
|
||||
// DefaultOwner is the default GitHub repository owner.
|
||||
DefaultOwner = "host-uk"
|
||||
|
||||
// DefaultRepo is the default GitHub repository name.
|
||||
DefaultRepo = "core"
|
||||
|
||||
// DefaultCheckInterval is the default interval between update checks.
|
||||
DefaultCheckInterval = 6 * time.Hour
|
||||
)
|
||||
|
||||
// GitHubRelease represents a GitHub release from the API.
|
||||
type GitHubRelease struct {
|
||||
TagName string `json:"tag_name"`
|
||||
Name string `json:"name"`
|
||||
Body string `json:"body"`
|
||||
Draft bool `json:"draft"`
|
||||
Prerelease bool `json:"prerelease"`
|
||||
PublishedAt time.Time `json:"published_at"`
|
||||
Assets []GitHubAsset `json:"assets"`
|
||||
HTMLURL string `json:"html_url"`
|
||||
}
|
||||
|
||||
// GitHubAsset represents a release asset from the GitHub API.
|
||||
type GitHubAsset struct {
|
||||
Name string `json:"name"`
|
||||
Size int64 `json:"size"`
|
||||
BrowserDownloadURL string `json:"browser_download_url"`
|
||||
ContentType string `json:"content_type"`
|
||||
}
|
||||
|
||||
// ReleaseInfo contains information about an available release.
|
||||
type ReleaseInfo struct {
|
||||
Version string `json:"version"`
|
||||
Channel Channel `json:"channel"`
|
||||
Tag string `json:"tag"`
|
||||
Name string `json:"name"`
|
||||
Body string `json:"body"`
|
||||
PublishedAt time.Time `json:"publishedAt"`
|
||||
HTMLURL string `json:"htmlUrl"`
|
||||
BinaryURL string `json:"binaryUrl"`
|
||||
ArchiveURL string `json:"archiveUrl"`
|
||||
ChecksumURL string `json:"checksumUrl"`
|
||||
Size int64 `json:"size"`
|
||||
}
|
||||
|
||||
// UpdateCheckResult contains the result of an update check.
|
||||
type UpdateCheckResult struct {
|
||||
Available bool `json:"available"`
|
||||
CurrentVersion string `json:"currentVersion"`
|
||||
LatestVersion string `json:"latestVersion"`
|
||||
Release *ReleaseInfo `json:"release,omitempty"`
|
||||
Error string `json:"error,omitempty"`
|
||||
CheckedAt time.Time `json:"checkedAt"`
|
||||
}
|
||||
|
||||
// Checker checks for available updates.
|
||||
type Checker struct {
|
||||
owner string
|
||||
repo string
|
||||
httpClient *http.Client
|
||||
}
|
||||
|
||||
// NewChecker creates a new update checker.
|
||||
func NewChecker() *Checker {
|
||||
return &Checker{
|
||||
owner: DefaultOwner,
|
||||
repo: DefaultRepo,
|
||||
httpClient: &http.Client{
|
||||
Timeout: 30 * time.Second,
|
||||
},
|
||||
}
|
||||
}
|
||||
|
||||
// CheckForUpdate checks if a newer version is available.
|
||||
func (c *Checker) CheckForUpdate(ctx context.Context, currentVersion string, channel Channel) (*UpdateCheckResult, error) {
|
||||
result := &UpdateCheckResult{
|
||||
CurrentVersion: currentVersion,
|
||||
CheckedAt: time.Now(),
|
||||
}
|
||||
|
||||
// Fetch releases from GitHub
|
||||
releases, err := c.fetchReleases(ctx)
|
||||
if err != nil {
|
||||
result.Error = err.Error()
|
||||
return result, err
|
||||
}
|
||||
|
||||
// Find the latest release for the channel
|
||||
latest := c.findLatestRelease(releases, channel)
|
||||
if latest == nil {
|
||||
result.LatestVersion = currentVersion
|
||||
return result, nil
|
||||
}
|
||||
|
||||
result.LatestVersion = latest.Version
|
||||
result.Release = latest
|
||||
|
||||
// Compare versions
|
||||
if c.isNewerVersion(currentVersion, latest.Version, channel) {
|
||||
result.Available = true
|
||||
}
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// fetchReleases fetches all releases from GitHub.
|
||||
func (c *Checker) fetchReleases(ctx context.Context) ([]GitHubRelease, error) {
|
||||
url := fmt.Sprintf(GitHubReleasesAPI, c.owner, c.repo)
|
||||
|
||||
req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to create request: %w", err)
|
||||
}
|
||||
|
||||
req.Header.Set("Accept", "application/vnd.github.v3+json")
|
||||
req.Header.Set("User-Agent", "BugSETI-Updater")
|
||||
|
||||
resp, err := c.httpClient.Do(req)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to fetch releases: %w", err)
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
if resp.StatusCode != http.StatusOK {
|
||||
return nil, fmt.Errorf("GitHub API returned status %d", resp.StatusCode)
|
||||
}
|
||||
|
||||
var releases []GitHubRelease
|
||||
if err := json.NewDecoder(resp.Body).Decode(&releases); err != nil {
|
||||
return nil, fmt.Errorf("failed to decode releases: %w", err)
|
||||
}
|
||||
|
||||
return releases, nil
|
||||
}
|
||||
|
||||
// findLatestRelease finds the latest release for the given channel.
|
||||
func (c *Checker) findLatestRelease(releases []GitHubRelease, channel Channel) *ReleaseInfo {
|
||||
var candidates []ReleaseInfo
|
||||
|
||||
for _, release := range releases {
|
||||
// Skip drafts
|
||||
if release.Draft {
|
||||
continue
|
||||
}
|
||||
|
||||
// Check if the tag matches our BugSETI release pattern
|
||||
if !strings.HasPrefix(release.TagName, "bugseti-") {
|
||||
continue
|
||||
}
|
||||
|
||||
// Determine the channel for this release
|
||||
releaseChannel := c.determineChannel(release.TagName)
|
||||
if releaseChannel == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
// Check if this release should be considered for the requested channel
|
||||
if !channel.IncludesChannel(releaseChannel) {
|
||||
continue
|
||||
}
|
||||
|
||||
// Extract version
|
||||
version := releaseChannel.ExtractVersion(release.TagName)
|
||||
if version == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
// Find the appropriate asset for this platform
|
||||
binaryName := c.getBinaryName()
|
||||
archiveName := c.getArchiveName()
|
||||
checksumName := archiveName + ".sha256"
|
||||
|
||||
var binaryURL, archiveURL, checksumURL string
|
||||
var size int64
|
||||
|
||||
for _, asset := range release.Assets {
|
||||
switch asset.Name {
|
||||
case binaryName:
|
||||
binaryURL = asset.BrowserDownloadURL
|
||||
size = asset.Size
|
||||
case archiveName:
|
||||
archiveURL = asset.BrowserDownloadURL
|
||||
if size == 0 {
|
||||
size = asset.Size
|
||||
}
|
||||
case checksumName:
|
||||
checksumURL = asset.BrowserDownloadURL
|
||||
}
|
||||
}
|
||||
|
||||
// Skip if no binary available for this platform
|
||||
if binaryURL == "" && archiveURL == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
candidates = append(candidates, ReleaseInfo{
|
||||
Version: version,
|
||||
Channel: releaseChannel,
|
||||
Tag: release.TagName,
|
||||
Name: release.Name,
|
||||
Body: release.Body,
|
||||
PublishedAt: release.PublishedAt,
|
||||
HTMLURL: release.HTMLURL,
|
||||
BinaryURL: binaryURL,
|
||||
ArchiveURL: archiveURL,
|
||||
ChecksumURL: checksumURL,
|
||||
Size: size,
|
||||
})
|
||||
}
|
||||
|
||||
if len(candidates) == 0 {
|
||||
return nil
|
||||
}
|
||||
|
||||
// Sort by version (newest first)
|
||||
sort.Slice(candidates, func(i, j int) bool {
|
||||
return c.compareVersions(candidates[i].Version, candidates[j].Version, channel) > 0
|
||||
})
|
||||
|
||||
return &candidates[0]
|
||||
}
|
||||
|
||||
// determineChannel determines the channel from a release tag.
|
||||
func (c *Checker) determineChannel(tag string) Channel {
|
||||
for _, ch := range AllChannels() {
|
||||
if ch.MatchesTag(tag) {
|
||||
return ch
|
||||
}
|
||||
}
|
||||
return ""
|
||||
}
|
||||
|
||||
// getBinaryName returns the binary name for the current platform.
|
||||
func (c *Checker) getBinaryName() string {
|
||||
ext := ""
|
||||
if runtime.GOOS == "windows" {
|
||||
ext = ".exe"
|
||||
}
|
||||
return fmt.Sprintf("bugseti-%s-%s%s", runtime.GOOS, runtime.GOARCH, ext)
|
||||
}
|
||||
|
||||
// getArchiveName returns the archive name for the current platform.
|
||||
func (c *Checker) getArchiveName() string {
|
||||
ext := "tar.gz"
|
||||
if runtime.GOOS == "windows" {
|
||||
ext = "zip"
|
||||
}
|
||||
return fmt.Sprintf("bugseti-%s-%s.%s", runtime.GOOS, runtime.GOARCH, ext)
|
||||
}
|
||||
|
||||
// isNewerVersion returns true if newVersion is newer than currentVersion.
|
||||
func (c *Checker) isNewerVersion(currentVersion, newVersion string, channel Channel) bool {
|
||||
// Handle nightly versions (date-based)
|
||||
if channel == ChannelNightly {
|
||||
return newVersion > currentVersion
|
||||
}
|
||||
|
||||
// Handle dev builds
|
||||
if currentVersion == "dev" {
|
||||
return true
|
||||
}
|
||||
|
||||
// Use semver comparison
|
||||
current := c.normalizeSemver(currentVersion)
|
||||
new := c.normalizeSemver(newVersion)
|
||||
|
||||
return semver.Compare(new, current) > 0
|
||||
}
|
||||
|
||||
// compareVersions compares two versions.
|
||||
func (c *Checker) compareVersions(v1, v2 string, channel Channel) int {
|
||||
// Handle nightly versions (date-based)
|
||||
if channel == ChannelNightly {
|
||||
if v1 > v2 {
|
||||
return 1
|
||||
} else if v1 < v2 {
|
||||
return -1
|
||||
}
|
||||
return 0
|
||||
}
|
||||
|
||||
// Use semver comparison
|
||||
return semver.Compare(c.normalizeSemver(v1), c.normalizeSemver(v2))
|
||||
}
|
||||
|
||||
// normalizeSemver ensures a version string has the 'v' prefix for semver.
|
||||
func (c *Checker) normalizeSemver(version string) string {
|
||||
if !strings.HasPrefix(version, "v") {
|
||||
return "v" + version
|
||||
}
|
||||
return version
|
||||
}
|
||||
|
||||
// GetAllReleases returns all BugSETI releases from GitHub.
|
||||
func (c *Checker) GetAllReleases(ctx context.Context) ([]ReleaseInfo, error) {
|
||||
releases, err := c.fetchReleases(ctx)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
var result []ReleaseInfo
|
||||
for _, release := range releases {
|
||||
if release.Draft {
|
||||
continue
|
||||
}
|
||||
|
||||
if !strings.HasPrefix(release.TagName, "bugseti-") {
|
||||
continue
|
||||
}
|
||||
|
||||
releaseChannel := c.determineChannel(release.TagName)
|
||||
if releaseChannel == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
version := releaseChannel.ExtractVersion(release.TagName)
|
||||
if version == "" {
|
||||
continue
|
||||
}
|
||||
|
||||
binaryName := c.getBinaryName()
|
||||
archiveName := c.getArchiveName()
|
||||
checksumName := archiveName + ".sha256"
|
||||
|
||||
var binaryURL, archiveURL, checksumURL string
|
||||
var size int64
|
||||
|
||||
for _, asset := range release.Assets {
|
||||
switch asset.Name {
|
||||
case binaryName:
|
||||
binaryURL = asset.BrowserDownloadURL
|
||||
size = asset.Size
|
||||
case archiveName:
|
||||
archiveURL = asset.BrowserDownloadURL
|
||||
if size == 0 {
|
||||
size = asset.Size
|
||||
}
|
||||
case checksumName:
|
||||
checksumURL = asset.BrowserDownloadURL
|
||||
}
|
||||
}
|
||||
|
||||
result = append(result, ReleaseInfo{
|
||||
Version: version,
|
||||
Channel: releaseChannel,
|
||||
Tag: release.TagName,
|
||||
Name: release.Name,
|
||||
Body: release.Body,
|
||||
PublishedAt: release.PublishedAt,
|
||||
HTMLURL: release.HTMLURL,
|
||||
BinaryURL: binaryURL,
|
||||
ArchiveURL: archiveURL,
|
||||
ChecksumURL: checksumURL,
|
||||
Size: size,
|
||||
})
|
||||
}
|
||||
|
||||
return result, nil
|
||||
}
|
||||
427
internal/bugseti/updater/download.go
Normal file
427
internal/bugseti/updater/download.go
Normal file
|
|
@ -0,0 +1,427 @@
|
|||
// Package updater provides auto-update functionality for BugSETI.
|
||||
package updater
|
||||
|
||||
import (
|
||||
"archive/tar"
|
||||
"archive/zip"
|
||||
"compress/gzip"
|
||||
"context"
|
||||
"crypto/sha256"
|
||||
"encoding/hex"
|
||||
"fmt"
|
||||
"io"
|
||||
"net/http"
|
||||
"os"
|
||||
"path/filepath"
|
||||
"runtime"
|
||||
"strings"
|
||||
)
|
||||
|
||||
// DownloadProgress reports download progress.
|
||||
type DownloadProgress struct {
|
||||
BytesDownloaded int64 `json:"bytesDownloaded"`
|
||||
TotalBytes int64 `json:"totalBytes"`
|
||||
Percent float64 `json:"percent"`
|
||||
}
|
||||
|
||||
// DownloadResult contains the result of a download operation.
|
||||
type DownloadResult struct {
|
||||
BinaryPath string `json:"binaryPath"`
|
||||
Version string `json:"version"`
|
||||
Checksum string `json:"checksum"`
|
||||
VerifiedOK bool `json:"verifiedOK"`
|
||||
}
|
||||
|
||||
// Downloader handles downloading and verifying updates.
|
||||
type Downloader struct {
|
||||
httpClient *http.Client
|
||||
stagingDir string
|
||||
onProgress func(DownloadProgress)
|
||||
}
|
||||
|
||||
// NewDownloader creates a new update downloader.
|
||||
func NewDownloader() (*Downloader, error) {
|
||||
// Create staging directory in user's temp dir
|
||||
stagingDir := filepath.Join(os.TempDir(), "bugseti-updates")
|
||||
if err := os.MkdirAll(stagingDir, 0755); err != nil {
|
||||
return nil, fmt.Errorf("failed to create staging directory: %w", err)
|
||||
}
|
||||
|
||||
return &Downloader{
|
||||
httpClient: &http.Client{},
|
||||
stagingDir: stagingDir,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// SetProgressCallback sets a callback for download progress updates.
|
||||
func (d *Downloader) SetProgressCallback(cb func(DownloadProgress)) {
|
||||
d.onProgress = cb
|
||||
}
|
||||
|
||||
// Download downloads a release and stages it for installation.
|
||||
func (d *Downloader) Download(ctx context.Context, release *ReleaseInfo) (*DownloadResult, error) {
|
||||
result := &DownloadResult{
|
||||
Version: release.Version,
|
||||
}
|
||||
|
||||
// Prefer archive download for extraction
|
||||
downloadURL := release.ArchiveURL
|
||||
if downloadURL == "" {
|
||||
downloadURL = release.BinaryURL
|
||||
}
|
||||
if downloadURL == "" {
|
||||
return nil, fmt.Errorf("no download URL available for release %s", release.Version)
|
||||
}
|
||||
|
||||
// Download the checksum first if available
|
||||
var expectedChecksum string
|
||||
if release.ChecksumURL != "" {
|
||||
checksum, err := d.downloadChecksum(ctx, release.ChecksumURL)
|
||||
if err != nil {
|
||||
// Log but don't fail - checksum verification is optional
|
||||
fmt.Printf("Warning: could not download checksum: %v\n", err)
|
||||
} else {
|
||||
expectedChecksum = checksum
|
||||
}
|
||||
}
|
||||
|
||||
// Download the file
|
||||
downloadedPath, err := d.downloadFile(ctx, downloadURL, release.Size)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to download update: %w", err)
|
||||
}
|
||||
|
||||
// Verify checksum if available
|
||||
actualChecksum, err := d.calculateChecksum(downloadedPath)
|
||||
if err != nil {
|
||||
os.Remove(downloadedPath)
|
||||
return nil, fmt.Errorf("failed to calculate checksum: %w", err)
|
||||
}
|
||||
result.Checksum = actualChecksum
|
||||
|
||||
if expectedChecksum != "" {
|
||||
if actualChecksum != expectedChecksum {
|
||||
os.Remove(downloadedPath)
|
||||
return nil, fmt.Errorf("checksum mismatch: expected %s, got %s", expectedChecksum, actualChecksum)
|
||||
}
|
||||
result.VerifiedOK = true
|
||||
}
|
||||
|
||||
// Extract if it's an archive
|
||||
var binaryPath string
|
||||
if strings.HasSuffix(downloadURL, ".tar.gz") {
|
||||
binaryPath, err = d.extractTarGz(downloadedPath)
|
||||
} else if strings.HasSuffix(downloadURL, ".zip") {
|
||||
binaryPath, err = d.extractZip(downloadedPath)
|
||||
} else {
|
||||
// It's a raw binary
|
||||
binaryPath = downloadedPath
|
||||
}
|
||||
|
||||
if err != nil {
|
||||
os.Remove(downloadedPath)
|
||||
return nil, fmt.Errorf("failed to extract archive: %w", err)
|
||||
}
|
||||
|
||||
// Make the binary executable (Unix only)
|
||||
if runtime.GOOS != "windows" {
|
||||
if err := os.Chmod(binaryPath, 0755); err != nil {
|
||||
return nil, fmt.Errorf("failed to make binary executable: %w", err)
|
||||
}
|
||||
}
|
||||
|
||||
result.BinaryPath = binaryPath
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// downloadChecksum downloads and parses a checksum file.
|
||||
func (d *Downloader) downloadChecksum(ctx context.Context, url string) (string, error) {
|
||||
req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
req.Header.Set("User-Agent", "BugSETI-Updater")
|
||||
|
||||
resp, err := d.httpClient.Do(req)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
if resp.StatusCode != http.StatusOK {
|
||||
return "", fmt.Errorf("HTTP %d", resp.StatusCode)
|
||||
}
|
||||
|
||||
data, err := io.ReadAll(resp.Body)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
// Checksum file format: "hash filename" or just "hash"
|
||||
parts := strings.Fields(strings.TrimSpace(string(data)))
|
||||
if len(parts) == 0 {
|
||||
return "", fmt.Errorf("empty checksum file")
|
||||
}
|
||||
|
||||
return parts[0], nil
|
||||
}
|
||||
|
||||
// downloadFile downloads a file with progress reporting.
|
||||
func (d *Downloader) downloadFile(ctx context.Context, url string, expectedSize int64) (string, error) {
|
||||
req, err := http.NewRequestWithContext(ctx, "GET", url, nil)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
req.Header.Set("User-Agent", "BugSETI-Updater")
|
||||
|
||||
resp, err := d.httpClient.Do(req)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer resp.Body.Close()
|
||||
|
||||
if resp.StatusCode != http.StatusOK {
|
||||
return "", fmt.Errorf("HTTP %d", resp.StatusCode)
|
||||
}
|
||||
|
||||
// Get total size from response or use expected size
|
||||
totalSize := resp.ContentLength
|
||||
if totalSize <= 0 {
|
||||
totalSize = expectedSize
|
||||
}
|
||||
|
||||
// Create output file
|
||||
filename := filepath.Base(url)
|
||||
outPath := filepath.Join(d.stagingDir, filename)
|
||||
out, err := os.Create(outPath)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer out.Close()
|
||||
|
||||
// Download with progress
|
||||
var downloaded int64
|
||||
buf := make([]byte, 32*1024) // 32KB buffer
|
||||
|
||||
for {
|
||||
select {
|
||||
case <-ctx.Done():
|
||||
os.Remove(outPath)
|
||||
return "", ctx.Err()
|
||||
default:
|
||||
}
|
||||
|
||||
n, readErr := resp.Body.Read(buf)
|
||||
if n > 0 {
|
||||
_, writeErr := out.Write(buf[:n])
|
||||
if writeErr != nil {
|
||||
os.Remove(outPath)
|
||||
return "", writeErr
|
||||
}
|
||||
downloaded += int64(n)
|
||||
|
||||
// Report progress
|
||||
if d.onProgress != nil && totalSize > 0 {
|
||||
d.onProgress(DownloadProgress{
|
||||
BytesDownloaded: downloaded,
|
||||
TotalBytes: totalSize,
|
||||
Percent: float64(downloaded) / float64(totalSize) * 100,
|
||||
})
|
||||
}
|
||||
}
|
||||
|
||||
if readErr == io.EOF {
|
||||
break
|
||||
}
|
||||
if readErr != nil {
|
||||
os.Remove(outPath)
|
||||
return "", readErr
|
||||
}
|
||||
}
|
||||
|
||||
return outPath, nil
|
||||
}
|
||||
|
||||
// calculateChecksum calculates the SHA256 checksum of a file.
|
||||
func (d *Downloader) calculateChecksum(path string) (string, error) {
|
||||
f, err := os.Open(path)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer f.Close()
|
||||
|
||||
h := sha256.New()
|
||||
if _, err := io.Copy(h, f); err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
return hex.EncodeToString(h.Sum(nil)), nil
|
||||
}
|
||||
|
||||
// extractTarGz extracts a .tar.gz archive and returns the path to the binary.
|
||||
func (d *Downloader) extractTarGz(archivePath string) (string, error) {
|
||||
f, err := os.Open(archivePath)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer f.Close()
|
||||
|
||||
gzr, err := gzip.NewReader(f)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer gzr.Close()
|
||||
|
||||
tr := tar.NewReader(gzr)
|
||||
|
||||
extractDir := filepath.Join(d.stagingDir, "extracted")
|
||||
os.RemoveAll(extractDir)
|
||||
if err := os.MkdirAll(extractDir, 0755); err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
var binaryPath string
|
||||
binaryName := "bugseti"
|
||||
if runtime.GOOS == "windows" {
|
||||
binaryName = "bugseti.exe"
|
||||
}
|
||||
|
||||
for {
|
||||
header, err := tr.Next()
|
||||
if err == io.EOF {
|
||||
break
|
||||
}
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
target := filepath.Join(extractDir, header.Name)
|
||||
|
||||
// Prevent directory traversal
|
||||
if !strings.HasPrefix(filepath.Clean(target), filepath.Clean(extractDir)) {
|
||||
return "", fmt.Errorf("invalid file path in archive: %s", header.Name)
|
||||
}
|
||||
|
||||
switch header.Typeflag {
|
||||
case tar.TypeDir:
|
||||
if err := os.MkdirAll(target, 0755); err != nil {
|
||||
return "", err
|
||||
}
|
||||
case tar.TypeReg:
|
||||
// Create parent directory
|
||||
if err := os.MkdirAll(filepath.Dir(target), 0755); err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
outFile, err := os.OpenFile(target, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, os.FileMode(header.Mode))
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
if _, err := io.Copy(outFile, tr); err != nil {
|
||||
outFile.Close()
|
||||
return "", err
|
||||
}
|
||||
outFile.Close()
|
||||
|
||||
// Check if this is the binary we're looking for
|
||||
if filepath.Base(header.Name) == binaryName {
|
||||
binaryPath = target
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up archive
|
||||
os.Remove(archivePath)
|
||||
|
||||
if binaryPath == "" {
|
||||
return "", fmt.Errorf("binary not found in archive")
|
||||
}
|
||||
|
||||
return binaryPath, nil
|
||||
}
|
||||
|
||||
// extractZip extracts a .zip archive and returns the path to the binary.
|
||||
func (d *Downloader) extractZip(archivePath string) (string, error) {
|
||||
r, err := zip.OpenReader(archivePath)
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
defer r.Close()
|
||||
|
||||
extractDir := filepath.Join(d.stagingDir, "extracted")
|
||||
os.RemoveAll(extractDir)
|
||||
if err := os.MkdirAll(extractDir, 0755); err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
var binaryPath string
|
||||
binaryName := "bugseti"
|
||||
if runtime.GOOS == "windows" {
|
||||
binaryName = "bugseti.exe"
|
||||
}
|
||||
|
||||
for _, f := range r.File {
|
||||
target := filepath.Join(extractDir, f.Name)
|
||||
|
||||
// Prevent directory traversal
|
||||
if !strings.HasPrefix(filepath.Clean(target), filepath.Clean(extractDir)) {
|
||||
return "", fmt.Errorf("invalid file path in archive: %s", f.Name)
|
||||
}
|
||||
|
||||
if f.FileInfo().IsDir() {
|
||||
if err := os.MkdirAll(target, 0755); err != nil {
|
||||
return "", err
|
||||
}
|
||||
continue
|
||||
}
|
||||
|
||||
// Create parent directory
|
||||
if err := os.MkdirAll(filepath.Dir(target), 0755); err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
rc, err := f.Open()
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
outFile, err := os.OpenFile(target, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, f.Mode())
|
||||
if err != nil {
|
||||
rc.Close()
|
||||
return "", err
|
||||
}
|
||||
|
||||
_, err = io.Copy(outFile, rc)
|
||||
rc.Close()
|
||||
outFile.Close()
|
||||
|
||||
if err != nil {
|
||||
return "", err
|
||||
}
|
||||
|
||||
// Check if this is the binary we're looking for
|
||||
if filepath.Base(f.Name) == binaryName {
|
||||
binaryPath = target
|
||||
}
|
||||
}
|
||||
|
||||
// Clean up archive
|
||||
os.Remove(archivePath)
|
||||
|
||||
if binaryPath == "" {
|
||||
return "", fmt.Errorf("binary not found in archive")
|
||||
}
|
||||
|
||||
return binaryPath, nil
|
||||
}
|
||||
|
||||
// Cleanup removes all staged files.
|
||||
func (d *Downloader) Cleanup() error {
|
||||
return os.RemoveAll(d.stagingDir)
|
||||
}
|
||||
|
||||
// GetStagingDir returns the staging directory path.
|
||||
func (d *Downloader) GetStagingDir() string {
|
||||
return d.stagingDir
|
||||
}
|
||||
10
internal/bugseti/updater/go.mod
Normal file
10
internal/bugseti/updater/go.mod
Normal file
|
|
@ -0,0 +1,10 @@
|
|||
module github.com/host-uk/core/internal/bugseti/updater
|
||||
|
||||
go 1.25.5
|
||||
|
||||
require (
|
||||
github.com/host-uk/core/internal/bugseti v0.0.0
|
||||
golang.org/x/mod v0.25.0
|
||||
)
|
||||
|
||||
replace github.com/host-uk/core/internal/bugseti => ../
|
||||
2
internal/bugseti/updater/go.sum
Normal file
2
internal/bugseti/updater/go.sum
Normal file
|
|
@ -0,0 +1,2 @@
|
|||
golang.org/x/mod v0.25.0 h1:n7a+ZbQKQA/Ysbyb0/6IbB1H/X41mKgbhfv7AfG/44w=
|
||||
golang.org/x/mod v0.25.0/go.mod h1:IXM97Txy2VM4PJ3gI61r1YEk/gAj6zAHN3AdZt6S9Ww=
|
||||
284
internal/bugseti/updater/install.go
Normal file
284
internal/bugseti/updater/install.go
Normal file
|
|
@ -0,0 +1,284 @@
|
|||
// Package updater provides auto-update functionality for BugSETI.
|
||||
package updater
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"runtime"
|
||||
"syscall"
|
||||
)
|
||||
|
||||
// InstallResult contains the result of an installation.
|
||||
type InstallResult struct {
|
||||
Success bool `json:"success"`
|
||||
OldPath string `json:"oldPath"`
|
||||
NewPath string `json:"newPath"`
|
||||
BackupPath string `json:"backupPath"`
|
||||
RestartNeeded bool `json:"restartNeeded"`
|
||||
Error string `json:"error,omitempty"`
|
||||
}
|
||||
|
||||
// Installer handles installing updates and restarting the application.
|
||||
type Installer struct {
|
||||
executablePath string
|
||||
}
|
||||
|
||||
// NewInstaller creates a new installer.
|
||||
func NewInstaller() (*Installer, error) {
|
||||
execPath, err := os.Executable()
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to get executable path: %w", err)
|
||||
}
|
||||
|
||||
// Resolve symlinks to get the real path
|
||||
execPath, err = filepath.EvalSymlinks(execPath)
|
||||
if err != nil {
|
||||
return nil, fmt.Errorf("failed to resolve executable path: %w", err)
|
||||
}
|
||||
|
||||
return &Installer{
|
||||
executablePath: execPath,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// Install replaces the current binary with the new one.
|
||||
func (i *Installer) Install(newBinaryPath string) (*InstallResult, error) {
|
||||
result := &InstallResult{
|
||||
OldPath: i.executablePath,
|
||||
NewPath: newBinaryPath,
|
||||
RestartNeeded: true,
|
||||
}
|
||||
|
||||
// Verify the new binary exists and is executable
|
||||
if _, err := os.Stat(newBinaryPath); err != nil {
|
||||
result.Error = fmt.Sprintf("new binary not found: %v", err)
|
||||
return result, fmt.Errorf("new binary not found: %w", err)
|
||||
}
|
||||
|
||||
// Create backup of current binary
|
||||
backupPath := i.executablePath + ".bak"
|
||||
result.BackupPath = backupPath
|
||||
|
||||
// Platform-specific installation
|
||||
var err error
|
||||
switch runtime.GOOS {
|
||||
case "windows":
|
||||
err = i.installWindows(newBinaryPath, backupPath)
|
||||
default:
|
||||
err = i.installUnix(newBinaryPath, backupPath)
|
||||
}
|
||||
|
||||
if err != nil {
|
||||
result.Error = err.Error()
|
||||
return result, err
|
||||
}
|
||||
|
||||
result.Success = true
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// installUnix performs the installation on Unix-like systems.
|
||||
func (i *Installer) installUnix(newBinaryPath, backupPath string) error {
|
||||
// Remove old backup if exists
|
||||
os.Remove(backupPath)
|
||||
|
||||
// Rename current binary to backup
|
||||
if err := os.Rename(i.executablePath, backupPath); err != nil {
|
||||
return fmt.Errorf("failed to backup current binary: %w", err)
|
||||
}
|
||||
|
||||
// Copy new binary to target location
|
||||
// We use copy instead of rename in case they're on different filesystems
|
||||
if err := copyFile(newBinaryPath, i.executablePath); err != nil {
|
||||
// Try to restore backup
|
||||
os.Rename(backupPath, i.executablePath)
|
||||
return fmt.Errorf("failed to install new binary: %w", err)
|
||||
}
|
||||
|
||||
// Make executable
|
||||
if err := os.Chmod(i.executablePath, 0755); err != nil {
|
||||
// Try to restore backup
|
||||
os.Remove(i.executablePath)
|
||||
os.Rename(backupPath, i.executablePath)
|
||||
return fmt.Errorf("failed to make binary executable: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// installWindows performs the installation on Windows.
|
||||
// On Windows, we can't replace a running executable, so we use a different approach:
|
||||
// 1. Rename current executable to .old
|
||||
// 2. Copy new executable to target location
|
||||
// 3. On next start, clean up the .old file
|
||||
func (i *Installer) installWindows(newBinaryPath, backupPath string) error {
|
||||
// Remove old backup if exists
|
||||
os.Remove(backupPath)
|
||||
|
||||
// On Windows, we can rename the running executable
|
||||
if err := os.Rename(i.executablePath, backupPath); err != nil {
|
||||
return fmt.Errorf("failed to backup current binary: %w", err)
|
||||
}
|
||||
|
||||
// Copy new binary to target location
|
||||
if err := copyFile(newBinaryPath, i.executablePath); err != nil {
|
||||
// Try to restore backup
|
||||
os.Rename(backupPath, i.executablePath)
|
||||
return fmt.Errorf("failed to install new binary: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// Restart restarts the application with the new binary.
|
||||
func (i *Installer) Restart() error {
|
||||
args := os.Args
|
||||
env := os.Environ()
|
||||
|
||||
switch runtime.GOOS {
|
||||
case "windows":
|
||||
return i.restartWindows(args, env)
|
||||
default:
|
||||
return i.restartUnix(args, env)
|
||||
}
|
||||
}
|
||||
|
||||
// restartUnix restarts the application on Unix-like systems using exec.
|
||||
func (i *Installer) restartUnix(args []string, env []string) error {
|
||||
// Use syscall.Exec to replace the current process
|
||||
// This is the cleanest way to restart on Unix
|
||||
return syscall.Exec(i.executablePath, args, env)
|
||||
}
|
||||
|
||||
// restartWindows restarts the application on Windows.
|
||||
func (i *Installer) restartWindows(args []string, env []string) error {
|
||||
// On Windows, we can't use exec to replace the process
|
||||
// Instead, we start a new process and exit the current one
|
||||
cmd := exec.Command(i.executablePath, args[1:]...)
|
||||
cmd.Env = env
|
||||
cmd.Stdout = os.Stdout
|
||||
cmd.Stderr = os.Stderr
|
||||
cmd.Stdin = os.Stdin
|
||||
|
||||
if err := cmd.Start(); err != nil {
|
||||
return fmt.Errorf("failed to start new process: %w", err)
|
||||
}
|
||||
|
||||
// Exit current process
|
||||
os.Exit(0)
|
||||
return nil // Never reached
|
||||
}
|
||||
|
||||
// RestartLater schedules a restart for when the app next starts.
|
||||
// This is useful when the user wants to continue working and restart later.
|
||||
func (i *Installer) RestartLater() error {
|
||||
// Create a marker file that indicates a restart is pending
|
||||
markerPath := filepath.Join(filepath.Dir(i.executablePath), ".bugseti-restart-pending")
|
||||
return os.WriteFile(markerPath, []byte("restart"), 0644)
|
||||
}
|
||||
|
||||
// CheckPendingRestart checks if a restart was scheduled.
|
||||
func (i *Installer) CheckPendingRestart() bool {
|
||||
markerPath := filepath.Join(filepath.Dir(i.executablePath), ".bugseti-restart-pending")
|
||||
_, err := os.Stat(markerPath)
|
||||
return err == nil
|
||||
}
|
||||
|
||||
// ClearPendingRestart clears the pending restart marker.
|
||||
func (i *Installer) ClearPendingRestart() error {
|
||||
markerPath := filepath.Join(filepath.Dir(i.executablePath), ".bugseti-restart-pending")
|
||||
return os.Remove(markerPath)
|
||||
}
|
||||
|
||||
// CleanupBackup removes the backup binary after a successful update.
|
||||
func (i *Installer) CleanupBackup() error {
|
||||
backupPath := i.executablePath + ".bak"
|
||||
if _, err := os.Stat(backupPath); err == nil {
|
||||
return os.Remove(backupPath)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
||||
// Rollback restores the previous version from backup.
|
||||
func (i *Installer) Rollback() error {
|
||||
backupPath := i.executablePath + ".bak"
|
||||
|
||||
// Check if backup exists
|
||||
if _, err := os.Stat(backupPath); err != nil {
|
||||
return fmt.Errorf("backup not found: %w", err)
|
||||
}
|
||||
|
||||
// Remove current binary
|
||||
if err := os.Remove(i.executablePath); err != nil {
|
||||
return fmt.Errorf("failed to remove current binary: %w", err)
|
||||
}
|
||||
|
||||
// Restore backup
|
||||
if err := os.Rename(backupPath, i.executablePath); err != nil {
|
||||
return fmt.Errorf("failed to restore backup: %w", err)
|
||||
}
|
||||
|
||||
return nil
|
||||
}
|
||||
|
||||
// GetExecutablePath returns the path to the current executable.
|
||||
func (i *Installer) GetExecutablePath() string {
|
||||
return i.executablePath
|
||||
}
|
||||
|
||||
// copyFile copies a file from src to dst.
|
||||
func copyFile(src, dst string) error {
|
||||
sourceFile, err := os.Open(src)
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer sourceFile.Close()
|
||||
|
||||
// Get source file info for permissions
|
||||
sourceInfo, err := sourceFile.Stat()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
destFile, err := os.OpenFile(dst, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, sourceInfo.Mode())
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
defer destFile.Close()
|
||||
|
||||
_, err = destFile.ReadFrom(sourceFile)
|
||||
return err
|
||||
}
|
||||
|
||||
// CanSelfUpdate checks if the application has permission to update itself.
|
||||
func CanSelfUpdate() bool {
|
||||
execPath, err := os.Executable()
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
|
||||
execPath, err = filepath.EvalSymlinks(execPath)
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
|
||||
// Check if we can write to the executable's directory
|
||||
dir := filepath.Dir(execPath)
|
||||
testFile := filepath.Join(dir, ".bugseti-update-test")
|
||||
|
||||
f, err := os.Create(testFile)
|
||||
if err != nil {
|
||||
return false
|
||||
}
|
||||
f.Close()
|
||||
os.Remove(testFile)
|
||||
|
||||
return true
|
||||
}
|
||||
|
||||
// NeedsElevation returns true if the update requires elevated privileges.
|
||||
func NeedsElevation() bool {
|
||||
return !CanSelfUpdate()
|
||||
}
|
||||
322
internal/bugseti/updater/service.go
Normal file
322
internal/bugseti/updater/service.go
Normal file
|
|
@ -0,0 +1,322 @@
|
|||
// Package updater provides auto-update functionality for BugSETI.
|
||||
package updater
|
||||
|
||||
import (
|
||||
"context"
|
||||
"log"
|
||||
"sync"
|
||||
"time"
|
||||
|
||||
"github.com/host-uk/core/internal/bugseti"
|
||||
)
|
||||
|
||||
// Service provides update functionality and Wails bindings.
|
||||
type Service struct {
|
||||
config *bugseti.ConfigService
|
||||
checker *Checker
|
||||
downloader *Downloader
|
||||
installer *Installer
|
||||
|
||||
mu sync.RWMutex
|
||||
lastResult *UpdateCheckResult
|
||||
pendingUpdate *DownloadResult
|
||||
|
||||
// Background check
|
||||
stopCh chan struct{}
|
||||
running bool
|
||||
}
|
||||
|
||||
// NewService creates a new update service.
|
||||
func NewService(config *bugseti.ConfigService) (*Service, error) {
|
||||
downloader, err := NewDownloader()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
installer, err := NewInstaller()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
return &Service{
|
||||
config: config,
|
||||
checker: NewChecker(),
|
||||
downloader: downloader,
|
||||
installer: installer,
|
||||
}, nil
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (s *Service) ServiceName() string {
|
||||
return "UpdateService"
|
||||
}
|
||||
|
||||
// Start begins the background update checker.
|
||||
func (s *Service) Start() {
|
||||
s.mu.Lock()
|
||||
if s.running {
|
||||
s.mu.Unlock()
|
||||
return
|
||||
}
|
||||
s.running = true
|
||||
s.stopCh = make(chan struct{})
|
||||
s.mu.Unlock()
|
||||
|
||||
go s.runBackgroundChecker()
|
||||
}
|
||||
|
||||
// Stop stops the background update checker.
|
||||
func (s *Service) Stop() {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
if !s.running {
|
||||
return
|
||||
}
|
||||
|
||||
s.running = false
|
||||
close(s.stopCh)
|
||||
}
|
||||
|
||||
// runBackgroundChecker runs periodic update checks.
|
||||
func (s *Service) runBackgroundChecker() {
|
||||
// Initial check after a short delay
|
||||
time.Sleep(30 * time.Second)
|
||||
|
||||
for {
|
||||
select {
|
||||
case <-s.stopCh:
|
||||
return
|
||||
default:
|
||||
}
|
||||
|
||||
if s.config.ShouldCheckForUpdates() {
|
||||
log.Println("Checking for updates...")
|
||||
_, err := s.CheckForUpdate()
|
||||
if err != nil {
|
||||
log.Printf("Update check failed: %v", err)
|
||||
}
|
||||
}
|
||||
|
||||
// Check interval from config (minimum 1 hour)
|
||||
interval := time.Duration(s.config.GetUpdateCheckInterval()) * time.Hour
|
||||
if interval < time.Hour {
|
||||
interval = time.Hour
|
||||
}
|
||||
|
||||
select {
|
||||
case <-s.stopCh:
|
||||
return
|
||||
case <-time.After(interval):
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// GetSettings returns the update settings.
|
||||
func (s *Service) GetSettings() bugseti.UpdateSettings {
|
||||
return s.config.GetUpdateSettings()
|
||||
}
|
||||
|
||||
// SetSettings updates the update settings.
|
||||
func (s *Service) SetSettings(settings bugseti.UpdateSettings) error {
|
||||
return s.config.SetUpdateSettings(settings)
|
||||
}
|
||||
|
||||
// GetVersionInfo returns the current version information.
|
||||
func (s *Service) GetVersionInfo() bugseti.VersionInfo {
|
||||
return bugseti.GetVersionInfo()
|
||||
}
|
||||
|
||||
// GetChannels returns all available update channels.
|
||||
func (s *Service) GetChannels() []ChannelInfo {
|
||||
return GetAllChannelInfo()
|
||||
}
|
||||
|
||||
// CheckForUpdate checks if an update is available.
|
||||
func (s *Service) CheckForUpdate() (*UpdateCheckResult, error) {
|
||||
currentVersion := bugseti.GetVersion()
|
||||
channelStr := s.config.GetUpdateChannel()
|
||||
|
||||
channel, err := ParseChannel(channelStr)
|
||||
if err != nil {
|
||||
channel = ChannelStable
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 30*time.Second)
|
||||
defer cancel()
|
||||
|
||||
result, err := s.checker.CheckForUpdate(ctx, currentVersion, channel)
|
||||
if err != nil {
|
||||
return result, err
|
||||
}
|
||||
|
||||
// Update last check time
|
||||
s.config.SetLastUpdateCheck(time.Now())
|
||||
|
||||
// Store result
|
||||
s.mu.Lock()
|
||||
s.lastResult = result
|
||||
s.mu.Unlock()
|
||||
|
||||
// If auto-update is enabled and an update is available, download it
|
||||
if result.Available && s.config.IsAutoUpdateEnabled() {
|
||||
go s.downloadUpdate(result.Release)
|
||||
}
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// GetLastCheckResult returns the last update check result.
|
||||
func (s *Service) GetLastCheckResult() *UpdateCheckResult {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.lastResult
|
||||
}
|
||||
|
||||
// downloadUpdate downloads an update in the background.
|
||||
func (s *Service) downloadUpdate(release *ReleaseInfo) {
|
||||
if release == nil {
|
||||
return
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
log.Printf("Downloading update %s...", release.Version)
|
||||
|
||||
result, err := s.downloader.Download(ctx, release)
|
||||
if err != nil {
|
||||
log.Printf("Failed to download update: %v", err)
|
||||
return
|
||||
}
|
||||
|
||||
log.Printf("Update %s downloaded and staged at %s", release.Version, result.BinaryPath)
|
||||
|
||||
s.mu.Lock()
|
||||
s.pendingUpdate = result
|
||||
s.mu.Unlock()
|
||||
}
|
||||
|
||||
// DownloadUpdate downloads the latest available update.
|
||||
func (s *Service) DownloadUpdate() (*DownloadResult, error) {
|
||||
s.mu.RLock()
|
||||
lastResult := s.lastResult
|
||||
s.mu.RUnlock()
|
||||
|
||||
if lastResult == nil || !lastResult.Available || lastResult.Release == nil {
|
||||
// Need to check first
|
||||
result, err := s.CheckForUpdate()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if !result.Available {
|
||||
return nil, nil
|
||||
}
|
||||
lastResult = result
|
||||
}
|
||||
|
||||
ctx, cancel := context.WithTimeout(context.Background(), 10*time.Minute)
|
||||
defer cancel()
|
||||
|
||||
downloadResult, err := s.downloader.Download(ctx, lastResult.Release)
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
|
||||
s.mu.Lock()
|
||||
s.pendingUpdate = downloadResult
|
||||
s.mu.Unlock()
|
||||
|
||||
return downloadResult, nil
|
||||
}
|
||||
|
||||
// InstallUpdate installs a previously downloaded update.
|
||||
func (s *Service) InstallUpdate() (*InstallResult, error) {
|
||||
s.mu.RLock()
|
||||
pending := s.pendingUpdate
|
||||
s.mu.RUnlock()
|
||||
|
||||
if pending == nil {
|
||||
// Try to download first
|
||||
downloadResult, err := s.DownloadUpdate()
|
||||
if err != nil {
|
||||
return nil, err
|
||||
}
|
||||
if downloadResult == nil {
|
||||
return &InstallResult{
|
||||
Success: false,
|
||||
Error: "No update available",
|
||||
}, nil
|
||||
}
|
||||
pending = downloadResult
|
||||
}
|
||||
|
||||
result, err := s.installer.Install(pending.BinaryPath)
|
||||
if err != nil {
|
||||
return result, err
|
||||
}
|
||||
|
||||
// Clear pending update
|
||||
s.mu.Lock()
|
||||
s.pendingUpdate = nil
|
||||
s.mu.Unlock()
|
||||
|
||||
return result, nil
|
||||
}
|
||||
|
||||
// InstallAndRestart installs the update and restarts the application.
|
||||
func (s *Service) InstallAndRestart() error {
|
||||
result, err := s.InstallUpdate()
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
if !result.Success {
|
||||
return nil
|
||||
}
|
||||
|
||||
return s.installer.Restart()
|
||||
}
|
||||
|
||||
// HasPendingUpdate returns true if there's a downloaded update ready to install.
|
||||
func (s *Service) HasPendingUpdate() bool {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.pendingUpdate != nil
|
||||
}
|
||||
|
||||
// GetPendingUpdate returns information about the pending update.
|
||||
func (s *Service) GetPendingUpdate() *DownloadResult {
|
||||
s.mu.RLock()
|
||||
defer s.mu.RUnlock()
|
||||
return s.pendingUpdate
|
||||
}
|
||||
|
||||
// CancelPendingUpdate cancels and removes the pending update.
|
||||
func (s *Service) CancelPendingUpdate() error {
|
||||
s.mu.Lock()
|
||||
defer s.mu.Unlock()
|
||||
|
||||
s.pendingUpdate = nil
|
||||
return s.downloader.Cleanup()
|
||||
}
|
||||
|
||||
// CanSelfUpdate returns true if the application can update itself.
|
||||
func (s *Service) CanSelfUpdate() bool {
|
||||
return CanSelfUpdate()
|
||||
}
|
||||
|
||||
// NeedsElevation returns true if the update requires elevated privileges.
|
||||
func (s *Service) NeedsElevation() bool {
|
||||
return NeedsElevation()
|
||||
}
|
||||
|
||||
// Rollback restores the previous version.
|
||||
func (s *Service) Rollback() error {
|
||||
return s.installer.Rollback()
|
||||
}
|
||||
|
||||
// CleanupAfterUpdate cleans up backup files after a successful update.
|
||||
func (s *Service) CleanupAfterUpdate() error {
|
||||
return s.installer.CleanupBackup()
|
||||
}
|
||||
122
internal/bugseti/version.go
Normal file
122
internal/bugseti/version.go
Normal file
|
|
@ -0,0 +1,122 @@
|
|||
// Package bugseti provides version information for the BugSETI application.
|
||||
package bugseti
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"runtime"
|
||||
)
|
||||
|
||||
// Version information - these are set at build time via ldflags
|
||||
// Example: go build -ldflags "-X github.com/host-uk/core/internal/bugseti.Version=1.0.0"
|
||||
var (
|
||||
// Version is the semantic version (e.g., "1.0.0", "1.0.0-beta.1", "nightly-20260205")
|
||||
Version = "dev"
|
||||
|
||||
// Channel is the release channel (stable, beta, nightly)
|
||||
Channel = "dev"
|
||||
|
||||
// Commit is the git commit SHA
|
||||
Commit = "unknown"
|
||||
|
||||
// BuildTime is the UTC build timestamp
|
||||
BuildTime = "unknown"
|
||||
)
|
||||
|
||||
// VersionInfo contains all version-related information.
|
||||
type VersionInfo struct {
|
||||
Version string `json:"version"`
|
||||
Channel string `json:"channel"`
|
||||
Commit string `json:"commit"`
|
||||
BuildTime string `json:"buildTime"`
|
||||
GoVersion string `json:"goVersion"`
|
||||
OS string `json:"os"`
|
||||
Arch string `json:"arch"`
|
||||
}
|
||||
|
||||
// GetVersion returns the current version string.
|
||||
func GetVersion() string {
|
||||
return Version
|
||||
}
|
||||
|
||||
// GetChannel returns the release channel.
|
||||
func GetChannel() string {
|
||||
return Channel
|
||||
}
|
||||
|
||||
// GetVersionInfo returns complete version information.
|
||||
func GetVersionInfo() VersionInfo {
|
||||
return VersionInfo{
|
||||
Version: Version,
|
||||
Channel: Channel,
|
||||
Commit: Commit,
|
||||
BuildTime: BuildTime,
|
||||
GoVersion: runtime.Version(),
|
||||
OS: runtime.GOOS,
|
||||
Arch: runtime.GOARCH,
|
||||
}
|
||||
}
|
||||
|
||||
// GetVersionString returns a formatted version string for display.
|
||||
func GetVersionString() string {
|
||||
if Channel == "dev" {
|
||||
return fmt.Sprintf("BugSETI %s (development build)", Version)
|
||||
}
|
||||
if Channel == "nightly" {
|
||||
return fmt.Sprintf("BugSETI %s (nightly)", Version)
|
||||
}
|
||||
if Channel == "beta" {
|
||||
return fmt.Sprintf("BugSETI v%s (beta)", Version)
|
||||
}
|
||||
return fmt.Sprintf("BugSETI v%s", Version)
|
||||
}
|
||||
|
||||
// GetShortCommit returns the first 7 characters of the commit hash.
|
||||
func GetShortCommit() string {
|
||||
if len(Commit) >= 7 {
|
||||
return Commit[:7]
|
||||
}
|
||||
return Commit
|
||||
}
|
||||
|
||||
// IsDevelopment returns true if this is a development build.
|
||||
func IsDevelopment() bool {
|
||||
return Channel == "dev" || Version == "dev"
|
||||
}
|
||||
|
||||
// IsPrerelease returns true if this is a prerelease build (beta or nightly).
|
||||
func IsPrerelease() bool {
|
||||
return Channel == "beta" || Channel == "nightly"
|
||||
}
|
||||
|
||||
// VersionService provides version information to the frontend via Wails.
|
||||
type VersionService struct{}
|
||||
|
||||
// NewVersionService creates a new VersionService.
|
||||
func NewVersionService() *VersionService {
|
||||
return &VersionService{}
|
||||
}
|
||||
|
||||
// ServiceName returns the service name for Wails.
|
||||
func (v *VersionService) ServiceName() string {
|
||||
return "VersionService"
|
||||
}
|
||||
|
||||
// GetVersion returns the version string.
|
||||
func (v *VersionService) GetVersion() string {
|
||||
return GetVersion()
|
||||
}
|
||||
|
||||
// GetChannel returns the release channel.
|
||||
func (v *VersionService) GetChannel() string {
|
||||
return GetChannel()
|
||||
}
|
||||
|
||||
// GetVersionInfo returns complete version information.
|
||||
func (v *VersionService) GetVersionInfo() VersionInfo {
|
||||
return GetVersionInfo()
|
||||
}
|
||||
|
||||
// GetVersionString returns a formatted version string.
|
||||
func (v *VersionService) GetVersionString() string {
|
||||
return GetVersionString()
|
||||
}
|
||||
|
|
@ -1,12 +1,15 @@
|
|||
package gocmd
|
||||
|
||||
import (
|
||||
"bufio"
|
||||
"errors"
|
||||
"fmt"
|
||||
"io"
|
||||
"os"
|
||||
"os/exec"
|
||||
"path/filepath"
|
||||
"regexp"
|
||||
"strconv"
|
||||
"strings"
|
||||
|
||||
"github.com/host-uk/core/pkg/cli"
|
||||
|
|
@ -51,10 +54,16 @@ func runGoTest(coverage bool, pkg, run string, short, race, jsonOut, verbose boo
|
|||
|
||||
args := []string{"test"}
|
||||
|
||||
var covPath string
|
||||
if coverage {
|
||||
args = append(args, "-cover")
|
||||
} else {
|
||||
args = append(args, "-cover")
|
||||
args = append(args, "-cover", "-covermode=atomic")
|
||||
covFile, err := os.CreateTemp("", "coverage-*.out")
|
||||
if err == nil {
|
||||
covPath = covFile.Name()
|
||||
_ = covFile.Close()
|
||||
args = append(args, "-coverprofile="+covPath)
|
||||
defer os.Remove(covPath)
|
||||
}
|
||||
}
|
||||
|
||||
if run != "" {
|
||||
|
|
@ -121,7 +130,15 @@ func runGoTest(coverage bool, pkg, run string, short, race, jsonOut, verbose boo
|
|||
}
|
||||
|
||||
if cov > 0 {
|
||||
cli.Print("\n %s %s\n", cli.KeyStyle.Render(i18n.Label("coverage")), formatCoverage(cov))
|
||||
cli.Print("\n %s %s\n", cli.KeyStyle.Render(i18n.Label("statements")), formatCoverage(cov))
|
||||
if covPath != "" {
|
||||
branchCov, err := calculateBlockCoverage(covPath)
|
||||
if err != nil {
|
||||
cli.Print(" %s %s\n", cli.KeyStyle.Render(i18n.Label("branches")), cli.ErrorStyle.Render("unable to calculate"))
|
||||
} else {
|
||||
cli.Print(" %s %s\n", cli.KeyStyle.Render(i18n.Label("branches")), formatCoverage(branchCov))
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
if err == nil {
|
||||
|
|
@ -161,10 +178,12 @@ func parseOverallCoverage(output string) float64 {
|
|||
}
|
||||
|
||||
var (
|
||||
covPkg string
|
||||
covHTML bool
|
||||
covOpen bool
|
||||
covThreshold float64
|
||||
covPkg string
|
||||
covHTML bool
|
||||
covOpen bool
|
||||
covThreshold float64
|
||||
covBranchThreshold float64
|
||||
covOutput string
|
||||
)
|
||||
|
||||
func addGoCovCommand(parent *cli.Command) {
|
||||
|
|
@ -193,7 +212,21 @@ func addGoCovCommand(parent *cli.Command) {
|
|||
}
|
||||
covPath := covFile.Name()
|
||||
_ = covFile.Close()
|
||||
defer func() { _ = os.Remove(covPath) }()
|
||||
defer func() {
|
||||
if covOutput == "" {
|
||||
_ = os.Remove(covPath)
|
||||
} else {
|
||||
// Copy to output destination before removing
|
||||
src, _ := os.Open(covPath)
|
||||
dst, _ := os.Create(covOutput)
|
||||
if src != nil && dst != nil {
|
||||
_, _ = io.Copy(dst, src)
|
||||
_ = src.Close()
|
||||
_ = dst.Close()
|
||||
}
|
||||
_ = os.Remove(covPath)
|
||||
}
|
||||
}()
|
||||
|
||||
cli.Print("%s %s\n", dimStyle.Render(i18n.Label("coverage")), i18n.ProgressSubject("run", "tests"))
|
||||
// Truncate package list if too long for display
|
||||
|
|
@ -228,7 +261,7 @@ func addGoCovCommand(parent *cli.Command) {
|
|||
|
||||
// Parse total coverage from last line
|
||||
lines := strings.Split(strings.TrimSpace(string(covOutput)), "\n")
|
||||
var totalCov float64
|
||||
var statementCov float64
|
||||
if len(lines) > 0 {
|
||||
lastLine := lines[len(lines)-1]
|
||||
// Format: "total: (statements) XX.X%"
|
||||
|
|
@ -236,14 +269,21 @@ func addGoCovCommand(parent *cli.Command) {
|
|||
parts := strings.Fields(lastLine)
|
||||
if len(parts) >= 3 {
|
||||
covStr := strings.TrimSuffix(parts[len(parts)-1], "%")
|
||||
_, _ = fmt.Sscanf(covStr, "%f", &totalCov)
|
||||
_, _ = fmt.Sscanf(covStr, "%f", &statementCov)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
// Calculate branch coverage (block coverage)
|
||||
branchCov, err := calculateBlockCoverage(covPath)
|
||||
if err != nil {
|
||||
return cli.Wrap(err, "calculate branch coverage")
|
||||
}
|
||||
|
||||
// Print coverage summary
|
||||
cli.Blank()
|
||||
cli.Print(" %s %s\n", cli.KeyStyle.Render(i18n.Label("total")), formatCoverage(totalCov))
|
||||
cli.Print(" %s %s\n", cli.KeyStyle.Render(i18n.Label("statements")), formatCoverage(statementCov))
|
||||
cli.Print(" %s %s\n", cli.KeyStyle.Render(i18n.Label("branches")), formatCoverage(branchCov))
|
||||
|
||||
// Generate HTML if requested
|
||||
if covHTML || covOpen {
|
||||
|
|
@ -271,10 +311,14 @@ func addGoCovCommand(parent *cli.Command) {
|
|||
}
|
||||
}
|
||||
|
||||
// Check threshold
|
||||
if covThreshold > 0 && totalCov < covThreshold {
|
||||
cli.Print("\n%s %.1f%% < %.1f%%\n", errorStyle.Render(i18n.T("i18n.fail.meet", "threshold")), totalCov, covThreshold)
|
||||
return errors.New("coverage below threshold")
|
||||
// Check thresholds
|
||||
if covThreshold > 0 && statementCov < covThreshold {
|
||||
cli.Print("\n%s Statements: %.1f%% < %.1f%%\n", errorStyle.Render(i18n.T("i18n.fail.meet", "threshold")), statementCov, covThreshold)
|
||||
return errors.New("statement coverage below threshold")
|
||||
}
|
||||
if covBranchThreshold > 0 && branchCov < covBranchThreshold {
|
||||
cli.Print("\n%s Branches: %.1f%% < %.1f%%\n", errorStyle.Render(i18n.T("i18n.fail.meet", "threshold")), branchCov, covBranchThreshold)
|
||||
return errors.New("branch coverage below threshold")
|
||||
}
|
||||
|
||||
if testErr != nil {
|
||||
|
|
@ -289,11 +333,66 @@ func addGoCovCommand(parent *cli.Command) {
|
|||
covCmd.Flags().StringVar(&covPkg, "pkg", "", "Package to test")
|
||||
covCmd.Flags().BoolVar(&covHTML, "html", false, "Generate HTML report")
|
||||
covCmd.Flags().BoolVar(&covOpen, "open", false, "Open HTML report in browser")
|
||||
covCmd.Flags().Float64Var(&covThreshold, "threshold", 0, "Minimum coverage percentage")
|
||||
covCmd.Flags().Float64Var(&covThreshold, "threshold", 0, "Minimum statement coverage percentage")
|
||||
covCmd.Flags().Float64Var(&covBranchThreshold, "branch-threshold", 0, "Minimum branch coverage percentage")
|
||||
covCmd.Flags().StringVarP(&covOutput, "output", "o", "", "Output file for coverage profile")
|
||||
|
||||
parent.AddCommand(covCmd)
|
||||
}
|
||||
|
||||
// calculateBlockCoverage parses a Go coverage profile and returns the percentage of basic
|
||||
// blocks that have a non-zero execution count. Go's coverage profile contains one line per
|
||||
// basic block, where the last field is the execution count, not explicit branch coverage.
|
||||
// The resulting block coverage is used here only as a proxy for branch coverage; computing
|
||||
// true branch coverage would require more detailed control-flow analysis.
|
||||
func calculateBlockCoverage(path string) (float64, error) {
|
||||
file, err := os.Open(path)
|
||||
if err != nil {
|
||||
return 0, err
|
||||
}
|
||||
defer file.Close()
|
||||
|
||||
scanner := bufio.NewScanner(file)
|
||||
var totalBlocks, coveredBlocks int
|
||||
|
||||
// Skip the first line (mode: atomic/set/count)
|
||||
if !scanner.Scan() {
|
||||
return 0, nil
|
||||
}
|
||||
|
||||
for scanner.Scan() {
|
||||
line := scanner.Text()
|
||||
if line == "" {
|
||||
continue
|
||||
}
|
||||
fields := strings.Fields(line)
|
||||
if len(fields) < 3 {
|
||||
continue
|
||||
}
|
||||
|
||||
// Last field is the count
|
||||
count, err := strconv.Atoi(fields[len(fields)-1])
|
||||
if err != nil {
|
||||
continue
|
||||
}
|
||||
|
||||
totalBlocks++
|
||||
if count > 0 {
|
||||
coveredBlocks++
|
||||
}
|
||||
}
|
||||
|
||||
if err := scanner.Err(); err != nil {
|
||||
return 0, err
|
||||
}
|
||||
|
||||
if totalBlocks == 0 {
|
||||
return 0, nil
|
||||
}
|
||||
|
||||
return (float64(coveredBlocks) / float64(totalBlocks)) * 100, nil
|
||||
}
|
||||
|
||||
func findTestPackages(root string) ([]string, error) {
|
||||
pkgMap := make(map[string]bool)
|
||||
err := filepath.Walk(root, func(path string, info os.FileInfo, err error) error {
|
||||
|
|
|
|||
|
|
@ -24,6 +24,7 @@ var (
|
|||
qaOnly string
|
||||
qaCoverage bool
|
||||
qaThreshold float64
|
||||
qaBranchThreshold float64
|
||||
qaDocblockThreshold float64
|
||||
qaJSON bool
|
||||
qaVerbose bool
|
||||
|
|
@ -71,7 +72,8 @@ Examples:
|
|||
// Coverage flags
|
||||
qaCmd.PersistentFlags().BoolVar(&qaCoverage, "coverage", false, "Include coverage reporting")
|
||||
qaCmd.PersistentFlags().BoolVarP(&qaCoverage, "cov", "c", false, "Include coverage reporting (shorthand)")
|
||||
qaCmd.PersistentFlags().Float64Var(&qaThreshold, "threshold", 0, "Minimum coverage threshold (0-100), fail if below")
|
||||
qaCmd.PersistentFlags().Float64Var(&qaThreshold, "threshold", 0, "Minimum statement coverage threshold (0-100), fail if below")
|
||||
qaCmd.PersistentFlags().Float64Var(&qaBranchThreshold, "branch-threshold", 0, "Minimum branch coverage threshold (0-100), fail if below")
|
||||
qaCmd.PersistentFlags().Float64Var(&qaDocblockThreshold, "docblock-threshold", 80, "Minimum docblock coverage threshold (0-100)")
|
||||
|
||||
// Test flags
|
||||
|
|
@ -134,11 +136,13 @@ Examples:
|
|||
|
||||
// QAResult holds the result of a QA run for JSON output
|
||||
type QAResult struct {
|
||||
Success bool `json:"success"`
|
||||
Duration string `json:"duration"`
|
||||
Checks []CheckResult `json:"checks"`
|
||||
Coverage *float64 `json:"coverage,omitempty"`
|
||||
Threshold *float64 `json:"threshold,omitempty"`
|
||||
Success bool `json:"success"`
|
||||
Duration string `json:"duration"`
|
||||
Checks []CheckResult `json:"checks"`
|
||||
Coverage *float64 `json:"coverage,omitempty"`
|
||||
BranchCoverage *float64 `json:"branch_coverage,omitempty"`
|
||||
Threshold *float64 `json:"threshold,omitempty"`
|
||||
BranchThreshold *float64 `json:"branch_threshold,omitempty"`
|
||||
}
|
||||
|
||||
// CheckResult holds the result of a single check
|
||||
|
|
@ -254,21 +258,34 @@ func runGoQA(cmd *cli.Command, args []string) error {
|
|||
|
||||
// Run coverage if requested
|
||||
var coverageVal *float64
|
||||
var branchVal *float64
|
||||
if qaCoverage && !qaFailFast || (qaCoverage && failed == 0) {
|
||||
cov, err := runCoverage(ctx, cwd)
|
||||
cov, branch, err := runCoverage(ctx, cwd)
|
||||
if err == nil {
|
||||
coverageVal = &cov
|
||||
branchVal = &branch
|
||||
if !qaJSON && !qaQuiet {
|
||||
cli.Print("\n%s %.1f%%\n", cli.DimStyle.Render("Coverage:"), cov)
|
||||
cli.Print("\n%s %.1f%%\n", cli.DimStyle.Render("Statement Coverage:"), cov)
|
||||
cli.Print("%s %.1f%%\n", cli.DimStyle.Render("Branch Coverage:"), branch)
|
||||
}
|
||||
if qaThreshold > 0 && cov < qaThreshold {
|
||||
failed++
|
||||
if !qaJSON && !qaQuiet {
|
||||
cli.Print(" %s Coverage %.1f%% below threshold %.1f%%\n",
|
||||
cli.Print(" %s Statement coverage %.1f%% below threshold %.1f%%\n",
|
||||
cli.ErrorStyle.Render(cli.Glyph(":cross:")), cov, qaThreshold)
|
||||
cli.Hint("fix", "Run 'core go cov --open' to see uncovered lines, then add tests.")
|
||||
}
|
||||
}
|
||||
if qaBranchThreshold > 0 && branch < qaBranchThreshold {
|
||||
failed++
|
||||
if !qaJSON && !qaQuiet {
|
||||
cli.Print(" %s Branch coverage %.1f%% below threshold %.1f%%\n",
|
||||
cli.ErrorStyle.Render(cli.Glyph(":cross:")), branch, qaBranchThreshold)
|
||||
}
|
||||
}
|
||||
|
||||
if failed > 0 && !qaJSON && !qaQuiet {
|
||||
cli.Hint("fix", "Run 'core go cov --open' to see uncovered lines, then add tests.")
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
|
|
@ -277,14 +294,18 @@ func runGoQA(cmd *cli.Command, args []string) error {
|
|||
// JSON output
|
||||
if qaJSON {
|
||||
qaResult := QAResult{
|
||||
Success: failed == 0,
|
||||
Duration: duration.String(),
|
||||
Checks: results,
|
||||
Coverage: coverageVal,
|
||||
Success: failed == 0,
|
||||
Duration: duration.String(),
|
||||
Checks: results,
|
||||
Coverage: coverageVal,
|
||||
BranchCoverage: branchVal,
|
||||
}
|
||||
if qaThreshold > 0 {
|
||||
qaResult.Threshold = &qaThreshold
|
||||
}
|
||||
if qaBranchThreshold > 0 {
|
||||
qaResult.BranchThreshold = &qaBranchThreshold
|
||||
}
|
||||
enc := json.NewEncoder(os.Stdout)
|
||||
enc.SetIndent("", " ")
|
||||
return enc.Encode(qaResult)
|
||||
|
|
@ -308,7 +329,7 @@ func runGoQA(cmd *cli.Command, args []string) error {
|
|||
}
|
||||
|
||||
if failed > 0 {
|
||||
os.Exit(1)
|
||||
return cli.Err("QA checks failed: %d passed, %d failed", passed, failed)
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
|
@ -525,8 +546,17 @@ func runCheckCapture(ctx context.Context, dir string, check QACheck) (string, er
|
|||
return "", cmd.Run()
|
||||
}
|
||||
|
||||
func runCoverage(ctx context.Context, dir string) (float64, error) {
|
||||
args := []string{"test", "-cover", "-coverprofile=/tmp/coverage.out"}
|
||||
func runCoverage(ctx context.Context, dir string) (float64, float64, error) {
|
||||
// Create temp file for coverage data
|
||||
covFile, err := os.CreateTemp("", "coverage-*.out")
|
||||
if err != nil {
|
||||
return 0, 0, err
|
||||
}
|
||||
covPath := covFile.Name()
|
||||
_ = covFile.Close()
|
||||
defer os.Remove(covPath)
|
||||
|
||||
args := []string{"test", "-cover", "-covermode=atomic", "-coverprofile=" + covPath}
|
||||
if qaShort {
|
||||
args = append(args, "-short")
|
||||
}
|
||||
|
|
@ -540,36 +570,36 @@ func runCoverage(ctx context.Context, dir string) (float64, error) {
|
|||
}
|
||||
|
||||
if err := cmd.Run(); err != nil {
|
||||
return 0, err
|
||||
return 0, 0, err
|
||||
}
|
||||
|
||||
// Parse coverage
|
||||
coverCmd := exec.CommandContext(ctx, "go", "tool", "cover", "-func=/tmp/coverage.out")
|
||||
// Parse statement coverage
|
||||
coverCmd := exec.CommandContext(ctx, "go", "tool", "cover", "-func="+covPath)
|
||||
output, err := coverCmd.Output()
|
||||
if err != nil {
|
||||
return 0, err
|
||||
return 0, 0, err
|
||||
}
|
||||
|
||||
// Parse last line for total coverage
|
||||
lines := strings.Split(strings.TrimSpace(string(output)), "\n")
|
||||
if len(lines) == 0 {
|
||||
return 0, nil
|
||||
var statementPct float64
|
||||
if len(lines) > 0 {
|
||||
lastLine := lines[len(lines)-1]
|
||||
fields := strings.Fields(lastLine)
|
||||
if len(fields) >= 3 {
|
||||
// Parse percentage (e.g., "45.6%")
|
||||
pctStr := strings.TrimSuffix(fields[len(fields)-1], "%")
|
||||
_, _ = fmt.Sscanf(pctStr, "%f", &statementPct)
|
||||
}
|
||||
}
|
||||
|
||||
lastLine := lines[len(lines)-1]
|
||||
fields := strings.Fields(lastLine)
|
||||
if len(fields) < 3 {
|
||||
return 0, nil
|
||||
// Parse branch coverage
|
||||
branchPct, err := calculateBlockCoverage(covPath)
|
||||
if err != nil {
|
||||
return statementPct, 0, err
|
||||
}
|
||||
|
||||
// Parse percentage (e.g., "45.6%")
|
||||
pctStr := strings.TrimSuffix(fields[len(fields)-1], "%")
|
||||
var pct float64
|
||||
if _, err := fmt.Sscanf(pctStr, "%f", &pct); err == nil {
|
||||
return pct, nil
|
||||
}
|
||||
|
||||
return 0, nil
|
||||
return statementPct, branchPct, nil
|
||||
}
|
||||
|
||||
// runInternalCheck runs internal Go-based checks (not external commands).
|
||||
|
|
|
|||
229
internal/cmd/go/coverage_test.go
Normal file
229
internal/cmd/go/coverage_test.go
Normal file
|
|
@ -0,0 +1,229 @@
|
|||
package gocmd
|
||||
|
||||
import (
|
||||
"os"
|
||||
"testing"
|
||||
|
||||
"github.com/host-uk/core/pkg/cli"
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestCalculateBlockCoverage(t *testing.T) {
|
||||
// Create a dummy coverage profile
|
||||
content := `mode: set
|
||||
github.com/host-uk/core/pkg/foo.go:1.2,3.4 5 1
|
||||
github.com/host-uk/core/pkg/foo.go:5.6,7.8 2 0
|
||||
github.com/host-uk/core/pkg/bar.go:10.1,12.20 10 5
|
||||
`
|
||||
tmpfile, err := os.CreateTemp("", "test-coverage-*.out")
|
||||
assert.NoError(t, err)
|
||||
defer os.Remove(tmpfile.Name())
|
||||
|
||||
_, err = tmpfile.Write([]byte(content))
|
||||
assert.NoError(t, err)
|
||||
err = tmpfile.Close()
|
||||
assert.NoError(t, err)
|
||||
|
||||
// Test calculation
|
||||
// 3 blocks total, 2 covered (count > 0)
|
||||
// Expect (2/3) * 100 = 66.666...
|
||||
pct, err := calculateBlockCoverage(tmpfile.Name())
|
||||
assert.NoError(t, err)
|
||||
assert.InDelta(t, 66.67, pct, 0.01)
|
||||
|
||||
// Test empty file (only header)
|
||||
contentEmpty := "mode: atomic\n"
|
||||
tmpfileEmpty, _ := os.CreateTemp("", "test-coverage-empty-*.out")
|
||||
defer os.Remove(tmpfileEmpty.Name())
|
||||
tmpfileEmpty.Write([]byte(contentEmpty))
|
||||
tmpfileEmpty.Close()
|
||||
|
||||
pct, err = calculateBlockCoverage(tmpfileEmpty.Name())
|
||||
assert.NoError(t, err)
|
||||
assert.Equal(t, 0.0, pct)
|
||||
|
||||
// Test non-existent file
|
||||
pct, err = calculateBlockCoverage("non-existent-file")
|
||||
assert.Error(t, err)
|
||||
assert.Equal(t, 0.0, pct)
|
||||
|
||||
// Test malformed file
|
||||
contentMalformed := `mode: set
|
||||
github.com/host-uk/core/pkg/foo.go:1.2,3.4 5
|
||||
github.com/host-uk/core/pkg/foo.go:1.2,3.4 5 notanumber
|
||||
`
|
||||
tmpfileMalformed, _ := os.CreateTemp("", "test-coverage-malformed-*.out")
|
||||
defer os.Remove(tmpfileMalformed.Name())
|
||||
tmpfileMalformed.Write([]byte(contentMalformed))
|
||||
tmpfileMalformed.Close()
|
||||
|
||||
pct, err = calculateBlockCoverage(tmpfileMalformed.Name())
|
||||
assert.NoError(t, err)
|
||||
assert.Equal(t, 0.0, pct)
|
||||
|
||||
// Test malformed file - missing fields
|
||||
contentMalformed2 := `mode: set
|
||||
github.com/host-uk/core/pkg/foo.go:1.2,3.4 5
|
||||
`
|
||||
tmpfileMalformed2, _ := os.CreateTemp("", "test-coverage-malformed2-*.out")
|
||||
defer os.Remove(tmpfileMalformed2.Name())
|
||||
tmpfileMalformed2.Write([]byte(contentMalformed2))
|
||||
tmpfileMalformed2.Close()
|
||||
|
||||
pct, err = calculateBlockCoverage(tmpfileMalformed2.Name())
|
||||
assert.NoError(t, err)
|
||||
assert.Equal(t, 0.0, pct)
|
||||
|
||||
// Test completely empty file
|
||||
tmpfileEmpty2, _ := os.CreateTemp("", "test-coverage-empty2-*.out")
|
||||
defer os.Remove(tmpfileEmpty2.Name())
|
||||
tmpfileEmpty2.Close()
|
||||
pct, err = calculateBlockCoverage(tmpfileEmpty2.Name())
|
||||
assert.NoError(t, err)
|
||||
assert.Equal(t, 0.0, pct)
|
||||
}
|
||||
|
||||
func TestParseOverallCoverage(t *testing.T) {
|
||||
output := `ok github.com/host-uk/core/pkg/foo 0.100s coverage: 50.0% of statements
|
||||
ok github.com/host-uk/core/pkg/bar 0.200s coverage: 100.0% of statements
|
||||
`
|
||||
pct := parseOverallCoverage(output)
|
||||
assert.Equal(t, 75.0, pct)
|
||||
|
||||
outputNoCov := "ok github.com/host-uk/core/pkg/foo 0.100s"
|
||||
pct = parseOverallCoverage(outputNoCov)
|
||||
assert.Equal(t, 0.0, pct)
|
||||
}
|
||||
|
||||
func TestFormatCoverage(t *testing.T) {
|
||||
assert.Contains(t, formatCoverage(85.0), "85.0%")
|
||||
assert.Contains(t, formatCoverage(65.0), "65.0%")
|
||||
assert.Contains(t, formatCoverage(25.0), "25.0%")
|
||||
}
|
||||
|
||||
func TestAddGoCovCommand(t *testing.T) {
|
||||
cmd := &cli.Command{Use: "test"}
|
||||
addGoCovCommand(cmd)
|
||||
assert.True(t, cmd.HasSubCommands())
|
||||
sub := cmd.Commands()[0]
|
||||
assert.Equal(t, "cov", sub.Name())
|
||||
}
|
||||
|
||||
func TestAddGoQACommand(t *testing.T) {
|
||||
cmd := &cli.Command{Use: "test"}
|
||||
addGoQACommand(cmd)
|
||||
assert.True(t, cmd.HasSubCommands())
|
||||
sub := cmd.Commands()[0]
|
||||
assert.Equal(t, "qa", sub.Name())
|
||||
}
|
||||
|
||||
func TestDetermineChecks(t *testing.T) {
|
||||
// Default checks
|
||||
qaOnly = ""
|
||||
qaSkip = ""
|
||||
qaRace = false
|
||||
qaBench = false
|
||||
checks := determineChecks()
|
||||
assert.Contains(t, checks, "fmt")
|
||||
assert.Contains(t, checks, "test")
|
||||
|
||||
// Only
|
||||
qaOnly = "fmt,lint"
|
||||
checks = determineChecks()
|
||||
assert.Equal(t, []string{"fmt", "lint"}, checks)
|
||||
|
||||
// Skip
|
||||
qaOnly = ""
|
||||
qaSkip = "fmt,lint"
|
||||
checks = determineChecks()
|
||||
assert.NotContains(t, checks, "fmt")
|
||||
assert.NotContains(t, checks, "lint")
|
||||
assert.Contains(t, checks, "test")
|
||||
|
||||
// Race
|
||||
qaSkip = ""
|
||||
qaRace = true
|
||||
checks = determineChecks()
|
||||
assert.Contains(t, checks, "race")
|
||||
assert.NotContains(t, checks, "test")
|
||||
|
||||
// Reset
|
||||
qaRace = false
|
||||
}
|
||||
|
||||
func TestBuildCheck(t *testing.T) {
|
||||
qaFix = false
|
||||
c := buildCheck("fmt")
|
||||
assert.Equal(t, "format", c.Name)
|
||||
assert.Equal(t, []string{"-l", "."}, c.Args)
|
||||
|
||||
qaFix = true
|
||||
c = buildCheck("fmt")
|
||||
assert.Equal(t, []string{"-w", "."}, c.Args)
|
||||
|
||||
c = buildCheck("vet")
|
||||
assert.Equal(t, "vet", c.Name)
|
||||
|
||||
c = buildCheck("lint")
|
||||
assert.Equal(t, "lint", c.Name)
|
||||
|
||||
c = buildCheck("test")
|
||||
assert.Equal(t, "test", c.Name)
|
||||
|
||||
c = buildCheck("race")
|
||||
assert.Equal(t, "race", c.Name)
|
||||
|
||||
c = buildCheck("bench")
|
||||
assert.Equal(t, "bench", c.Name)
|
||||
|
||||
c = buildCheck("vuln")
|
||||
assert.Equal(t, "vuln", c.Name)
|
||||
|
||||
c = buildCheck("sec")
|
||||
assert.Equal(t, "sec", c.Name)
|
||||
|
||||
c = buildCheck("fuzz")
|
||||
assert.Equal(t, "fuzz", c.Name)
|
||||
|
||||
c = buildCheck("docblock")
|
||||
assert.Equal(t, "docblock", c.Name)
|
||||
|
||||
c = buildCheck("unknown")
|
||||
assert.Equal(t, "", c.Name)
|
||||
}
|
||||
|
||||
func TestBuildChecks(t *testing.T) {
|
||||
checks := buildChecks([]string{"fmt", "vet", "unknown"})
|
||||
assert.Equal(t, 2, len(checks))
|
||||
assert.Equal(t, "format", checks[0].Name)
|
||||
assert.Equal(t, "vet", checks[1].Name)
|
||||
}
|
||||
|
||||
func TestFixHintFor(t *testing.T) {
|
||||
assert.Contains(t, fixHintFor("format", ""), "core go qa fmt --fix")
|
||||
assert.Contains(t, fixHintFor("vet", ""), "go vet")
|
||||
assert.Contains(t, fixHintFor("lint", ""), "core go qa lint --fix")
|
||||
assert.Contains(t, fixHintFor("test", "--- FAIL: TestFoo"), "TestFoo")
|
||||
assert.Contains(t, fixHintFor("race", ""), "Data race")
|
||||
assert.Contains(t, fixHintFor("bench", ""), "Benchmark regression")
|
||||
assert.Contains(t, fixHintFor("vuln", ""), "govulncheck")
|
||||
assert.Contains(t, fixHintFor("sec", ""), "gosec")
|
||||
assert.Contains(t, fixHintFor("fuzz", ""), "crashing input")
|
||||
assert.Contains(t, fixHintFor("docblock", ""), "doc comments")
|
||||
assert.Equal(t, "", fixHintFor("unknown", ""))
|
||||
}
|
||||
|
||||
func TestRunGoQA_NoGoMod(t *testing.T) {
|
||||
// runGoQA should fail if go.mod is not present in CWD
|
||||
// We run it in a temp dir without go.mod
|
||||
tmpDir, _ := os.MkdirTemp("", "test-qa-*")
|
||||
defer os.RemoveAll(tmpDir)
|
||||
cwd, _ := os.Getwd()
|
||||
os.Chdir(tmpDir)
|
||||
defer os.Chdir(cwd)
|
||||
|
||||
cmd := &cli.Command{Use: "qa"}
|
||||
err := runGoQA(cmd, []string{})
|
||||
assert.Error(t, err)
|
||||
assert.Contains(t, err.Error(), "no go.mod found")
|
||||
}
|
||||
96
internal/cmd/mcpcmd/cmd_mcp.go
Normal file
96
internal/cmd/mcpcmd/cmd_mcp.go
Normal file
|
|
@ -0,0 +1,96 @@
|
|||
// Package mcpcmd provides the MCP server command.
|
||||
//
|
||||
// Commands:
|
||||
// - mcp serve: Start the MCP server for AI tool integration
|
||||
package mcpcmd
|
||||
|
||||
import (
|
||||
"context"
|
||||
"os"
|
||||
"os/signal"
|
||||
"syscall"
|
||||
|
||||
"github.com/host-uk/core/pkg/cli"
|
||||
"github.com/host-uk/core/pkg/mcp"
|
||||
)
|
||||
|
||||
func init() {
|
||||
cli.RegisterCommands(AddMCPCommands)
|
||||
}
|
||||
|
||||
var workspaceFlag string
|
||||
|
||||
var mcpCmd = &cli.Command{
|
||||
Use: "mcp",
|
||||
Short: "MCP server for AI tool integration",
|
||||
Long: "Model Context Protocol (MCP) server providing file operations, RAG, and metrics tools.",
|
||||
}
|
||||
|
||||
var serveCmd = &cli.Command{
|
||||
Use: "serve",
|
||||
Short: "Start the MCP server",
|
||||
Long: `Start the MCP server on stdio (default) or TCP.
|
||||
|
||||
The server provides file operations, RAG tools, and metrics tools for AI assistants.
|
||||
|
||||
Environment variables:
|
||||
MCP_ADDR TCP address to listen on (e.g., "localhost:9999")
|
||||
If not set, uses stdio transport.
|
||||
|
||||
Examples:
|
||||
# Start with stdio transport (for Claude Code integration)
|
||||
core mcp serve
|
||||
|
||||
# Start with workspace restriction
|
||||
core mcp serve --workspace /path/to/project
|
||||
|
||||
# Start TCP server
|
||||
MCP_ADDR=localhost:9999 core mcp serve`,
|
||||
RunE: func(cmd *cli.Command, args []string) error {
|
||||
return runServe()
|
||||
},
|
||||
}
|
||||
|
||||
func initFlags() {
|
||||
cli.StringFlag(serveCmd, &workspaceFlag, "workspace", "w", "", "Restrict file operations to this directory (empty = unrestricted)")
|
||||
}
|
||||
|
||||
// AddMCPCommands registers the 'mcp' command and all subcommands.
|
||||
func AddMCPCommands(root *cli.Command) {
|
||||
initFlags()
|
||||
mcpCmd.AddCommand(serveCmd)
|
||||
root.AddCommand(mcpCmd)
|
||||
}
|
||||
|
||||
func runServe() error {
|
||||
// Build MCP service options
|
||||
var opts []mcp.Option
|
||||
|
||||
if workspaceFlag != "" {
|
||||
opts = append(opts, mcp.WithWorkspaceRoot(workspaceFlag))
|
||||
} else {
|
||||
// Explicitly unrestricted when no workspace specified
|
||||
opts = append(opts, mcp.WithWorkspaceRoot(""))
|
||||
}
|
||||
|
||||
// Create the MCP service
|
||||
svc, err := mcp.New(opts...)
|
||||
if err != nil {
|
||||
return cli.Wrap(err, "create MCP service")
|
||||
}
|
||||
|
||||
// Set up signal handling for clean shutdown
|
||||
ctx, cancel := context.WithCancel(context.Background())
|
||||
defer cancel()
|
||||
|
||||
sigCh := make(chan os.Signal, 1)
|
||||
signal.Notify(sigCh, syscall.SIGINT, syscall.SIGTERM)
|
||||
|
||||
go func() {
|
||||
<-sigCh
|
||||
cancel()
|
||||
}()
|
||||
|
||||
// Run the server (blocks until context cancelled or error)
|
||||
return svc.Run(ctx)
|
||||
}
|
||||
|
|
@ -189,7 +189,7 @@ func runPHPCI() error {
|
|||
return err
|
||||
}
|
||||
if !result.Passed {
|
||||
os.Exit(result.ExitCode)
|
||||
return cli.Exit(result.ExitCode, cli.Err("CI pipeline failed"))
|
||||
}
|
||||
return nil
|
||||
}
|
||||
|
|
|
|||
|
|
@ -167,7 +167,7 @@ func CheckDocblockCoverage(patterns []string) (*DocblockResult, error) {
|
|||
}, parser.ParseComments)
|
||||
if err != nil {
|
||||
// Log parse errors but continue to check other directories
|
||||
fmt.Fprintf(os.Stderr, "warning: failed to parse %s: %v\n", dir, err)
|
||||
cli.Warnf("failed to parse %s: %v", dir, err)
|
||||
continue
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -96,8 +96,7 @@ func runSDKDiff(basePath, specPath string) error {
|
|||
|
||||
result, err := Diff(basePath, specPath)
|
||||
if err != nil {
|
||||
fmt.Printf("%s %v\n", sdkErrorStyle.Render(i18n.Label("error")), err)
|
||||
os.Exit(2)
|
||||
return cli.Exit(2, cli.Wrap(err, i18n.Label("error")))
|
||||
}
|
||||
|
||||
if result.Breaking {
|
||||
|
|
@ -105,7 +104,7 @@ func runSDKDiff(basePath, specPath string) error {
|
|||
for _, change := range result.Changes {
|
||||
fmt.Printf(" - %s\n", change)
|
||||
}
|
||||
os.Exit(1)
|
||||
return cli.Exit(1, cli.Err("%s", result.Summary))
|
||||
}
|
||||
|
||||
fmt.Printf("%s %s\n", sdkSuccessStyle.Render(i18n.T("cmd.sdk.label.ok")), result.Summary)
|
||||
|
|
|
|||
|
|
@ -138,7 +138,11 @@ func printCoverageSummary(results testResults) {
|
|||
continue
|
||||
}
|
||||
name := shortenPackageName(pkg.name)
|
||||
padding := strings.Repeat(" ", maxLen-len(name)+2)
|
||||
padLen := maxLen - len(name) + 2
|
||||
if padLen < 0 {
|
||||
padLen = 2
|
||||
}
|
||||
padding := strings.Repeat(" ", padLen)
|
||||
fmt.Printf(" %s%s%s\n", name, padding, formatCoverage(pkg.coverage))
|
||||
}
|
||||
|
||||
|
|
@ -146,7 +150,11 @@ func printCoverageSummary(results testResults) {
|
|||
if results.covCount > 0 {
|
||||
avgCov := results.totalCov / float64(results.covCount)
|
||||
avgLabel := i18n.T("cmd.test.label.average")
|
||||
padding := strings.Repeat(" ", maxLen-len(avgLabel)+2)
|
||||
padLen := maxLen - len(avgLabel) + 2
|
||||
if padLen < 0 {
|
||||
padLen = 2
|
||||
}
|
||||
padding := strings.Repeat(" ", padLen)
|
||||
fmt.Printf("\n %s%s%s\n", testHeaderStyle.Render(avgLabel), padding, formatCoverage(avgCov))
|
||||
}
|
||||
}
|
||||
|
|
|
|||
52
internal/cmd/test/output_test.go
Normal file
52
internal/cmd/test/output_test.go
Normal file
|
|
@ -0,0 +1,52 @@
|
|||
package testcmd
|
||||
|
||||
import (
|
||||
"testing"
|
||||
|
||||
"github.com/stretchr/testify/assert"
|
||||
)
|
||||
|
||||
func TestShortenPackageName(t *testing.T) {
|
||||
assert.Equal(t, "pkg/foo", shortenPackageName("github.com/host-uk/core/pkg/foo"))
|
||||
assert.Equal(t, "core-php", shortenPackageName("github.com/host-uk/core-php"))
|
||||
assert.Equal(t, "bar", shortenPackageName("github.com/other/bar"))
|
||||
}
|
||||
|
||||
func TestFormatCoverageTest(t *testing.T) {
|
||||
assert.Contains(t, formatCoverage(85.0), "85.0%")
|
||||
assert.Contains(t, formatCoverage(65.0), "65.0%")
|
||||
assert.Contains(t, formatCoverage(25.0), "25.0%")
|
||||
}
|
||||
|
||||
func TestParseTestOutput(t *testing.T) {
|
||||
output := `ok github.com/host-uk/core/pkg/foo 0.100s coverage: 50.0% of statements
|
||||
FAIL github.com/host-uk/core/pkg/bar
|
||||
? github.com/host-uk/core/pkg/baz [no test files]
|
||||
`
|
||||
results := parseTestOutput(output)
|
||||
assert.Equal(t, 1, results.passed)
|
||||
assert.Equal(t, 1, results.failed)
|
||||
assert.Equal(t, 1, results.skipped)
|
||||
assert.Equal(t, 1, len(results.failedPkgs))
|
||||
assert.Equal(t, "github.com/host-uk/core/pkg/bar", results.failedPkgs[0])
|
||||
assert.Equal(t, 1, len(results.packages))
|
||||
assert.Equal(t, 50.0, results.packages[0].coverage)
|
||||
}
|
||||
|
||||
func TestPrintCoverageSummarySafe(t *testing.T) {
|
||||
// This tests the bug fix for long package names causing negative Repeat count
|
||||
results := testResults{
|
||||
packages: []packageCoverage{
|
||||
{name: "github.com/host-uk/core/pkg/short", coverage: 100, hasCov: true},
|
||||
{name: "github.com/host-uk/core/pkg/a-very-very-very-very-very-long-package-name-that-might-cause-issues", coverage: 80, hasCov: true},
|
||||
},
|
||||
passed: 2,
|
||||
totalCov: 180,
|
||||
covCount: 2,
|
||||
}
|
||||
|
||||
// Should not panic
|
||||
assert.NotPanics(t, func() {
|
||||
printCoverageSummary(results)
|
||||
})
|
||||
}
|
||||
|
|
@ -3,7 +3,6 @@ package updater
|
|||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"os"
|
||||
"runtime"
|
||||
|
||||
"github.com/host-uk/core/pkg/cli"
|
||||
|
|
@ -133,8 +132,6 @@ func runUpdate(cmd *cobra.Command, args []string) error {
|
|||
cli.Print("%s Updated to %s\n", cli.SuccessStyle.Render(cli.Glyph(":check:")), release.TagName)
|
||||
cli.Print("%s Restarting...\n", cli.DimStyle.Render("→"))
|
||||
|
||||
// Exit so the watcher can restart us
|
||||
os.Exit(0)
|
||||
return nil
|
||||
}
|
||||
|
||||
|
|
@ -179,7 +176,6 @@ func handleDevUpdate(currentVersion string) error {
|
|||
cli.Print("%s Updated to %s\n", cli.SuccessStyle.Render(cli.Glyph(":check:")), release.TagName)
|
||||
cli.Print("%s Restarting...\n", cli.DimStyle.Render("→"))
|
||||
|
||||
os.Exit(0)
|
||||
return nil
|
||||
}
|
||||
|
||||
|
|
@ -216,6 +212,5 @@ func handleDevTagUpdate(currentVersion string) error {
|
|||
cli.Print("%s Updated to latest dev build\n", cli.SuccessStyle.Render(cli.Glyph(":check:")))
|
||||
cli.Print("%s Restarting...\n", cli.DimStyle.Render("→"))
|
||||
|
||||
os.Exit(0)
|
||||
return nil
|
||||
}
|
||||
|
|
|
|||
|
|
@ -20,6 +20,8 @@
|
|||
// - test: Test runner with coverage
|
||||
// - qa: Quality assurance workflows
|
||||
// - monitor: Security monitoring aggregation
|
||||
// - gitea: Gitea instance management (repos, issues, PRs, mirrors)
|
||||
// - unifi: UniFi network management (sites, devices, clients)
|
||||
|
||||
package variants
|
||||
|
||||
|
|
@ -35,8 +37,10 @@ import (
|
|||
_ "github.com/host-uk/core/internal/cmd/docs"
|
||||
_ "github.com/host-uk/core/internal/cmd/doctor"
|
||||
_ "github.com/host-uk/core/internal/cmd/gitcmd"
|
||||
_ "github.com/host-uk/core/internal/cmd/gitea"
|
||||
_ "github.com/host-uk/core/internal/cmd/go"
|
||||
_ "github.com/host-uk/core/internal/cmd/help"
|
||||
_ "github.com/host-uk/core/internal/cmd/mcpcmd"
|
||||
_ "github.com/host-uk/core/internal/cmd/monitor"
|
||||
_ "github.com/host-uk/core/internal/cmd/php"
|
||||
_ "github.com/host-uk/core/internal/cmd/pkgcmd"
|
||||
|
|
@ -46,6 +50,7 @@ import (
|
|||
_ "github.com/host-uk/core/internal/cmd/security"
|
||||
_ "github.com/host-uk/core/internal/cmd/setup"
|
||||
_ "github.com/host-uk/core/internal/cmd/test"
|
||||
_ "github.com/host-uk/core/internal/cmd/unifi"
|
||||
_ "github.com/host-uk/core/internal/cmd/updater"
|
||||
_ "github.com/host-uk/core/internal/cmd/vm"
|
||||
_ "github.com/host-uk/core/internal/cmd/workspace"
|
||||
|
|
|
|||
31
mkdocs.yml
31
mkdocs.yml
|
|
@ -43,6 +43,26 @@ markdown_extensions:
|
|||
|
||||
nav:
|
||||
- Home: index.md
|
||||
- User Documentation:
|
||||
- User Guide: user-guide.md
|
||||
- FAQ: faq.md
|
||||
- Troubleshooting: troubleshooting.md
|
||||
- Workflows: workflows.md
|
||||
- CLI Reference:
|
||||
- Overview: cmd/index.md
|
||||
- AI: cmd/ai/index.md
|
||||
- Build: cmd/build/index.md
|
||||
- CI: cmd/ci/index.md
|
||||
- Dev: cmd/dev/index.md
|
||||
- Go: cmd/go/index.md
|
||||
- PHP: cmd/php/index.md
|
||||
- SDK: cmd/sdk/index.md
|
||||
- Setup: cmd/setup/index.md
|
||||
- Doctor: cmd/doctor/index.md
|
||||
- Test: cmd/test/index.md
|
||||
- VM: cmd/vm/index.md
|
||||
- Pkg: cmd/pkg/index.md
|
||||
- Docs: cmd/docs/index.md
|
||||
- Getting Started:
|
||||
- Installation: getting-started/installation.md
|
||||
- Quick Start: getting-started/quickstart.md
|
||||
|
|
@ -71,3 +91,14 @@ nav:
|
|||
- API Reference:
|
||||
- Core: api/core.md
|
||||
- Display: api/display.md
|
||||
- Development:
|
||||
- Package Standards: pkg/PACKAGE_STANDARDS.md
|
||||
- Internationalization:
|
||||
- Overview: pkg/i18n/README.md
|
||||
- Grammar: pkg/i18n/GRAMMAR.md
|
||||
- Extending: pkg/i18n/EXTENDING.md
|
||||
- Claude Skill: skill/index.md
|
||||
- Reference:
|
||||
- Configuration: configuration.md
|
||||
- Migration: migration.md
|
||||
- Glossary: glossary.md
|
||||
|
|
|
|||
|
|
@ -7,6 +7,7 @@ import (
|
|||
"strings"
|
||||
|
||||
"github.com/host-uk/core/pkg/framework"
|
||||
"github.com/host-uk/core/pkg/log"
|
||||
)
|
||||
|
||||
// Tasks for AI service
|
||||
|
|
@ -68,10 +69,16 @@ func (s *Service) handleTask(c *framework.Core, t framework.Task) (any, bool, er
|
|||
switch m := t.(type) {
|
||||
case TaskCommit:
|
||||
err := s.doCommit(m)
|
||||
if err != nil {
|
||||
log.Error("agentic: commit task failed", "err", err, "path", m.Path)
|
||||
}
|
||||
return nil, true, err
|
||||
|
||||
case TaskPrompt:
|
||||
err := s.doPrompt(m)
|
||||
if err != nil {
|
||||
log.Error("agentic: prompt task failed", "err", err)
|
||||
}
|
||||
return nil, true, err
|
||||
}
|
||||
return nil, false, nil
|
||||
|
|
|
|||
|
|
@ -120,7 +120,7 @@ func (e *Executor) runPlay(ctx context.Context, play *Play) error {
|
|||
if err := e.gatherFacts(ctx, host, play); err != nil {
|
||||
// Non-fatal
|
||||
if e.Verbose > 0 {
|
||||
fmt.Fprintf(os.Stderr, "Warning: gather facts failed for %s: %v\n", host, err)
|
||||
log.Warn("gather facts failed", "host", host, "err", err)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
|
|
|||
|
|
@ -30,7 +30,6 @@ type SSHClient struct {
|
|||
becomeUser string
|
||||
becomePass string
|
||||
timeout time.Duration
|
||||
insecure bool
|
||||
}
|
||||
|
||||
// SSHConfig holds SSH connection configuration.
|
||||
|
|
@ -44,7 +43,6 @@ type SSHConfig struct {
|
|||
BecomeUser string
|
||||
BecomePass string
|
||||
Timeout time.Duration
|
||||
Insecure bool
|
||||
}
|
||||
|
||||
// NewSSHClient creates a new SSH client.
|
||||
|
|
@ -69,7 +67,6 @@ func NewSSHClient(cfg SSHConfig) (*SSHClient, error) {
|
|||
becomeUser: cfg.BecomeUser,
|
||||
becomePass: cfg.BecomePass,
|
||||
timeout: cfg.Timeout,
|
||||
insecure: cfg.Insecure,
|
||||
}
|
||||
|
||||
return client, nil
|
||||
|
|
@ -137,21 +134,27 @@ func (c *SSHClient) Connect(ctx context.Context) error {
|
|||
// Host key verification
|
||||
var hostKeyCallback ssh.HostKeyCallback
|
||||
|
||||
if c.insecure {
|
||||
hostKeyCallback = ssh.InsecureIgnoreHostKey()
|
||||
} else {
|
||||
home, err := os.UserHomeDir()
|
||||
if err != nil {
|
||||
return log.E("ssh.Connect", "failed to get user home dir", err)
|
||||
}
|
||||
knownHostsPath := filepath.Join(home, ".ssh", "known_hosts")
|
||||
|
||||
cb, err := knownhosts.New(knownHostsPath)
|
||||
if err != nil {
|
||||
return log.E("ssh.Connect", "failed to load known_hosts (use Insecure=true to bypass)", err)
|
||||
}
|
||||
hostKeyCallback = cb
|
||||
home, err := os.UserHomeDir()
|
||||
if err != nil {
|
||||
return log.E("ssh.Connect", "failed to get user home dir", err)
|
||||
}
|
||||
knownHostsPath := filepath.Join(home, ".ssh", "known_hosts")
|
||||
|
||||
// Ensure known_hosts file exists
|
||||
if _, err := os.Stat(knownHostsPath); os.IsNotExist(err) {
|
||||
if err := os.MkdirAll(filepath.Dir(knownHostsPath), 0700); err != nil {
|
||||
return log.E("ssh.Connect", "failed to create .ssh dir", err)
|
||||
}
|
||||
if err := os.WriteFile(knownHostsPath, nil, 0600); err != nil {
|
||||
return log.E("ssh.Connect", "failed to create known_hosts file", err)
|
||||
}
|
||||
}
|
||||
|
||||
cb, err := knownhosts.New(knownHostsPath)
|
||||
if err != nil {
|
||||
return log.E("ssh.Connect", "failed to load known_hosts", err)
|
||||
}
|
||||
hostKeyCallback = cb
|
||||
|
||||
config := &ssh.ClientConfig{
|
||||
User: c.user,
|
||||
|
|
|
|||
|
|
@ -1,10 +1,14 @@
|
|||
package cli
|
||||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"runtime/debug"
|
||||
|
||||
"github.com/host-uk/core/pkg/crypt/openpgp"
|
||||
"github.com/host-uk/core/pkg/framework"
|
||||
"github.com/host-uk/core/pkg/log"
|
||||
"github.com/host-uk/core/pkg/workspace"
|
||||
"github.com/spf13/cobra"
|
||||
)
|
||||
|
||||
|
|
@ -20,8 +24,17 @@ var AppVersion = "dev"
|
|||
|
||||
// Main initialises and runs the CLI application.
|
||||
// This is the main entry point for the CLI.
|
||||
// Exits with code 1 on error.
|
||||
// Exits with code 1 on error or panic.
|
||||
func Main() {
|
||||
// Recovery from panics
|
||||
defer func() {
|
||||
if r := recover(); r != nil {
|
||||
log.Error("recovered from panic", "error", r, "stack", string(debug.Stack()))
|
||||
Shutdown()
|
||||
Fatal(fmt.Errorf("panic: %v", r))
|
||||
}
|
||||
}()
|
||||
|
||||
// Initialise CLI runtime with services
|
||||
if err := Init(Options{
|
||||
AppName: AppName,
|
||||
|
|
@ -31,16 +44,27 @@ func Main() {
|
|||
framework.WithName("log", NewLogService(log.Options{
|
||||
Level: log.LevelInfo,
|
||||
})),
|
||||
framework.WithName("crypt", openpgp.New),
|
||||
framework.WithName("workspace", workspace.New),
|
||||
},
|
||||
}); err != nil {
|
||||
Fatal(err)
|
||||
Error(err.Error())
|
||||
os.Exit(1)
|
||||
}
|
||||
defer Shutdown()
|
||||
|
||||
// Add completion command to the CLI's root
|
||||
RootCmd().AddCommand(completionCmd)
|
||||
|
||||
Fatal(Execute())
|
||||
if err := Execute(); err != nil {
|
||||
code := 1
|
||||
var exitErr *ExitError
|
||||
if As(err, &exitErr) {
|
||||
code = exitErr.Code
|
||||
}
|
||||
Error(err.Error())
|
||||
os.Exit(code)
|
||||
}
|
||||
}
|
||||
|
||||
// completionCmd generates shell completion scripts.
|
||||
|
|
|
|||
|
|
@ -219,7 +219,7 @@ func (h *HealthServer) Start() error {
|
|||
|
||||
go func() {
|
||||
if err := h.server.Serve(listener); err != http.ErrServerClosed {
|
||||
LogError(fmt.Sprintf("health server error: %v", err))
|
||||
LogError("health server error", "err", err)
|
||||
}
|
||||
}()
|
||||
|
||||
|
|
|
|||
|
|
@ -77,48 +77,86 @@ func Join(errs ...error) error {
|
|||
return errors.Join(errs...)
|
||||
}
|
||||
|
||||
// ExitError represents an error that should cause the CLI to exit with a specific code.
|
||||
type ExitError struct {
|
||||
Code int
|
||||
Err error
|
||||
}
|
||||
|
||||
func (e *ExitError) Error() string {
|
||||
if e.Err == nil {
|
||||
return ""
|
||||
}
|
||||
return e.Err.Error()
|
||||
}
|
||||
|
||||
func (e *ExitError) Unwrap() error {
|
||||
return e.Err
|
||||
}
|
||||
|
||||
// Exit creates a new ExitError with the given code and error.
|
||||
// Use this to return an error from a command with a specific exit code.
|
||||
func Exit(code int, err error) error {
|
||||
if err == nil {
|
||||
return nil
|
||||
}
|
||||
return &ExitError{Code: code, Err: err}
|
||||
}
|
||||
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
// Fatal Functions (print and exit)
|
||||
// Fatal Functions (Deprecated - return error from command instead)
|
||||
// ─────────────────────────────────────────────────────────────────────────────
|
||||
|
||||
// Fatal prints an error message and exits with code 1.
|
||||
// Fatal prints an error message to stderr, logs it, and exits with code 1.
|
||||
//
|
||||
// Deprecated: return an error from the command instead.
|
||||
func Fatal(err error) {
|
||||
if err != nil {
|
||||
fmt.Println(ErrorStyle.Render(Glyph(":cross:") + " " + err.Error()))
|
||||
LogError("Fatal error", "err", err)
|
||||
fmt.Fprintln(os.Stderr, ErrorStyle.Render(Glyph(":cross:")+" "+err.Error()))
|
||||
os.Exit(1)
|
||||
}
|
||||
}
|
||||
|
||||
// Fatalf prints a formatted error message and exits with code 1.
|
||||
// Fatalf prints a formatted error message to stderr, logs it, and exits with code 1.
|
||||
//
|
||||
// Deprecated: return an error from the command instead.
|
||||
func Fatalf(format string, args ...any) {
|
||||
msg := fmt.Sprintf(format, args...)
|
||||
fmt.Println(ErrorStyle.Render(Glyph(":cross:") + " " + msg))
|
||||
LogError("Fatal error", "msg", msg)
|
||||
fmt.Fprintln(os.Stderr, ErrorStyle.Render(Glyph(":cross:")+" "+msg))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// FatalWrap prints a wrapped error message and exits with code 1.
|
||||
// FatalWrap prints a wrapped error message to stderr, logs it, and exits with code 1.
|
||||
// Does nothing if err is nil.
|
||||
//
|
||||
// Deprecated: return an error from the command instead.
|
||||
//
|
||||
// cli.FatalWrap(err, "load config") // Prints "✗ load config: <error>" and exits
|
||||
func FatalWrap(err error, msg string) {
|
||||
if err == nil {
|
||||
return
|
||||
}
|
||||
LogError("Fatal error", "msg", msg, "err", err)
|
||||
fullMsg := fmt.Sprintf("%s: %v", msg, err)
|
||||
fmt.Println(ErrorStyle.Render(Glyph(":cross:") + " " + fullMsg))
|
||||
fmt.Fprintln(os.Stderr, ErrorStyle.Render(Glyph(":cross:")+" "+fullMsg))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
||||
// FatalWrapVerb prints a wrapped error using i18n grammar and exits with code 1.
|
||||
// FatalWrapVerb prints a wrapped error using i18n grammar to stderr, logs it, and exits with code 1.
|
||||
// Does nothing if err is nil.
|
||||
//
|
||||
// Deprecated: return an error from the command instead.
|
||||
//
|
||||
// cli.FatalWrapVerb(err, "load", "config") // Prints "✗ Failed to load config: <error>" and exits
|
||||
func FatalWrapVerb(err error, verb, subject string) {
|
||||
if err == nil {
|
||||
return
|
||||
}
|
||||
msg := i18n.ActionFailed(verb, subject)
|
||||
LogError("Fatal error", "msg", msg, "err", err, "verb", verb, "subject", subject)
|
||||
fullMsg := fmt.Sprintf("%s: %v", msg, err)
|
||||
fmt.Println(ErrorStyle.Render(Glyph(":cross:") + " " + fullMsg))
|
||||
fmt.Fprintln(os.Stderr, ErrorStyle.Render(Glyph(":cross:")+" "+fullMsg))
|
||||
os.Exit(1)
|
||||
}
|
||||
|
|
|
|||
|
|
@ -68,31 +68,31 @@ func Log() *LogService {
|
|||
return svc
|
||||
}
|
||||
|
||||
// LogDebug logs a debug message if log service is available.
|
||||
func LogDebug(msg string) {
|
||||
// LogDebug logs a debug message with optional key-value pairs if log service is available.
|
||||
func LogDebug(msg string, keyvals ...any) {
|
||||
if l := Log(); l != nil {
|
||||
l.Debug(msg)
|
||||
l.Debug(msg, keyvals...)
|
||||
}
|
||||
}
|
||||
|
||||
// LogInfo logs an info message if log service is available.
|
||||
func LogInfo(msg string) {
|
||||
// LogInfo logs an info message with optional key-value pairs if log service is available.
|
||||
func LogInfo(msg string, keyvals ...any) {
|
||||
if l := Log(); l != nil {
|
||||
l.Info(msg)
|
||||
l.Info(msg, keyvals...)
|
||||
}
|
||||
}
|
||||
|
||||
// LogWarn logs a warning message if log service is available.
|
||||
func LogWarn(msg string) {
|
||||
// LogWarn logs a warning message with optional key-value pairs if log service is available.
|
||||
func LogWarn(msg string, keyvals ...any) {
|
||||
if l := Log(); l != nil {
|
||||
l.Warn(msg)
|
||||
l.Warn(msg, keyvals...)
|
||||
}
|
||||
}
|
||||
|
||||
// LogError logs an error message if log service is available.
|
||||
func LogError(msg string) {
|
||||
// LogError logs an error message with optional key-value pairs if log service is available.
|
||||
func LogError(msg string, keyvals ...any) {
|
||||
if l := Log(); l != nil {
|
||||
l.Error(msg)
|
||||
l.Error(msg, keyvals...)
|
||||
}
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -2,6 +2,7 @@ package cli
|
|||
|
||||
import (
|
||||
"fmt"
|
||||
"os"
|
||||
"strings"
|
||||
|
||||
"github.com/host-uk/core/pkg/i18n"
|
||||
|
|
@ -45,22 +46,50 @@ func Successf(format string, args ...any) {
|
|||
Success(fmt.Sprintf(format, args...))
|
||||
}
|
||||
|
||||
// Error prints an error message with cross (red).
|
||||
// Error prints an error message with cross (red) to stderr and logs it.
|
||||
func Error(msg string) {
|
||||
fmt.Println(ErrorStyle.Render(Glyph(":cross:") + " " + msg))
|
||||
LogError(msg)
|
||||
fmt.Fprintln(os.Stderr, ErrorStyle.Render(Glyph(":cross:")+" "+msg))
|
||||
}
|
||||
|
||||
// Errorf prints a formatted error message.
|
||||
// Errorf prints a formatted error message to stderr and logs it.
|
||||
func Errorf(format string, args ...any) {
|
||||
Error(fmt.Sprintf(format, args...))
|
||||
}
|
||||
|
||||
// Warn prints a warning message with warning symbol (amber).
|
||||
func Warn(msg string) {
|
||||
fmt.Println(WarningStyle.Render(Glyph(":warn:") + " " + msg))
|
||||
// ErrorWrap prints a wrapped error message to stderr and logs it.
|
||||
func ErrorWrap(err error, msg string) {
|
||||
if err == nil {
|
||||
return
|
||||
}
|
||||
Error(fmt.Sprintf("%s: %v", msg, err))
|
||||
}
|
||||
|
||||
// Warnf prints a formatted warning message.
|
||||
// ErrorWrapVerb prints a wrapped error using i18n grammar to stderr and logs it.
|
||||
func ErrorWrapVerb(err error, verb, subject string) {
|
||||
if err == nil {
|
||||
return
|
||||
}
|
||||
msg := i18n.ActionFailed(verb, subject)
|
||||
Error(fmt.Sprintf("%s: %v", msg, err))
|
||||
}
|
||||
|
||||
// ErrorWrapAction prints a wrapped error using i18n grammar to stderr and logs it.
|
||||
func ErrorWrapAction(err error, verb string) {
|
||||
if err == nil {
|
||||
return
|
||||
}
|
||||
msg := i18n.ActionFailed(verb, "")
|
||||
Error(fmt.Sprintf("%s: %v", msg, err))
|
||||
}
|
||||
|
||||
// Warn prints a warning message with warning symbol (amber) to stderr and logs it.
|
||||
func Warn(msg string) {
|
||||
LogWarn(msg)
|
||||
fmt.Fprintln(os.Stderr, WarningStyle.Render(Glyph(":warn:")+" "+msg))
|
||||
}
|
||||
|
||||
// Warnf prints a formatted warning message to stderr and logs it.
|
||||
func Warnf(format string, args ...any) {
|
||||
Warn(fmt.Sprintf(format, args...))
|
||||
}
|
||||
|
|
|
|||
|
|
@ -8,14 +8,17 @@ import (
|
|||
)
|
||||
|
||||
func captureOutput(f func()) string {
|
||||
old := os.Stdout
|
||||
oldOut := os.Stdout
|
||||
oldErr := os.Stderr
|
||||
r, w, _ := os.Pipe()
|
||||
os.Stdout = w
|
||||
os.Stderr = w
|
||||
|
||||
f()
|
||||
|
||||
_ = w.Close()
|
||||
os.Stdout = old
|
||||
os.Stdout = oldOut
|
||||
os.Stderr = oldErr
|
||||
|
||||
var buf bytes.Buffer
|
||||
_, _ = io.Copy(&buf, r)
|
||||
|
|
|
|||
|
|
@ -15,7 +15,6 @@ package cli
|
|||
|
||||
import (
|
||||
"context"
|
||||
"fmt"
|
||||
"os"
|
||||
"os/signal"
|
||||
"sync"
|
||||
|
|
@ -58,8 +57,10 @@ func Init(opts Options) error {
|
|||
|
||||
// Create root command
|
||||
rootCmd := &cobra.Command{
|
||||
Use: opts.AppName,
|
||||
Version: opts.Version,
|
||||
Use: opts.AppName,
|
||||
Version: opts.Version,
|
||||
SilenceErrors: true,
|
||||
SilenceUsage: true,
|
||||
}
|
||||
|
||||
// Attach all registered commands
|
||||
|
|
@ -147,9 +148,10 @@ func Shutdown() {
|
|||
// --- Signal Service (internal) ---
|
||||
|
||||
type signalService struct {
|
||||
cancel context.CancelFunc
|
||||
sigChan chan os.Signal
|
||||
onReload func() error
|
||||
cancel context.CancelFunc
|
||||
sigChan chan os.Signal
|
||||
onReload func() error
|
||||
shutdownOnce sync.Once
|
||||
}
|
||||
|
||||
// SignalOption configures signal handling.
|
||||
|
|
@ -190,7 +192,7 @@ func (s *signalService) OnStartup(ctx context.Context) error {
|
|||
case syscall.SIGHUP:
|
||||
if s.onReload != nil {
|
||||
if err := s.onReload(); err != nil {
|
||||
LogError(fmt.Sprintf("reload failed: %v", err))
|
||||
LogError("reload failed", "err", err)
|
||||
} else {
|
||||
LogInfo("configuration reloaded")
|
||||
}
|
||||
|
|
@ -209,7 +211,9 @@ func (s *signalService) OnStartup(ctx context.Context) error {
|
|||
}
|
||||
|
||||
func (s *signalService) OnShutdown(ctx context.Context) error {
|
||||
signal.Stop(s.sigChan)
|
||||
close(s.sigChan)
|
||||
s.shutdownOnce.Do(func() {
|
||||
signal.Stop(s.sigChan)
|
||||
close(s.sigChan)
|
||||
})
|
||||
return nil
|
||||
}
|
||||
|
|
|
|||
|
|
@ -436,7 +436,7 @@ func (m *LinuxKitManager) Exec(ctx context.Context, id string, cmd []string) err
|
|||
// Build SSH command
|
||||
sshArgs := []string{
|
||||
"-p", fmt.Sprintf("%d", sshPort),
|
||||
"-o", "StrictHostKeyChecking=accept-new",
|
||||
"-o", "StrictHostKeyChecking=yes",
|
||||
"-o", "UserKnownHostsFile=~/.core/known_hosts",
|
||||
"-o", "LogLevel=ERROR",
|
||||
"root@localhost",
|
||||
|
|
|
|||
|
|
@ -70,11 +70,11 @@ func (d *DevOps) Claude(ctx context.Context, projectDir string, opts ClaudeOptio
|
|||
|
||||
// Build SSH command with agent forwarding
|
||||
args := []string{
|
||||
"-o", "StrictHostKeyChecking=accept-new",
|
||||
"-o", "StrictHostKeyChecking=yes",
|
||||
"-o", "UserKnownHostsFile=~/.core/known_hosts",
|
||||
"-o", "LogLevel=ERROR",
|
||||
"-A", // SSH agent forwarding
|
||||
"-p", "2222",
|
||||
"-p", fmt.Sprintf("%d", DefaultSSHPort),
|
||||
}
|
||||
|
||||
args = append(args, "root@localhost")
|
||||
|
|
@ -132,10 +132,10 @@ func (d *DevOps) CopyGHAuth(ctx context.Context) error {
|
|||
|
||||
// Use scp to copy gh config
|
||||
cmd := exec.CommandContext(ctx, "scp",
|
||||
"-o", "StrictHostKeyChecking=accept-new",
|
||||
"-o", "StrictHostKeyChecking=yes",
|
||||
"-o", "UserKnownHostsFile=~/.core/known_hosts",
|
||||
"-o", "LogLevel=ERROR",
|
||||
"-P", "2222",
|
||||
"-P", fmt.Sprintf("%d", DefaultSSHPort),
|
||||
"-r", ghConfigDir,
|
||||
"root@localhost:/root/.config/",
|
||||
)
|
||||
|
|
|
|||
|
|
@ -13,6 +13,11 @@ import (
|
|||
"github.com/host-uk/core/pkg/io"
|
||||
)
|
||||
|
||||
const (
|
||||
// DefaultSSHPort is the default port for SSH connections to the dev environment.
|
||||
DefaultSSHPort = 2222
|
||||
)
|
||||
|
||||
// DevOps manages the portable development environment.
|
||||
type DevOps struct {
|
||||
medium io.Medium
|
||||
|
|
@ -137,12 +142,32 @@ func (d *DevOps) Boot(ctx context.Context, opts BootOptions) error {
|
|||
Name: opts.Name,
|
||||
Memory: opts.Memory,
|
||||
CPUs: opts.CPUs,
|
||||
SSHPort: 2222,
|
||||
SSHPort: DefaultSSHPort,
|
||||
Detach: true,
|
||||
}
|
||||
|
||||
_, err = d.container.Run(ctx, imagePath, runOpts)
|
||||
return err
|
||||
if err != nil {
|
||||
return err
|
||||
}
|
||||
|
||||
// Wait for SSH to be ready and scan host key
|
||||
// We try for up to 60 seconds as the VM takes a moment to boot
|
||||
var lastErr error
|
||||
for i := 0; i < 30; i++ {
|
||||
select {
|
||||
case <-ctx.Done():
|
||||
return ctx.Err()
|
||||
case <-time.After(2 * time.Second):
|
||||
if err := ensureHostKey(ctx, runOpts.SSHPort); err == nil {
|
||||
return nil
|
||||
} else {
|
||||
lastErr = err
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return fmt.Errorf("failed to verify host key after boot: %w", lastErr)
|
||||
}
|
||||
|
||||
// Stop stops the dev environment.
|
||||
|
|
@ -196,7 +221,7 @@ type DevStatus struct {
|
|||
func (d *DevOps) Status(ctx context.Context) (*DevStatus, error) {
|
||||
status := &DevStatus{
|
||||
Installed: d.images.IsInstalled(),
|
||||
SSHPort: 2222,
|
||||
SSHPort: DefaultSSHPort,
|
||||
}
|
||||
|
||||
if info, ok := d.images.manifest.Images[ImageName()]; ok {
|
||||
|
|
|
|||
|
|
@ -616,6 +616,7 @@ func TestDevOps_IsRunning_Bad_DifferentContainerName(t *testing.T) {
|
|||
}
|
||||
|
||||
func TestDevOps_Boot_Good_FreshFlag(t *testing.T) {
|
||||
t.Setenv("CORE_SKIP_SSH_SCAN", "true")
|
||||
tempDir, err := os.MkdirTemp("", "devops-test-*")
|
||||
require.NoError(t, err)
|
||||
t.Cleanup(func() { _ = os.RemoveAll(tempDir) })
|
||||
|
|
@ -700,6 +701,7 @@ func TestDevOps_Stop_Bad_ContainerNotRunning(t *testing.T) {
|
|||
}
|
||||
|
||||
func TestDevOps_Boot_Good_FreshWithNoExisting(t *testing.T) {
|
||||
t.Setenv("CORE_SKIP_SSH_SCAN", "true")
|
||||
tempDir, err := os.MkdirTemp("", "devops-boot-fresh-*")
|
||||
require.NoError(t, err)
|
||||
t.Cleanup(func() { _ = os.RemoveAll(tempDir) })
|
||||
|
|
@ -782,6 +784,7 @@ func TestDevOps_CheckUpdate_Delegates(t *testing.T) {
|
|||
}
|
||||
|
||||
func TestDevOps_Boot_Good_Success(t *testing.T) {
|
||||
t.Setenv("CORE_SKIP_SSH_SCAN", "true")
|
||||
tempDir, err := os.MkdirTemp("", "devops-boot-success-*")
|
||||
require.NoError(t, err)
|
||||
t.Cleanup(func() { _ = os.RemoveAll(tempDir) })
|
||||
|
|
|
|||
|
|
@ -59,11 +59,11 @@ func (d *DevOps) mountProject(ctx context.Context, path string) error {
|
|||
// Use reverse SSHFS mount
|
||||
// The VM connects back to host to mount the directory
|
||||
cmd := exec.CommandContext(ctx, "ssh",
|
||||
"-o", "StrictHostKeyChecking=accept-new",
|
||||
"-o", "StrictHostKeyChecking=yes",
|
||||
"-o", "UserKnownHostsFile=~/.core/known_hosts",
|
||||
"-o", "LogLevel=ERROR",
|
||||
"-R", "10000:localhost:22", // Reverse tunnel for SSHFS
|
||||
"-p", "2222",
|
||||
"-p", fmt.Sprintf("%d", DefaultSSHPort),
|
||||
"root@localhost",
|
||||
fmt.Sprintf("mkdir -p /app && sshfs -p 10000 %s@localhost:%s /app -o allow_other", os.Getenv("USER"), absPath),
|
||||
)
|
||||
|
|
|
|||
|
|
@ -33,11 +33,11 @@ func (d *DevOps) Shell(ctx context.Context, opts ShellOptions) error {
|
|||
// sshShell connects via SSH.
|
||||
func (d *DevOps) sshShell(ctx context.Context, command []string) error {
|
||||
args := []string{
|
||||
"-o", "StrictHostKeyChecking=accept-new",
|
||||
"-o", "StrictHostKeyChecking=yes",
|
||||
"-o", "UserKnownHostsFile=~/.core/known_hosts",
|
||||
"-o", "LogLevel=ERROR",
|
||||
"-A", // Agent forwarding
|
||||
"-p", "2222",
|
||||
"-p", fmt.Sprintf("%d", DefaultSSHPort),
|
||||
"root@localhost",
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -335,14 +335,22 @@ func ClearInstance() {
|
|||
|
||||
// Config returns the registered Config service.
|
||||
func (c *Core) Config() Config {
|
||||
cfg := MustServiceFor[Config](c, "config")
|
||||
return cfg
|
||||
return MustServiceFor[Config](c, "config")
|
||||
}
|
||||
|
||||
// Display returns the registered Display service.
|
||||
func (c *Core) Display() Display {
|
||||
d := MustServiceFor[Display](c, "display")
|
||||
return d
|
||||
return MustServiceFor[Display](c, "display")
|
||||
}
|
||||
|
||||
// Workspace returns the registered Workspace service.
|
||||
func (c *Core) Workspace() Workspace {
|
||||
return MustServiceFor[Workspace](c, "workspace")
|
||||
}
|
||||
|
||||
// Crypt returns the registered Crypt service.
|
||||
func (c *Core) Crypt() Crypt {
|
||||
return MustServiceFor[Crypt](c, "crypt")
|
||||
}
|
||||
|
||||
// Core returns self, implementing the CoreProvider interface.
|
||||
|
|
|
|||
|
|
@ -68,17 +68,23 @@ func TestCore_Services_Good(t *testing.T) {
|
|||
err = c.RegisterService("display", &MockDisplayService{})
|
||||
assert.NoError(t, err)
|
||||
|
||||
assert.NotNil(t, c.Config())
|
||||
assert.NotNil(t, c.Display())
|
||||
cfg := c.Config()
|
||||
assert.NotNil(t, cfg)
|
||||
|
||||
d := c.Display()
|
||||
assert.NotNil(t, d)
|
||||
}
|
||||
|
||||
func TestCore_Services_Ugly(t *testing.T) {
|
||||
c, err := New()
|
||||
assert.NoError(t, err)
|
||||
|
||||
// Config panics when service not registered
|
||||
assert.Panics(t, func() {
|
||||
c.Config()
|
||||
})
|
||||
|
||||
// Display panics when service not registered
|
||||
assert.Panics(t, func() {
|
||||
c.Display()
|
||||
})
|
||||
|
|
@ -122,6 +128,15 @@ func TestFeatures_IsEnabled_Good(t *testing.T) {
|
|||
assert.True(t, c.Features.IsEnabled("feature1"))
|
||||
assert.True(t, c.Features.IsEnabled("feature2"))
|
||||
assert.False(t, c.Features.IsEnabled("feature3"))
|
||||
assert.False(t, c.Features.IsEnabled(""))
|
||||
}
|
||||
|
||||
func TestFeatures_IsEnabled_Edge(t *testing.T) {
|
||||
c, _ := New()
|
||||
c.Features.Flags = []string{" ", "foo"}
|
||||
assert.True(t, c.Features.IsEnabled(" "))
|
||||
assert.True(t, c.Features.IsEnabled("foo"))
|
||||
assert.False(t, c.Features.IsEnabled("FOO")) // Case sensitive check
|
||||
}
|
||||
|
||||
func TestCore_ServiceLifecycle_Good(t *testing.T) {
|
||||
|
|
@ -231,11 +246,16 @@ func TestCore_MustServiceFor_Good(t *testing.T) {
|
|||
func TestCore_MustServiceFor_Ugly(t *testing.T) {
|
||||
c, err := New()
|
||||
assert.NoError(t, err)
|
||||
|
||||
// MustServiceFor panics on missing service
|
||||
assert.Panics(t, func() {
|
||||
MustServiceFor[*MockService](c, "nonexistent")
|
||||
})
|
||||
|
||||
err = c.RegisterService("test", "not a service")
|
||||
assert.NoError(t, err)
|
||||
|
||||
// MustServiceFor panics on type mismatch
|
||||
assert.Panics(t, func() {
|
||||
MustServiceFor[*MockService](c, "test")
|
||||
})
|
||||
|
|
|
|||
|
|
@ -144,3 +144,33 @@ func TestMessageBus_ConcurrentAccess_Good(t *testing.T) {
|
|||
|
||||
wg.Wait()
|
||||
}
|
||||
|
||||
func TestMessageBus_Action_NoHandlers(t *testing.T) {
|
||||
c, _ := New()
|
||||
// Should not error if no handlers are registered
|
||||
err := c.bus.action("no one listening")
|
||||
assert.NoError(t, err)
|
||||
}
|
||||
|
||||
func TestMessageBus_Query_NoHandlers(t *testing.T) {
|
||||
c, _ := New()
|
||||
result, handled, err := c.bus.query(TestQuery{})
|
||||
assert.NoError(t, err)
|
||||
assert.False(t, handled)
|
||||
assert.Nil(t, result)
|
||||
}
|
||||
|
||||
func TestMessageBus_QueryAll_NoHandlers(t *testing.T) {
|
||||
c, _ := New()
|
||||
results, err := c.bus.queryAll(TestQuery{})
|
||||
assert.NoError(t, err)
|
||||
assert.Empty(t, results)
|
||||
}
|
||||
|
||||
func TestMessageBus_Perform_NoHandlers(t *testing.T) {
|
||||
c, _ := New()
|
||||
result, handled, err := c.bus.perform(TestTask{})
|
||||
assert.NoError(t, err)
|
||||
assert.False(t, handled)
|
||||
assert.Nil(t, result)
|
||||
}
|
||||
|
|
|
|||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue