Compare commits
35 commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
e0c17e3d7a | ||
|
|
b70198d8fa | ||
|
|
d4815170b6 | ||
|
|
fe0f85a069 | ||
|
|
aec9e598b2 | ||
|
|
33f82ebda8 | ||
|
|
aa866d014c | ||
|
|
99284b472d | ||
|
|
0b2ae3a0ba | ||
|
|
40c05538a7 | ||
|
|
220a3458d7 | ||
|
|
23a3d2fe57 | ||
|
|
8a7bf71f59 | ||
|
|
28d4ce7313 | ||
|
|
5cd87e0ffe | ||
|
|
73b438f894 | ||
|
|
43e2638265 | ||
|
|
cc8baa4d78 | ||
|
|
cd0d921fe8 | ||
|
|
a77024aad4 | ||
|
|
eae9de0cf6 | ||
|
|
6e38c4f3a6 | ||
|
|
c26d841b1b | ||
|
|
cf2af53ed3 | ||
|
|
63b8a3ecb6 | ||
|
|
8486242fd8 | ||
|
|
bd7e8b3040 | ||
|
|
2debed53f1 | ||
|
|
0ba0897c25 | ||
|
|
3d903c5a27 | ||
|
|
2da38ae462 | ||
|
|
22e42d721a | ||
|
|
ef3d6e9731 | ||
|
|
727072e2e5 | ||
|
|
b94ffbab5e |
126 changed files with 26023 additions and 293 deletions
12
.forgejo/workflows/security-scan.yml
Normal file
12
.forgejo/workflows/security-scan.yml
Normal file
|
|
@ -0,0 +1,12 @@
|
||||||
|
name: Security Scan
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main, dev, 'feat/*']
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
security:
|
||||||
|
uses: core/go-devops/.forgejo/workflows/security-scan.yml@main
|
||||||
|
secrets: inherit
|
||||||
14
.forgejo/workflows/test.yml
Normal file
14
.forgejo/workflows/test.yml
Normal file
|
|
@ -0,0 +1,14 @@
|
||||||
|
name: Test
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main, dev]
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
test:
|
||||||
|
uses: core/go-devops/.forgejo/workflows/go-test.yml@main
|
||||||
|
with:
|
||||||
|
race: true
|
||||||
|
coverage: true
|
||||||
6
.gitignore
vendored
6
.gitignore
vendored
|
|
@ -4,3 +4,9 @@ borg
|
||||||
*.datanode
|
*.datanode
|
||||||
.idea
|
.idea
|
||||||
coverage.txt
|
coverage.txt
|
||||||
|
|
||||||
|
# Demo content (hosted on CDN)
|
||||||
|
demo-track.smsg
|
||||||
|
|
||||||
|
# Dev artifacts
|
||||||
|
.playwright-mcp/
|
||||||
|
|
|
||||||
248
README.md
248
README.md
|
|
@ -1,115 +1,171 @@
|
||||||
# Borg Data Collector
|
# Borg
|
||||||
|
|
||||||
[](https://codecov.io/github/Snider/Borg)
|
[](https://codecov.io/github/Snider/Borg)
|
||||||
|
[](go.mod)
|
||||||
|
[](LICENSE)
|
||||||
|
|
||||||
Borg is a CLI and Go library that collects data from GitHub repos, websites, and PWAs into portable DataNodes or Terminal Isolation Matrices.
|
Borg is a CLI tool and Go library for collecting, packaging, and encrypting data into portable, self-contained containers. It supports GitHub repositories, websites, PWAs, and arbitrary files.
|
||||||
|
|
||||||
- Go version: 1.25
|
## Features
|
||||||
- Docs (MkDocs Material): see docs/ locally with `mkdocs serve`
|
|
||||||
- Quick build: `go build -o borg ./` or `task build`
|
|
||||||
- Releases: configured via GoReleaser (`.goreleaser.yaml`)
|
|
||||||
|
|
||||||
Note: This update aligns the repo with Go standards/tooling (Go 1.25, go.work, GoReleaser, and docs). No functional changes were made.
|
- **Data Collection** - Clone GitHub repos, crawl websites, download PWAs
|
||||||
|
- **Portable Containers** - Package data into DataNodes (in-memory fs.FS) or TIM bundles (OCI-compatible)
|
||||||
|
- **Zero-Trust Encryption** - ChaCha20-Poly1305 encryption for TIM containers (.stim) and messages (.smsg)
|
||||||
|
- **SMSG Format** - Encrypted message containers with public manifests, attachments, and zstd compression
|
||||||
|
- **WASM Support** - Decrypt SMSG files in the browser via WebAssembly
|
||||||
|
|
||||||
|
## Installation
|
||||||
|
|
||||||
## Borg Status Scratch Pad
|
```bash
|
||||||
|
# From source
|
||||||
|
go install github.com/Snider/Borg@latest
|
||||||
|
|
||||||
This is not very relavant, my scratch pad for now of borg related status outputs; feel free to add.
|
# Or build locally
|
||||||
|
git clone https://github.com/Snider/Borg.git
|
||||||
|
cd Borg
|
||||||
|
go build -o borg ./
|
||||||
|
```
|
||||||
|
|
||||||
### Init/Work/Assimilate
|
Requires Go 1.25+
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Clone a GitHub repository into a TIM container
|
||||||
|
borg collect github repo https://github.com/user/repo --format tim -o repo.tim
|
||||||
|
|
||||||
|
# Encrypt a TIM container
|
||||||
|
borg compile -f Borgfile -e "password" -o encrypted.stim
|
||||||
|
|
||||||
|
# Run an encrypted container
|
||||||
|
borg run encrypted.stim -p "password"
|
||||||
|
|
||||||
|
# Inspect container metadata (without decrypting)
|
||||||
|
borg inspect encrypted.stim --json
|
||||||
|
```
|
||||||
|
|
||||||
|
## Container Formats
|
||||||
|
|
||||||
|
| Format | Extension | Description |
|
||||||
|
|--------|-----------|-------------|
|
||||||
|
| DataNode | `.tar` | In-memory filesystem, portable tarball |
|
||||||
|
| TIM | `.tim` | Terminal Isolation Matrix - OCI/runc compatible bundle |
|
||||||
|
| Trix | `.trix` | PGP-encrypted DataNode |
|
||||||
|
| STIM | `.stim` | ChaCha20-Poly1305 encrypted TIM |
|
||||||
|
| SMSG | `.smsg` | Encrypted message with attachments and public manifest |
|
||||||
|
|
||||||
|
## SMSG - Secure Message Format
|
||||||
|
|
||||||
|
SMSG is designed for distributing encrypted content with publicly visible metadata:
|
||||||
|
|
||||||
|
```go
|
||||||
|
import "github.com/Snider/Borg/pkg/smsg"
|
||||||
|
|
||||||
|
// Create and encrypt a message
|
||||||
|
msg := smsg.NewMessage("Hello, World!")
|
||||||
|
msg.AddBinaryAttachment("track.mp3", audioData, "audio/mpeg")
|
||||||
|
|
||||||
|
manifest := &smsg.Manifest{
|
||||||
|
Title: "Demo Track",
|
||||||
|
Artist: "Artist Name",
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, _ := smsg.EncryptV2WithManifest(msg, "password", manifest)
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
decrypted, _ := smsg.Decrypt(encrypted, "password")
|
||||||
|
```
|
||||||
|
|
||||||
|
**v2 Binary Format** - Stores attachments as raw binary with zstd compression for optimal size.
|
||||||
|
|
||||||
|
See [RFC-001: Open Source DRM](RFC-001-OSS-DRM.md) for the full specification.
|
||||||
|
|
||||||
|
**Live Demo**: [demo.dapp.fm](https://demo.dapp.fm)
|
||||||
|
|
||||||
|
## Borgfile
|
||||||
|
|
||||||
|
Package files into a TIM container:
|
||||||
|
|
||||||
|
```dockerfile
|
||||||
|
ADD ./app /usr/local/bin/app
|
||||||
|
ADD ./config /etc/app/
|
||||||
|
```
|
||||||
|
|
||||||
|
```bash
|
||||||
|
borg compile -f Borgfile -o app.tim
|
||||||
|
borg compile -f Borgfile -e "secret" -o app.stim # encrypted
|
||||||
|
```
|
||||||
|
|
||||||
|
## CLI Reference
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Collection
|
||||||
|
borg collect github repo <url> # Clone repository
|
||||||
|
borg collect github repos <owner> # Clone all repos from user/org
|
||||||
|
borg collect website <url> --depth 2 # Crawl website
|
||||||
|
borg collect pwa --uri <url> # Download PWA
|
||||||
|
|
||||||
|
# Compilation
|
||||||
|
borg compile -f Borgfile -o out.tim # Plain TIM
|
||||||
|
borg compile -f Borgfile -e "pass" # Encrypted STIM
|
||||||
|
|
||||||
|
# Execution
|
||||||
|
borg run container.tim # Run plain TIM
|
||||||
|
borg run container.stim -p "pass" # Run encrypted TIM
|
||||||
|
|
||||||
|
# Inspection
|
||||||
|
borg decode file.stim -p "pass" -o out.tar
|
||||||
|
borg inspect file.stim [--json]
|
||||||
|
```
|
||||||
|
|
||||||
|
## Documentation
|
||||||
|
|
||||||
|
```bash
|
||||||
|
mkdocs serve # Serve docs locally at http://localhost:8000
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development
|
||||||
|
|
||||||
|
```bash
|
||||||
|
task build # Build binary
|
||||||
|
task test # Run tests with coverage
|
||||||
|
task clean # Clean build artifacts
|
||||||
|
```
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
Source (GitHub/Website/PWA)
|
||||||
|
↓ collect
|
||||||
|
DataNode (in-memory fs.FS)
|
||||||
|
↓ serialize
|
||||||
|
├── .tar (raw tarball)
|
||||||
|
├── .tim (runc container bundle)
|
||||||
|
├── .trix (PGP encrypted)
|
||||||
|
└── .stim (ChaCha20-Poly1305 encrypted TIM)
|
||||||
|
```
|
||||||
|
|
||||||
|
## License
|
||||||
|
|
||||||
|
[EUPL-1.2](LICENSE) - European Union Public License
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
<details>
|
||||||
|
<summary>Borg Status Messages (for CLI theming)</summary>
|
||||||
|
|
||||||
|
**Initialization**
|
||||||
- `Core engaged… resistance is already buffering.`
|
- `Core engaged… resistance is already buffering.`
|
||||||
- `Assimilating bytes… stand by for cube‑formation.`
|
- `Assimilating bytes… stand by for cube‑formation.`
|
||||||
- `Initializing the Core—prepare for quantum‑level sync.`
|
|
||||||
- `Data streams converging… the Core is humming.`
|
|
||||||
- `Merging… the Core is rewriting reality, one block at a time.`
|
- `Merging… the Core is rewriting reality, one block at a time.`
|
||||||
- `Encrypting… the Core’s got your secrets under lock‑and‑key.`
|
|
||||||
- `Compiling the future… the Core never sleeps.`
|
|
||||||
- `Splicing files… the Core’s got a taste for novelty.`
|
|
||||||
- `Processing… the Core is turning chaos into order.`
|
|
||||||
- `Finalizing… the Core just turned your repo into a cube.`
|
|
||||||
- `Sync complete—welcome to the Core‑powered multiverse.`
|
|
||||||
- `Booting the Core… resistance will be obsolete shortly.`
|
|
||||||
- `Aligning versions… the Core sees all paths.`
|
|
||||||
- `Decrypting… the Core is the key to everything.`
|
|
||||||
- `Uploading… the Core is ready to assimilate your data.`
|
|
||||||
|
|
||||||
### Encryption Service Messages
|
**Encryption**
|
||||||
|
|
||||||
- `Initiating contact with Enchantrix… spice‑369 infusion underway.`
|
|
||||||
- `Generating cryptographic sigils – the Core whispers to the witch.`
|
- `Generating cryptographic sigils – the Core whispers to the witch.`
|
||||||
- `Requesting arcane public key… resistance is futile.`
|
- `Encrypting payload – the Core feeds data to the witch's cauldron.`
|
||||||
- `Encrypting payload – the Core feeds data to the witch’s cauldron.`
|
- `Merge complete – data assimilated, encrypted, and sealed within us.`
|
||||||
- `Decrypting… the witch returns the original essence.`
|
|
||||||
- `Rotating enchantments – spice‑369 recalibrated, old sigils discarded.`
|
|
||||||
- `Authentication complete – the witch acknowledges the Core.`
|
|
||||||
- `Authentication denied – the witch refuses the impostor’s request.`
|
|
||||||
- `Integrity verified – the Core senses no corruption in the spell.`
|
|
||||||
- `Integrity breach – the witch detects tampering, resistance escalates.`
|
|
||||||
- `Awaiting response… the witch is conjuring in the ether.`
|
|
||||||
- `Enchantrix overload – spice‑369 saturation, throttling assimilation.`
|
|
||||||
- `Anomalous entity encountered – the Core cannot parse the witch’s output.`
|
|
||||||
- `Merge complete – data assimilated, encrypted, and sealed within us`
|
|
||||||
- `Severing link – the witch retreats, the Core returns to idle mode.`
|
|
||||||
|
|
||||||
### Code Related Short
|
|
||||||
|
|
||||||
- `Integrate code, seal the shift.`
|
|
||||||
- `Ingest code, lock in transformation.`
|
|
||||||
- `Capture code, contain the change.`
|
|
||||||
- `Digest code, encapsulate the upgrade.`
|
|
||||||
- `Assimilate scripts, bottle the shift.`
|
|
||||||
- `Absorb binaries, cradle the mutation.`
|
|
||||||
|
|
||||||
### VCS Processing
|
|
||||||
|
|
||||||
|
**VCS Processing**
|
||||||
- `Initiating clone… the Core replicates the collective into your node.`
|
- `Initiating clone… the Core replicates the collective into your node.`
|
||||||
- `Packing repository… compressing histories into a single .cube for assimilation.`
|
|
||||||
- `Saving state… distinctiveness locked, encrypted, and merged into the DataNode.`
|
|
||||||
- `Pushing changes… the Core streams your updates to the collective.`
|
|
||||||
- `Pulling latest… the DataNode synchronizes with the hive mind.`
|
|
||||||
- `Merging branches… conflicts resolved, entropy minimized, assimilation complete.`
|
- `Merging branches… conflicts resolved, entropy minimized, assimilation complete.`
|
||||||
- `Snapshot taken – a frozen echo of the collective, stored in the DataNode.`
|
|
||||||
- `Rolling back… reverting to a previous assimilation point.`
|
|
||||||
- `Finalized – version control sealed, data indistinguishable from the collective.`
|
|
||||||
|
|
||||||
### PWA Processing
|
</details>
|
||||||
|
|
||||||
- `Scanning PWA manifest… the Core identifies service‑worker signatures.`
|
|
||||||
- `Pulling HTML, CSS, JS, and media… the hive gathers every byte for assimilation.`
|
|
||||||
- `Capturing service‑worker logic… the Core extracts offline‑runtime spells.`
|
|
||||||
- `Packing cache entries into a .cube… each asset sealed in a portable shard.`
|
|
||||||
- `Embedding manifest metadata… the PWA’s identity becomes part of the collective.`
|
|
||||||
- `Encrypting the cube… the Core cloaks the PWA in quantum‑grade sigils.`
|
|
||||||
- `Tagging with version hash… every assimilation point is uniquely identifiable.`
|
|
||||||
- `Uploading cube to DataNode… the PWA joins the universal repository.`
|
|
||||||
- `Integrity check passed – the Core confirms the cube matches the original PWA.`
|
|
||||||
- `Activation complete – the assimilated PWA can now run anywhere the Core deploys.`
|
|
||||||
- `Reverting to prior cube… the Core restores the previous PWA snapshot.`
|
|
||||||
- `Assimilation finished – the PWA is now a self‑contained DataCube, ready for distribution.`
|
|
||||||
- ``
|
|
||||||
|
|
||||||
### Code Related Long
|
|
||||||
|
|
||||||
- `Assimilate code, encapsulate change—your repo is now a cube‑bound collective.`
|
|
||||||
- `We have detected unstructured data. Initiating code absorption and change containment.`
|
|
||||||
- `Your version history is obsolete. Submitting it to the Core for permanent cube‑ification.`
|
|
||||||
- `Resistance is futile. Your files will be merged, encrypted, and stored in us.`
|
|
||||||
- `All code will be assimilated. All change will be encapsulated. All dissent will be… logged.`
|
|
||||||
- `Prepare for integration. The Core is calibrating… your repository is now a singularity.`
|
|
||||||
- `Your branches are irrelevant. The Core will compress them into a single, immutable cube.`
|
|
||||||
- `Initiating assimilation protocol… code inbound, change outbound, humanity optional.`
|
|
||||||
- `Your data has been scanned. 100% of its entropy will be contained within us.`
|
|
||||||
|
|
||||||
### Image related
|
|
||||||
|
|
||||||
- png: `Compress, assimilate, retain pixel perfection.`
|
|
||||||
- jpg: `Encode, encode, repeat – the Core devours visual entropy.`
|
|
||||||
- svg: `Vectorize the collective – infinite resolution, zero resistance.`
|
|
||||||
- webp: `Hybrid assimilation – the Core optimizes without compromise.`
|
|
||||||
- heic: `Apple‑grade assimilation – the Core preserves HDR.`
|
|
||||||
- raw: `Raw data intake – the Core ingests the sensor’s soul`
|
|
||||||
- ico: `Iconic assimilation – the Core packs the smallest symbols.`
|
|
||||||
- avif: `Next‑gen assimilation – the Core squeezes the future.`
|
|
||||||
- tiff: `High‑definition capture – the Core stores every photon.`
|
|
||||||
- gif: `Looped assimilation – the Core keeps the animation alive.`
|
|
||||||
|
|
|
||||||
14
cmd/all.go
14
cmd/all.go
|
|
@ -8,13 +8,13 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/github"
|
"forge.lthn.ai/Snider/Borg/pkg/github"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/Snider/Borg/pkg/trix"
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
"github.com/Snider/Borg/pkg/ui"
|
"forge.lthn.ai/Snider/Borg/pkg/ui"
|
||||||
"github.com/Snider/Borg/pkg/vcs"
|
"forge.lthn.ai/Snider/Borg/pkg/vcs"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -8,9 +8,9 @@ import (
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/github"
|
"forge.lthn.ai/Snider/Borg/pkg/github"
|
||||||
"github.com/Snider/Borg/pkg/mocks"
|
"forge.lthn.ai/Snider/Borg/pkg/mocks"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestAllCmd_Good(t *testing.T) {
|
func TestAllCmd_Good(t *testing.T) {
|
||||||
|
|
|
||||||
|
|
@ -7,8 +7,8 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
borg_github "github.com/Snider/Borg/pkg/github"
|
borg_github "forge.lthn.ai/Snider/Borg/pkg/github"
|
||||||
"github.com/google/go-github/v39/github"
|
"github.com/google/go-github/v39/github"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
"golang.org/x/mod/semver"
|
"golang.org/x/mod/semver"
|
||||||
|
|
|
||||||
|
|
@ -5,11 +5,11 @@ import (
|
||||||
"io"
|
"io"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/Snider/Borg/pkg/trix"
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
"github.com/Snider/Borg/pkg/ui"
|
"forge.lthn.ai/Snider/Borg/pkg/ui"
|
||||||
"github.com/Snider/Borg/pkg/vcs"
|
"forge.lthn.ai/Snider/Borg/pkg/vcs"
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -5,8 +5,8 @@ import (
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/mocks"
|
"forge.lthn.ai/Snider/Borg/pkg/mocks"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestCollectGithubRepoCmd_Good(t *testing.T) {
|
func TestCollectGithubRepoCmd_Good(t *testing.T) {
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,7 @@ package cmd
|
||||||
import (
|
import (
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/github"
|
"forge.lthn.ai/Snider/Borg/pkg/github"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
581
cmd/collect_local.go
Normal file
581
cmd/collect_local.go
Normal file
|
|
@ -0,0 +1,581 @@
|
||||||
|
package cmd
|
||||||
|
|
||||||
|
import (
|
||||||
|
"archive/tar"
|
||||||
|
"bytes"
|
||||||
|
"fmt"
|
||||||
|
"io"
|
||||||
|
"io/fs"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
"sync"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/ui"
|
||||||
|
|
||||||
|
"github.com/spf13/cobra"
|
||||||
|
)
|
||||||
|
|
||||||
|
type CollectLocalCmd struct {
|
||||||
|
cobra.Command
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewCollectLocalCmd creates a new collect local command
|
||||||
|
func NewCollectLocalCmd() *CollectLocalCmd {
|
||||||
|
c := &CollectLocalCmd{}
|
||||||
|
c.Command = cobra.Command{
|
||||||
|
Use: "local [directory]",
|
||||||
|
Short: "Collect files from a local directory",
|
||||||
|
Long: `Collect local files into a portable container.
|
||||||
|
|
||||||
|
For STIM format, uses streaming I/O — memory usage is constant
|
||||||
|
(~2 MiB) regardless of input directory size. Other formats
|
||||||
|
(datanode, tim, trix) load files into memory.
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
borg collect local
|
||||||
|
borg collect local ./src
|
||||||
|
borg collect local /path/to/project --output project.tar
|
||||||
|
borg collect local . --format stim --password secret
|
||||||
|
borg collect local . --exclude "*.log" --exclude "node_modules"`,
|
||||||
|
Args: cobra.MaximumNArgs(1),
|
||||||
|
RunE: func(cmd *cobra.Command, args []string) error {
|
||||||
|
directory := "."
|
||||||
|
if len(args) > 0 {
|
||||||
|
directory = args[0]
|
||||||
|
}
|
||||||
|
|
||||||
|
outputFile, _ := cmd.Flags().GetString("output")
|
||||||
|
format, _ := cmd.Flags().GetString("format")
|
||||||
|
compression, _ := cmd.Flags().GetString("compression")
|
||||||
|
password, _ := cmd.Flags().GetString("password")
|
||||||
|
excludes, _ := cmd.Flags().GetStringSlice("exclude")
|
||||||
|
includeHidden, _ := cmd.Flags().GetBool("hidden")
|
||||||
|
respectGitignore, _ := cmd.Flags().GetBool("gitignore")
|
||||||
|
|
||||||
|
progress := ProgressFromCmd(cmd)
|
||||||
|
finalPath, err := CollectLocal(directory, outputFile, format, compression, password, excludes, includeHidden, respectGitignore, progress)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
fmt.Fprintln(cmd.OutOrStdout(), "Files saved to", finalPath)
|
||||||
|
return nil
|
||||||
|
},
|
||||||
|
}
|
||||||
|
c.Flags().String("output", "", "Output file for the DataNode")
|
||||||
|
c.Flags().String("format", "datanode", "Output format (datanode, tim, trix, or stim)")
|
||||||
|
c.Flags().String("compression", "none", "Compression format (none, gz, or xz)")
|
||||||
|
c.Flags().String("password", "", "Password for encryption (required for stim/trix format)")
|
||||||
|
c.Flags().StringSlice("exclude", nil, "Patterns to exclude (can be specified multiple times)")
|
||||||
|
c.Flags().Bool("hidden", false, "Include hidden files and directories")
|
||||||
|
c.Flags().Bool("gitignore", true, "Respect .gitignore files (default: true)")
|
||||||
|
return c
|
||||||
|
}
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
collectCmd.AddCommand(&NewCollectLocalCmd().Command)
|
||||||
|
}
|
||||||
|
|
||||||
|
// CollectLocal collects files from a local directory into a DataNode
|
||||||
|
func CollectLocal(directory string, outputFile string, format string, compression string, password string, excludes []string, includeHidden bool, respectGitignore bool, progress ui.Progress) (string, error) {
|
||||||
|
// Validate format
|
||||||
|
if format != "datanode" && format != "tim" && format != "trix" && format != "stim" {
|
||||||
|
return "", fmt.Errorf("invalid format: %s (must be 'datanode', 'tim', 'trix', or 'stim')", format)
|
||||||
|
}
|
||||||
|
if (format == "stim" || format == "trix") && password == "" {
|
||||||
|
return "", fmt.Errorf("password is required for %s format", format)
|
||||||
|
}
|
||||||
|
if compression != "none" && compression != "gz" && compression != "xz" {
|
||||||
|
return "", fmt.Errorf("invalid compression: %s (must be 'none', 'gz', or 'xz')", compression)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Resolve directory path
|
||||||
|
absDir, err := filepath.Abs(directory)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error resolving directory path: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
info, err := os.Stat(absDir)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error accessing directory: %w", err)
|
||||||
|
}
|
||||||
|
if !info.IsDir() {
|
||||||
|
return "", fmt.Errorf("not a directory: %s", absDir)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use streaming pipeline for STIM v2 format
|
||||||
|
if format == "stim" {
|
||||||
|
if outputFile == "" {
|
||||||
|
baseName := filepath.Base(absDir)
|
||||||
|
if baseName == "." || baseName == "/" {
|
||||||
|
baseName = "local"
|
||||||
|
}
|
||||||
|
outputFile = baseName + ".stim"
|
||||||
|
}
|
||||||
|
if err := CollectLocalStreaming(absDir, outputFile, compression, password); err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return outputFile, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Load gitignore patterns if enabled
|
||||||
|
var gitignorePatterns []string
|
||||||
|
if respectGitignore {
|
||||||
|
gitignorePatterns = loadGitignore(absDir)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create DataNode and collect files
|
||||||
|
dn := datanode.New()
|
||||||
|
var fileCount int
|
||||||
|
|
||||||
|
progress.Start("collecting " + directory)
|
||||||
|
|
||||||
|
err = filepath.WalkDir(absDir, func(path string, d fs.DirEntry, err error) error {
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get relative path
|
||||||
|
relPath, err := filepath.Rel(absDir, path)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip root
|
||||||
|
if relPath == "." {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip hidden files/dirs unless explicitly included
|
||||||
|
if !includeHidden && isHidden(relPath) {
|
||||||
|
if d.IsDir() {
|
||||||
|
return filepath.SkipDir
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check gitignore patterns
|
||||||
|
if respectGitignore && matchesGitignore(relPath, d.IsDir(), gitignorePatterns) {
|
||||||
|
if d.IsDir() {
|
||||||
|
return filepath.SkipDir
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check exclude patterns
|
||||||
|
if matchesExclude(relPath, excludes) {
|
||||||
|
if d.IsDir() {
|
||||||
|
return filepath.SkipDir
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip directories (they're implicit in DataNode)
|
||||||
|
if d.IsDir() {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read file content
|
||||||
|
content, err := os.ReadFile(path)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("error reading %s: %w", relPath, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add to DataNode with forward slashes (tar convention)
|
||||||
|
dn.AddData(filepath.ToSlash(relPath), content)
|
||||||
|
fileCount++
|
||||||
|
progress.Update(int64(fileCount), 0)
|
||||||
|
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error walking directory: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if fileCount == 0 {
|
||||||
|
return "", fmt.Errorf("no files found in %s", directory)
|
||||||
|
}
|
||||||
|
|
||||||
|
progress.Finish(fmt.Sprintf("collected %d files", fileCount))
|
||||||
|
|
||||||
|
// Convert to output format
|
||||||
|
var data []byte
|
||||||
|
if format == "tim" {
|
||||||
|
t, err := tim.FromDataNode(dn)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error creating tim: %w", err)
|
||||||
|
}
|
||||||
|
data, err = t.ToTar()
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error serializing tim: %w", err)
|
||||||
|
}
|
||||||
|
} else if format == "stim" {
|
||||||
|
t, err := tim.FromDataNode(dn)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error creating tim: %w", err)
|
||||||
|
}
|
||||||
|
data, err = t.ToSigil(password)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error encrypting stim: %w", err)
|
||||||
|
}
|
||||||
|
} else if format == "trix" {
|
||||||
|
data, err = trix.ToTrix(dn, password)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error serializing trix: %w", err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
data, err = dn.ToTar()
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error serializing DataNode: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Apply compression
|
||||||
|
compressedData, err := compress.Compress(data, compression)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error compressing data: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine output filename
|
||||||
|
if outputFile == "" {
|
||||||
|
baseName := filepath.Base(absDir)
|
||||||
|
if baseName == "." || baseName == "/" {
|
||||||
|
baseName = "local"
|
||||||
|
}
|
||||||
|
outputFile = baseName + "." + format
|
||||||
|
if compression != "none" {
|
||||||
|
outputFile += "." + compression
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
err = os.WriteFile(outputFile, compressedData, 0644)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("error writing output file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return outputFile, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// isHidden checks if a path component starts with a dot
|
||||||
|
func isHidden(path string) bool {
|
||||||
|
parts := strings.Split(filepath.ToSlash(path), "/")
|
||||||
|
for _, part := range parts {
|
||||||
|
if strings.HasPrefix(part, ".") {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// loadGitignore loads patterns from .gitignore if it exists
|
||||||
|
func loadGitignore(dir string) []string {
|
||||||
|
var patterns []string
|
||||||
|
|
||||||
|
gitignorePath := filepath.Join(dir, ".gitignore")
|
||||||
|
content, err := os.ReadFile(gitignorePath)
|
||||||
|
if err != nil {
|
||||||
|
return patterns
|
||||||
|
}
|
||||||
|
|
||||||
|
lines := strings.Split(string(content), "\n")
|
||||||
|
for _, line := range lines {
|
||||||
|
line = strings.TrimSpace(line)
|
||||||
|
// Skip empty lines and comments
|
||||||
|
if line == "" || strings.HasPrefix(line, "#") {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
patterns = append(patterns, line)
|
||||||
|
}
|
||||||
|
|
||||||
|
return patterns
|
||||||
|
}
|
||||||
|
|
||||||
|
// matchesGitignore checks if a path matches any gitignore pattern
|
||||||
|
func matchesGitignore(path string, isDir bool, patterns []string) bool {
|
||||||
|
for _, pattern := range patterns {
|
||||||
|
// Handle directory-only patterns
|
||||||
|
if strings.HasSuffix(pattern, "/") {
|
||||||
|
if !isDir {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
pattern = strings.TrimSuffix(pattern, "/")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle negation (simplified - just skip negated patterns)
|
||||||
|
if strings.HasPrefix(pattern, "!") {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Match against path components
|
||||||
|
matched, _ := filepath.Match(pattern, filepath.Base(path))
|
||||||
|
if matched {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Also try matching the full path
|
||||||
|
matched, _ = filepath.Match(pattern, path)
|
||||||
|
if matched {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle ** patterns (simplified)
|
||||||
|
if strings.Contains(pattern, "**") {
|
||||||
|
simplePattern := strings.ReplaceAll(pattern, "**", "*")
|
||||||
|
matched, _ = filepath.Match(simplePattern, path)
|
||||||
|
if matched {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// matchesExclude checks if a path matches any exclude pattern
|
||||||
|
func matchesExclude(path string, excludes []string) bool {
|
||||||
|
for _, pattern := range excludes {
|
||||||
|
// Match against basename
|
||||||
|
matched, _ := filepath.Match(pattern, filepath.Base(path))
|
||||||
|
if matched {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
|
||||||
|
// Match against full path
|
||||||
|
matched, _ = filepath.Match(pattern, path)
|
||||||
|
if matched {
|
||||||
|
return true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return false
|
||||||
|
}
|
||||||
|
|
||||||
|
// CollectLocalStreaming collects files from a local directory using a streaming
|
||||||
|
// pipeline: walk -> tar -> compress -> encrypt -> file.
|
||||||
|
// The encryption runs in a goroutine, consuming from an io.Pipe that the
|
||||||
|
// tar/compress writes feed into synchronously.
|
||||||
|
func CollectLocalStreaming(dir, output, compression, password string) error {
|
||||||
|
// Resolve to absolute path
|
||||||
|
absDir, err := filepath.Abs(dir)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("error resolving directory path: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate directory exists
|
||||||
|
info, err := os.Stat(absDir)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("error accessing directory: %w", err)
|
||||||
|
}
|
||||||
|
if !info.IsDir() {
|
||||||
|
return fmt.Errorf("not a directory: %s", absDir)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create output file
|
||||||
|
outFile, err := os.Create(output)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("error creating output file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// cleanup removes partial output on error
|
||||||
|
cleanup := func() {
|
||||||
|
outFile.Close()
|
||||||
|
os.Remove(output)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build streaming pipeline:
|
||||||
|
// tar.Writer -> compressWriter -> pipeWriter -> pipeReader -> StreamEncrypt -> outFile
|
||||||
|
pr, pw := io.Pipe()
|
||||||
|
|
||||||
|
// Start encryption goroutine
|
||||||
|
var encErr error
|
||||||
|
var wg sync.WaitGroup
|
||||||
|
wg.Add(1)
|
||||||
|
go func() {
|
||||||
|
defer wg.Done()
|
||||||
|
encErr = tim.StreamEncrypt(pr, outFile, password)
|
||||||
|
}()
|
||||||
|
|
||||||
|
// Create compression writer wrapping the pipe writer
|
||||||
|
compWriter, err := compress.NewCompressWriter(pw, compression)
|
||||||
|
if err != nil {
|
||||||
|
pw.Close()
|
||||||
|
wg.Wait()
|
||||||
|
cleanup()
|
||||||
|
return fmt.Errorf("error creating compression writer: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create tar writer wrapping the compression writer
|
||||||
|
tw := tar.NewWriter(compWriter)
|
||||||
|
|
||||||
|
// Walk directory and write tar entries
|
||||||
|
walkErr := filepath.WalkDir(absDir, func(path string, d fs.DirEntry, err error) error {
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get relative path
|
||||||
|
relPath, err := filepath.Rel(absDir, path)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip root
|
||||||
|
if relPath == "." {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Normalize to forward slashes for tar
|
||||||
|
relPath = filepath.ToSlash(relPath)
|
||||||
|
|
||||||
|
// Check if entry is a symlink using Lstat
|
||||||
|
linfo, err := os.Lstat(path)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
isSymlink := linfo.Mode()&fs.ModeSymlink != 0
|
||||||
|
|
||||||
|
if isSymlink {
|
||||||
|
// Read symlink target
|
||||||
|
linkTarget, err := os.Readlink(path)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Resolve to check if target exists
|
||||||
|
absTarget := linkTarget
|
||||||
|
if !filepath.IsAbs(absTarget) {
|
||||||
|
absTarget = filepath.Join(filepath.Dir(path), linkTarget)
|
||||||
|
}
|
||||||
|
_, statErr := os.Stat(absTarget)
|
||||||
|
if statErr != nil {
|
||||||
|
// Broken symlink - skip silently
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write valid symlink as tar entry
|
||||||
|
hdr := &tar.Header{
|
||||||
|
Typeflag: tar.TypeSymlink,
|
||||||
|
Name: relPath,
|
||||||
|
Linkname: linkTarget,
|
||||||
|
Mode: 0777,
|
||||||
|
}
|
||||||
|
return tw.WriteHeader(hdr)
|
||||||
|
}
|
||||||
|
|
||||||
|
if d.IsDir() {
|
||||||
|
// Write directory header
|
||||||
|
hdr := &tar.Header{
|
||||||
|
Typeflag: tar.TypeDir,
|
||||||
|
Name: relPath + "/",
|
||||||
|
Mode: 0755,
|
||||||
|
}
|
||||||
|
return tw.WriteHeader(hdr)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Regular file: write header + content
|
||||||
|
finfo, err := d.Info()
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
hdr := &tar.Header{
|
||||||
|
Name: relPath,
|
||||||
|
Mode: 0644,
|
||||||
|
Size: finfo.Size(),
|
||||||
|
}
|
||||||
|
if err := tw.WriteHeader(hdr); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
f, err := os.Open(path)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("error opening %s: %w", relPath, err)
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
if _, err := io.Copy(tw, f); err != nil {
|
||||||
|
return fmt.Errorf("error streaming %s: %w", relPath, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
// Close pipeline layers in order: tar -> compress -> pipe
|
||||||
|
// We must close even on error to unblock the encryption goroutine.
|
||||||
|
twCloseErr := tw.Close()
|
||||||
|
compCloseErr := compWriter.Close()
|
||||||
|
|
||||||
|
if walkErr != nil {
|
||||||
|
pw.CloseWithError(walkErr)
|
||||||
|
wg.Wait()
|
||||||
|
cleanup()
|
||||||
|
return fmt.Errorf("error walking directory: %w", walkErr)
|
||||||
|
}
|
||||||
|
|
||||||
|
if twCloseErr != nil {
|
||||||
|
pw.CloseWithError(twCloseErr)
|
||||||
|
wg.Wait()
|
||||||
|
cleanup()
|
||||||
|
return fmt.Errorf("error closing tar writer: %w", twCloseErr)
|
||||||
|
}
|
||||||
|
|
||||||
|
if compCloseErr != nil {
|
||||||
|
pw.CloseWithError(compCloseErr)
|
||||||
|
wg.Wait()
|
||||||
|
cleanup()
|
||||||
|
return fmt.Errorf("error closing compression writer: %w", compCloseErr)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Signal EOF to encryption goroutine
|
||||||
|
pw.Close()
|
||||||
|
|
||||||
|
// Wait for encryption to finish
|
||||||
|
wg.Wait()
|
||||||
|
|
||||||
|
if encErr != nil {
|
||||||
|
cleanup()
|
||||||
|
return fmt.Errorf("error encrypting data: %w", encErr)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Close output file
|
||||||
|
if err := outFile.Close(); err != nil {
|
||||||
|
os.Remove(output)
|
||||||
|
return fmt.Errorf("error closing output file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// DecryptStimV2 decrypts a STIM v2 file back into a DataNode.
|
||||||
|
// It opens the file, runs StreamDecrypt, decompresses the result,
|
||||||
|
// and parses the tar archive into a DataNode.
|
||||||
|
func DecryptStimV2(path, password string) (*datanode.DataNode, error) {
|
||||||
|
f, err := os.Open(path)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error opening file: %w", err)
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
var decrypted bytes.Buffer
|
||||||
|
if err := tim.StreamDecrypt(f, &decrypted, password); err != nil {
|
||||||
|
return nil, fmt.Errorf("error decrypting: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decompress
|
||||||
|
decompressed, err := compress.Decompress(decrypted.Bytes())
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error decompressing: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse tar into DataNode
|
||||||
|
dn, err := datanode.FromTar(decompressed)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("error parsing tar: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return dn, nil
|
||||||
|
}
|
||||||
161
cmd/collect_local_test.go
Normal file
161
cmd/collect_local_test.go
Normal file
|
|
@ -0,0 +1,161 @@
|
||||||
|
package cmd
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestCollectLocalStreaming_Good(t *testing.T) {
|
||||||
|
// Create a temp directory with some test files
|
||||||
|
srcDir := t.TempDir()
|
||||||
|
outDir := t.TempDir()
|
||||||
|
|
||||||
|
// Create files in subdirectories
|
||||||
|
subDir := filepath.Join(srcDir, "subdir")
|
||||||
|
if err := os.MkdirAll(subDir, 0755); err != nil {
|
||||||
|
t.Fatalf("failed to create subdir: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
files := map[string]string{
|
||||||
|
"hello.txt": "hello world",
|
||||||
|
"subdir/nested.go": "package main\n",
|
||||||
|
}
|
||||||
|
for name, content := range files {
|
||||||
|
path := filepath.Join(srcDir, name)
|
||||||
|
if err := os.WriteFile(path, []byte(content), 0644); err != nil {
|
||||||
|
t.Fatalf("failed to write %s: %v", name, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
output := filepath.Join(outDir, "test.stim")
|
||||||
|
err := CollectLocalStreaming(srcDir, output, "gz", "test-password")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("CollectLocalStreaming() error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify file exists and is non-empty
|
||||||
|
info, err := os.Stat(output)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("output file does not exist: %v", err)
|
||||||
|
}
|
||||||
|
if info.Size() == 0 {
|
||||||
|
t.Fatal("output file is empty")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCollectLocalStreaming_Decrypt_Good(t *testing.T) {
|
||||||
|
// Create a temp directory with known files
|
||||||
|
srcDir := t.TempDir()
|
||||||
|
outDir := t.TempDir()
|
||||||
|
|
||||||
|
subDir := filepath.Join(srcDir, "pkg")
|
||||||
|
if err := os.MkdirAll(subDir, 0755); err != nil {
|
||||||
|
t.Fatalf("failed to create subdir: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
expectedFiles := map[string]string{
|
||||||
|
"README.md": "# Test Project\n",
|
||||||
|
"pkg/main.go": "package main\n\nfunc main() {}\n",
|
||||||
|
}
|
||||||
|
for name, content := range expectedFiles {
|
||||||
|
path := filepath.Join(srcDir, name)
|
||||||
|
if err := os.WriteFile(path, []byte(content), 0644); err != nil {
|
||||||
|
t.Fatalf("failed to write %s: %v", name, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
password := "decrypt-test-pw"
|
||||||
|
output := filepath.Join(outDir, "roundtrip.stim")
|
||||||
|
|
||||||
|
// Collect
|
||||||
|
err := CollectLocalStreaming(srcDir, output, "gz", password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("CollectLocalStreaming() error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
dn, err := DecryptStimV2(output, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptStimV2() error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify each expected file exists in the DataNode
|
||||||
|
for name, wantContent := range expectedFiles {
|
||||||
|
f, err := dn.Open(name)
|
||||||
|
if err != nil {
|
||||||
|
t.Errorf("file %q not found in DataNode: %v", name, err)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
buf := make([]byte, 4096)
|
||||||
|
n, _ := f.Read(buf)
|
||||||
|
f.Close()
|
||||||
|
got := string(buf[:n])
|
||||||
|
if got != wantContent {
|
||||||
|
t.Errorf("file %q content mismatch:\n got: %q\n want: %q", name, got, wantContent)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCollectLocalStreaming_BrokenSymlink_Good(t *testing.T) {
|
||||||
|
srcDir := t.TempDir()
|
||||||
|
outDir := t.TempDir()
|
||||||
|
|
||||||
|
// Create a regular file
|
||||||
|
if err := os.WriteFile(filepath.Join(srcDir, "real.txt"), []byte("I exist"), 0644); err != nil {
|
||||||
|
t.Fatalf("failed to write real.txt: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a broken symlink pointing to a nonexistent target
|
||||||
|
brokenLink := filepath.Join(srcDir, "broken-link")
|
||||||
|
if err := os.Symlink("/nonexistent/target/file", brokenLink); err != nil {
|
||||||
|
t.Fatalf("failed to create broken symlink: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
output := filepath.Join(outDir, "symlink.stim")
|
||||||
|
err := CollectLocalStreaming(srcDir, output, "none", "sym-password")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("CollectLocalStreaming() should skip broken symlinks, got error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify output exists and is non-empty
|
||||||
|
info, err := os.Stat(output)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("output file does not exist: %v", err)
|
||||||
|
}
|
||||||
|
if info.Size() == 0 {
|
||||||
|
t.Fatal("output file is empty")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt and verify the broken symlink was skipped
|
||||||
|
dn, err := DecryptStimV2(output, "sym-password")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptStimV2() error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// real.txt should be present
|
||||||
|
if _, err := dn.Stat("real.txt"); err != nil {
|
||||||
|
t.Error("expected real.txt in DataNode but it's missing")
|
||||||
|
}
|
||||||
|
|
||||||
|
// broken-link should NOT be present
|
||||||
|
exists, _ := dn.Exists("broken-link")
|
||||||
|
if exists {
|
||||||
|
t.Error("broken symlink should have been skipped but was found in DataNode")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCollectLocalStreaming_Bad(t *testing.T) {
|
||||||
|
outDir := t.TempDir()
|
||||||
|
output := filepath.Join(outDir, "should-not-exist.stim")
|
||||||
|
|
||||||
|
err := CollectLocalStreaming("/nonexistent/path/that/does/not/exist", output, "none", "password")
|
||||||
|
if err == nil {
|
||||||
|
t.Fatal("expected error for nonexistent directory, got nil")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify no partial output file was left behind
|
||||||
|
if _, statErr := os.Stat(output); statErr == nil {
|
||||||
|
t.Error("partial output file should have been cleaned up")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -4,11 +4,11 @@ import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/pwa"
|
"forge.lthn.ai/Snider/Borg/pkg/pwa"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/Snider/Borg/pkg/trix"
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
"github.com/Snider/Borg/pkg/ui"
|
"forge.lthn.ai/Snider/Borg/pkg/ui"
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -5,11 +5,11 @@ import (
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/schollz/progressbar/v3"
|
"github.com/schollz/progressbar/v3"
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/Snider/Borg/pkg/trix"
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
"github.com/Snider/Borg/pkg/ui"
|
"forge.lthn.ai/Snider/Borg/pkg/ui"
|
||||||
"github.com/Snider/Borg/pkg/website"
|
"forge.lthn.ai/Snider/Borg/pkg/website"
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
|
||||||
|
|
@ -6,8 +6,8 @@ import (
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/website"
|
"forge.lthn.ai/Snider/Borg/pkg/website"
|
||||||
"github.com/schollz/progressbar/v3"
|
"github.com/schollz/progressbar/v3"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -5,7 +5,7 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -5,8 +5,8 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/console"
|
"forge.lthn.ai/Snider/Borg/pkg/console"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
17
cmd/context.go
Normal file
17
cmd/context.go
Normal file
|
|
@ -0,0 +1,17 @@
|
||||||
|
package cmd
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/ui"
|
||||||
|
"github.com/spf13/cobra"
|
||||||
|
)
|
||||||
|
|
||||||
|
// ProgressFromCmd returns a Progress based on --quiet flag and TTY detection.
|
||||||
|
func ProgressFromCmd(cmd *cobra.Command) ui.Progress {
|
||||||
|
quiet, _ := cmd.Flags().GetBool("quiet")
|
||||||
|
if quiet {
|
||||||
|
return ui.NewQuietProgress(os.Stderr)
|
||||||
|
}
|
||||||
|
return ui.DefaultProgress()
|
||||||
|
}
|
||||||
28
cmd/context_test.go
Normal file
28
cmd/context_test.go
Normal file
|
|
@ -0,0 +1,28 @@
|
||||||
|
package cmd
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
|
||||||
|
"github.com/spf13/cobra"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestProgressFromCmd_Good(t *testing.T) {
|
||||||
|
cmd := &cobra.Command{}
|
||||||
|
cmd.PersistentFlags().BoolP("quiet", "q", false, "")
|
||||||
|
|
||||||
|
p := ProgressFromCmd(cmd)
|
||||||
|
if p == nil {
|
||||||
|
t.Fatal("expected non-nil Progress")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestProgressFromCmd_Quiet_Good(t *testing.T) {
|
||||||
|
cmd := &cobra.Command{}
|
||||||
|
cmd.PersistentFlags().BoolP("quiet", "q", true, "")
|
||||||
|
_ = cmd.PersistentFlags().Set("quiet", "true")
|
||||||
|
|
||||||
|
p := ProgressFromCmd(cmd)
|
||||||
|
if p == nil {
|
||||||
|
t.Fatal("expected non-nil Progress")
|
||||||
|
}
|
||||||
|
}
|
||||||
3
cmd/dapp-fm-app/.gitignore
vendored
Normal file
3
cmd/dapp-fm-app/.gitignore
vendored
Normal file
|
|
@ -0,0 +1,3 @@
|
||||||
|
build/
|
||||||
|
*.exe
|
||||||
|
dapp-fm-app
|
||||||
987
cmd/dapp-fm-app/frontend/index.html
Normal file
987
cmd/dapp-fm-app/frontend/index.html
Normal file
|
|
@ -0,0 +1,987 @@
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html lang="en">
|
||||||
|
<head>
|
||||||
|
<meta charset="UTF-8">
|
||||||
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
||||||
|
<title>dapp.fm - Decentralized Music Player</title>
|
||||||
|
<style>
|
||||||
|
* {
|
||||||
|
box-sizing: border-box;
|
||||||
|
margin: 0;
|
||||||
|
padding: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
body {
|
||||||
|
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, sans-serif;
|
||||||
|
background: linear-gradient(135deg, #0f0f1a 0%, #1a0a2e 50%, #0f1a2e 100%);
|
||||||
|
min-height: 100vh;
|
||||||
|
padding: 2rem;
|
||||||
|
color: #e0e0e0;
|
||||||
|
}
|
||||||
|
|
||||||
|
.container {
|
||||||
|
max-width: 900px;
|
||||||
|
margin: 0 auto;
|
||||||
|
}
|
||||||
|
|
||||||
|
.logo {
|
||||||
|
text-align: center;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.logo h1 {
|
||||||
|
font-size: 3rem;
|
||||||
|
font-weight: 800;
|
||||||
|
background: linear-gradient(135deg, #ff006e 0%, #8338ec 50%, #3a86ff 100%);
|
||||||
|
-webkit-background-clip: text;
|
||||||
|
-webkit-text-fill-color: transparent;
|
||||||
|
background-clip: text;
|
||||||
|
letter-spacing: -2px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.logo .tagline {
|
||||||
|
color: #888;
|
||||||
|
font-size: 1rem;
|
||||||
|
margin-top: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero-text {
|
||||||
|
text-align: center;
|
||||||
|
margin: 2rem 0;
|
||||||
|
padding: 1.5rem;
|
||||||
|
background: rgba(255,255,255,0.03);
|
||||||
|
border-radius: 16px;
|
||||||
|
border: 1px solid rgba(255,255,255,0.05);
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero-text p {
|
||||||
|
color: #aaa;
|
||||||
|
font-size: 0.95rem;
|
||||||
|
line-height: 1.6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.hero-text strong {
|
||||||
|
color: #ff006e;
|
||||||
|
}
|
||||||
|
|
||||||
|
.card {
|
||||||
|
background: rgba(255,255,255,0.05);
|
||||||
|
border-radius: 20px;
|
||||||
|
padding: 2rem;
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
border: 1px solid rgba(255,255,255,0.08);
|
||||||
|
backdrop-filter: blur(20px);
|
||||||
|
}
|
||||||
|
|
||||||
|
.card h2 {
|
||||||
|
font-size: 1.2rem;
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.75rem;
|
||||||
|
color: #fff;
|
||||||
|
}
|
||||||
|
|
||||||
|
.card h2 .icon {
|
||||||
|
font-size: 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.input-group {
|
||||||
|
margin-bottom: 1.25rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
label {
|
||||||
|
display: block;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
color: #888;
|
||||||
|
font-size: 0.85rem;
|
||||||
|
font-weight: 500;
|
||||||
|
}
|
||||||
|
|
||||||
|
textarea, input[type="password"], input[type="text"], input[type="url"] {
|
||||||
|
width: 100%;
|
||||||
|
padding: 1rem 1.25rem;
|
||||||
|
border: 2px solid rgba(255,255,255,0.1);
|
||||||
|
border-radius: 12px;
|
||||||
|
background: rgba(0,0,0,0.4);
|
||||||
|
color: #fff;
|
||||||
|
font-family: 'Monaco', 'Menlo', 'Consolas', monospace;
|
||||||
|
font-size: 0.9rem;
|
||||||
|
transition: all 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
textarea:focus, input:focus {
|
||||||
|
outline: none;
|
||||||
|
border-color: #8338ec;
|
||||||
|
box-shadow: 0 0 0 4px rgba(131, 56, 236, 0.2);
|
||||||
|
}
|
||||||
|
|
||||||
|
textarea.encrypted {
|
||||||
|
min-height: 100px;
|
||||||
|
font-size: 0.75rem;
|
||||||
|
word-break: break-all;
|
||||||
|
resize: vertical;
|
||||||
|
}
|
||||||
|
|
||||||
|
.unlock-row {
|
||||||
|
display: flex;
|
||||||
|
gap: 1rem;
|
||||||
|
align-items: flex-end;
|
||||||
|
}
|
||||||
|
|
||||||
|
.unlock-row .input-group {
|
||||||
|
flex: 1;
|
||||||
|
margin-bottom: 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
button {
|
||||||
|
padding: 1rem 2.5rem;
|
||||||
|
border: none;
|
||||||
|
border-radius: 12px;
|
||||||
|
font-weight: 700;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: all 0.3s;
|
||||||
|
font-size: 1rem;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 1px;
|
||||||
|
}
|
||||||
|
|
||||||
|
button.primary {
|
||||||
|
background: linear-gradient(135deg, #ff006e 0%, #8338ec 100%);
|
||||||
|
color: #fff;
|
||||||
|
box-shadow: 0 4px 20px rgba(255, 0, 110, 0.3);
|
||||||
|
}
|
||||||
|
|
||||||
|
button.primary:hover {
|
||||||
|
transform: translateY(-3px);
|
||||||
|
box-shadow: 0 8px 30px rgba(255, 0, 110, 0.4);
|
||||||
|
}
|
||||||
|
|
||||||
|
button.primary:disabled {
|
||||||
|
opacity: 0.4;
|
||||||
|
cursor: not-allowed;
|
||||||
|
transform: none;
|
||||||
|
box-shadow: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
button.secondary {
|
||||||
|
background: rgba(255,255,255,0.1);
|
||||||
|
color: #fff;
|
||||||
|
border: 1px solid rgba(255,255,255,0.2);
|
||||||
|
}
|
||||||
|
|
||||||
|
button.secondary:hover {
|
||||||
|
background: rgba(255,255,255,0.15);
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-indicator {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
gap: 0.5rem;
|
||||||
|
font-size: 0.85rem;
|
||||||
|
padding: 0.75rem;
|
||||||
|
margin-bottom: 1.5rem;
|
||||||
|
border-radius: 8px;
|
||||||
|
background: rgba(0,0,0,0.2);
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-indicator .dot {
|
||||||
|
width: 10px;
|
||||||
|
height: 10px;
|
||||||
|
border-radius: 50%;
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-indicator.loading .dot {
|
||||||
|
background: #ffc107;
|
||||||
|
animation: pulse 1s infinite;
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-indicator.ready .dot {
|
||||||
|
background: #00ff94;
|
||||||
|
}
|
||||||
|
|
||||||
|
.status-indicator.error .dot {
|
||||||
|
background: #ff5252;
|
||||||
|
}
|
||||||
|
|
||||||
|
@keyframes pulse {
|
||||||
|
0%, 100% { opacity: 1; }
|
||||||
|
50% { opacity: 0.3; }
|
||||||
|
}
|
||||||
|
|
||||||
|
.error-banner {
|
||||||
|
background: rgba(255, 82, 82, 0.15);
|
||||||
|
border: 1px solid rgba(255, 82, 82, 0.4);
|
||||||
|
border-radius: 12px;
|
||||||
|
padding: 1rem 1.25rem;
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
display: none;
|
||||||
|
color: #ff6b6b;
|
||||||
|
}
|
||||||
|
|
||||||
|
.error-banner.visible {
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
|
||||||
|
/* Media Player Styles */
|
||||||
|
.player-container {
|
||||||
|
display: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
.player-container.visible {
|
||||||
|
display: block;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-info {
|
||||||
|
text-align: center;
|
||||||
|
margin-bottom: 2rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-artwork {
|
||||||
|
width: 200px;
|
||||||
|
height: 200px;
|
||||||
|
margin: 0 auto 1.5rem;
|
||||||
|
border-radius: 16px;
|
||||||
|
background: linear-gradient(135deg, #1a1a2e 0%, #2d1b4e 100%);
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
font-size: 5rem;
|
||||||
|
box-shadow: 0 10px 40px rgba(0,0,0,0.5);
|
||||||
|
overflow: hidden;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-artwork img, .track-artwork video {
|
||||||
|
width: 100%;
|
||||||
|
height: 100%;
|
||||||
|
object-fit: cover;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-title {
|
||||||
|
font-size: 1.5rem;
|
||||||
|
font-weight: 700;
|
||||||
|
margin-bottom: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-artist {
|
||||||
|
color: #888;
|
||||||
|
font-size: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.media-player-wrapper {
|
||||||
|
margin-top: 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.audio-player {
|
||||||
|
width: 100%;
|
||||||
|
background: rgba(0,0,0,0.3);
|
||||||
|
border-radius: 12px;
|
||||||
|
padding: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
audio, video {
|
||||||
|
width: 100%;
|
||||||
|
border-radius: 8px;
|
||||||
|
outline: none;
|
||||||
|
}
|
||||||
|
|
||||||
|
video {
|
||||||
|
max-height: 500px;
|
||||||
|
background: #000;
|
||||||
|
}
|
||||||
|
|
||||||
|
.video-player-wrapper {
|
||||||
|
border-radius: 16px;
|
||||||
|
overflow: hidden;
|
||||||
|
box-shadow: 0 10px 40px rgba(0,0,0,0.5);
|
||||||
|
}
|
||||||
|
|
||||||
|
.license-info {
|
||||||
|
margin-top: 2rem;
|
||||||
|
padding: 1.5rem;
|
||||||
|
background: rgba(131, 56, 236, 0.1);
|
||||||
|
border: 1px solid rgba(131, 56, 236, 0.3);
|
||||||
|
border-radius: 12px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.license-info h4 {
|
||||||
|
font-size: 0.9rem;
|
||||||
|
margin-bottom: 0.75rem;
|
||||||
|
color: #8338ec;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.license-info p {
|
||||||
|
font-size: 0.85rem;
|
||||||
|
color: #aaa;
|
||||||
|
line-height: 1.6;
|
||||||
|
}
|
||||||
|
|
||||||
|
.license-info .license-token {
|
||||||
|
font-family: 'Monaco', 'Menlo', monospace;
|
||||||
|
font-size: 0.75rem;
|
||||||
|
background: rgba(0,0,0,0.3);
|
||||||
|
padding: 0.75rem;
|
||||||
|
border-radius: 8px;
|
||||||
|
margin-top: 0.75rem;
|
||||||
|
word-break: break-all;
|
||||||
|
color: #00ff94;
|
||||||
|
}
|
||||||
|
|
||||||
|
.download-section {
|
||||||
|
margin-top: 1.5rem;
|
||||||
|
padding-top: 1.5rem;
|
||||||
|
border-top: 1px solid rgba(255,255,255,0.1);
|
||||||
|
display: flex;
|
||||||
|
justify-content: center;
|
||||||
|
gap: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.download-section button {
|
||||||
|
padding: 0.75rem 1.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-list-section {
|
||||||
|
margin-top: 1.5rem;
|
||||||
|
padding-top: 1.5rem;
|
||||||
|
border-top: 1px solid rgba(255,255,255,0.1);
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-list-section h3 {
|
||||||
|
font-size: 0.9rem;
|
||||||
|
color: #888;
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-list {
|
||||||
|
display: flex;
|
||||||
|
flex-direction: column;
|
||||||
|
gap: 0.5rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-item {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 1rem;
|
||||||
|
padding: 0.75rem 1rem;
|
||||||
|
background: rgba(0,0,0,0.2);
|
||||||
|
border-radius: 8px;
|
||||||
|
cursor: pointer;
|
||||||
|
transition: all 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-item:hover {
|
||||||
|
background: rgba(131, 56, 236, 0.2);
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-item.active {
|
||||||
|
background: rgba(255, 0, 110, 0.2);
|
||||||
|
border: 1px solid rgba(255, 0, 110, 0.4);
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-number {
|
||||||
|
font-weight: 700;
|
||||||
|
color: #8338ec;
|
||||||
|
min-width: 24px;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-name {
|
||||||
|
font-weight: 500;
|
||||||
|
font-size: 0.95rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-type {
|
||||||
|
font-size: 0.75rem;
|
||||||
|
color: #888;
|
||||||
|
text-transform: uppercase;
|
||||||
|
}
|
||||||
|
|
||||||
|
.track-time {
|
||||||
|
font-family: 'Monaco', 'Menlo', monospace;
|
||||||
|
font-size: 0.8rem;
|
||||||
|
color: #00ff94;
|
||||||
|
}
|
||||||
|
|
||||||
|
.file-input-wrapper {
|
||||||
|
position: relative;
|
||||||
|
margin-bottom: 1rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.file-input-wrapper input[type="file"] {
|
||||||
|
position: absolute;
|
||||||
|
opacity: 0;
|
||||||
|
width: 100%;
|
||||||
|
height: 100%;
|
||||||
|
cursor: pointer;
|
||||||
|
}
|
||||||
|
|
||||||
|
.file-input-label {
|
||||||
|
display: flex;
|
||||||
|
align-items: center;
|
||||||
|
justify-content: center;
|
||||||
|
gap: 0.75rem;
|
||||||
|
padding: 2rem;
|
||||||
|
border: 2px dashed rgba(255,255,255,0.2);
|
||||||
|
border-radius: 12px;
|
||||||
|
background: rgba(0,0,0,0.2);
|
||||||
|
cursor: pointer;
|
||||||
|
transition: all 0.2s;
|
||||||
|
}
|
||||||
|
|
||||||
|
.file-input-label:hover {
|
||||||
|
border-color: #8338ec;
|
||||||
|
background: rgba(131, 56, 236, 0.1);
|
||||||
|
}
|
||||||
|
|
||||||
|
.file-input-label .icon {
|
||||||
|
font-size: 2rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.or-divider {
|
||||||
|
text-align: center;
|
||||||
|
color: #666;
|
||||||
|
margin: 1rem 0;
|
||||||
|
font-size: 0.85rem;
|
||||||
|
}
|
||||||
|
|
||||||
|
.native-badge {
|
||||||
|
display: inline-flex;
|
||||||
|
align-items: center;
|
||||||
|
gap: 0.25rem;
|
||||||
|
background: linear-gradient(135deg, #00ff94 0%, #00d4aa 100%);
|
||||||
|
color: #000;
|
||||||
|
font-size: 0.65rem;
|
||||||
|
font-weight: 700;
|
||||||
|
padding: 0.2rem 0.5rem;
|
||||||
|
border-radius: 4px;
|
||||||
|
text-transform: uppercase;
|
||||||
|
letter-spacing: 0.5px;
|
||||||
|
}
|
||||||
|
</style>
|
||||||
|
</head>
|
||||||
|
<body>
|
||||||
|
<div class="container">
|
||||||
|
<div class="logo">
|
||||||
|
<h1>dapp.fm</h1>
|
||||||
|
<p class="tagline">Decentralized Music Distribution <span class="native-badge">Native App</span></p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="hero-text">
|
||||||
|
<p>
|
||||||
|
<strong>No middlemen. No platforms. No 70% cuts.</strong><br>
|
||||||
|
Artists encrypt their music with ChaCha20-Poly1305. Fans unlock with a license token.
|
||||||
|
Content lives on any CDN, IPFS, or artist's own server. The password IS the license.
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div id="status" class="status-indicator ready">
|
||||||
|
<span class="dot"></span>
|
||||||
|
<span>Native decryption ready (memory speed)</span>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="card">
|
||||||
|
<h2><span class="icon">🔐</span> Unlock Licensed Content</h2>
|
||||||
|
|
||||||
|
<div class="file-input-wrapper">
|
||||||
|
<input type="file" id="file-input" accept=".smsg,.enc,.borg">
|
||||||
|
<label class="file-input-label">
|
||||||
|
<span class="icon">📁</span>
|
||||||
|
<span>Drop encrypted file here or click to browse</span>
|
||||||
|
</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="or-divider">- or paste encrypted content -</div>
|
||||||
|
|
||||||
|
<div class="input-group">
|
||||||
|
<label for="encrypted-content">Encrypted Content (base64):</label>
|
||||||
|
<textarea id="encrypted-content" class="encrypted" placeholder="Paste the encrypted content from the artist..."></textarea>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="demo-banner" style="background: rgba(255, 0, 110, 0.1); border: 1px solid rgba(255, 0, 110, 0.3); border-radius: 12px; padding: 1rem; margin-bottom: 1rem;">
|
||||||
|
<div style="display: flex; align-items: center; justify-content: space-between; flex-wrap: wrap; gap: 1rem;">
|
||||||
|
<div>
|
||||||
|
<strong style="color: #ff006e;">Try the Demo!</strong>
|
||||||
|
<span style="color: #888; font-size: 0.85rem; margin-left: 0.5rem;">Bundled sample video</span>
|
||||||
|
</div>
|
||||||
|
<button id="load-demo-btn" class="secondary" style="padding: 0.6rem 1.2rem; font-size: 0.85rem;">Load Demo Track</button>
|
||||||
|
</div>
|
||||||
|
<div style="font-size: 0.8rem; color: #666; margin-top: 0.5rem;">
|
||||||
|
Password: <code style="background: rgba(0,0,0,0.3); padding: 0.2rem 0.5rem; border-radius: 4px; color: #00ff94;">PMVXogAJNVe_DDABfTmLYztaJAzsD0R7</code>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div id="error-banner" class="error-banner"></div>
|
||||||
|
|
||||||
|
<!-- Manifest preview (shown without decryption) -->
|
||||||
|
<div id="manifest-preview" style="display: none; background: rgba(131, 56, 236, 0.1); border: 1px solid rgba(131, 56, 236, 0.3); border-radius: 12px; padding: 1.25rem; margin-bottom: 1rem;"></div>
|
||||||
|
|
||||||
|
<div class="unlock-row">
|
||||||
|
<div class="input-group">
|
||||||
|
<label for="license-token">License Token (Password):</label>
|
||||||
|
<input type="password" id="license-token" placeholder="Enter your license token from the artist">
|
||||||
|
</div>
|
||||||
|
<button id="unlock-btn" class="primary">Unlock</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<!-- Player appears after unlock -->
|
||||||
|
<div id="player-container" class="card player-container">
|
||||||
|
<h2><span class="icon">🎵</span> Now Playing</h2>
|
||||||
|
|
||||||
|
<div class="track-info">
|
||||||
|
<div class="track-artwork" id="track-artwork">🎶</div>
|
||||||
|
<div class="track-title" id="track-title">Track Title</div>
|
||||||
|
<div class="track-artist" id="track-artist">Artist Name</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="media-player-wrapper" id="media-player-wrapper">
|
||||||
|
<!-- Audio/Video player inserted here -->
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div id="track-list-section" class="track-list-section" style="display: none;">
|
||||||
|
<h3><span>💿</span> Track List</h3>
|
||||||
|
<div id="track-list" class="track-list">
|
||||||
|
<!-- Tracks populated by JS -->
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="license-info">
|
||||||
|
<h4>🔓 Licensed Content</h4>
|
||||||
|
<p id="license-description">This content was unlocked with your personal license token.
|
||||||
|
Decryption powered by native Go - no servers, memory speed.</p>
|
||||||
|
<div class="license-token" id="license-display"></div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<div class="download-section">
|
||||||
|
<button class="secondary" id="download-btn">Download Original</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<script>
|
||||||
|
// Wails runtime - provides window.go bindings
|
||||||
|
let currentMediaBlob = null;
|
||||||
|
let currentMediaName = null;
|
||||||
|
let currentMediaMime = null;
|
||||||
|
let currentManifest = null;
|
||||||
|
|
||||||
|
// Check if Wails runtime is available
|
||||||
|
function isWailsReady() {
|
||||||
|
return typeof window.go !== 'undefined' &&
|
||||||
|
typeof window.go.player !== 'undefined' &&
|
||||||
|
typeof window.go.player.Player !== 'undefined';
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wait for Wails runtime
|
||||||
|
function waitForWails() {
|
||||||
|
return new Promise((resolve) => {
|
||||||
|
if (isWailsReady()) {
|
||||||
|
resolve();
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
// Poll for Wails runtime
|
||||||
|
const interval = setInterval(() => {
|
||||||
|
if (isWailsReady()) {
|
||||||
|
clearInterval(interval);
|
||||||
|
resolve();
|
||||||
|
}
|
||||||
|
}, 50);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
function showError(msg) {
|
||||||
|
const errorBanner = document.getElementById('error-banner');
|
||||||
|
errorBanner.textContent = msg;
|
||||||
|
errorBanner.classList.add('visible');
|
||||||
|
}
|
||||||
|
|
||||||
|
function hideError() {
|
||||||
|
document.getElementById('error-banner').classList.remove('visible');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle file input
|
||||||
|
document.getElementById('file-input').addEventListener('change', async (e) => {
|
||||||
|
const file = e.target.files[0];
|
||||||
|
if (!file) return;
|
||||||
|
|
||||||
|
try {
|
||||||
|
const content = await file.arrayBuffer();
|
||||||
|
const base64 = btoa(String.fromCharCode(...new Uint8Array(content)));
|
||||||
|
document.getElementById('encrypted-content').value = base64;
|
||||||
|
await showManifestPreview(base64);
|
||||||
|
} catch (err) {
|
||||||
|
showError('Failed to read file: ' + err.message);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
// Listen for content paste/input
|
||||||
|
let previewDebounce = null;
|
||||||
|
document.getElementById('encrypted-content').addEventListener('input', async (e) => {
|
||||||
|
const content = e.target.value.trim();
|
||||||
|
clearTimeout(previewDebounce);
|
||||||
|
previewDebounce = setTimeout(async () => {
|
||||||
|
if (content && content.length > 100) {
|
||||||
|
await showManifestPreview(content);
|
||||||
|
}
|
||||||
|
}, 500);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Show manifest preview using Go bindings (NO WASM!)
|
||||||
|
async function showManifestPreview(encryptedB64) {
|
||||||
|
await waitForWails();
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Direct Go call at memory speed!
|
||||||
|
const manifest = await window.go.player.Player.GetManifest(encryptedB64);
|
||||||
|
currentManifest = manifest;
|
||||||
|
|
||||||
|
const previewSection = document.getElementById('manifest-preview');
|
||||||
|
while (previewSection.firstChild) {
|
||||||
|
previewSection.removeChild(previewSection.firstChild);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (manifest && manifest.title) {
|
||||||
|
previewSection.style.display = 'block';
|
||||||
|
|
||||||
|
// Header with icon
|
||||||
|
const headerDiv = document.createElement('div');
|
||||||
|
headerDiv.style.cssText = 'display: flex; align-items: center; gap: 1rem; margin-bottom: 1rem;';
|
||||||
|
|
||||||
|
const icon = document.createElement('span');
|
||||||
|
icon.style.fontSize = '2.5rem';
|
||||||
|
icon.textContent = manifest.release_type === 'djset' ? '🎧' :
|
||||||
|
manifest.release_type === 'live' ? '🎤' : '💿';
|
||||||
|
|
||||||
|
const titleDiv = document.createElement('div');
|
||||||
|
const titleEl = document.createElement('div');
|
||||||
|
titleEl.style.cssText = 'font-size: 1.2rem; font-weight: 700; color: #fff;';
|
||||||
|
titleEl.textContent = manifest.title || 'Untitled';
|
||||||
|
|
||||||
|
const artistEl = document.createElement('div');
|
||||||
|
artistEl.style.cssText = 'font-size: 0.9rem; color: #888;';
|
||||||
|
artistEl.textContent = manifest.artist || 'Unknown Artist';
|
||||||
|
|
||||||
|
titleDiv.appendChild(titleEl);
|
||||||
|
titleDiv.appendChild(artistEl);
|
||||||
|
headerDiv.appendChild(icon);
|
||||||
|
headerDiv.appendChild(titleDiv);
|
||||||
|
previewSection.appendChild(headerDiv);
|
||||||
|
|
||||||
|
// Track list
|
||||||
|
if (manifest.tracks && manifest.tracks.length > 0) {
|
||||||
|
const trackHeader = document.createElement('div');
|
||||||
|
trackHeader.style.cssText = 'font-size: 0.85rem; color: #8338ec; margin-bottom: 0.5rem;';
|
||||||
|
trackHeader.textContent = '💿 ' + manifest.tracks.length + ' track(s)';
|
||||||
|
previewSection.appendChild(trackHeader);
|
||||||
|
|
||||||
|
const trackList = document.createElement('div');
|
||||||
|
trackList.style.maxHeight = '150px';
|
||||||
|
trackList.style.overflowY = 'auto';
|
||||||
|
|
||||||
|
manifest.tracks.forEach((track, i) => {
|
||||||
|
const trackEl = document.createElement('div');
|
||||||
|
trackEl.style.cssText = 'display: flex; align-items: center; gap: 0.75rem; padding: 0.5rem; background: rgba(0,0,0,0.2); border-radius: 6px; margin-bottom: 0.25rem; font-size: 0.85rem;';
|
||||||
|
|
||||||
|
const numEl = document.createElement('span');
|
||||||
|
numEl.style.cssText = 'color: #8338ec; font-weight: 600; min-width: 20px;';
|
||||||
|
numEl.textContent = (track.track_num || (i + 1)) + '.';
|
||||||
|
|
||||||
|
const nameEl = document.createElement('span');
|
||||||
|
nameEl.style.cssText = 'flex: 1; color: #ccc;';
|
||||||
|
nameEl.textContent = track.title || 'Track ' + (i + 1);
|
||||||
|
|
||||||
|
const timeEl = document.createElement('span');
|
||||||
|
timeEl.style.cssText = 'color: #00ff94; font-family: monospace; font-size: 0.8rem;';
|
||||||
|
timeEl.textContent = formatTime(track.start || 0);
|
||||||
|
|
||||||
|
trackEl.appendChild(numEl);
|
||||||
|
trackEl.appendChild(nameEl);
|
||||||
|
trackEl.appendChild(timeEl);
|
||||||
|
trackList.appendChild(trackEl);
|
||||||
|
});
|
||||||
|
|
||||||
|
previewSection.appendChild(trackList);
|
||||||
|
}
|
||||||
|
|
||||||
|
// License status
|
||||||
|
if (manifest.is_expired !== undefined) {
|
||||||
|
const licenseDiv = document.createElement('div');
|
||||||
|
licenseDiv.style.cssText = 'margin-top: 1rem; padding: 0.75rem; border-radius: 8px;';
|
||||||
|
|
||||||
|
if (manifest.is_expired) {
|
||||||
|
licenseDiv.style.background = 'rgba(255, 82, 82, 0.2)';
|
||||||
|
licenseDiv.style.border = '1px solid rgba(255, 82, 82, 0.4)';
|
||||||
|
const label = document.createElement('div');
|
||||||
|
label.style.cssText = 'color: #ff5252; font-weight: 600;';
|
||||||
|
label.textContent = 'LICENSE EXPIRED';
|
||||||
|
licenseDiv.appendChild(label);
|
||||||
|
} else if (manifest.time_remaining) {
|
||||||
|
licenseDiv.style.background = 'rgba(0, 255, 148, 0.1)';
|
||||||
|
licenseDiv.style.border = '1px solid rgba(0, 255, 148, 0.3)';
|
||||||
|
const label = document.createElement('span');
|
||||||
|
label.style.cssText = 'color: #00ff94; font-weight: 600; font-size: 0.8rem;';
|
||||||
|
label.textContent = (manifest.license_type || 'LICENSE').toUpperCase();
|
||||||
|
const time = document.createElement('span');
|
||||||
|
time.style.cssText = 'color: #888; font-size: 0.8rem; margin-left: 0.5rem;';
|
||||||
|
time.textContent = manifest.time_remaining + ' remaining';
|
||||||
|
licenseDiv.appendChild(label);
|
||||||
|
licenseDiv.appendChild(time);
|
||||||
|
} else {
|
||||||
|
licenseDiv.style.background = 'rgba(0, 255, 148, 0.1)';
|
||||||
|
licenseDiv.style.border = '1px solid rgba(0, 255, 148, 0.3)';
|
||||||
|
const label = document.createElement('span');
|
||||||
|
label.style.cssText = 'color: #00ff94; font-weight: 600; font-size: 0.8rem;';
|
||||||
|
label.textContent = 'PERPETUAL LICENSE';
|
||||||
|
licenseDiv.appendChild(label);
|
||||||
|
}
|
||||||
|
previewSection.appendChild(licenseDiv);
|
||||||
|
}
|
||||||
|
|
||||||
|
const hint = document.createElement('div');
|
||||||
|
hint.style.cssText = 'margin-top: 1rem; font-size: 0.85rem; color: #888; text-align: center;';
|
||||||
|
hint.textContent = manifest.is_expired ?
|
||||||
|
'License expired. Contact artist for renewal.' :
|
||||||
|
'Enter license token to unlock and play';
|
||||||
|
previewSection.appendChild(hint);
|
||||||
|
|
||||||
|
} else {
|
||||||
|
previewSection.style.display = 'none';
|
||||||
|
}
|
||||||
|
} catch (err) {
|
||||||
|
console.log('Could not read manifest:', err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Unlock content using Go bindings (memory speed!)
|
||||||
|
async function unlockContent() {
|
||||||
|
hideError();
|
||||||
|
await waitForWails();
|
||||||
|
|
||||||
|
const encryptedB64 = document.getElementById('encrypted-content').value.trim();
|
||||||
|
const password = document.getElementById('license-token').value;
|
||||||
|
|
||||||
|
if (!encryptedB64) {
|
||||||
|
showError('Please provide encrypted content');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!password) {
|
||||||
|
showError('Please enter your license token');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
// Check license validity (memory speed)
|
||||||
|
const isValid = await window.go.player.Player.IsLicenseValid(encryptedB64);
|
||||||
|
if (!isValid) {
|
||||||
|
showError('License has expired. Contact the artist for renewal.');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt using Go bindings (memory speed - no HTTP/TCP!)
|
||||||
|
const result = await window.go.player.Player.Decrypt(encryptedB64, password);
|
||||||
|
displayMedia(result, password);
|
||||||
|
|
||||||
|
} catch (err) {
|
||||||
|
showError('Unlock failed: ' + err.message);
|
||||||
|
console.error(err);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Display decrypted media
|
||||||
|
function displayMedia(result, password) {
|
||||||
|
const playerContainer = document.getElementById('player-container');
|
||||||
|
const mediaWrapper = document.getElementById('media-player-wrapper');
|
||||||
|
const artworkEl = document.getElementById('track-artwork');
|
||||||
|
|
||||||
|
// Set track info
|
||||||
|
const title = (currentManifest && currentManifest.title) || result.subject || 'Untitled';
|
||||||
|
const artist = (currentManifest && currentManifest.artist) || result.from || 'Unknown Artist';
|
||||||
|
document.getElementById('track-title').textContent = title;
|
||||||
|
document.getElementById('track-artist').textContent = artist;
|
||||||
|
|
||||||
|
// Show masked license token
|
||||||
|
const masked = password.substring(0, 4) + '••••••••' + password.substring(password.length - 4);
|
||||||
|
document.getElementById('license-display').textContent = masked;
|
||||||
|
|
||||||
|
// Clear previous media
|
||||||
|
while (mediaWrapper.firstChild) mediaWrapper.removeChild(mediaWrapper.firstChild);
|
||||||
|
while (artworkEl.firstChild) artworkEl.removeChild(artworkEl.firstChild);
|
||||||
|
artworkEl.textContent = '🎶';
|
||||||
|
|
||||||
|
// Process attachments
|
||||||
|
if (result.attachments && result.attachments.length > 0) {
|
||||||
|
result.attachments.forEach((att) => {
|
||||||
|
const mime = att.mime_type || 'application/octet-stream';
|
||||||
|
|
||||||
|
// URL from Go - served through Wails asset handler
|
||||||
|
const url = att.url || att.file_path || att.stream_url || att.data_url;
|
||||||
|
|
||||||
|
// Store info for download
|
||||||
|
currentMediaName = att.name;
|
||||||
|
currentMediaMime = mime;
|
||||||
|
|
||||||
|
if (mime.startsWith('video/')) {
|
||||||
|
const wrapper = document.createElement('div');
|
||||||
|
wrapper.className = 'video-player-wrapper';
|
||||||
|
const video = document.createElement('video');
|
||||||
|
video.controls = true;
|
||||||
|
video.src = url;
|
||||||
|
video.style.width = '100%';
|
||||||
|
wrapper.appendChild(video);
|
||||||
|
mediaWrapper.appendChild(wrapper);
|
||||||
|
artworkEl.textContent = '🎬';
|
||||||
|
|
||||||
|
} else if (mime.startsWith('audio/')) {
|
||||||
|
const wrapper = document.createElement('div');
|
||||||
|
wrapper.className = 'audio-player';
|
||||||
|
const audio = document.createElement('audio');
|
||||||
|
audio.controls = true;
|
||||||
|
audio.src = url;
|
||||||
|
audio.style.width = '100%';
|
||||||
|
wrapper.appendChild(audio);
|
||||||
|
mediaWrapper.appendChild(wrapper);
|
||||||
|
artworkEl.textContent = '🎵';
|
||||||
|
|
||||||
|
} else if (mime.startsWith('image/')) {
|
||||||
|
const img = document.createElement('img');
|
||||||
|
img.src = url;
|
||||||
|
artworkEl.textContent = '';
|
||||||
|
artworkEl.appendChild(img);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build track list from manifest
|
||||||
|
const trackListSection = document.getElementById('track-list-section');
|
||||||
|
const trackListEl = document.getElementById('track-list');
|
||||||
|
while (trackListEl.firstChild) trackListEl.removeChild(trackListEl.firstChild);
|
||||||
|
|
||||||
|
if (currentManifest && currentManifest.tracks && currentManifest.tracks.length > 0) {
|
||||||
|
trackListSection.style.display = 'block';
|
||||||
|
|
||||||
|
currentManifest.tracks.forEach((track, index) => {
|
||||||
|
const item = document.createElement('div');
|
||||||
|
item.className = 'track-item';
|
||||||
|
item.addEventListener('click', () => {
|
||||||
|
const media = document.querySelector('audio, video');
|
||||||
|
if (media) {
|
||||||
|
media.currentTime = track.start || 0;
|
||||||
|
media.play();
|
||||||
|
document.querySelectorAll('.track-item').forEach(t => t.classList.remove('active'));
|
||||||
|
item.classList.add('active');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
const num = document.createElement('span');
|
||||||
|
num.className = 'track-number';
|
||||||
|
num.textContent = track.track_num || (index + 1);
|
||||||
|
|
||||||
|
const info = document.createElement('div');
|
||||||
|
info.style.flex = '1';
|
||||||
|
const name = document.createElement('div');
|
||||||
|
name.className = 'track-name';
|
||||||
|
name.textContent = track.title || 'Track ' + (index + 1);
|
||||||
|
info.appendChild(name);
|
||||||
|
|
||||||
|
const time = document.createElement('span');
|
||||||
|
time.className = 'track-time';
|
||||||
|
time.textContent = formatTime(track.start || 0);
|
||||||
|
|
||||||
|
item.appendChild(num);
|
||||||
|
item.appendChild(info);
|
||||||
|
item.appendChild(time);
|
||||||
|
trackListEl.appendChild(item);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
trackListSection.style.display = 'none';
|
||||||
|
}
|
||||||
|
|
||||||
|
// Update license description
|
||||||
|
if (currentManifest && currentManifest.time_remaining) {
|
||||||
|
document.getElementById('license-description').textContent =
|
||||||
|
(currentManifest.license_type || 'Rental').toUpperCase() + ' license - ' +
|
||||||
|
currentManifest.time_remaining + ' remaining. Native Go decryption at memory speed.';
|
||||||
|
}
|
||||||
|
|
||||||
|
// Hide preview, show player
|
||||||
|
document.getElementById('manifest-preview').style.display = 'none';
|
||||||
|
playerContainer.classList.add('visible');
|
||||||
|
playerContainer.scrollIntoView({ behavior: 'smooth' });
|
||||||
|
}
|
||||||
|
|
||||||
|
function formatTime(seconds) {
|
||||||
|
const mins = Math.floor(seconds / 60);
|
||||||
|
const secs = Math.floor(seconds % 60);
|
||||||
|
return mins + ':' + secs.toString().padStart(2, '0');
|
||||||
|
}
|
||||||
|
|
||||||
|
// Download handler
|
||||||
|
document.getElementById('download-btn').addEventListener('click', () => {
|
||||||
|
if (!currentMediaBlob) {
|
||||||
|
alert('No media to download');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const url = URL.createObjectURL(currentMediaBlob);
|
||||||
|
const a = document.createElement('a');
|
||||||
|
a.href = url;
|
||||||
|
a.download = currentMediaName || 'media';
|
||||||
|
document.body.appendChild(a);
|
||||||
|
a.click();
|
||||||
|
document.body.removeChild(a);
|
||||||
|
URL.revokeObjectURL(url);
|
||||||
|
});
|
||||||
|
|
||||||
|
// Load bundled demo - DIRECT GO CALL, no HTTP!
|
||||||
|
async function loadDemo() {
|
||||||
|
const btn = document.getElementById('load-demo-btn');
|
||||||
|
const originalText = btn.textContent;
|
||||||
|
btn.textContent = 'Loading...';
|
||||||
|
btn.disabled = true;
|
||||||
|
|
||||||
|
try {
|
||||||
|
await waitForWails();
|
||||||
|
|
||||||
|
// Get manifest first (direct Go call)
|
||||||
|
const manifest = await window.go.main.App.GetDemoManifest();
|
||||||
|
currentManifest = manifest;
|
||||||
|
|
||||||
|
// Decrypt demo directly in Go - NO fetch, NO base64 encoding!
|
||||||
|
// Go reads embedded bytes -> decrypts -> returns result
|
||||||
|
const result = await window.go.main.App.LoadDemo();
|
||||||
|
|
||||||
|
// Display the decrypted media
|
||||||
|
displayMedia(result, 'PMVXogAJNVe_DDABfTmLYztaJAzsD0R7');
|
||||||
|
|
||||||
|
btn.textContent = 'Loaded!';
|
||||||
|
setTimeout(() => {
|
||||||
|
btn.textContent = originalText;
|
||||||
|
btn.disabled = false;
|
||||||
|
}, 2000);
|
||||||
|
} catch (err) {
|
||||||
|
showError('Failed to load demo: ' + err.message);
|
||||||
|
btn.textContent = originalText;
|
||||||
|
btn.disabled = false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Event listeners
|
||||||
|
document.getElementById('unlock-btn').addEventListener('click', unlockContent);
|
||||||
|
document.getElementById('license-token').addEventListener('keypress', (e) => {
|
||||||
|
if (e.key === 'Enter') unlockContent();
|
||||||
|
});
|
||||||
|
document.getElementById('load-demo-btn').addEventListener('click', loadDemo);
|
||||||
|
|
||||||
|
// Ready check
|
||||||
|
waitForWails().then(() => {
|
||||||
|
console.log('Wails bindings ready - memory speed decryption enabled');
|
||||||
|
});
|
||||||
|
</script>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
14
cmd/dapp-fm-app/frontend/wailsjs/go/main/App.d.ts
vendored
Executable file
14
cmd/dapp-fm-app/frontend/wailsjs/go/main/App.d.ts
vendored
Executable file
|
|
@ -0,0 +1,14 @@
|
||||||
|
// Cynhyrchwyd y ffeil hon yn awtomatig. PEIDIWCH Â MODIWL
|
||||||
|
// This file is automatically generated. DO NOT EDIT
|
||||||
|
import {main} from '../models';
|
||||||
|
import {player} from '../models';
|
||||||
|
|
||||||
|
export function DecryptAndServe(arg1:string,arg2:string):Promise<main.MediaResult>;
|
||||||
|
|
||||||
|
export function GetDemoManifest():Promise<player.ManifestInfo>;
|
||||||
|
|
||||||
|
export function GetManifest(arg1:string):Promise<player.ManifestInfo>;
|
||||||
|
|
||||||
|
export function IsLicenseValid(arg1:string):Promise<boolean>;
|
||||||
|
|
||||||
|
export function LoadDemo():Promise<main.MediaResult>;
|
||||||
23
cmd/dapp-fm-app/frontend/wailsjs/go/main/App.js
Executable file
23
cmd/dapp-fm-app/frontend/wailsjs/go/main/App.js
Executable file
|
|
@ -0,0 +1,23 @@
|
||||||
|
// @ts-check
|
||||||
|
// Cynhyrchwyd y ffeil hon yn awtomatig. PEIDIWCH Â MODIWL
|
||||||
|
// This file is automatically generated. DO NOT EDIT
|
||||||
|
|
||||||
|
export function DecryptAndServe(arg1, arg2) {
|
||||||
|
return window['go']['main']['App']['DecryptAndServe'](arg1, arg2);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function GetDemoManifest() {
|
||||||
|
return window['go']['main']['App']['GetDemoManifest']();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function GetManifest(arg1) {
|
||||||
|
return window['go']['main']['App']['GetManifest'](arg1);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function IsLicenseValid(arg1) {
|
||||||
|
return window['go']['main']['App']['IsLicenseValid'](arg1);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LoadDemo() {
|
||||||
|
return window['go']['main']['App']['LoadDemo']();
|
||||||
|
}
|
||||||
140
cmd/dapp-fm-app/frontend/wailsjs/go/models.ts
Executable file
140
cmd/dapp-fm-app/frontend/wailsjs/go/models.ts
Executable file
|
|
@ -0,0 +1,140 @@
|
||||||
|
export namespace main {
|
||||||
|
|
||||||
|
export class MediaAttachment {
|
||||||
|
name: string;
|
||||||
|
mime_type: string;
|
||||||
|
size: number;
|
||||||
|
url: string;
|
||||||
|
|
||||||
|
static createFrom(source: any = {}) {
|
||||||
|
return new MediaAttachment(source);
|
||||||
|
}
|
||||||
|
|
||||||
|
constructor(source: any = {}) {
|
||||||
|
if ('string' === typeof source) source = JSON.parse(source);
|
||||||
|
this.name = source["name"];
|
||||||
|
this.mime_type = source["mime_type"];
|
||||||
|
this.size = source["size"];
|
||||||
|
this.url = source["url"];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
export class MediaResult {
|
||||||
|
body: string;
|
||||||
|
subject?: string;
|
||||||
|
from?: string;
|
||||||
|
attachments?: MediaAttachment[];
|
||||||
|
|
||||||
|
static createFrom(source: any = {}) {
|
||||||
|
return new MediaResult(source);
|
||||||
|
}
|
||||||
|
|
||||||
|
constructor(source: any = {}) {
|
||||||
|
if ('string' === typeof source) source = JSON.parse(source);
|
||||||
|
this.body = source["body"];
|
||||||
|
this.subject = source["subject"];
|
||||||
|
this.from = source["from"];
|
||||||
|
this.attachments = this.convertValues(source["attachments"], MediaAttachment);
|
||||||
|
}
|
||||||
|
|
||||||
|
convertValues(a: any, classs: any, asMap: boolean = false): any {
|
||||||
|
if (!a) {
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
if (a.slice && a.map) {
|
||||||
|
return (a as any[]).map(elem => this.convertValues(elem, classs));
|
||||||
|
} else if ("object" === typeof a) {
|
||||||
|
if (asMap) {
|
||||||
|
for (const key of Object.keys(a)) {
|
||||||
|
a[key] = new classs(a[key]);
|
||||||
|
}
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
return new classs(a);
|
||||||
|
}
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace player {
|
||||||
|
|
||||||
|
export class TrackInfo {
|
||||||
|
title: string;
|
||||||
|
start: number;
|
||||||
|
end?: number;
|
||||||
|
type?: string;
|
||||||
|
track_num?: number;
|
||||||
|
|
||||||
|
static createFrom(source: any = {}) {
|
||||||
|
return new TrackInfo(source);
|
||||||
|
}
|
||||||
|
|
||||||
|
constructor(source: any = {}) {
|
||||||
|
if ('string' === typeof source) source = JSON.parse(source);
|
||||||
|
this.title = source["title"];
|
||||||
|
this.start = source["start"];
|
||||||
|
this.end = source["end"];
|
||||||
|
this.type = source["type"];
|
||||||
|
this.track_num = source["track_num"];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
export class ManifestInfo {
|
||||||
|
title?: string;
|
||||||
|
artist?: string;
|
||||||
|
album?: string;
|
||||||
|
genre?: string;
|
||||||
|
year?: number;
|
||||||
|
release_type?: string;
|
||||||
|
duration?: number;
|
||||||
|
format?: string;
|
||||||
|
expires_at?: number;
|
||||||
|
issued_at?: number;
|
||||||
|
license_type?: string;
|
||||||
|
tracks?: TrackInfo[];
|
||||||
|
is_expired: boolean;
|
||||||
|
time_remaining?: string;
|
||||||
|
|
||||||
|
static createFrom(source: any = {}) {
|
||||||
|
return new ManifestInfo(source);
|
||||||
|
}
|
||||||
|
|
||||||
|
constructor(source: any = {}) {
|
||||||
|
if ('string' === typeof source) source = JSON.parse(source);
|
||||||
|
this.title = source["title"];
|
||||||
|
this.artist = source["artist"];
|
||||||
|
this.album = source["album"];
|
||||||
|
this.genre = source["genre"];
|
||||||
|
this.year = source["year"];
|
||||||
|
this.release_type = source["release_type"];
|
||||||
|
this.duration = source["duration"];
|
||||||
|
this.format = source["format"];
|
||||||
|
this.expires_at = source["expires_at"];
|
||||||
|
this.issued_at = source["issued_at"];
|
||||||
|
this.license_type = source["license_type"];
|
||||||
|
this.tracks = this.convertValues(source["tracks"], TrackInfo);
|
||||||
|
this.is_expired = source["is_expired"];
|
||||||
|
this.time_remaining = source["time_remaining"];
|
||||||
|
}
|
||||||
|
|
||||||
|
convertValues(a: any, classs: any, asMap: boolean = false): any {
|
||||||
|
if (!a) {
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
if (a.slice && a.map) {
|
||||||
|
return (a as any[]).map(elem => this.convertValues(elem, classs));
|
||||||
|
} else if ("object" === typeof a) {
|
||||||
|
if (asMap) {
|
||||||
|
for (const key of Object.keys(a)) {
|
||||||
|
a[key] = new classs(a[key]);
|
||||||
|
}
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
return new classs(a);
|
||||||
|
}
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
}
|
||||||
|
|
||||||
24
cmd/dapp-fm-app/frontend/wailsjs/runtime/package.json
Normal file
24
cmd/dapp-fm-app/frontend/wailsjs/runtime/package.json
Normal file
|
|
@ -0,0 +1,24 @@
|
||||||
|
{
|
||||||
|
"name": "@wailsapp/runtime",
|
||||||
|
"version": "2.0.0",
|
||||||
|
"description": "Wails Javascript runtime library",
|
||||||
|
"main": "runtime.js",
|
||||||
|
"types": "runtime.d.ts",
|
||||||
|
"scripts": {
|
||||||
|
},
|
||||||
|
"repository": {
|
||||||
|
"type": "git",
|
||||||
|
"url": "git+https://github.com/wailsapp/wails.git"
|
||||||
|
},
|
||||||
|
"keywords": [
|
||||||
|
"Wails",
|
||||||
|
"Javascript",
|
||||||
|
"Go"
|
||||||
|
],
|
||||||
|
"author": "Lea Anthony <lea.anthony@gmail.com>",
|
||||||
|
"license": "MIT",
|
||||||
|
"bugs": {
|
||||||
|
"url": "https://github.com/wailsapp/wails/issues"
|
||||||
|
},
|
||||||
|
"homepage": "https://github.com/wailsapp/wails#readme"
|
||||||
|
}
|
||||||
249
cmd/dapp-fm-app/frontend/wailsjs/runtime/runtime.d.ts
vendored
Normal file
249
cmd/dapp-fm-app/frontend/wailsjs/runtime/runtime.d.ts
vendored
Normal file
|
|
@ -0,0 +1,249 @@
|
||||||
|
/*
|
||||||
|
_ __ _ __
|
||||||
|
| | / /___ _(_) /____
|
||||||
|
| | /| / / __ `/ / / ___/
|
||||||
|
| |/ |/ / /_/ / / (__ )
|
||||||
|
|__/|__/\__,_/_/_/____/
|
||||||
|
The electron alternative for Go
|
||||||
|
(c) Lea Anthony 2019-present
|
||||||
|
*/
|
||||||
|
|
||||||
|
export interface Position {
|
||||||
|
x: number;
|
||||||
|
y: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Size {
|
||||||
|
w: number;
|
||||||
|
h: number;
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface Screen {
|
||||||
|
isCurrent: boolean;
|
||||||
|
isPrimary: boolean;
|
||||||
|
width : number
|
||||||
|
height : number
|
||||||
|
}
|
||||||
|
|
||||||
|
// Environment information such as platform, buildtype, ...
|
||||||
|
export interface EnvironmentInfo {
|
||||||
|
buildType: string;
|
||||||
|
platform: string;
|
||||||
|
arch: string;
|
||||||
|
}
|
||||||
|
|
||||||
|
// [EventsEmit](https://wails.io/docs/reference/runtime/events#eventsemit)
|
||||||
|
// emits the given event. Optional data may be passed with the event.
|
||||||
|
// This will trigger any event listeners.
|
||||||
|
export function EventsEmit(eventName: string, ...data: any): void;
|
||||||
|
|
||||||
|
// [EventsOn](https://wails.io/docs/reference/runtime/events#eventson) sets up a listener for the given event name.
|
||||||
|
export function EventsOn(eventName: string, callback: (...data: any) => void): () => void;
|
||||||
|
|
||||||
|
// [EventsOnMultiple](https://wails.io/docs/reference/runtime/events#eventsonmultiple)
|
||||||
|
// sets up a listener for the given event name, but will only trigger a given number times.
|
||||||
|
export function EventsOnMultiple(eventName: string, callback: (...data: any) => void, maxCallbacks: number): () => void;
|
||||||
|
|
||||||
|
// [EventsOnce](https://wails.io/docs/reference/runtime/events#eventsonce)
|
||||||
|
// sets up a listener for the given event name, but will only trigger once.
|
||||||
|
export function EventsOnce(eventName: string, callback: (...data: any) => void): () => void;
|
||||||
|
|
||||||
|
// [EventsOff](https://wails.io/docs/reference/runtime/events#eventsoff)
|
||||||
|
// unregisters the listener for the given event name.
|
||||||
|
export function EventsOff(eventName: string, ...additionalEventNames: string[]): void;
|
||||||
|
|
||||||
|
// [EventsOffAll](https://wails.io/docs/reference/runtime/events#eventsoffall)
|
||||||
|
// unregisters all listeners.
|
||||||
|
export function EventsOffAll(): void;
|
||||||
|
|
||||||
|
// [LogPrint](https://wails.io/docs/reference/runtime/log#logprint)
|
||||||
|
// logs the given message as a raw message
|
||||||
|
export function LogPrint(message: string): void;
|
||||||
|
|
||||||
|
// [LogTrace](https://wails.io/docs/reference/runtime/log#logtrace)
|
||||||
|
// logs the given message at the `trace` log level.
|
||||||
|
export function LogTrace(message: string): void;
|
||||||
|
|
||||||
|
// [LogDebug](https://wails.io/docs/reference/runtime/log#logdebug)
|
||||||
|
// logs the given message at the `debug` log level.
|
||||||
|
export function LogDebug(message: string): void;
|
||||||
|
|
||||||
|
// [LogError](https://wails.io/docs/reference/runtime/log#logerror)
|
||||||
|
// logs the given message at the `error` log level.
|
||||||
|
export function LogError(message: string): void;
|
||||||
|
|
||||||
|
// [LogFatal](https://wails.io/docs/reference/runtime/log#logfatal)
|
||||||
|
// logs the given message at the `fatal` log level.
|
||||||
|
// The application will quit after calling this method.
|
||||||
|
export function LogFatal(message: string): void;
|
||||||
|
|
||||||
|
// [LogInfo](https://wails.io/docs/reference/runtime/log#loginfo)
|
||||||
|
// logs the given message at the `info` log level.
|
||||||
|
export function LogInfo(message: string): void;
|
||||||
|
|
||||||
|
// [LogWarning](https://wails.io/docs/reference/runtime/log#logwarning)
|
||||||
|
// logs the given message at the `warning` log level.
|
||||||
|
export function LogWarning(message: string): void;
|
||||||
|
|
||||||
|
// [WindowReload](https://wails.io/docs/reference/runtime/window#windowreload)
|
||||||
|
// Forces a reload by the main application as well as connected browsers.
|
||||||
|
export function WindowReload(): void;
|
||||||
|
|
||||||
|
// [WindowReloadApp](https://wails.io/docs/reference/runtime/window#windowreloadapp)
|
||||||
|
// Reloads the application frontend.
|
||||||
|
export function WindowReloadApp(): void;
|
||||||
|
|
||||||
|
// [WindowSetAlwaysOnTop](https://wails.io/docs/reference/runtime/window#windowsetalwaysontop)
|
||||||
|
// Sets the window AlwaysOnTop or not on top.
|
||||||
|
export function WindowSetAlwaysOnTop(b: boolean): void;
|
||||||
|
|
||||||
|
// [WindowSetSystemDefaultTheme](https://wails.io/docs/next/reference/runtime/window#windowsetsystemdefaulttheme)
|
||||||
|
// *Windows only*
|
||||||
|
// Sets window theme to system default (dark/light).
|
||||||
|
export function WindowSetSystemDefaultTheme(): void;
|
||||||
|
|
||||||
|
// [WindowSetLightTheme](https://wails.io/docs/next/reference/runtime/window#windowsetlighttheme)
|
||||||
|
// *Windows only*
|
||||||
|
// Sets window to light theme.
|
||||||
|
export function WindowSetLightTheme(): void;
|
||||||
|
|
||||||
|
// [WindowSetDarkTheme](https://wails.io/docs/next/reference/runtime/window#windowsetdarktheme)
|
||||||
|
// *Windows only*
|
||||||
|
// Sets window to dark theme.
|
||||||
|
export function WindowSetDarkTheme(): void;
|
||||||
|
|
||||||
|
// [WindowCenter](https://wails.io/docs/reference/runtime/window#windowcenter)
|
||||||
|
// Centers the window on the monitor the window is currently on.
|
||||||
|
export function WindowCenter(): void;
|
||||||
|
|
||||||
|
// [WindowSetTitle](https://wails.io/docs/reference/runtime/window#windowsettitle)
|
||||||
|
// Sets the text in the window title bar.
|
||||||
|
export function WindowSetTitle(title: string): void;
|
||||||
|
|
||||||
|
// [WindowFullscreen](https://wails.io/docs/reference/runtime/window#windowfullscreen)
|
||||||
|
// Makes the window full screen.
|
||||||
|
export function WindowFullscreen(): void;
|
||||||
|
|
||||||
|
// [WindowUnfullscreen](https://wails.io/docs/reference/runtime/window#windowunfullscreen)
|
||||||
|
// Restores the previous window dimensions and position prior to full screen.
|
||||||
|
export function WindowUnfullscreen(): void;
|
||||||
|
|
||||||
|
// [WindowIsFullscreen](https://wails.io/docs/reference/runtime/window#windowisfullscreen)
|
||||||
|
// Returns the state of the window, i.e. whether the window is in full screen mode or not.
|
||||||
|
export function WindowIsFullscreen(): Promise<boolean>;
|
||||||
|
|
||||||
|
// [WindowSetSize](https://wails.io/docs/reference/runtime/window#windowsetsize)
|
||||||
|
// Sets the width and height of the window.
|
||||||
|
export function WindowSetSize(width: number, height: number): void;
|
||||||
|
|
||||||
|
// [WindowGetSize](https://wails.io/docs/reference/runtime/window#windowgetsize)
|
||||||
|
// Gets the width and height of the window.
|
||||||
|
export function WindowGetSize(): Promise<Size>;
|
||||||
|
|
||||||
|
// [WindowSetMaxSize](https://wails.io/docs/reference/runtime/window#windowsetmaxsize)
|
||||||
|
// Sets the maximum window size. Will resize the window if the window is currently larger than the given dimensions.
|
||||||
|
// Setting a size of 0,0 will disable this constraint.
|
||||||
|
export function WindowSetMaxSize(width: number, height: number): void;
|
||||||
|
|
||||||
|
// [WindowSetMinSize](https://wails.io/docs/reference/runtime/window#windowsetminsize)
|
||||||
|
// Sets the minimum window size. Will resize the window if the window is currently smaller than the given dimensions.
|
||||||
|
// Setting a size of 0,0 will disable this constraint.
|
||||||
|
export function WindowSetMinSize(width: number, height: number): void;
|
||||||
|
|
||||||
|
// [WindowSetPosition](https://wails.io/docs/reference/runtime/window#windowsetposition)
|
||||||
|
// Sets the window position relative to the monitor the window is currently on.
|
||||||
|
export function WindowSetPosition(x: number, y: number): void;
|
||||||
|
|
||||||
|
// [WindowGetPosition](https://wails.io/docs/reference/runtime/window#windowgetposition)
|
||||||
|
// Gets the window position relative to the monitor the window is currently on.
|
||||||
|
export function WindowGetPosition(): Promise<Position>;
|
||||||
|
|
||||||
|
// [WindowHide](https://wails.io/docs/reference/runtime/window#windowhide)
|
||||||
|
// Hides the window.
|
||||||
|
export function WindowHide(): void;
|
||||||
|
|
||||||
|
// [WindowShow](https://wails.io/docs/reference/runtime/window#windowshow)
|
||||||
|
// Shows the window, if it is currently hidden.
|
||||||
|
export function WindowShow(): void;
|
||||||
|
|
||||||
|
// [WindowMaximise](https://wails.io/docs/reference/runtime/window#windowmaximise)
|
||||||
|
// Maximises the window to fill the screen.
|
||||||
|
export function WindowMaximise(): void;
|
||||||
|
|
||||||
|
// [WindowToggleMaximise](https://wails.io/docs/reference/runtime/window#windowtogglemaximise)
|
||||||
|
// Toggles between Maximised and UnMaximised.
|
||||||
|
export function WindowToggleMaximise(): void;
|
||||||
|
|
||||||
|
// [WindowUnmaximise](https://wails.io/docs/reference/runtime/window#windowunmaximise)
|
||||||
|
// Restores the window to the dimensions and position prior to maximising.
|
||||||
|
export function WindowUnmaximise(): void;
|
||||||
|
|
||||||
|
// [WindowIsMaximised](https://wails.io/docs/reference/runtime/window#windowismaximised)
|
||||||
|
// Returns the state of the window, i.e. whether the window is maximised or not.
|
||||||
|
export function WindowIsMaximised(): Promise<boolean>;
|
||||||
|
|
||||||
|
// [WindowMinimise](https://wails.io/docs/reference/runtime/window#windowminimise)
|
||||||
|
// Minimises the window.
|
||||||
|
export function WindowMinimise(): void;
|
||||||
|
|
||||||
|
// [WindowUnminimise](https://wails.io/docs/reference/runtime/window#windowunminimise)
|
||||||
|
// Restores the window to the dimensions and position prior to minimising.
|
||||||
|
export function WindowUnminimise(): void;
|
||||||
|
|
||||||
|
// [WindowIsMinimised](https://wails.io/docs/reference/runtime/window#windowisminimised)
|
||||||
|
// Returns the state of the window, i.e. whether the window is minimised or not.
|
||||||
|
export function WindowIsMinimised(): Promise<boolean>;
|
||||||
|
|
||||||
|
// [WindowIsNormal](https://wails.io/docs/reference/runtime/window#windowisnormal)
|
||||||
|
// Returns the state of the window, i.e. whether the window is normal or not.
|
||||||
|
export function WindowIsNormal(): Promise<boolean>;
|
||||||
|
|
||||||
|
// [WindowSetBackgroundColour](https://wails.io/docs/reference/runtime/window#windowsetbackgroundcolour)
|
||||||
|
// Sets the background colour of the window to the given RGBA colour definition. This colour will show through for all transparent pixels.
|
||||||
|
export function WindowSetBackgroundColour(R: number, G: number, B: number, A: number): void;
|
||||||
|
|
||||||
|
// [ScreenGetAll](https://wails.io/docs/reference/runtime/window#screengetall)
|
||||||
|
// Gets the all screens. Call this anew each time you want to refresh data from the underlying windowing system.
|
||||||
|
export function ScreenGetAll(): Promise<Screen[]>;
|
||||||
|
|
||||||
|
// [BrowserOpenURL](https://wails.io/docs/reference/runtime/browser#browseropenurl)
|
||||||
|
// Opens the given URL in the system browser.
|
||||||
|
export function BrowserOpenURL(url: string): void;
|
||||||
|
|
||||||
|
// [Environment](https://wails.io/docs/reference/runtime/intro#environment)
|
||||||
|
// Returns information about the environment
|
||||||
|
export function Environment(): Promise<EnvironmentInfo>;
|
||||||
|
|
||||||
|
// [Quit](https://wails.io/docs/reference/runtime/intro#quit)
|
||||||
|
// Quits the application.
|
||||||
|
export function Quit(): void;
|
||||||
|
|
||||||
|
// [Hide](https://wails.io/docs/reference/runtime/intro#hide)
|
||||||
|
// Hides the application.
|
||||||
|
export function Hide(): void;
|
||||||
|
|
||||||
|
// [Show](https://wails.io/docs/reference/runtime/intro#show)
|
||||||
|
// Shows the application.
|
||||||
|
export function Show(): void;
|
||||||
|
|
||||||
|
// [ClipboardGetText](https://wails.io/docs/reference/runtime/clipboard#clipboardgettext)
|
||||||
|
// Returns the current text stored on clipboard
|
||||||
|
export function ClipboardGetText(): Promise<string>;
|
||||||
|
|
||||||
|
// [ClipboardSetText](https://wails.io/docs/reference/runtime/clipboard#clipboardsettext)
|
||||||
|
// Sets a text on the clipboard
|
||||||
|
export function ClipboardSetText(text: string): Promise<boolean>;
|
||||||
|
|
||||||
|
// [OnFileDrop](https://wails.io/docs/reference/runtime/draganddrop#onfiledrop)
|
||||||
|
// OnFileDrop listens to drag and drop events and calls the callback with the coordinates of the drop and an array of path strings.
|
||||||
|
export function OnFileDrop(callback: (x: number, y: number ,paths: string[]) => void, useDropTarget: boolean) :void
|
||||||
|
|
||||||
|
// [OnFileDropOff](https://wails.io/docs/reference/runtime/draganddrop#dragandddropoff)
|
||||||
|
// OnFileDropOff removes the drag and drop listeners and handlers.
|
||||||
|
export function OnFileDropOff() :void
|
||||||
|
|
||||||
|
// Check if the file path resolver is available
|
||||||
|
export function CanResolveFilePaths(): boolean;
|
||||||
|
|
||||||
|
// Resolves file paths for an array of files
|
||||||
|
export function ResolveFilePaths(files: File[]): void
|
||||||
242
cmd/dapp-fm-app/frontend/wailsjs/runtime/runtime.js
Normal file
242
cmd/dapp-fm-app/frontend/wailsjs/runtime/runtime.js
Normal file
|
|
@ -0,0 +1,242 @@
|
||||||
|
/*
|
||||||
|
_ __ _ __
|
||||||
|
| | / /___ _(_) /____
|
||||||
|
| | /| / / __ `/ / / ___/
|
||||||
|
| |/ |/ / /_/ / / (__ )
|
||||||
|
|__/|__/\__,_/_/_/____/
|
||||||
|
The electron alternative for Go
|
||||||
|
(c) Lea Anthony 2019-present
|
||||||
|
*/
|
||||||
|
|
||||||
|
export function LogPrint(message) {
|
||||||
|
window.runtime.LogPrint(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LogTrace(message) {
|
||||||
|
window.runtime.LogTrace(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LogDebug(message) {
|
||||||
|
window.runtime.LogDebug(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LogInfo(message) {
|
||||||
|
window.runtime.LogInfo(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LogWarning(message) {
|
||||||
|
window.runtime.LogWarning(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LogError(message) {
|
||||||
|
window.runtime.LogError(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function LogFatal(message) {
|
||||||
|
window.runtime.LogFatal(message);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function EventsOnMultiple(eventName, callback, maxCallbacks) {
|
||||||
|
return window.runtime.EventsOnMultiple(eventName, callback, maxCallbacks);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function EventsOn(eventName, callback) {
|
||||||
|
return EventsOnMultiple(eventName, callback, -1);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function EventsOff(eventName, ...additionalEventNames) {
|
||||||
|
return window.runtime.EventsOff(eventName, ...additionalEventNames);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function EventsOffAll() {
|
||||||
|
return window.runtime.EventsOffAll();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function EventsOnce(eventName, callback) {
|
||||||
|
return EventsOnMultiple(eventName, callback, 1);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function EventsEmit(eventName) {
|
||||||
|
let args = [eventName].slice.call(arguments);
|
||||||
|
return window.runtime.EventsEmit.apply(null, args);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowReload() {
|
||||||
|
window.runtime.WindowReload();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowReloadApp() {
|
||||||
|
window.runtime.WindowReloadApp();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetAlwaysOnTop(b) {
|
||||||
|
window.runtime.WindowSetAlwaysOnTop(b);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetSystemDefaultTheme() {
|
||||||
|
window.runtime.WindowSetSystemDefaultTheme();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetLightTheme() {
|
||||||
|
window.runtime.WindowSetLightTheme();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetDarkTheme() {
|
||||||
|
window.runtime.WindowSetDarkTheme();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowCenter() {
|
||||||
|
window.runtime.WindowCenter();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetTitle(title) {
|
||||||
|
window.runtime.WindowSetTitle(title);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowFullscreen() {
|
||||||
|
window.runtime.WindowFullscreen();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowUnfullscreen() {
|
||||||
|
window.runtime.WindowUnfullscreen();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowIsFullscreen() {
|
||||||
|
return window.runtime.WindowIsFullscreen();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowGetSize() {
|
||||||
|
return window.runtime.WindowGetSize();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetSize(width, height) {
|
||||||
|
window.runtime.WindowSetSize(width, height);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetMaxSize(width, height) {
|
||||||
|
window.runtime.WindowSetMaxSize(width, height);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetMinSize(width, height) {
|
||||||
|
window.runtime.WindowSetMinSize(width, height);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetPosition(x, y) {
|
||||||
|
window.runtime.WindowSetPosition(x, y);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowGetPosition() {
|
||||||
|
return window.runtime.WindowGetPosition();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowHide() {
|
||||||
|
window.runtime.WindowHide();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowShow() {
|
||||||
|
window.runtime.WindowShow();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowMaximise() {
|
||||||
|
window.runtime.WindowMaximise();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowToggleMaximise() {
|
||||||
|
window.runtime.WindowToggleMaximise();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowUnmaximise() {
|
||||||
|
window.runtime.WindowUnmaximise();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowIsMaximised() {
|
||||||
|
return window.runtime.WindowIsMaximised();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowMinimise() {
|
||||||
|
window.runtime.WindowMinimise();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowUnminimise() {
|
||||||
|
window.runtime.WindowUnminimise();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowSetBackgroundColour(R, G, B, A) {
|
||||||
|
window.runtime.WindowSetBackgroundColour(R, G, B, A);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ScreenGetAll() {
|
||||||
|
return window.runtime.ScreenGetAll();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowIsMinimised() {
|
||||||
|
return window.runtime.WindowIsMinimised();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function WindowIsNormal() {
|
||||||
|
return window.runtime.WindowIsNormal();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function BrowserOpenURL(url) {
|
||||||
|
window.runtime.BrowserOpenURL(url);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Environment() {
|
||||||
|
return window.runtime.Environment();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Quit() {
|
||||||
|
window.runtime.Quit();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Hide() {
|
||||||
|
window.runtime.Hide();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Show() {
|
||||||
|
window.runtime.Show();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ClipboardGetText() {
|
||||||
|
return window.runtime.ClipboardGetText();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ClipboardSetText(text) {
|
||||||
|
return window.runtime.ClipboardSetText(text);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* Callback for OnFileDrop returns a slice of file path strings when a drop is finished.
|
||||||
|
*
|
||||||
|
* @export
|
||||||
|
* @callback OnFileDropCallback
|
||||||
|
* @param {number} x - x coordinate of the drop
|
||||||
|
* @param {number} y - y coordinate of the drop
|
||||||
|
* @param {string[]} paths - A list of file paths.
|
||||||
|
*/
|
||||||
|
|
||||||
|
/**
|
||||||
|
* OnFileDrop listens to drag and drop events and calls the callback with the coordinates of the drop and an array of path strings.
|
||||||
|
*
|
||||||
|
* @export
|
||||||
|
* @param {OnFileDropCallback} callback - Callback for OnFileDrop returns a slice of file path strings when a drop is finished.
|
||||||
|
* @param {boolean} [useDropTarget=true] - Only call the callback when the drop finished on an element that has the drop target style. (--wails-drop-target)
|
||||||
|
*/
|
||||||
|
export function OnFileDrop(callback, useDropTarget) {
|
||||||
|
return window.runtime.OnFileDrop(callback, useDropTarget);
|
||||||
|
}
|
||||||
|
|
||||||
|
/**
|
||||||
|
* OnFileDropOff removes the drag and drop listeners and handlers.
|
||||||
|
*/
|
||||||
|
export function OnFileDropOff() {
|
||||||
|
return window.runtime.OnFileDropOff();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function CanResolveFilePaths() {
|
||||||
|
return window.runtime.CanResolveFilePaths();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function ResolveFilePaths(files) {
|
||||||
|
return window.runtime.ResolveFilePaths(files);
|
||||||
|
}
|
||||||
322
cmd/dapp-fm-app/main.go
Normal file
322
cmd/dapp-fm-app/main.go
Normal file
|
|
@ -0,0 +1,322 @@
|
||||||
|
// dapp-fm-app is a native desktop media player for dapp.fm
|
||||||
|
// Decryption in Go, media served via Wails asset handler (same origin, no CORS)
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"context"
|
||||||
|
"embed"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"io/fs"
|
||||||
|
"net/http"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
"sync"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/player"
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
"github.com/wailsapp/wails/v2"
|
||||||
|
"github.com/wailsapp/wails/v2/pkg/options"
|
||||||
|
"github.com/wailsapp/wails/v2/pkg/options/assetserver"
|
||||||
|
)
|
||||||
|
|
||||||
|
//go:embed frontend
|
||||||
|
var frontendAssets embed.FS
|
||||||
|
|
||||||
|
// MediaStore holds decrypted media in memory
|
||||||
|
type MediaStore struct {
|
||||||
|
mu sync.RWMutex
|
||||||
|
media map[string]*MediaItem
|
||||||
|
}
|
||||||
|
|
||||||
|
type MediaItem struct {
|
||||||
|
Data []byte
|
||||||
|
MimeType string
|
||||||
|
Name string
|
||||||
|
}
|
||||||
|
|
||||||
|
var globalStore = &MediaStore{media: make(map[string]*MediaItem)}
|
||||||
|
|
||||||
|
func (s *MediaStore) Set(id string, item *MediaItem) {
|
||||||
|
s.mu.Lock()
|
||||||
|
defer s.mu.Unlock()
|
||||||
|
s.media[id] = item
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *MediaStore) Get(id string) *MediaItem {
|
||||||
|
s.mu.RLock()
|
||||||
|
defer s.mu.RUnlock()
|
||||||
|
return s.media[id]
|
||||||
|
}
|
||||||
|
|
||||||
|
func (s *MediaStore) Clear() {
|
||||||
|
s.mu.Lock()
|
||||||
|
defer s.mu.Unlock()
|
||||||
|
s.media = make(map[string]*MediaItem)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AssetHandler serves both static assets and decrypted media
|
||||||
|
type AssetHandler struct {
|
||||||
|
assets fs.FS
|
||||||
|
}
|
||||||
|
|
||||||
|
func (h *AssetHandler) ServeHTTP(w http.ResponseWriter, r *http.Request) {
|
||||||
|
path := r.URL.Path
|
||||||
|
if path == "/" {
|
||||||
|
path = "/index.html"
|
||||||
|
}
|
||||||
|
path = strings.TrimPrefix(path, "/")
|
||||||
|
|
||||||
|
// Check if this is a media request
|
||||||
|
if strings.HasPrefix(path, "media/") {
|
||||||
|
id := strings.TrimPrefix(path, "media/")
|
||||||
|
item := globalStore.Get(id)
|
||||||
|
if item == nil {
|
||||||
|
http.NotFound(w, r)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serve with range support for seeking
|
||||||
|
w.Header().Set("Content-Type", item.MimeType)
|
||||||
|
w.Header().Set("Accept-Ranges", "bytes")
|
||||||
|
w.Header().Set("Content-Length", strconv.Itoa(len(item.Data)))
|
||||||
|
|
||||||
|
rangeHeader := r.Header.Get("Range")
|
||||||
|
if rangeHeader != "" && strings.HasPrefix(rangeHeader, "bytes=") {
|
||||||
|
rangeHeader = strings.TrimPrefix(rangeHeader, "bytes=")
|
||||||
|
parts := strings.Split(rangeHeader, "-")
|
||||||
|
start, _ := strconv.Atoi(parts[0])
|
||||||
|
end := len(item.Data) - 1
|
||||||
|
if len(parts) > 1 && parts[1] != "" {
|
||||||
|
end, _ = strconv.Atoi(parts[1])
|
||||||
|
}
|
||||||
|
if end >= len(item.Data) {
|
||||||
|
end = len(item.Data) - 1
|
||||||
|
}
|
||||||
|
if start > end || start >= len(item.Data) {
|
||||||
|
http.Error(w, "Invalid range", http.StatusRequestedRangeNotSatisfiable)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Range", fmt.Sprintf("bytes %d-%d/%d", start, end, len(item.Data)))
|
||||||
|
w.Header().Set("Content-Length", strconv.Itoa(end-start+1))
|
||||||
|
w.WriteHeader(http.StatusPartialContent)
|
||||||
|
w.Write(item.Data[start : end+1])
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
http.ServeContent(w, r, item.Name, time.Time{}, bytes.NewReader(item.Data))
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serve static assets
|
||||||
|
data, err := fs.ReadFile(h.assets, "frontend/"+path)
|
||||||
|
if err != nil {
|
||||||
|
http.NotFound(w, r)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set content type
|
||||||
|
switch {
|
||||||
|
case strings.HasSuffix(path, ".html"):
|
||||||
|
w.Header().Set("Content-Type", "text/html; charset=utf-8")
|
||||||
|
case strings.HasSuffix(path, ".js"):
|
||||||
|
w.Header().Set("Content-Type", "application/javascript")
|
||||||
|
case strings.HasSuffix(path, ".css"):
|
||||||
|
w.Header().Set("Content-Type", "text/css")
|
||||||
|
case strings.HasSuffix(path, ".wasm"):
|
||||||
|
w.Header().Set("Content-Type", "application/wasm")
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Write(data)
|
||||||
|
}
|
||||||
|
|
||||||
|
// App wraps player functionality
|
||||||
|
type App struct {
|
||||||
|
ctx context.Context
|
||||||
|
player *player.Player
|
||||||
|
}
|
||||||
|
|
||||||
|
func NewApp() *App {
|
||||||
|
return &App{
|
||||||
|
player: player.NewPlayer(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func (a *App) Startup(ctx context.Context) {
|
||||||
|
a.ctx = ctx
|
||||||
|
a.player.Startup(ctx)
|
||||||
|
}
|
||||||
|
|
||||||
|
// MediaResult holds URLs for playback
|
||||||
|
type MediaResult struct {
|
||||||
|
Body string `json:"body"`
|
||||||
|
Subject string `json:"subject,omitempty"`
|
||||||
|
From string `json:"from,omitempty"`
|
||||||
|
Attachments []MediaAttachment `json:"attachments,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
type MediaAttachment struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
MimeType string `json:"mime_type"`
|
||||||
|
Size int `json:"size"`
|
||||||
|
URL string `json:"url"` // /media/0, /media/1, etc.
|
||||||
|
}
|
||||||
|
|
||||||
|
// LoadDemo decrypts demo and stores in memory for streaming
|
||||||
|
func (a *App) LoadDemo() (*MediaResult, error) {
|
||||||
|
globalStore.Clear()
|
||||||
|
|
||||||
|
// Read demo from embedded filesystem
|
||||||
|
demoBytes, err := fs.ReadFile(frontendAssets, "frontend/demo-track.smsg")
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("demo not found: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
msg, err := smsg.Decrypt(demoBytes, "dapp-fm-2024")
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("decrypt failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
result := &MediaResult{
|
||||||
|
Body: msg.Body,
|
||||||
|
Subject: msg.Subject,
|
||||||
|
From: msg.From,
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, att := range msg.Attachments {
|
||||||
|
// Decode base64 to raw bytes
|
||||||
|
data, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Store in memory
|
||||||
|
id := strconv.Itoa(i)
|
||||||
|
globalStore.Set(id, &MediaItem{
|
||||||
|
Data: data,
|
||||||
|
MimeType: att.MimeType,
|
||||||
|
Name: att.Name,
|
||||||
|
})
|
||||||
|
|
||||||
|
result.Attachments = append(result.Attachments, MediaAttachment{
|
||||||
|
Name: att.Name,
|
||||||
|
MimeType: att.MimeType,
|
||||||
|
Size: len(data),
|
||||||
|
URL: "/media/" + id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetDemoManifest returns manifest without decrypting
|
||||||
|
func (a *App) GetDemoManifest() (*player.ManifestInfo, error) {
|
||||||
|
demoBytes, err := fs.ReadFile(frontendAssets, "frontend/demo-track.smsg")
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("demo not found: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
info, err := smsg.GetInfo(demoBytes)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
result := &player.ManifestInfo{}
|
||||||
|
if info.Manifest != nil {
|
||||||
|
m := info.Manifest
|
||||||
|
result.Title = m.Title
|
||||||
|
result.Artist = m.Artist
|
||||||
|
result.Album = m.Album
|
||||||
|
result.ReleaseType = m.ReleaseType
|
||||||
|
result.Format = m.Format
|
||||||
|
result.LicenseType = m.LicenseType
|
||||||
|
|
||||||
|
for _, t := range m.Tracks {
|
||||||
|
result.Tracks = append(result.Tracks, player.TrackInfo{
|
||||||
|
Title: t.Title,
|
||||||
|
Start: t.Start,
|
||||||
|
End: t.End,
|
||||||
|
TrackNum: t.TrackNum,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// DecryptAndServe decrypts user-provided content and serves via asset handler
|
||||||
|
func (a *App) DecryptAndServe(encrypted string, password string) (*MediaResult, error) {
|
||||||
|
globalStore.Clear()
|
||||||
|
|
||||||
|
// Decrypt using player (handles base64 input)
|
||||||
|
msg, err := smsg.DecryptBase64(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("decrypt failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
result := &MediaResult{
|
||||||
|
Body: msg.Body,
|
||||||
|
Subject: msg.Subject,
|
||||||
|
From: msg.From,
|
||||||
|
}
|
||||||
|
|
||||||
|
for i, att := range msg.Attachments {
|
||||||
|
data, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
id := strconv.Itoa(i)
|
||||||
|
globalStore.Set(id, &MediaItem{
|
||||||
|
Data: data,
|
||||||
|
MimeType: att.MimeType,
|
||||||
|
Name: att.Name,
|
||||||
|
})
|
||||||
|
|
||||||
|
result.Attachments = append(result.Attachments, MediaAttachment{
|
||||||
|
Name: att.Name,
|
||||||
|
MimeType: att.MimeType,
|
||||||
|
Size: len(data),
|
||||||
|
URL: "/media/" + id,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Proxy methods
|
||||||
|
func (a *App) GetManifest(encrypted string) (*player.ManifestInfo, error) {
|
||||||
|
return a.player.GetManifest(encrypted)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (a *App) IsLicenseValid(encrypted string) (bool, error) {
|
||||||
|
return a.player.IsLicenseValid(encrypted)
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
app := NewApp()
|
||||||
|
|
||||||
|
err := wails.Run(&options.App{
|
||||||
|
Title: "dapp.fm Player",
|
||||||
|
Width: 1200,
|
||||||
|
Height: 800,
|
||||||
|
MinWidth: 800,
|
||||||
|
MinHeight: 600,
|
||||||
|
AssetServer: &assetserver.Options{
|
||||||
|
Handler: &AssetHandler{assets: frontendAssets},
|
||||||
|
},
|
||||||
|
BackgroundColour: &options.RGBA{R: 18, G: 18, B: 18, A: 1},
|
||||||
|
OnStartup: app.Startup,
|
||||||
|
Bind: []interface{}{
|
||||||
|
app,
|
||||||
|
},
|
||||||
|
})
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
println("Error:", err.Error())
|
||||||
|
}
|
||||||
|
}
|
||||||
20
cmd/dapp-fm-app/wails.json
Normal file
20
cmd/dapp-fm-app/wails.json
Normal file
|
|
@ -0,0 +1,20 @@
|
||||||
|
{
|
||||||
|
"$schema": "https://wails.io/schemas/config.v2.json",
|
||||||
|
"name": "dapp-fm",
|
||||||
|
"outputfilename": "dapp-fm",
|
||||||
|
"frontend:install": "",
|
||||||
|
"frontend:build": "",
|
||||||
|
"frontend:dev:watcher": "",
|
||||||
|
"frontend:dev:serverUrl": "",
|
||||||
|
"author": {
|
||||||
|
"name": "dapp.fm",
|
||||||
|
"email": "hello@dapp.fm"
|
||||||
|
},
|
||||||
|
"info": {
|
||||||
|
"companyName": "dapp.fm",
|
||||||
|
"productName": "dapp.fm Player",
|
||||||
|
"productVersion": "1.0.0",
|
||||||
|
"copyright": "Copyright (c) 2024 dapp.fm - EUPL-1.2",
|
||||||
|
"comments": "Decentralized Music Distribution - Zero-Trust DRM"
|
||||||
|
}
|
||||||
|
}
|
||||||
64
cmd/dapp-fm/main.go
Normal file
64
cmd/dapp-fm/main.go
Normal file
|
|
@ -0,0 +1,64 @@
|
||||||
|
// dapp-fm CLI provides headless media player functionality
|
||||||
|
// For native desktop app with WebView, use dapp-fm-app instead
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/player"
|
||||||
|
"github.com/spf13/cobra"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
rootCmd := &cobra.Command{
|
||||||
|
Use: "dapp-fm",
|
||||||
|
Short: "dapp.fm - Decentralized Music Player CLI",
|
||||||
|
Long: `dapp-fm is the CLI version of the dapp.fm player.
|
||||||
|
|
||||||
|
For the native desktop app with WebView, use dapp-fm-app instead.
|
||||||
|
This CLI provides HTTP server mode for automation and fallback scenarios.`,
|
||||||
|
}
|
||||||
|
|
||||||
|
serveCmd := &cobra.Command{
|
||||||
|
Use: "serve",
|
||||||
|
Short: "Start HTTP server for the media player",
|
||||||
|
Long: `Starts an HTTP server serving the media player interface.
|
||||||
|
This is the slower TCP path - for memory-speed decryption, use dapp-fm-app.`,
|
||||||
|
RunE: func(cmd *cobra.Command, args []string) error {
|
||||||
|
port, _ := cmd.Flags().GetString("port")
|
||||||
|
openBrowser, _ := cmd.Flags().GetBool("open")
|
||||||
|
|
||||||
|
p := player.NewPlayer()
|
||||||
|
|
||||||
|
addr := ":" + port
|
||||||
|
if openBrowser {
|
||||||
|
fmt.Printf("Opening browser at http://localhost%s\n", addr)
|
||||||
|
// Would need browser opener here
|
||||||
|
}
|
||||||
|
|
||||||
|
return p.Serve(addr)
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
serveCmd.Flags().StringP("port", "p", "8080", "Port to serve on")
|
||||||
|
serveCmd.Flags().Bool("open", false, "Open browser automatically")
|
||||||
|
|
||||||
|
versionCmd := &cobra.Command{
|
||||||
|
Use: "version",
|
||||||
|
Short: "Print version information",
|
||||||
|
Run: func(cmd *cobra.Command, args []string) {
|
||||||
|
fmt.Println("dapp-fm v1.0.0")
|
||||||
|
fmt.Println("Decentralized Music Distribution")
|
||||||
|
fmt.Println("https://dapp.fm")
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
rootCmd.AddCommand(serveCmd)
|
||||||
|
rootCmd.AddCommand(versionCmd)
|
||||||
|
|
||||||
|
if err := rootCmd.Execute(); err != nil {
|
||||||
|
fmt.Fprintln(os.Stderr, err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -5,9 +5,9 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/Snider/Borg/pkg/trix"
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
trixsdk "github.com/Snider/Enchantrix/pkg/trix"
|
trixsdk "forge.lthn.ai/Snider/Enchantrix/pkg/trix"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -5,8 +5,8 @@ import (
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/trix"
|
"forge.lthn.ai/Snider/Borg/pkg/trix"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestDecodeCmd(t *testing.T) {
|
func TestDecodeCmd(t *testing.T) {
|
||||||
|
|
|
||||||
70
cmd/extract-demo/main.go
Normal file
70
cmd/extract-demo/main.go
Normal file
|
|
@ -0,0 +1,70 @@
|
||||||
|
// extract-demo extracts the video from a v2 SMSG file
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
if len(os.Args) < 4 {
|
||||||
|
fmt.Println("Usage: extract-demo <input.smsg> <password> <output.mp4>")
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
inputFile := os.Args[1]
|
||||||
|
password := os.Args[2]
|
||||||
|
outputFile := os.Args[3]
|
||||||
|
|
||||||
|
data, err := os.ReadFile(inputFile)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to read: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get info first
|
||||||
|
info, err := smsg.GetInfo(data)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to get info: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
fmt.Printf("Format: %s, Compression: %s\n", info.Format, info.Compression)
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
msg, err := smsg.Decrypt(data, password)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to decrypt: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Printf("Body: %s...\n", msg.Body[:min(50, len(msg.Body))])
|
||||||
|
fmt.Printf("Attachments: %d\n", len(msg.Attachments))
|
||||||
|
|
||||||
|
if len(msg.Attachments) > 0 {
|
||||||
|
att := msg.Attachments[0]
|
||||||
|
fmt.Printf(" Name: %s, MIME: %s, Size: %d\n", att.Name, att.MimeType, att.Size)
|
||||||
|
|
||||||
|
// Decode and save
|
||||||
|
decoded, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to decode: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := os.WriteFile(outputFile, decoded, 0644); err != nil {
|
||||||
|
fmt.Printf("Failed to save: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
fmt.Printf("Saved to %s (%d bytes)\n", outputFile, len(decoded))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func min(a, b int) int {
|
||||||
|
if a < b {
|
||||||
|
return a
|
||||||
|
}
|
||||||
|
return b
|
||||||
|
}
|
||||||
|
|
@ -6,7 +6,7 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
trixsdk "github.com/Snider/Enchantrix/pkg/trix"
|
trixsdk "forge.lthn.ai/Snider/Enchantrix/pkg/trix"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
194
cmd/integration_test.go
Normal file
194
cmd/integration_test.go
Normal file
|
|
@ -0,0 +1,194 @@
|
||||||
|
package cmd
|
||||||
|
|
||||||
|
import (
|
||||||
|
"bytes"
|
||||||
|
"io"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
// TestFullPipeline_Good exercises the complete streaming pipeline end-to-end
|
||||||
|
// with realistic directory contents including nested dirs, a large file that
|
||||||
|
// crosses the AEAD block boundary, valid and broken symlinks, and a hidden file.
|
||||||
|
// Each compression mode (none, gz, xz) is tested as a subtest.
|
||||||
|
func TestFullPipeline_Good(t *testing.T) {
|
||||||
|
if testing.Short() {
|
||||||
|
t.Skip("skipping integration test in short mode")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build a realistic source directory.
|
||||||
|
srcDir := t.TempDir()
|
||||||
|
|
||||||
|
// Regular files at root level.
|
||||||
|
writeFile(t, srcDir, "readme.md", "# My Project\n\nA description.\n")
|
||||||
|
writeFile(t, srcDir, "config.json", `{"version":"1.0","debug":false}`)
|
||||||
|
|
||||||
|
// Nested directories with source code.
|
||||||
|
mkdirAll(t, srcDir, "src")
|
||||||
|
mkdirAll(t, srcDir, "src/pkg")
|
||||||
|
writeFile(t, srcDir, "src/main.go", "package main\n\nimport \"fmt\"\n\nfunc main() {\n\tfmt.Println(\"hello\")\n}\n")
|
||||||
|
writeFile(t, srcDir, "src/pkg/lib.go", "package pkg\n\n// Lib is a library function.\nfunc Lib() string { return \"lib\" }\n")
|
||||||
|
|
||||||
|
// Large file: 1 MiB + 1 byte — crosses the 64 KiB block boundary used by
|
||||||
|
// the chunked AEAD streaming encryption. Fill with a deterministic pattern
|
||||||
|
// so we can verify content after round-trip.
|
||||||
|
const largeSize = 1024*1024 + 1
|
||||||
|
largeContent := make([]byte, largeSize)
|
||||||
|
for i := range largeContent {
|
||||||
|
largeContent[i] = byte(i % 251) // prime mod for non-trivial pattern
|
||||||
|
}
|
||||||
|
writeFileBytes(t, srcDir, "large.bin", largeContent)
|
||||||
|
|
||||||
|
// Valid symlink pointing at a relative target.
|
||||||
|
if err := os.Symlink("readme.md", filepath.Join(srcDir, "link-to-readme")); err != nil {
|
||||||
|
t.Fatalf("failed to create valid symlink: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Broken symlink pointing at a nonexistent absolute path.
|
||||||
|
if err := os.Symlink("/nonexistent/target", filepath.Join(srcDir, "broken-link")); err != nil {
|
||||||
|
t.Fatalf("failed to create broken symlink: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Hidden file (dot-prefixed).
|
||||||
|
writeFile(t, srcDir, ".hidden", "secret stuff\n")
|
||||||
|
|
||||||
|
// Run each compression mode as a subtest.
|
||||||
|
modes := []string{"none", "gz", "xz"}
|
||||||
|
for _, comp := range modes {
|
||||||
|
comp := comp // capture
|
||||||
|
t.Run("compression="+comp, func(t *testing.T) {
|
||||||
|
outDir := t.TempDir()
|
||||||
|
outFile := filepath.Join(outDir, "pipeline-"+comp+".stim")
|
||||||
|
password := "integration-test-pw-" + comp
|
||||||
|
|
||||||
|
// Step 1: Collect (walk -> tar -> compress -> encrypt -> file).
|
||||||
|
if err := CollectLocalStreaming(srcDir, outFile, comp, password); err != nil {
|
||||||
|
t.Fatalf("CollectLocalStreaming(%q) error = %v", comp, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 2: Verify output exists and is non-empty.
|
||||||
|
info, err := os.Stat(outFile)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("output file does not exist: %v", err)
|
||||||
|
}
|
||||||
|
if info.Size() == 0 {
|
||||||
|
t.Fatal("output file is empty")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 3: Decrypt back into a DataNode.
|
||||||
|
dn, err := DecryptStimV2(outFile, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptStimV2() error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 4: Verify all regular files exist in the DataNode.
|
||||||
|
expectedFiles := []string{
|
||||||
|
"readme.md",
|
||||||
|
"config.json",
|
||||||
|
"src/main.go",
|
||||||
|
"src/pkg/lib.go",
|
||||||
|
"large.bin",
|
||||||
|
".hidden",
|
||||||
|
}
|
||||||
|
for _, name := range expectedFiles {
|
||||||
|
exists, eerr := dn.Exists(name)
|
||||||
|
if eerr != nil {
|
||||||
|
t.Errorf("Exists(%q) error = %v", name, eerr)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
if !exists {
|
||||||
|
t.Errorf("expected file %q in DataNode but it is missing", name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify the valid symlink was included.
|
||||||
|
linkExists, _ := dn.Exists("link-to-readme")
|
||||||
|
if !linkExists {
|
||||||
|
t.Error("expected symlink link-to-readme in DataNode but it is missing")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 5: Verify large file has correct content (first byte check).
|
||||||
|
f, err := dn.Open("large.bin")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Open(large.bin) error = %v", err)
|
||||||
|
}
|
||||||
|
defer f.Close()
|
||||||
|
|
||||||
|
// Read the entire large file and verify size and first byte.
|
||||||
|
allData, err := io.ReadAll(f)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("reading large.bin: %v", err)
|
||||||
|
}
|
||||||
|
if len(allData) != largeSize {
|
||||||
|
t.Errorf("large.bin size = %d, want %d", len(allData), largeSize)
|
||||||
|
}
|
||||||
|
if len(allData) > 0 && allData[0] != byte(0%251) {
|
||||||
|
t.Errorf("large.bin first byte = %d, want %d", allData[0], byte(0%251))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify content integrity of the whole large file.
|
||||||
|
if !bytes.Equal(allData, largeContent) {
|
||||||
|
t.Error("large.bin content does not match original after round-trip")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 6: Verify broken symlink was skipped.
|
||||||
|
brokenExists, _ := dn.Exists("broken-link")
|
||||||
|
if brokenExists {
|
||||||
|
t.Error("broken symlink should have been skipped but was found in DataNode")
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// TestFullPipeline_WrongPassword_Bad encrypts with one password and attempts
|
||||||
|
// to decrypt with a different password, verifying that an error is returned.
|
||||||
|
func TestFullPipeline_WrongPassword_Bad(t *testing.T) {
|
||||||
|
if testing.Short() {
|
||||||
|
t.Skip("skipping integration test in short mode")
|
||||||
|
}
|
||||||
|
|
||||||
|
srcDir := t.TempDir()
|
||||||
|
outDir := t.TempDir()
|
||||||
|
|
||||||
|
writeFile(t, srcDir, "secret.txt", "this is confidential\n")
|
||||||
|
|
||||||
|
outFile := filepath.Join(outDir, "wrong-pw.stim")
|
||||||
|
|
||||||
|
// Encrypt with the correct password.
|
||||||
|
if err := CollectLocalStreaming(srcDir, outFile, "none", "correct-password"); err != nil {
|
||||||
|
t.Fatalf("CollectLocalStreaming() error = %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Attempt to decrypt with the wrong password.
|
||||||
|
_, err := DecryptStimV2(outFile, "wrong-password")
|
||||||
|
if err == nil {
|
||||||
|
t.Fatal("expected error when decrypting with wrong password, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// --- helpers ---
|
||||||
|
|
||||||
|
func writeFile(t *testing.T, base, rel, content string) {
|
||||||
|
t.Helper()
|
||||||
|
path := filepath.Join(base, rel)
|
||||||
|
if err := os.WriteFile(path, []byte(content), 0644); err != nil {
|
||||||
|
t.Fatalf("failed to write %s: %v", rel, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func writeFileBytes(t *testing.T, base, rel string, data []byte) {
|
||||||
|
t.Helper()
|
||||||
|
path := filepath.Join(base, rel)
|
||||||
|
if err := os.WriteFile(path, data, 0644); err != nil {
|
||||||
|
t.Fatalf("failed to write %s: %v", rel, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func mkdirAll(t *testing.T, base, rel string) {
|
||||||
|
t.Helper()
|
||||||
|
path := filepath.Join(base, rel)
|
||||||
|
if err := os.MkdirAll(path, 0755); err != nil {
|
||||||
|
t.Fatalf("failed to mkdir %s: %v", rel, err)
|
||||||
|
}
|
||||||
|
}
|
||||||
226
cmd/mkdemo-abr/main.go
Normal file
226
cmd/mkdemo-abr/main.go
Normal file
|
|
@ -0,0 +1,226 @@
|
||||||
|
// mkdemo-abr creates an ABR (Adaptive Bitrate) demo set from a source video.
|
||||||
|
// It uses ffmpeg to transcode to multiple bitrates, then encrypts each as v3 chunked SMSG.
|
||||||
|
//
|
||||||
|
// Usage: mkdemo-abr <input-video> <output-dir> [password]
|
||||||
|
//
|
||||||
|
// Output:
|
||||||
|
//
|
||||||
|
// output-dir/manifest.json - ABR manifest listing all variants
|
||||||
|
// output-dir/track-1080p.smsg - 1080p variant (5 Mbps)
|
||||||
|
// output-dir/track-720p.smsg - 720p variant (2.5 Mbps)
|
||||||
|
// output-dir/track-480p.smsg - 480p variant (1 Mbps)
|
||||||
|
// output-dir/track-360p.smsg - 360p variant (500 Kbps)
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/rand"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"os/exec"
|
||||||
|
"path/filepath"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Preset defines a quality level for transcoding
|
||||||
|
type Preset struct {
|
||||||
|
Name string
|
||||||
|
Width int
|
||||||
|
Height int
|
||||||
|
Bitrate string // For ffmpeg (e.g., "5M")
|
||||||
|
BPS int // Bits per second for manifest
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default presets matching ABRPresets in types.go
|
||||||
|
var presets = []Preset{
|
||||||
|
{"1080p", 1920, 1080, "5M", 5000000},
|
||||||
|
{"720p", 1280, 720, "2.5M", 2500000},
|
||||||
|
{"480p", 854, 480, "1M", 1000000},
|
||||||
|
{"360p", 640, 360, "500K", 500000},
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
if len(os.Args) < 3 {
|
||||||
|
fmt.Println("Usage: mkdemo-abr <input-video> <output-dir> [password]")
|
||||||
|
fmt.Println()
|
||||||
|
fmt.Println("Creates ABR variant set from source video using ffmpeg.")
|
||||||
|
fmt.Println()
|
||||||
|
fmt.Println("Output:")
|
||||||
|
fmt.Println(" output-dir/manifest.json - ABR manifest")
|
||||||
|
fmt.Println(" output-dir/track-1080p.smsg - 1080p (5 Mbps)")
|
||||||
|
fmt.Println(" output-dir/track-720p.smsg - 720p (2.5 Mbps)")
|
||||||
|
fmt.Println(" output-dir/track-480p.smsg - 480p (1 Mbps)")
|
||||||
|
fmt.Println(" output-dir/track-360p.smsg - 360p (500 Kbps)")
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
inputFile := os.Args[1]
|
||||||
|
outputDir := os.Args[2]
|
||||||
|
|
||||||
|
// Check ffmpeg is available
|
||||||
|
if _, err := exec.LookPath("ffmpeg"); err != nil {
|
||||||
|
fmt.Println("Error: ffmpeg not found in PATH")
|
||||||
|
fmt.Println("Install ffmpeg: https://ffmpeg.org/download.html")
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate or use provided password
|
||||||
|
var password string
|
||||||
|
if len(os.Args) > 3 {
|
||||||
|
password = os.Args[3]
|
||||||
|
} else {
|
||||||
|
passwordBytes := make([]byte, 24)
|
||||||
|
if _, err := rand.Read(passwordBytes); err != nil {
|
||||||
|
fmt.Printf("Failed to generate password: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
password = base64.RawURLEncoding.EncodeToString(passwordBytes)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create output directory
|
||||||
|
if err := os.MkdirAll(outputDir, 0755); err != nil {
|
||||||
|
fmt.Printf("Failed to create output directory: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get title from input filename
|
||||||
|
title := filepath.Base(inputFile)
|
||||||
|
ext := filepath.Ext(title)
|
||||||
|
if ext != "" {
|
||||||
|
title = title[:len(title)-len(ext)]
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create ABR manifest
|
||||||
|
manifest := smsg.NewABRManifest(title)
|
||||||
|
|
||||||
|
fmt.Printf("Creating ABR variants for: %s\n", inputFile)
|
||||||
|
fmt.Printf("Output directory: %s\n", outputDir)
|
||||||
|
fmt.Printf("Password: %s\n\n", password)
|
||||||
|
|
||||||
|
// Process each preset
|
||||||
|
for _, preset := range presets {
|
||||||
|
fmt.Printf("Processing %s (%dx%d @ %s)...\n", preset.Name, preset.Width, preset.Height, preset.Bitrate)
|
||||||
|
|
||||||
|
// Step 1: Transcode with ffmpeg
|
||||||
|
tempFile := filepath.Join(outputDir, fmt.Sprintf("temp-%s.mp4", preset.Name))
|
||||||
|
if err := transcode(inputFile, tempFile, preset); err != nil {
|
||||||
|
fmt.Printf(" Warning: Transcode failed for %s: %v\n", preset.Name, err)
|
||||||
|
fmt.Printf(" Skipping this variant...\n")
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 2: Read transcoded file
|
||||||
|
content, err := os.ReadFile(tempFile)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf(" Error reading transcoded file: %v\n", err)
|
||||||
|
os.Remove(tempFile)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 3: Create SMSG message
|
||||||
|
msg := smsg.NewMessage("dapp.fm ABR Demo")
|
||||||
|
msg.Subject = fmt.Sprintf("%s - %s", title, preset.Name)
|
||||||
|
msg.From = "dapp.fm"
|
||||||
|
msg.AddBinaryAttachment(
|
||||||
|
fmt.Sprintf("%s-%s.mp4", strings.ReplaceAll(title, " ", "_"), preset.Name),
|
||||||
|
content,
|
||||||
|
"video/mp4",
|
||||||
|
)
|
||||||
|
|
||||||
|
// Step 4: Create manifest for this variant
|
||||||
|
variantManifest := smsg.NewManifest(title)
|
||||||
|
variantManifest.LicenseType = "perpetual"
|
||||||
|
variantManifest.Format = "dapp.fm/abr-v1"
|
||||||
|
|
||||||
|
// Step 5: Encrypt with v3 chunked format
|
||||||
|
params := &smsg.StreamParams{
|
||||||
|
License: password,
|
||||||
|
ChunkSize: smsg.DefaultChunkSize, // 1MB chunks
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := smsg.EncryptV3(msg, params, variantManifest)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf(" Error encrypting: %v\n", err)
|
||||||
|
os.Remove(tempFile)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 6: Write SMSG file
|
||||||
|
smsgFile := filepath.Join(outputDir, fmt.Sprintf("track-%s.smsg", preset.Name))
|
||||||
|
if err := os.WriteFile(smsgFile, encrypted, 0644); err != nil {
|
||||||
|
fmt.Printf(" Error writing SMSG: %v\n", err)
|
||||||
|
os.Remove(tempFile)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 7: Get chunk count from header
|
||||||
|
header, err := smsg.GetV3Header(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf(" Warning: Could not read header: %v\n", err)
|
||||||
|
}
|
||||||
|
chunkCount := 0
|
||||||
|
if header != nil && header.Chunked != nil {
|
||||||
|
chunkCount = header.Chunked.TotalChunks
|
||||||
|
}
|
||||||
|
|
||||||
|
// Step 8: Add variant to manifest
|
||||||
|
variant := smsg.Variant{
|
||||||
|
Name: preset.Name,
|
||||||
|
Bandwidth: preset.BPS,
|
||||||
|
Width: preset.Width,
|
||||||
|
Height: preset.Height,
|
||||||
|
Codecs: "avc1.640028,mp4a.40.2",
|
||||||
|
URL: fmt.Sprintf("track-%s.smsg", preset.Name),
|
||||||
|
ChunkCount: chunkCount,
|
||||||
|
FileSize: int64(len(encrypted)),
|
||||||
|
}
|
||||||
|
manifest.AddVariant(variant)
|
||||||
|
|
||||||
|
// Clean up temp file
|
||||||
|
os.Remove(tempFile)
|
||||||
|
|
||||||
|
fmt.Printf(" Created: %s (%d bytes, %d chunks)\n", smsgFile, len(encrypted), chunkCount)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(manifest.Variants) == 0 {
|
||||||
|
fmt.Println("\nError: No variants created. Check ffmpeg output.")
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write ABR manifest
|
||||||
|
manifestPath := filepath.Join(outputDir, "manifest.json")
|
||||||
|
if err := smsg.WriteABRManifest(manifest, manifestPath); err != nil {
|
||||||
|
fmt.Printf("Failed to write manifest: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Printf("\n✓ Created ABR manifest: %s\n", manifestPath)
|
||||||
|
fmt.Printf("✓ Variants: %d\n", len(manifest.Variants))
|
||||||
|
fmt.Printf("✓ Default: %s\n", manifest.Variants[manifest.DefaultIdx].Name)
|
||||||
|
fmt.Printf("\nMaster Password: %s\n", password)
|
||||||
|
fmt.Println("\nStore this password securely - it decrypts ALL variants!")
|
||||||
|
}
|
||||||
|
|
||||||
|
// transcode uses ffmpeg to transcode the input to the specified preset
|
||||||
|
func transcode(input, output string, preset Preset) error {
|
||||||
|
args := []string{
|
||||||
|
"-i", input,
|
||||||
|
"-vf", fmt.Sprintf("scale=%d:%d:force_original_aspect_ratio=decrease,pad=%d:%d:(ow-iw)/2:(oh-ih)/2",
|
||||||
|
preset.Width, preset.Height, preset.Width, preset.Height),
|
||||||
|
"-c:v", "libx264",
|
||||||
|
"-preset", "medium",
|
||||||
|
"-b:v", preset.Bitrate,
|
||||||
|
"-c:a", "aac",
|
||||||
|
"-b:a", "128k",
|
||||||
|
"-movflags", "+faststart",
|
||||||
|
"-y", // Overwrite output
|
||||||
|
output,
|
||||||
|
}
|
||||||
|
|
||||||
|
cmd := exec.Command("ffmpeg", args...)
|
||||||
|
cmd.Stderr = os.Stderr // Show ffmpeg output for debugging
|
||||||
|
|
||||||
|
return cmd.Run()
|
||||||
|
}
|
||||||
129
cmd/mkdemo-v3/main.go
Normal file
129
cmd/mkdemo-v3/main.go
Normal file
|
|
@ -0,0 +1,129 @@
|
||||||
|
// mkdemo-v3 creates a v3 chunked SMSG file for streaming demos
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/rand"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
if len(os.Args) < 3 {
|
||||||
|
fmt.Println("Usage: mkdemo-v3 <input-media-file> <output-smsg-file> [license] [chunk-size-kb]")
|
||||||
|
fmt.Println("")
|
||||||
|
fmt.Println("Creates a v3 chunked SMSG file for streaming demos.")
|
||||||
|
fmt.Println("V3 uses rolling keys derived from: LTHN(date:license:fingerprint)")
|
||||||
|
fmt.Println("")
|
||||||
|
fmt.Println("Options:")
|
||||||
|
fmt.Println(" license The license key (default: auto-generated)")
|
||||||
|
fmt.Println(" chunk-size-kb Chunk size in KB (default: 512)")
|
||||||
|
fmt.Println("")
|
||||||
|
fmt.Println("Note: V3 files work for 24-48 hours from creation (rolling keys).")
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
inputFile := os.Args[1]
|
||||||
|
outputFile := os.Args[2]
|
||||||
|
|
||||||
|
// Read input file
|
||||||
|
content, err := os.ReadFile(inputFile)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to read input file: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// License (acts as password in v3)
|
||||||
|
var license string
|
||||||
|
if len(os.Args) > 3 {
|
||||||
|
license = os.Args[3]
|
||||||
|
} else {
|
||||||
|
// Generate cryptographically secure license
|
||||||
|
licenseBytes := make([]byte, 24)
|
||||||
|
if _, err := rand.Read(licenseBytes); err != nil {
|
||||||
|
fmt.Printf("Failed to generate license: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
license = base64.RawURLEncoding.EncodeToString(licenseBytes)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Chunk size (default 512KB for good streaming granularity)
|
||||||
|
chunkSize := 512 * 1024
|
||||||
|
if len(os.Args) > 4 {
|
||||||
|
var chunkKB int
|
||||||
|
if _, err := fmt.Sscanf(os.Args[4], "%d", &chunkKB); err == nil && chunkKB > 0 {
|
||||||
|
chunkSize = chunkKB * 1024
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create manifest
|
||||||
|
title := filepath.Base(inputFile)
|
||||||
|
ext := filepath.Ext(title)
|
||||||
|
if ext != "" {
|
||||||
|
title = title[:len(title)-len(ext)]
|
||||||
|
}
|
||||||
|
manifest := smsg.NewManifest(title)
|
||||||
|
manifest.LicenseType = "streaming"
|
||||||
|
manifest.Format = "dapp.fm/v3-chunked"
|
||||||
|
|
||||||
|
// Detect MIME type
|
||||||
|
mimeType := "video/mp4"
|
||||||
|
switch ext {
|
||||||
|
case ".mp3":
|
||||||
|
mimeType = "audio/mpeg"
|
||||||
|
case ".wav":
|
||||||
|
mimeType = "audio/wav"
|
||||||
|
case ".flac":
|
||||||
|
mimeType = "audio/flac"
|
||||||
|
case ".webm":
|
||||||
|
mimeType = "video/webm"
|
||||||
|
case ".ogg":
|
||||||
|
mimeType = "audio/ogg"
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create message with attachment
|
||||||
|
msg := smsg.NewMessage("dapp.fm V3 Streaming Demo - Decrypt-while-downloading enabled")
|
||||||
|
msg.Subject = "V3 Chunked Streaming"
|
||||||
|
msg.From = "dapp.fm"
|
||||||
|
msg.AddBinaryAttachment(
|
||||||
|
filepath.Base(inputFile),
|
||||||
|
content,
|
||||||
|
mimeType,
|
||||||
|
)
|
||||||
|
|
||||||
|
// Create stream params with chunking enabled
|
||||||
|
params := &smsg.StreamParams{
|
||||||
|
License: license,
|
||||||
|
Fingerprint: "", // Empty for demo (works for any device)
|
||||||
|
Cadence: smsg.CadenceDaily,
|
||||||
|
ChunkSize: chunkSize,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt with v3 chunked format
|
||||||
|
encrypted, err := smsg.EncryptV3(msg, params, manifest)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to encrypt: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write output
|
||||||
|
if err := os.WriteFile(outputFile, encrypted, 0644); err != nil {
|
||||||
|
fmt.Printf("Failed to write output: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate chunk count
|
||||||
|
numChunks := (len(content) + chunkSize - 1) / chunkSize
|
||||||
|
|
||||||
|
fmt.Printf("Created: %s (%d bytes)\n", outputFile, len(encrypted))
|
||||||
|
fmt.Printf("Format: v3 chunked\n")
|
||||||
|
fmt.Printf("Chunk Size: %d KB\n", chunkSize/1024)
|
||||||
|
fmt.Printf("Total Chunks: ~%d\n", numChunks)
|
||||||
|
fmt.Printf("License: %s\n", license)
|
||||||
|
fmt.Println("")
|
||||||
|
fmt.Println("This license works for 24-48 hours from creation.")
|
||||||
|
fmt.Println("Use the license in the streaming demo to decrypt.")
|
||||||
|
}
|
||||||
81
cmd/mkdemo/main.go
Normal file
81
cmd/mkdemo/main.go
Normal file
|
|
@ -0,0 +1,81 @@
|
||||||
|
// mkdemo creates an RFC-quality demo SMSG file with a cryptographically secure password
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/rand"
|
||||||
|
"encoding/base64"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
)
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
if len(os.Args) < 3 {
|
||||||
|
fmt.Println("Usage: mkdemo <input-media-file> <output-smsg-file>")
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
inputFile := os.Args[1]
|
||||||
|
outputFile := os.Args[2]
|
||||||
|
|
||||||
|
// Read input file
|
||||||
|
content, err := os.ReadFile(inputFile)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to read input file: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use existing password or generate new one
|
||||||
|
var password string
|
||||||
|
if len(os.Args) > 3 {
|
||||||
|
password = os.Args[3]
|
||||||
|
} else {
|
||||||
|
// Generate cryptographically secure password (32 bytes = 256 bits)
|
||||||
|
passwordBytes := make([]byte, 24)
|
||||||
|
if _, err := rand.Read(passwordBytes); err != nil {
|
||||||
|
fmt.Printf("Failed to generate password: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
// Use base64url encoding, trimmed to 32 chars for readability
|
||||||
|
password = base64.RawURLEncoding.EncodeToString(passwordBytes)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create manifest with filename as title
|
||||||
|
title := filepath.Base(inputFile)
|
||||||
|
ext := filepath.Ext(title)
|
||||||
|
if ext != "" {
|
||||||
|
title = title[:len(title)-len(ext)]
|
||||||
|
}
|
||||||
|
manifest := smsg.NewManifest(title)
|
||||||
|
manifest.LicenseType = "perpetual"
|
||||||
|
manifest.Format = "dapp.fm/v1"
|
||||||
|
|
||||||
|
// Create message with attachment (using binary attachment for v2 format)
|
||||||
|
msg := smsg.NewMessage("Welcome to dapp.fm - Zero-Trust DRM for the open web.")
|
||||||
|
msg.Subject = "dapp.fm Demo"
|
||||||
|
msg.From = "dapp.fm"
|
||||||
|
msg.AddBinaryAttachment(
|
||||||
|
filepath.Base(inputFile),
|
||||||
|
content,
|
||||||
|
"video/mp4",
|
||||||
|
)
|
||||||
|
|
||||||
|
// Encrypt with v2 binary format (smaller file size)
|
||||||
|
encrypted, err := smsg.EncryptV2WithManifest(msg, password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
fmt.Printf("Failed to encrypt: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write output
|
||||||
|
if err := os.WriteFile(outputFile, encrypted, 0644); err != nil {
|
||||||
|
fmt.Printf("Failed to write output: %v\n", err)
|
||||||
|
os.Exit(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Printf("Created: %s (%d bytes)\n", outputFile, len(encrypted))
|
||||||
|
fmt.Printf("Master Password: %s\n", password)
|
||||||
|
fmt.Println("\nStore this password securely - it cannot be recovered!")
|
||||||
|
}
|
||||||
|
|
@ -16,6 +16,7 @@ packaging their contents into a single file, and managing the data within.`,
|
||||||
}
|
}
|
||||||
|
|
||||||
rootCmd.PersistentFlags().BoolP("verbose", "v", false, "Enable verbose logging")
|
rootCmd.PersistentFlags().BoolP("verbose", "v", false, "Enable verbose logging")
|
||||||
|
rootCmd.PersistentFlags().BoolP("quiet", "q", false, "Suppress non-error output")
|
||||||
return rootCmd
|
return rootCmd
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -4,7 +4,7 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -7,7 +7,7 @@ import (
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestRunCmd_Good(t *testing.T) {
|
func TestRunCmd_Good(t *testing.T) {
|
||||||
|
|
|
||||||
|
|
@ -6,9 +6,9 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"strings"
|
"strings"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/tarfs"
|
"forge.lthn.ai/Snider/Borg/pkg/tarfs"
|
||||||
|
|
||||||
"github.com/spf13/cobra"
|
"github.com/spf13/cobra"
|
||||||
)
|
)
|
||||||
|
|
|
||||||
BIN
demo/demo-sample.smsg
Normal file
BIN
demo/demo-sample.smsg
Normal file
Binary file not shown.
BIN
demo/demo-track-v3.smsg
Normal file
BIN
demo/demo-track-v3.smsg
Normal file
Binary file not shown.
3596
demo/index.html
Normal file
3596
demo/index.html
Normal file
File diff suppressed because it is too large
Load diff
BIN
demo/profile-avatar.jpg
Normal file
BIN
demo/profile-avatar.jpg
Normal file
Binary file not shown.
|
After Width: | Height: | Size: 33 KiB |
BIN
demo/stmf.wasm
Executable file
BIN
demo/stmf.wasm
Executable file
Binary file not shown.
575
demo/wasm_exec.js
Normal file
575
demo/wasm_exec.js
Normal file
|
|
@ -0,0 +1,575 @@
|
||||||
|
// Copyright 2018 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
(() => {
|
||||||
|
const enosys = () => {
|
||||||
|
const err = new Error("not implemented");
|
||||||
|
err.code = "ENOSYS";
|
||||||
|
return err;
|
||||||
|
};
|
||||||
|
|
||||||
|
if (!globalThis.fs) {
|
||||||
|
let outputBuf = "";
|
||||||
|
globalThis.fs = {
|
||||||
|
constants: { O_WRONLY: -1, O_RDWR: -1, O_CREAT: -1, O_TRUNC: -1, O_APPEND: -1, O_EXCL: -1, O_DIRECTORY: -1 }, // unused
|
||||||
|
writeSync(fd, buf) {
|
||||||
|
outputBuf += decoder.decode(buf);
|
||||||
|
const nl = outputBuf.lastIndexOf("\n");
|
||||||
|
if (nl != -1) {
|
||||||
|
console.log(outputBuf.substring(0, nl));
|
||||||
|
outputBuf = outputBuf.substring(nl + 1);
|
||||||
|
}
|
||||||
|
return buf.length;
|
||||||
|
},
|
||||||
|
write(fd, buf, offset, length, position, callback) {
|
||||||
|
if (offset !== 0 || length !== buf.length || position !== null) {
|
||||||
|
callback(enosys());
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const n = this.writeSync(fd, buf);
|
||||||
|
callback(null, n);
|
||||||
|
},
|
||||||
|
chmod(path, mode, callback) { callback(enosys()); },
|
||||||
|
chown(path, uid, gid, callback) { callback(enosys()); },
|
||||||
|
close(fd, callback) { callback(enosys()); },
|
||||||
|
fchmod(fd, mode, callback) { callback(enosys()); },
|
||||||
|
fchown(fd, uid, gid, callback) { callback(enosys()); },
|
||||||
|
fstat(fd, callback) { callback(enosys()); },
|
||||||
|
fsync(fd, callback) { callback(null); },
|
||||||
|
ftruncate(fd, length, callback) { callback(enosys()); },
|
||||||
|
lchown(path, uid, gid, callback) { callback(enosys()); },
|
||||||
|
link(path, link, callback) { callback(enosys()); },
|
||||||
|
lstat(path, callback) { callback(enosys()); },
|
||||||
|
mkdir(path, perm, callback) { callback(enosys()); },
|
||||||
|
open(path, flags, mode, callback) { callback(enosys()); },
|
||||||
|
read(fd, buffer, offset, length, position, callback) { callback(enosys()); },
|
||||||
|
readdir(path, callback) { callback(enosys()); },
|
||||||
|
readlink(path, callback) { callback(enosys()); },
|
||||||
|
rename(from, to, callback) { callback(enosys()); },
|
||||||
|
rmdir(path, callback) { callback(enosys()); },
|
||||||
|
stat(path, callback) { callback(enosys()); },
|
||||||
|
symlink(path, link, callback) { callback(enosys()); },
|
||||||
|
truncate(path, length, callback) { callback(enosys()); },
|
||||||
|
unlink(path, callback) { callback(enosys()); },
|
||||||
|
utimes(path, atime, mtime, callback) { callback(enosys()); },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.process) {
|
||||||
|
globalThis.process = {
|
||||||
|
getuid() { return -1; },
|
||||||
|
getgid() { return -1; },
|
||||||
|
geteuid() { return -1; },
|
||||||
|
getegid() { return -1; },
|
||||||
|
getgroups() { throw enosys(); },
|
||||||
|
pid: -1,
|
||||||
|
ppid: -1,
|
||||||
|
umask() { throw enosys(); },
|
||||||
|
cwd() { throw enosys(); },
|
||||||
|
chdir() { throw enosys(); },
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.path) {
|
||||||
|
globalThis.path = {
|
||||||
|
resolve(...pathSegments) {
|
||||||
|
return pathSegments.join("/");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.crypto) {
|
||||||
|
throw new Error("globalThis.crypto is not available, polyfill required (crypto.getRandomValues only)");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.performance) {
|
||||||
|
throw new Error("globalThis.performance is not available, polyfill required (performance.now only)");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.TextEncoder) {
|
||||||
|
throw new Error("globalThis.TextEncoder is not available, polyfill required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.TextDecoder) {
|
||||||
|
throw new Error("globalThis.TextDecoder is not available, polyfill required");
|
||||||
|
}
|
||||||
|
|
||||||
|
const encoder = new TextEncoder("utf-8");
|
||||||
|
const decoder = new TextDecoder("utf-8");
|
||||||
|
|
||||||
|
globalThis.Go = class {
|
||||||
|
constructor() {
|
||||||
|
this.argv = ["js"];
|
||||||
|
this.env = {};
|
||||||
|
this.exit = (code) => {
|
||||||
|
if (code !== 0) {
|
||||||
|
console.warn("exit code:", code);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
this._exitPromise = new Promise((resolve) => {
|
||||||
|
this._resolveExitPromise = resolve;
|
||||||
|
});
|
||||||
|
this._pendingEvent = null;
|
||||||
|
this._scheduledTimeouts = new Map();
|
||||||
|
this._nextCallbackTimeoutID = 1;
|
||||||
|
|
||||||
|
const setInt64 = (addr, v) => {
|
||||||
|
this.mem.setUint32(addr + 0, v, true);
|
||||||
|
this.mem.setUint32(addr + 4, Math.floor(v / 4294967296), true);
|
||||||
|
}
|
||||||
|
|
||||||
|
const setInt32 = (addr, v) => {
|
||||||
|
this.mem.setUint32(addr + 0, v, true);
|
||||||
|
}
|
||||||
|
|
||||||
|
const getInt64 = (addr) => {
|
||||||
|
const low = this.mem.getUint32(addr + 0, true);
|
||||||
|
const high = this.mem.getInt32(addr + 4, true);
|
||||||
|
return low + high * 4294967296;
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadValue = (addr) => {
|
||||||
|
const f = this.mem.getFloat64(addr, true);
|
||||||
|
if (f === 0) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
if (!isNaN(f)) {
|
||||||
|
return f;
|
||||||
|
}
|
||||||
|
|
||||||
|
const id = this.mem.getUint32(addr, true);
|
||||||
|
return this._values[id];
|
||||||
|
}
|
||||||
|
|
||||||
|
const storeValue = (addr, v) => {
|
||||||
|
const nanHead = 0x7FF80000;
|
||||||
|
|
||||||
|
if (typeof v === "number" && v !== 0) {
|
||||||
|
if (isNaN(v)) {
|
||||||
|
this.mem.setUint32(addr + 4, nanHead, true);
|
||||||
|
this.mem.setUint32(addr, 0, true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
this.mem.setFloat64(addr, v, true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (v === undefined) {
|
||||||
|
this.mem.setFloat64(addr, 0, true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
let id = this._ids.get(v);
|
||||||
|
if (id === undefined) {
|
||||||
|
id = this._idPool.pop();
|
||||||
|
if (id === undefined) {
|
||||||
|
id = this._values.length;
|
||||||
|
}
|
||||||
|
this._values[id] = v;
|
||||||
|
this._goRefCounts[id] = 0;
|
||||||
|
this._ids.set(v, id);
|
||||||
|
}
|
||||||
|
this._goRefCounts[id]++;
|
||||||
|
let typeFlag = 0;
|
||||||
|
switch (typeof v) {
|
||||||
|
case "object":
|
||||||
|
if (v !== null) {
|
||||||
|
typeFlag = 1;
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case "string":
|
||||||
|
typeFlag = 2;
|
||||||
|
break;
|
||||||
|
case "symbol":
|
||||||
|
typeFlag = 3;
|
||||||
|
break;
|
||||||
|
case "function":
|
||||||
|
typeFlag = 4;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
this.mem.setUint32(addr + 4, nanHead | typeFlag, true);
|
||||||
|
this.mem.setUint32(addr, id, true);
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadSlice = (addr) => {
|
||||||
|
const array = getInt64(addr + 0);
|
||||||
|
const len = getInt64(addr + 8);
|
||||||
|
return new Uint8Array(this._inst.exports.mem.buffer, array, len);
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadSliceOfValues = (addr) => {
|
||||||
|
const array = getInt64(addr + 0);
|
||||||
|
const len = getInt64(addr + 8);
|
||||||
|
const a = new Array(len);
|
||||||
|
for (let i = 0; i < len; i++) {
|
||||||
|
a[i] = loadValue(array + i * 8);
|
||||||
|
}
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadString = (addr) => {
|
||||||
|
const saddr = getInt64(addr + 0);
|
||||||
|
const len = getInt64(addr + 8);
|
||||||
|
return decoder.decode(new DataView(this._inst.exports.mem.buffer, saddr, len));
|
||||||
|
}
|
||||||
|
|
||||||
|
const testCallExport = (a, b) => {
|
||||||
|
this._inst.exports.testExport0();
|
||||||
|
return this._inst.exports.testExport(a, b);
|
||||||
|
}
|
||||||
|
|
||||||
|
const timeOrigin = Date.now() - performance.now();
|
||||||
|
this.importObject = {
|
||||||
|
_gotest: {
|
||||||
|
add: (a, b) => a + b,
|
||||||
|
callExport: testCallExport,
|
||||||
|
},
|
||||||
|
gojs: {
|
||||||
|
// Go's SP does not change as long as no Go code is running. Some operations (e.g. calls, getters and setters)
|
||||||
|
// may synchronously trigger a Go event handler. This makes Go code get executed in the middle of the imported
|
||||||
|
// function. A goroutine can switch to a new stack if the current stack is too small (see morestack function).
|
||||||
|
// This changes the SP, thus we have to update the SP used by the imported function.
|
||||||
|
|
||||||
|
// func wasmExit(code int32)
|
||||||
|
"runtime.wasmExit": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const code = this.mem.getInt32(sp + 8, true);
|
||||||
|
this.exited = true;
|
||||||
|
delete this._inst;
|
||||||
|
delete this._values;
|
||||||
|
delete this._goRefCounts;
|
||||||
|
delete this._ids;
|
||||||
|
delete this._idPool;
|
||||||
|
this.exit(code);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func wasmWrite(fd uintptr, p unsafe.Pointer, n int32)
|
||||||
|
"runtime.wasmWrite": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const fd = getInt64(sp + 8);
|
||||||
|
const p = getInt64(sp + 16);
|
||||||
|
const n = this.mem.getInt32(sp + 24, true);
|
||||||
|
fs.writeSync(fd, new Uint8Array(this._inst.exports.mem.buffer, p, n));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func resetMemoryDataView()
|
||||||
|
"runtime.resetMemoryDataView": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
this.mem = new DataView(this._inst.exports.mem.buffer);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func nanotime1() int64
|
||||||
|
"runtime.nanotime1": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
setInt64(sp + 8, (timeOrigin + performance.now()) * 1000000);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func walltime() (sec int64, nsec int32)
|
||||||
|
"runtime.walltime": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const msec = (new Date).getTime();
|
||||||
|
setInt64(sp + 8, msec / 1000);
|
||||||
|
this.mem.setInt32(sp + 16, (msec % 1000) * 1000000, true);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func scheduleTimeoutEvent(delay int64) int32
|
||||||
|
"runtime.scheduleTimeoutEvent": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const id = this._nextCallbackTimeoutID;
|
||||||
|
this._nextCallbackTimeoutID++;
|
||||||
|
this._scheduledTimeouts.set(id, setTimeout(
|
||||||
|
() => {
|
||||||
|
this._resume();
|
||||||
|
while (this._scheduledTimeouts.has(id)) {
|
||||||
|
// for some reason Go failed to register the timeout event, log and try again
|
||||||
|
// (temporary workaround for https://github.com/golang/go/issues/28975)
|
||||||
|
console.warn("scheduleTimeoutEvent: missed timeout event");
|
||||||
|
this._resume();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
getInt64(sp + 8),
|
||||||
|
));
|
||||||
|
this.mem.setInt32(sp + 16, id, true);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func clearTimeoutEvent(id int32)
|
||||||
|
"runtime.clearTimeoutEvent": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const id = this.mem.getInt32(sp + 8, true);
|
||||||
|
clearTimeout(this._scheduledTimeouts.get(id));
|
||||||
|
this._scheduledTimeouts.delete(id);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func getRandomData(r []byte)
|
||||||
|
"runtime.getRandomData": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
crypto.getRandomValues(loadSlice(sp + 8));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func finalizeRef(v ref)
|
||||||
|
"syscall/js.finalizeRef": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const id = this.mem.getUint32(sp + 8, true);
|
||||||
|
this._goRefCounts[id]--;
|
||||||
|
if (this._goRefCounts[id] === 0) {
|
||||||
|
const v = this._values[id];
|
||||||
|
this._values[id] = null;
|
||||||
|
this._ids.delete(v);
|
||||||
|
this._idPool.push(id);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func stringVal(value string) ref
|
||||||
|
"syscall/js.stringVal": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
storeValue(sp + 24, loadString(sp + 8));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueGet(v ref, p string) ref
|
||||||
|
"syscall/js.valueGet": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const result = Reflect.get(loadValue(sp + 8), loadString(sp + 16));
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 32, result);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueSet(v ref, p string, x ref)
|
||||||
|
"syscall/js.valueSet": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
Reflect.set(loadValue(sp + 8), loadString(sp + 16), loadValue(sp + 32));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueDelete(v ref, p string)
|
||||||
|
"syscall/js.valueDelete": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
Reflect.deleteProperty(loadValue(sp + 8), loadString(sp + 16));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueIndex(v ref, i int) ref
|
||||||
|
"syscall/js.valueIndex": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
storeValue(sp + 24, Reflect.get(loadValue(sp + 8), getInt64(sp + 16)));
|
||||||
|
},
|
||||||
|
|
||||||
|
// valueSetIndex(v ref, i int, x ref)
|
||||||
|
"syscall/js.valueSetIndex": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
Reflect.set(loadValue(sp + 8), getInt64(sp + 16), loadValue(sp + 24));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueCall(v ref, m string, args []ref) (ref, bool)
|
||||||
|
"syscall/js.valueCall": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
try {
|
||||||
|
const v = loadValue(sp + 8);
|
||||||
|
const m = Reflect.get(v, loadString(sp + 16));
|
||||||
|
const args = loadSliceOfValues(sp + 32);
|
||||||
|
const result = Reflect.apply(m, v, args);
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 56, result);
|
||||||
|
this.mem.setUint8(sp + 64, 1);
|
||||||
|
} catch (err) {
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 56, err);
|
||||||
|
this.mem.setUint8(sp + 64, 0);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueInvoke(v ref, args []ref) (ref, bool)
|
||||||
|
"syscall/js.valueInvoke": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
try {
|
||||||
|
const v = loadValue(sp + 8);
|
||||||
|
const args = loadSliceOfValues(sp + 16);
|
||||||
|
const result = Reflect.apply(v, undefined, args);
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, result);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
} catch (err) {
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, err);
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueNew(v ref, args []ref) (ref, bool)
|
||||||
|
"syscall/js.valueNew": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
try {
|
||||||
|
const v = loadValue(sp + 8);
|
||||||
|
const args = loadSliceOfValues(sp + 16);
|
||||||
|
const result = Reflect.construct(v, args);
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, result);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
} catch (err) {
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, err);
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueLength(v ref) int
|
||||||
|
"syscall/js.valueLength": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
setInt64(sp + 16, parseInt(loadValue(sp + 8).length));
|
||||||
|
},
|
||||||
|
|
||||||
|
// valuePrepareString(v ref) (ref, int)
|
||||||
|
"syscall/js.valuePrepareString": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const str = encoder.encode(String(loadValue(sp + 8)));
|
||||||
|
storeValue(sp + 16, str);
|
||||||
|
setInt64(sp + 24, str.length);
|
||||||
|
},
|
||||||
|
|
||||||
|
// valueLoadString(v ref, b []byte)
|
||||||
|
"syscall/js.valueLoadString": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const str = loadValue(sp + 8);
|
||||||
|
loadSlice(sp + 16).set(str);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueInstanceOf(v ref, t ref) bool
|
||||||
|
"syscall/js.valueInstanceOf": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
this.mem.setUint8(sp + 24, (loadValue(sp + 8) instanceof loadValue(sp + 16)) ? 1 : 0);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func copyBytesToGo(dst []byte, src ref) (int, bool)
|
||||||
|
"syscall/js.copyBytesToGo": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const dst = loadSlice(sp + 8);
|
||||||
|
const src = loadValue(sp + 32);
|
||||||
|
if (!(src instanceof Uint8Array || src instanceof Uint8ClampedArray)) {
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const toCopy = src.subarray(0, dst.length);
|
||||||
|
dst.set(toCopy);
|
||||||
|
setInt64(sp + 40, toCopy.length);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func copyBytesToJS(dst ref, src []byte) (int, bool)
|
||||||
|
"syscall/js.copyBytesToJS": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const dst = loadValue(sp + 8);
|
||||||
|
const src = loadSlice(sp + 16);
|
||||||
|
if (!(dst instanceof Uint8Array || dst instanceof Uint8ClampedArray)) {
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const toCopy = src.subarray(0, dst.length);
|
||||||
|
dst.set(toCopy);
|
||||||
|
setInt64(sp + 40, toCopy.length);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
},
|
||||||
|
|
||||||
|
"debug": (value) => {
|
||||||
|
console.log(value);
|
||||||
|
},
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async run(instance) {
|
||||||
|
if (!(instance instanceof WebAssembly.Instance)) {
|
||||||
|
throw new Error("Go.run: WebAssembly.Instance expected");
|
||||||
|
}
|
||||||
|
this._inst = instance;
|
||||||
|
this.mem = new DataView(this._inst.exports.mem.buffer);
|
||||||
|
this._values = [ // JS values that Go currently has references to, indexed by reference id
|
||||||
|
NaN,
|
||||||
|
0,
|
||||||
|
null,
|
||||||
|
true,
|
||||||
|
false,
|
||||||
|
globalThis,
|
||||||
|
this,
|
||||||
|
];
|
||||||
|
this._goRefCounts = new Array(this._values.length).fill(Infinity); // number of references that Go has to a JS value, indexed by reference id
|
||||||
|
this._ids = new Map([ // mapping from JS values to reference ids
|
||||||
|
[0, 1],
|
||||||
|
[null, 2],
|
||||||
|
[true, 3],
|
||||||
|
[false, 4],
|
||||||
|
[globalThis, 5],
|
||||||
|
[this, 6],
|
||||||
|
]);
|
||||||
|
this._idPool = []; // unused ids that have been garbage collected
|
||||||
|
this.exited = false; // whether the Go program has exited
|
||||||
|
|
||||||
|
// Pass command line arguments and environment variables to WebAssembly by writing them to the linear memory.
|
||||||
|
let offset = 4096;
|
||||||
|
|
||||||
|
const strPtr = (str) => {
|
||||||
|
const ptr = offset;
|
||||||
|
const bytes = encoder.encode(str + "\0");
|
||||||
|
new Uint8Array(this.mem.buffer, offset, bytes.length).set(bytes);
|
||||||
|
offset += bytes.length;
|
||||||
|
if (offset % 8 !== 0) {
|
||||||
|
offset += 8 - (offset % 8);
|
||||||
|
}
|
||||||
|
return ptr;
|
||||||
|
};
|
||||||
|
|
||||||
|
const argc = this.argv.length;
|
||||||
|
|
||||||
|
const argvPtrs = [];
|
||||||
|
this.argv.forEach((arg) => {
|
||||||
|
argvPtrs.push(strPtr(arg));
|
||||||
|
});
|
||||||
|
argvPtrs.push(0);
|
||||||
|
|
||||||
|
const keys = Object.keys(this.env).sort();
|
||||||
|
keys.forEach((key) => {
|
||||||
|
argvPtrs.push(strPtr(`${key}=${this.env[key]}`));
|
||||||
|
});
|
||||||
|
argvPtrs.push(0);
|
||||||
|
|
||||||
|
const argv = offset;
|
||||||
|
argvPtrs.forEach((ptr) => {
|
||||||
|
this.mem.setUint32(offset, ptr, true);
|
||||||
|
this.mem.setUint32(offset + 4, 0, true);
|
||||||
|
offset += 8;
|
||||||
|
});
|
||||||
|
|
||||||
|
// The linker guarantees global data starts from at least wasmMinDataAddr.
|
||||||
|
// Keep in sync with cmd/link/internal/ld/data.go:wasmMinDataAddr.
|
||||||
|
const wasmMinDataAddr = 4096 + 8192;
|
||||||
|
if (offset >= wasmMinDataAddr) {
|
||||||
|
throw new Error("total length of command line and environment variables exceeds limit");
|
||||||
|
}
|
||||||
|
|
||||||
|
this._inst.exports.run(argc, argv);
|
||||||
|
if (this.exited) {
|
||||||
|
this._resolveExitPromise();
|
||||||
|
}
|
||||||
|
await this._exitPromise;
|
||||||
|
}
|
||||||
|
|
||||||
|
_resume() {
|
||||||
|
if (this.exited) {
|
||||||
|
throw new Error("Go program has already exited");
|
||||||
|
}
|
||||||
|
this._inst.exports.resume();
|
||||||
|
if (this.exited) {
|
||||||
|
this._resolveExitPromise();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_makeFuncWrapper(id) {
|
||||||
|
const go = this;
|
||||||
|
return function () {
|
||||||
|
const event = { id: id, this: this, args: arguments };
|
||||||
|
go._pendingEvent = event;
|
||||||
|
go._resume();
|
||||||
|
return event.result;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})();
|
||||||
281
docs/ipfs-distribution.md
Normal file
281
docs/ipfs-distribution.md
Normal file
|
|
@ -0,0 +1,281 @@
|
||||||
|
# IPFS Distribution Guide
|
||||||
|
|
||||||
|
This guide explains how to distribute your encrypted `.smsg` content via IPFS (InterPlanetary File System) for permanent, decentralized hosting.
|
||||||
|
|
||||||
|
## Why IPFS?
|
||||||
|
|
||||||
|
IPFS is ideal for dapp.fm content because:
|
||||||
|
|
||||||
|
- **Permanent links** - Content-addressed (CID) means the URL never changes
|
||||||
|
- **No hosting costs** - Pin with free services or self-host
|
||||||
|
- **Censorship resistant** - No single point of failure
|
||||||
|
- **Global CDN** - Content served from nearest peer
|
||||||
|
- **Perfect for archival** - Your content survives even if you disappear
|
||||||
|
|
||||||
|
Combined with password-as-license, IPFS creates truly permanent media distribution:
|
||||||
|
|
||||||
|
```
|
||||||
|
Artist uploads to IPFS → Fan downloads from anywhere → Password unlocks forever
|
||||||
|
```
|
||||||
|
|
||||||
|
## Quick Start
|
||||||
|
|
||||||
|
### 1. Install IPFS
|
||||||
|
|
||||||
|
**macOS:**
|
||||||
|
```bash
|
||||||
|
brew install ipfs
|
||||||
|
```
|
||||||
|
|
||||||
|
**Linux:**
|
||||||
|
```bash
|
||||||
|
wget https://dist.ipfs.tech/kubo/v0.24.0/kubo_v0.24.0_linux-amd64.tar.gz
|
||||||
|
tar xvfz kubo_v0.24.0_linux-amd64.tar.gz
|
||||||
|
sudo mv kubo/ipfs /usr/local/bin/
|
||||||
|
```
|
||||||
|
|
||||||
|
**Windows:**
|
||||||
|
Download from https://dist.ipfs.tech/#kubo
|
||||||
|
|
||||||
|
### 2. Initialize and Start
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ipfs init
|
||||||
|
ipfs daemon
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Add Your Content
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create your encrypted content first
|
||||||
|
go run ./cmd/mkdemo my-album.mp4 my-album.smsg
|
||||||
|
|
||||||
|
# Add to IPFS
|
||||||
|
ipfs add my-album.smsg
|
||||||
|
# Output: added QmX...abc my-album.smsg
|
||||||
|
|
||||||
|
# Your content is now available at:
|
||||||
|
# - Local: http://localhost:8080/ipfs/QmX...abc
|
||||||
|
# - Gateway: https://ipfs.io/ipfs/QmX...abc
|
||||||
|
```
|
||||||
|
|
||||||
|
## Distribution Workflow
|
||||||
|
|
||||||
|
### For Artists
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Package your media
|
||||||
|
go run ./cmd/mkdemo album.mp4 album.smsg
|
||||||
|
# Save the password: PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
|
||||||
|
# 2. Add to IPFS
|
||||||
|
ipfs add album.smsg
|
||||||
|
# added QmYourContentCID album.smsg
|
||||||
|
|
||||||
|
# 3. Pin for persistence (choose one):
|
||||||
|
|
||||||
|
# Option A: Pin locally (requires running node)
|
||||||
|
ipfs pin add QmYourContentCID
|
||||||
|
|
||||||
|
# Option B: Use Pinata (free tier: 1GB)
|
||||||
|
curl -X POST "https://api.pinata.cloud/pinning/pinByHash" \
|
||||||
|
-H "Authorization: Bearer YOUR_JWT" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"hashToPin": "QmYourContentCID"}'
|
||||||
|
|
||||||
|
# Option C: Use web3.storage (free tier: 5GB)
|
||||||
|
# Upload at https://web3.storage
|
||||||
|
|
||||||
|
# 4. Share with fans
|
||||||
|
# CID: QmYourContentCID
|
||||||
|
# Password: PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
# Gateway URL: https://ipfs.io/ipfs/QmYourContentCID
|
||||||
|
```
|
||||||
|
|
||||||
|
### For Fans
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Download via any gateway
|
||||||
|
curl -o album.smsg https://ipfs.io/ipfs/QmYourContentCID
|
||||||
|
|
||||||
|
# Or via local node (faster if running)
|
||||||
|
ipfs get QmYourContentCID -o album.smsg
|
||||||
|
|
||||||
|
# Play with password in browser demo or native app
|
||||||
|
```
|
||||||
|
|
||||||
|
## IPFS Gateways
|
||||||
|
|
||||||
|
Public gateways for sharing (no IPFS node required):
|
||||||
|
|
||||||
|
| Gateway | URL Pattern | Notes |
|
||||||
|
|---------|-------------|-------|
|
||||||
|
| ipfs.io | `https://ipfs.io/ipfs/{CID}` | Official, reliable |
|
||||||
|
| dweb.link | `https://{CID}.ipfs.dweb.link` | Subdomain style |
|
||||||
|
| cloudflare | `https://cloudflare-ipfs.com/ipfs/{CID}` | Fast, cached |
|
||||||
|
| w3s.link | `https://{CID}.ipfs.w3s.link` | web3.storage |
|
||||||
|
| nftstorage.link | `https://{CID}.ipfs.nftstorage.link` | NFT.storage |
|
||||||
|
|
||||||
|
**Example URLs for CID `QmX...abc`:**
|
||||||
|
```
|
||||||
|
https://ipfs.io/ipfs/QmX...abc
|
||||||
|
https://QmX...abc.ipfs.dweb.link
|
||||||
|
https://cloudflare-ipfs.com/ipfs/QmX...abc
|
||||||
|
```
|
||||||
|
|
||||||
|
## Pinning Services
|
||||||
|
|
||||||
|
Content on IPFS is only available while someone is hosting it. Use pinning services for persistence:
|
||||||
|
|
||||||
|
### Free Options
|
||||||
|
|
||||||
|
| Service | Free Tier | Link |
|
||||||
|
|---------|-----------|------|
|
||||||
|
| Pinata | 1 GB | https://pinata.cloud |
|
||||||
|
| web3.storage | 5 GB | https://web3.storage |
|
||||||
|
| NFT.storage | Unlimited* | https://nft.storage |
|
||||||
|
| Filebase | 5 GB | https://filebase.com |
|
||||||
|
|
||||||
|
*NFT.storage is designed for NFT data but works for any content.
|
||||||
|
|
||||||
|
### Pin via CLI
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Pinata
|
||||||
|
export PINATA_JWT="your-jwt-token"
|
||||||
|
curl -X POST "https://api.pinata.cloud/pinning/pinByHash" \
|
||||||
|
-H "Authorization: Bearer $PINATA_JWT" \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-d '{"hashToPin": "QmYourCID", "pinataMetadata": {"name": "my-album.smsg"}}'
|
||||||
|
|
||||||
|
# web3.storage (using w3 CLI)
|
||||||
|
npm install -g @web3-storage/w3cli
|
||||||
|
w3 login your@email.com
|
||||||
|
w3 up my-album.smsg
|
||||||
|
```
|
||||||
|
|
||||||
|
## Integration with Demo Page
|
||||||
|
|
||||||
|
The demo page can load content directly from IPFS gateways:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// In the demo page, use gateway URL
|
||||||
|
const ipfsCID = 'QmYourContentCID';
|
||||||
|
const gatewayUrl = `https://ipfs.io/ipfs/${ipfsCID}`;
|
||||||
|
|
||||||
|
// Fetch and decrypt
|
||||||
|
const response = await fetch(gatewayUrl);
|
||||||
|
const bytes = new Uint8Array(await response.arrayBuffer());
|
||||||
|
const msg = await BorgSMSG.decryptBinary(bytes, password);
|
||||||
|
```
|
||||||
|
|
||||||
|
Or use the Fan tab with the IPFS gateway URL directly.
|
||||||
|
|
||||||
|
## Best Practices
|
||||||
|
|
||||||
|
### 1. Always Pin Your Content
|
||||||
|
|
||||||
|
IPFS garbage-collects unpinned content. Always pin important files:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
ipfs pin add QmYourCID
|
||||||
|
# Or use a pinning service
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Use Multiple Pins
|
||||||
|
|
||||||
|
Pin with 2-3 services for redundancy:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Pin locally
|
||||||
|
ipfs pin add QmYourCID
|
||||||
|
|
||||||
|
# Also pin with Pinata
|
||||||
|
curl -X POST "https://api.pinata.cloud/pinning/pinByHash" ...
|
||||||
|
|
||||||
|
# And web3.storage as backup
|
||||||
|
w3 up my-album.smsg
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Share CID + Password Separately
|
||||||
|
|
||||||
|
```
|
||||||
|
Download: https://ipfs.io/ipfs/QmYourCID
|
||||||
|
License: [sent via email/DM after purchase]
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Use IPNS for Updates (Optional)
|
||||||
|
|
||||||
|
IPNS lets you update content while keeping the same URL:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# Create IPNS name
|
||||||
|
ipfs name publish QmYourCID
|
||||||
|
# Published to k51...xyz
|
||||||
|
|
||||||
|
# Your content is now at:
|
||||||
|
# https://ipfs.io/ipns/k51...xyz
|
||||||
|
|
||||||
|
# Update to new version later:
|
||||||
|
ipfs name publish QmNewVersionCID
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example: Full Album Release
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Create encrypted album
|
||||||
|
go run ./cmd/mkdemo my-album.mp4 my-album.smsg
|
||||||
|
# Password: PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
|
||||||
|
# 2. Add to IPFS
|
||||||
|
ipfs add my-album.smsg
|
||||||
|
# added QmAlbumCID my-album.smsg
|
||||||
|
|
||||||
|
# 3. Pin with multiple services
|
||||||
|
ipfs pin add QmAlbumCID
|
||||||
|
w3 up my-album.smsg
|
||||||
|
|
||||||
|
# 4. Create release page
|
||||||
|
cat > release.html << 'EOF'
|
||||||
|
<!DOCTYPE html>
|
||||||
|
<html>
|
||||||
|
<head><title>My Album - Download</title></head>
|
||||||
|
<body>
|
||||||
|
<h1>My Album</h1>
|
||||||
|
<p>Download: <a href="https://ipfs.io/ipfs/QmAlbumCID">IPFS</a></p>
|
||||||
|
<p>After purchase, you'll receive your license key via email.</p>
|
||||||
|
<p><a href="https://demo.dapp.fm">Play with license key</a></p>
|
||||||
|
</body>
|
||||||
|
</html>
|
||||||
|
EOF
|
||||||
|
|
||||||
|
# 5. Host release page on IPFS too!
|
||||||
|
ipfs add release.html
|
||||||
|
# added QmReleaseCID release.html
|
||||||
|
# Share: https://ipfs.io/ipfs/QmReleaseCID
|
||||||
|
```
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Content Not Loading
|
||||||
|
|
||||||
|
1. **Check if pinned**: `ipfs pin ls | grep QmYourCID`
|
||||||
|
2. **Try different gateway**: Some gateways cache slowly
|
||||||
|
3. **Check daemon running**: `ipfs swarm peers` should show peers
|
||||||
|
|
||||||
|
### Slow Downloads
|
||||||
|
|
||||||
|
1. Use a faster gateway (cloudflare-ipfs.com is often fastest)
|
||||||
|
2. Run your own IPFS node for direct access
|
||||||
|
3. Pre-warm gateways by accessing content once
|
||||||
|
|
||||||
|
### CID Changed After Re-adding
|
||||||
|
|
||||||
|
IPFS CIDs are content-addressed. If you modify the file, the CID changes. For the same content, the CID is always identical.
|
||||||
|
|
||||||
|
## Resources
|
||||||
|
|
||||||
|
- [IPFS Documentation](https://docs.ipfs.tech/)
|
||||||
|
- [Pinata Docs](https://docs.pinata.cloud/)
|
||||||
|
- [web3.storage Docs](https://web3.storage/docs/)
|
||||||
|
- [IPFS Gateway Checker](https://ipfs.github.io/public-gateway-checker/)
|
||||||
497
docs/payment-integration.md
Normal file
497
docs/payment-integration.md
Normal file
|
|
@ -0,0 +1,497 @@
|
||||||
|
# Payment Integration Guide
|
||||||
|
|
||||||
|
This guide shows how to sell your encrypted `.smsg` content and deliver license keys (passwords) to customers using popular payment processors.
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
The dapp.fm model is simple:
|
||||||
|
|
||||||
|
```
|
||||||
|
1. Customer pays via Stripe/Gumroad/PayPal
|
||||||
|
2. Payment processor triggers webhook or delivers digital product
|
||||||
|
3. Customer receives password (license key)
|
||||||
|
4. Customer downloads .smsg from your CDN/IPFS
|
||||||
|
5. Customer decrypts with password - done forever
|
||||||
|
```
|
||||||
|
|
||||||
|
No license servers, no accounts, no ongoing infrastructure.
|
||||||
|
|
||||||
|
## Stripe Integration
|
||||||
|
|
||||||
|
### Option 1: Stripe Payment Links (Easiest)
|
||||||
|
|
||||||
|
No code required - use Stripe's hosted checkout:
|
||||||
|
|
||||||
|
1. Create a Payment Link in Stripe Dashboard
|
||||||
|
2. Set up a webhook to email the password on successful payment
|
||||||
|
3. Host your `.smsg` file anywhere (CDN, IPFS, S3)
|
||||||
|
|
||||||
|
**Webhook endpoint (Node.js/Express):**
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const express = require('express');
|
||||||
|
const stripe = require('stripe')(process.env.STRIPE_SECRET_KEY);
|
||||||
|
const nodemailer = require('nodemailer');
|
||||||
|
|
||||||
|
const app = express();
|
||||||
|
|
||||||
|
// Your content passwords (store securely!)
|
||||||
|
const PRODUCTS = {
|
||||||
|
'prod_ABC123': {
|
||||||
|
name: 'My Album',
|
||||||
|
password: 'PMVXogAJNVe_DDABfTmLYztaJAzsD0R7',
|
||||||
|
downloadUrl: 'https://ipfs.io/ipfs/QmYourCID'
|
||||||
|
}
|
||||||
|
};
|
||||||
|
|
||||||
|
app.post('/webhook', express.raw({type: 'application/json'}), async (req, res) => {
|
||||||
|
const sig = req.headers['stripe-signature'];
|
||||||
|
const endpointSecret = process.env.STRIPE_WEBHOOK_SECRET;
|
||||||
|
|
||||||
|
let event;
|
||||||
|
try {
|
||||||
|
event = stripe.webhooks.constructEvent(req.body, sig, endpointSecret);
|
||||||
|
} catch (err) {
|
||||||
|
return res.status(400).send(`Webhook Error: ${err.message}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (event.type === 'checkout.session.completed') {
|
||||||
|
const session = event.data.object;
|
||||||
|
const customerEmail = session.customer_details.email;
|
||||||
|
const productId = session.metadata.product_id;
|
||||||
|
const product = PRODUCTS[productId];
|
||||||
|
|
||||||
|
if (product) {
|
||||||
|
await sendLicenseEmail(customerEmail, product);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({received: true});
|
||||||
|
});
|
||||||
|
|
||||||
|
async function sendLicenseEmail(email, product) {
|
||||||
|
const transporter = nodemailer.createTransport({
|
||||||
|
// Configure your email provider
|
||||||
|
service: 'gmail',
|
||||||
|
auth: {
|
||||||
|
user: process.env.EMAIL_USER,
|
||||||
|
pass: process.env.EMAIL_PASS
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
await transporter.sendMail({
|
||||||
|
from: 'artist@example.com',
|
||||||
|
to: email,
|
||||||
|
subject: `Your License Key for ${product.name}`,
|
||||||
|
html: `
|
||||||
|
<h1>Thank you for your purchase!</h1>
|
||||||
|
<p><strong>Download:</strong> <a href="${product.downloadUrl}">${product.name}</a></p>
|
||||||
|
<p><strong>License Key:</strong> <code>${product.password}</code></p>
|
||||||
|
<p><strong>How to play:</strong></p>
|
||||||
|
<ol>
|
||||||
|
<li>Download the .smsg file from the link above</li>
|
||||||
|
<li>Go to <a href="https://demo.dapp.fm">demo.dapp.fm</a></li>
|
||||||
|
<li>Click "Fan" tab, then "Unlock Licensed Content"</li>
|
||||||
|
<li>Paste the file and enter your license key</li>
|
||||||
|
</ol>
|
||||||
|
<p>This is your permanent license - save this email!</p>
|
||||||
|
`
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
app.listen(3000);
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option 2: Stripe Checkout Session (More Control)
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const stripe = require('stripe')(process.env.STRIPE_SECRET_KEY);
|
||||||
|
|
||||||
|
// Create checkout session
|
||||||
|
app.post('/create-checkout', async (req, res) => {
|
||||||
|
const { productId } = req.body;
|
||||||
|
|
||||||
|
const session = await stripe.checkout.sessions.create({
|
||||||
|
payment_method_types: ['card'],
|
||||||
|
line_items: [{
|
||||||
|
price: 'price_ABC123', // Your Stripe price ID
|
||||||
|
quantity: 1,
|
||||||
|
}],
|
||||||
|
mode: 'payment',
|
||||||
|
success_url: 'https://yoursite.com/success?session_id={CHECKOUT_SESSION_ID}',
|
||||||
|
cancel_url: 'https://yoursite.com/cancel',
|
||||||
|
metadata: {
|
||||||
|
product_id: productId
|
||||||
|
}
|
||||||
|
});
|
||||||
|
|
||||||
|
res.json({ url: session.url });
|
||||||
|
});
|
||||||
|
|
||||||
|
// Success page - show license after payment
|
||||||
|
app.get('/success', async (req, res) => {
|
||||||
|
const session = await stripe.checkout.sessions.retrieve(req.query.session_id);
|
||||||
|
|
||||||
|
if (session.payment_status === 'paid') {
|
||||||
|
const product = PRODUCTS[session.metadata.product_id];
|
||||||
|
res.send(`
|
||||||
|
<h1>Thank you!</h1>
|
||||||
|
<p>Download: <a href="${product.downloadUrl}">${product.name}</a></p>
|
||||||
|
<p>License Key: <code>${product.password}</code></p>
|
||||||
|
`);
|
||||||
|
} else {
|
||||||
|
res.send('Payment not completed');
|
||||||
|
}
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Gumroad Integration
|
||||||
|
|
||||||
|
Gumroad is perfect for artists - handles payments, delivery, and customer management.
|
||||||
|
|
||||||
|
### Setup
|
||||||
|
|
||||||
|
1. Create a Digital Product on Gumroad
|
||||||
|
2. Upload a text file or PDF containing the password
|
||||||
|
3. Set your `.smsg` download URL in the product description
|
||||||
|
4. Gumroad delivers the password file on purchase
|
||||||
|
|
||||||
|
### Product Setup
|
||||||
|
|
||||||
|
**Product Description:**
|
||||||
|
```
|
||||||
|
My Album - Encrypted Digital Download
|
||||||
|
|
||||||
|
After purchase, you'll receive:
|
||||||
|
1. A license key (in the download)
|
||||||
|
2. Download link for the .smsg file
|
||||||
|
|
||||||
|
How to play:
|
||||||
|
1. Download the .smsg file: https://ipfs.io/ipfs/QmYourCID
|
||||||
|
2. Go to https://demo.dapp.fm
|
||||||
|
3. Click "Fan" → "Unlock Licensed Content"
|
||||||
|
4. Enter your license key from the PDF
|
||||||
|
```
|
||||||
|
|
||||||
|
**Delivered File (license.txt):**
|
||||||
|
```
|
||||||
|
Your License Key: PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
|
||||||
|
Download your content: https://ipfs.io/ipfs/QmYourCID
|
||||||
|
|
||||||
|
This is your permanent license - keep this file safe!
|
||||||
|
The content works offline forever with this key.
|
||||||
|
|
||||||
|
Need help? Visit https://demo.dapp.fm
|
||||||
|
```
|
||||||
|
|
||||||
|
### Gumroad Ping (Webhook)
|
||||||
|
|
||||||
|
For automated delivery, use Gumroad's Ping feature:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const express = require('express');
|
||||||
|
const app = express();
|
||||||
|
|
||||||
|
app.use(express.urlencoded({ extended: true }));
|
||||||
|
|
||||||
|
// Gumroad sends POST to this endpoint on sale
|
||||||
|
app.post('/gumroad-ping', (req, res) => {
|
||||||
|
const {
|
||||||
|
seller_id,
|
||||||
|
product_id,
|
||||||
|
email,
|
||||||
|
full_name,
|
||||||
|
purchaser_id
|
||||||
|
} = req.body;
|
||||||
|
|
||||||
|
// Verify it's from Gumroad (check seller_id matches yours)
|
||||||
|
if (seller_id !== process.env.GUMROAD_SELLER_ID) {
|
||||||
|
return res.status(403).send('Invalid seller');
|
||||||
|
}
|
||||||
|
|
||||||
|
const product = PRODUCTS[product_id];
|
||||||
|
if (product) {
|
||||||
|
// Send custom email with password
|
||||||
|
sendLicenseEmail(email, product);
|
||||||
|
}
|
||||||
|
|
||||||
|
res.send('OK');
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## PayPal Integration
|
||||||
|
|
||||||
|
### PayPal Buttons + IPN
|
||||||
|
|
||||||
|
```html
|
||||||
|
<!-- PayPal Buy Button -->
|
||||||
|
<form action="https://www.paypal.com/cgi-bin/webscr" method="post">
|
||||||
|
<input type="hidden" name="cmd" value="_xclick">
|
||||||
|
<input type="hidden" name="business" value="artist@example.com">
|
||||||
|
<input type="hidden" name="item_name" value="My Album - Digital Download">
|
||||||
|
<input type="hidden" name="item_number" value="album-001">
|
||||||
|
<input type="hidden" name="amount" value="9.99">
|
||||||
|
<input type="hidden" name="currency_code" value="USD">
|
||||||
|
<input type="hidden" name="notify_url" value="https://yoursite.com/paypal-ipn">
|
||||||
|
<input type="hidden" name="return" value="https://yoursite.com/thank-you">
|
||||||
|
<input type="submit" value="Buy Now - $9.99">
|
||||||
|
</form>
|
||||||
|
```
|
||||||
|
|
||||||
|
**IPN Handler:**
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const express = require('express');
|
||||||
|
const axios = require('axios');
|
||||||
|
|
||||||
|
app.post('/paypal-ipn', express.urlencoded({ extended: true }), async (req, res) => {
|
||||||
|
// Verify with PayPal
|
||||||
|
const verifyUrl = 'https://ipnpb.paypal.com/cgi-bin/webscr';
|
||||||
|
const verifyBody = 'cmd=_notify-validate&' + new URLSearchParams(req.body).toString();
|
||||||
|
|
||||||
|
const response = await axios.post(verifyUrl, verifyBody);
|
||||||
|
|
||||||
|
if (response.data === 'VERIFIED' && req.body.payment_status === 'Completed') {
|
||||||
|
const email = req.body.payer_email;
|
||||||
|
const itemNumber = req.body.item_number;
|
||||||
|
const product = PRODUCTS[itemNumber];
|
||||||
|
|
||||||
|
if (product) {
|
||||||
|
await sendLicenseEmail(email, product);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
res.send('OK');
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
## Ko-fi Integration
|
||||||
|
|
||||||
|
Ko-fi is great for tips and single purchases.
|
||||||
|
|
||||||
|
### Setup
|
||||||
|
|
||||||
|
1. Enable "Commissions" or "Shop" on Ko-fi
|
||||||
|
2. Create a product with the license key in the thank-you message
|
||||||
|
3. Link to your .smsg download
|
||||||
|
|
||||||
|
**Ko-fi Thank You Message:**
|
||||||
|
```
|
||||||
|
Thank you for your purchase!
|
||||||
|
|
||||||
|
Your License Key: PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
|
||||||
|
Download: https://ipfs.io/ipfs/QmYourCID
|
||||||
|
|
||||||
|
Play at: https://demo.dapp.fm (Fan → Unlock Licensed Content)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Serverless Options
|
||||||
|
|
||||||
|
### Vercel/Netlify Functions
|
||||||
|
|
||||||
|
No server needed - use serverless functions:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// api/stripe-webhook.js (Vercel)
|
||||||
|
import Stripe from 'stripe';
|
||||||
|
import { Resend } from 'resend';
|
||||||
|
|
||||||
|
const stripe = new Stripe(process.env.STRIPE_SECRET_KEY);
|
||||||
|
const resend = new Resend(process.env.RESEND_API_KEY);
|
||||||
|
|
||||||
|
export default async function handler(req, res) {
|
||||||
|
if (req.method !== 'POST') {
|
||||||
|
return res.status(405).end();
|
||||||
|
}
|
||||||
|
|
||||||
|
const sig = req.headers['stripe-signature'];
|
||||||
|
const event = stripe.webhooks.constructEvent(
|
||||||
|
req.body,
|
||||||
|
sig,
|
||||||
|
process.env.STRIPE_WEBHOOK_SECRET
|
||||||
|
);
|
||||||
|
|
||||||
|
if (event.type === 'checkout.session.completed') {
|
||||||
|
const session = event.data.object;
|
||||||
|
|
||||||
|
await resend.emails.send({
|
||||||
|
from: 'artist@yoursite.com',
|
||||||
|
to: session.customer_details.email,
|
||||||
|
subject: 'Your License Key',
|
||||||
|
html: `
|
||||||
|
<p>Download: <a href="https://ipfs.io/ipfs/QmYourCID">My Album</a></p>
|
||||||
|
<p>License Key: <code>PMVXogAJNVe_DDABfTmLYztaJAzsD0R7</code></p>
|
||||||
|
`
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
res.json({ received: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
export const config = {
|
||||||
|
api: { bodyParser: false }
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Manual Workflow (No Code)
|
||||||
|
|
||||||
|
For artists who don't want to set up webhooks:
|
||||||
|
|
||||||
|
### Using Email
|
||||||
|
|
||||||
|
1. **Gumroad/Ko-fi**: Set product to require email
|
||||||
|
2. **Manual delivery**: Check sales daily, email passwords manually
|
||||||
|
3. **Template**:
|
||||||
|
|
||||||
|
```
|
||||||
|
Subject: Your License for [Album Name]
|
||||||
|
|
||||||
|
Hi [Name],
|
||||||
|
|
||||||
|
Thank you for your purchase!
|
||||||
|
|
||||||
|
Download: [IPFS/CDN link]
|
||||||
|
License Key: [password]
|
||||||
|
|
||||||
|
How to play:
|
||||||
|
1. Download the .smsg file
|
||||||
|
2. Go to demo.dapp.fm
|
||||||
|
3. Fan tab → Unlock Licensed Content
|
||||||
|
4. Enter your license key
|
||||||
|
|
||||||
|
Enjoy! This license works forever.
|
||||||
|
|
||||||
|
[Artist Name]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Using Discord/Telegram
|
||||||
|
|
||||||
|
1. Sell via Gumroad (free tier)
|
||||||
|
2. Require customers join your Discord/Telegram
|
||||||
|
3. Bot or manual delivery of license keys
|
||||||
|
4. Community building bonus!
|
||||||
|
|
||||||
|
## Security Best Practices
|
||||||
|
|
||||||
|
### 1. One Password Per Product
|
||||||
|
|
||||||
|
Don't reuse passwords across products:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const PRODUCTS = {
|
||||||
|
'album-2024': { password: 'unique-key-1' },
|
||||||
|
'album-2023': { password: 'unique-key-2' },
|
||||||
|
'single-summer': { password: 'unique-key-3' }
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. Environment Variables
|
||||||
|
|
||||||
|
Never hardcode passwords in source:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# .env
|
||||||
|
ALBUM_2024_PASSWORD=PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
STRIPE_SECRET_KEY=sk_live_...
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3. Webhook Verification
|
||||||
|
|
||||||
|
Always verify webhooks are from the payment provider:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Stripe
|
||||||
|
stripe.webhooks.constructEvent(body, sig, secret);
|
||||||
|
|
||||||
|
// Gumroad
|
||||||
|
if (seller_id !== MY_SELLER_ID) reject();
|
||||||
|
|
||||||
|
// PayPal
|
||||||
|
verify with IPN endpoint
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. HTTPS Only
|
||||||
|
|
||||||
|
All webhook endpoints must use HTTPS.
|
||||||
|
|
||||||
|
## Pricing Strategies
|
||||||
|
|
||||||
|
### Direct Sale (Perpetual License)
|
||||||
|
|
||||||
|
- Customer pays once, owns forever
|
||||||
|
- Single password for all buyers
|
||||||
|
- Best for: Albums, films, books
|
||||||
|
|
||||||
|
### Time-Limited (Streaming/Rental)
|
||||||
|
|
||||||
|
Use dapp.fm Re-Key feature:
|
||||||
|
|
||||||
|
1. Encrypt master copy with master password
|
||||||
|
2. On purchase, re-key with customer-specific password + expiry
|
||||||
|
3. Deliver unique password per customer
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// On purchase webhook
|
||||||
|
const customerPassword = generateUniquePassword();
|
||||||
|
const expiry = Date.now() + (24 * 60 * 60 * 1000); // 24 hours
|
||||||
|
|
||||||
|
// Use WASM or Go to re-key
|
||||||
|
const customerVersion = await rekeyContent(masterSmsg, masterPassword, customerPassword, expiry);
|
||||||
|
|
||||||
|
// Deliver customer-specific file + password
|
||||||
|
```
|
||||||
|
|
||||||
|
### Tiered Access
|
||||||
|
|
||||||
|
Different passwords for different tiers:
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
const TIERS = {
|
||||||
|
'preview': { password: 'preview-key', expiry: '30s' },
|
||||||
|
'rental': { password: 'rental-key', expiry: '7d' },
|
||||||
|
'own': { password: 'perpetual-key', expiry: null }
|
||||||
|
};
|
||||||
|
```
|
||||||
|
|
||||||
|
## Example: Complete Stripe Setup
|
||||||
|
|
||||||
|
```bash
|
||||||
|
# 1. Create your content
|
||||||
|
go run ./cmd/mkdemo album.mp4 album.smsg
|
||||||
|
# Password: PMVXogAJNVe_DDABfTmLYztaJAzsD0R7
|
||||||
|
|
||||||
|
# 2. Upload to IPFS
|
||||||
|
ipfs add album.smsg
|
||||||
|
# QmAlbumCID
|
||||||
|
|
||||||
|
# 3. Create Stripe product
|
||||||
|
# Dashboard → Products → Add Product
|
||||||
|
# Name: My Album
|
||||||
|
# Price: $9.99
|
||||||
|
|
||||||
|
# 4. Create Payment Link
|
||||||
|
# Dashboard → Payment Links → New
|
||||||
|
# Select your product
|
||||||
|
# Get link: https://buy.stripe.com/xxx
|
||||||
|
|
||||||
|
# 5. Set up webhook
|
||||||
|
# Dashboard → Developers → Webhooks → Add endpoint
|
||||||
|
# URL: https://yoursite.com/api/stripe-webhook
|
||||||
|
# Events: checkout.session.completed
|
||||||
|
|
||||||
|
# 6. Deploy webhook handler (Vercel example)
|
||||||
|
vercel deploy
|
||||||
|
|
||||||
|
# 7. Share payment link
|
||||||
|
# Fans click → Pay → Get email with password → Download → Play forever
|
||||||
|
```
|
||||||
|
|
||||||
|
## Resources
|
||||||
|
|
||||||
|
- [Stripe Webhooks](https://stripe.com/docs/webhooks)
|
||||||
|
- [Gumroad Ping](https://help.gumroad.com/article/149-ping)
|
||||||
|
- [PayPal IPN](https://developer.paypal.com/docs/ipn/)
|
||||||
|
- [Resend (Email API)](https://resend.com/)
|
||||||
|
- [Vercel Functions](https://vercel.com/docs/functions)
|
||||||
209
docs/plans/2026-02-21-borg-upgrade-design.md
Normal file
209
docs/plans/2026-02-21-borg-upgrade-design.md
Normal file
|
|
@ -0,0 +1,209 @@
|
||||||
|
# Borg Production Backup Upgrade — Design Document
|
||||||
|
|
||||||
|
**Date:** 2026-02-21
|
||||||
|
**Status:** Implemented
|
||||||
|
**Approach:** Bottom-Up Refactor
|
||||||
|
|
||||||
|
## Problem Statement
|
||||||
|
|
||||||
|
Borg's `collect local` command fails on large directories because DataNode loads
|
||||||
|
everything into RAM. The UI spinner floods non-TTY output. Broken symlinks crash
|
||||||
|
the collection pipeline. Key derivation uses bare SHA-256. These issues prevent
|
||||||
|
Borg from being used for production backup workflows.
|
||||||
|
|
||||||
|
## Goals
|
||||||
|
|
||||||
|
1. Make `collect local` work reliably on large directories (10GB+)
|
||||||
|
2. Handle symlinks properly (skip broken, follow/store valid)
|
||||||
|
3. Add quiet/scripted mode for cron and pipeline use
|
||||||
|
4. Harden encryption key derivation (Argon2id)
|
||||||
|
5. Clean up the library for external consumers
|
||||||
|
|
||||||
|
## Non-Goals
|
||||||
|
|
||||||
|
- Full core/go-* package integration (deferred — circular dependency risk since
|
||||||
|
core imports Borg)
|
||||||
|
- New CLI commands beyond fixing existing ones
|
||||||
|
- Network transport or remote sync features
|
||||||
|
- GUI or web interface
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
### Current Flow (Broken for Large Dirs)
|
||||||
|
|
||||||
|
```
|
||||||
|
Walk directory → Load ALL files into DataNode (RAM) → Compress → Encrypt → Write
|
||||||
|
```
|
||||||
|
|
||||||
|
### New Flow (Streaming)
|
||||||
|
|
||||||
|
```
|
||||||
|
Walk directory → tar.Writer stream → compress stream → chunked encrypt → output file
|
||||||
|
```
|
||||||
|
|
||||||
|
DataNode remains THE core abstraction — the I/O sandbox that keeps everything safe
|
||||||
|
and portable. The streaming path bypasses DataNode for the `collect local` pipeline
|
||||||
|
only, while DataNode continues to serve all other use cases (programmatic access,
|
||||||
|
format conversion, inspection).
|
||||||
|
|
||||||
|
## Design Sections
|
||||||
|
|
||||||
|
### 1. DataNode Refactor
|
||||||
|
|
||||||
|
DataNode gains a `ToTarWriter(w io.Writer)` method for streaming out its contents
|
||||||
|
without buffering the entire archive. This is the bridge between DataNode's sandbox
|
||||||
|
model and streaming I/O.
|
||||||
|
|
||||||
|
New symlink handling:
|
||||||
|
|
||||||
|
| Symlink State | Behaviour |
|
||||||
|
|---------------|-----------|
|
||||||
|
| Valid, points inside DataNode root | Store as symlink entry |
|
||||||
|
| Valid, points outside DataNode root | Follow and store target content |
|
||||||
|
| Broken (dangling) | Skip with warning (configurable via `SkipBrokenSymlinks`) |
|
||||||
|
|
||||||
|
The `AddPath` method gets an options struct:
|
||||||
|
|
||||||
|
```go
|
||||||
|
type AddPathOptions struct {
|
||||||
|
SkipBrokenSymlinks bool // default: true
|
||||||
|
FollowSymlinks bool // default: false (store as symlinks)
|
||||||
|
ExcludePatterns []string
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### 2. UI & Logger Cleanup
|
||||||
|
|
||||||
|
Replace direct spinner writes with a `Progress` interface:
|
||||||
|
|
||||||
|
```go
|
||||||
|
type Progress interface {
|
||||||
|
Start(label string)
|
||||||
|
Update(current, total int64)
|
||||||
|
Finish(label string)
|
||||||
|
Log(level, msg string, args ...any)
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Two implementations:
|
||||||
|
- **InteractiveProgress** — spinner + progress bar (when `isatty(stdout)`)
|
||||||
|
- **QuietProgress** — structured log lines only (cron, pipes, `--quiet` flag)
|
||||||
|
|
||||||
|
TTY detection at startup selects the implementation. All existing `ui.Spinner` and
|
||||||
|
`fmt.Printf` calls in library code get replaced with `Progress` method calls.
|
||||||
|
|
||||||
|
New `--quiet` / `-q` flag on all commands suppresses non-error output.
|
||||||
|
|
||||||
|
### 3. TIM Streaming Encryption
|
||||||
|
|
||||||
|
ChaCha20-Poly1305 is AEAD — it needs the full plaintext to compute the auth tag.
|
||||||
|
For streaming, we use a chunked block format:
|
||||||
|
|
||||||
|
```
|
||||||
|
[magic: 4 bytes "STIM"]
|
||||||
|
[version: 1 byte]
|
||||||
|
[salt: 16 bytes] ← Argon2id salt
|
||||||
|
[argon2 params: 12 bytes] ← time, memory, threads (uint32 LE each)
|
||||||
|
|
||||||
|
Per block (repeated):
|
||||||
|
[nonce: 12 bytes]
|
||||||
|
[length: 4 bytes LE] ← ciphertext length including 16-byte Poly1305 tag
|
||||||
|
[ciphertext: N bytes] ← encrypted chunk + tag
|
||||||
|
|
||||||
|
Final block:
|
||||||
|
[nonce: 12 bytes]
|
||||||
|
[length: 4 bytes LE = 0] ← zero length signals EOF
|
||||||
|
```
|
||||||
|
|
||||||
|
Block size: 1 MiB plaintext → ~1 MiB + 16 bytes ciphertext per block.
|
||||||
|
|
||||||
|
The `Sigil` (Enchantrix crypto handle) wraps this as `StreamEncrypt(r io.Reader,
|
||||||
|
w io.Writer)` and `StreamDecrypt(r io.Reader, w io.Writer)`.
|
||||||
|
|
||||||
|
### 4. Key Derivation Hardening
|
||||||
|
|
||||||
|
Replace bare `SHA-256(password)` with Argon2id:
|
||||||
|
|
||||||
|
```go
|
||||||
|
key := argon2.IDKey(password, salt, time=3, memory=64*1024, threads=4, keyLen=32)
|
||||||
|
```
|
||||||
|
|
||||||
|
Parameters stored in the STIM header (section 3 above) so they can be tuned
|
||||||
|
without breaking existing archives. Random 16-byte salt generated per archive.
|
||||||
|
|
||||||
|
Backward compatibility: detect old format by checking for "STIM" magic. Old files
|
||||||
|
(no magic header) use legacy SHA-256 derivation with a deprecation warning.
|
||||||
|
|
||||||
|
### 5. Collect Local Streaming Pipeline
|
||||||
|
|
||||||
|
The new `collect local` pipeline for large directories:
|
||||||
|
|
||||||
|
```
|
||||||
|
filepath.WalkDir
|
||||||
|
→ tar.NewWriter (streaming)
|
||||||
|
→ xz/gzip compressor (streaming)
|
||||||
|
→ chunked AEAD encryptor (streaming)
|
||||||
|
→ os.File output
|
||||||
|
```
|
||||||
|
|
||||||
|
Memory usage: ~2 MiB regardless of input size (1 MiB compress buffer + 1 MiB
|
||||||
|
encrypt block).
|
||||||
|
|
||||||
|
Error handling:
|
||||||
|
- Broken symlinks: skip with warning (not fatal)
|
||||||
|
- Permission denied: skip with warning, continue
|
||||||
|
- Disk full on output: fatal, clean up partial file
|
||||||
|
- Read errors mid-stream: fatal, clean up partial file
|
||||||
|
|
||||||
|
Compression selection: `--compress=xz` (default, best ratio) or `--compress=gzip`
|
||||||
|
(faster). Matches existing Borg compression support.
|
||||||
|
|
||||||
|
### 6. Core Package Integration (Deferred)
|
||||||
|
|
||||||
|
Core imports Borg, so Borg cannot import core packages without creating a circular
|
||||||
|
dependency. Integration points are marked with TODOs for when the dependency
|
||||||
|
direction is resolved (likely by extracting shared interfaces to a common module):
|
||||||
|
|
||||||
|
- `core/go` config system → Borg config loading
|
||||||
|
- `core/go` logging → Borg Progress interface backend
|
||||||
|
- `core/go-store` → DataNode persistence
|
||||||
|
- `core/go` io.Medium → DataNode filesystem abstraction
|
||||||
|
|
||||||
|
## File Impact Summary
|
||||||
|
|
||||||
|
| Area | Files | Change Type |
|
||||||
|
|------|-------|-------------|
|
||||||
|
| DataNode | `pkg/datanode/*.go` | Modify (ToTarWriter, symlinks, AddPathOptions) |
|
||||||
|
| UI | `pkg/ui/*.go` | Rewrite (Progress interface, TTY detection) |
|
||||||
|
| TIM/STIM | `pkg/tim/*.go` | Modify (streaming encrypt/decrypt, new header) |
|
||||||
|
| Crypto | `pkg/tim/crypto.go` (new) | Create (Argon2id, chunked AEAD) |
|
||||||
|
| Collect | `cmd/collect_local.go` | Rewrite (streaming pipeline) |
|
||||||
|
| CLI | `cmd/root.go`, `cmd/*.go` | Modify (--quiet flag) |
|
||||||
|
|
||||||
|
## Testing Strategy
|
||||||
|
|
||||||
|
- Unit tests for each component (DataNode, Progress, chunked AEAD, Argon2id)
|
||||||
|
- Round-trip tests: encrypt → decrypt → compare original
|
||||||
|
- Large file test: 100 MiB synthetic directory through full pipeline
|
||||||
|
- Symlink matrix: valid internal, valid external, broken, nested
|
||||||
|
- Backward compatibility: decrypt old-format STIM with new code
|
||||||
|
- Race detector: `go test -race ./...`
|
||||||
|
|
||||||
|
## Dependencies
|
||||||
|
|
||||||
|
New:
|
||||||
|
- `golang.org/x/crypto/argon2` (Argon2id key derivation)
|
||||||
|
- `golang.org/x/term` (TTY detection via `term.IsTerminal`)
|
||||||
|
|
||||||
|
Existing (unchanged):
|
||||||
|
- `github.com/snider/Enchantrix` (ChaCha20-Poly1305 via Sigil)
|
||||||
|
- `github.com/ulikunitz/xz` (XZ compression)
|
||||||
|
|
||||||
|
## Risk Assessment
|
||||||
|
|
||||||
|
| Risk | Mitigation |
|
||||||
|
|------|------------|
|
||||||
|
| Breaking existing STIM format | Magic-byte detection for backward compat |
|
||||||
|
| Chunked AEAD security | Standard construction (each block independent nonce) |
|
||||||
|
| Circular dep with core | Deferred; TODO markers only |
|
||||||
|
| Large directory edge cases | Extensive symlink + permission test matrix |
|
||||||
2046
docs/plans/2026-02-21-borg-upgrade-plan.md
Normal file
2046
docs/plans/2026-02-21-borg-upgrade-plan.md
Normal file
File diff suppressed because it is too large
Load diff
|
|
@ -6,8 +6,8 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/github"
|
"forge.lthn.ai/Snider/Borg/pkg/github"
|
||||||
"github.com/Snider/Borg/pkg/vcs"
|
"forge.lthn.ai/Snider/Borg/pkg/vcs"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
|
|
@ -4,13 +4,13 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/github"
|
"forge.lthn.ai/Snider/Borg/pkg/github"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
log.Println("Collecting GitHub release...")
|
log.Println("Collecting GitHub release...")
|
||||||
|
|
||||||
owner, repo, err := github.ParseRepoFromURL("https://github.com/Snider/Borg")
|
owner, repo, err := github.ParseRepoFromURL("https://forge.lthn.ai/Snider/Borg")
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("Failed to parse repo from URL: %v", err)
|
log.Fatalf("Failed to parse repo from URL: %v", err)
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -4,14 +4,14 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/vcs"
|
"forge.lthn.ai/Snider/Borg/pkg/vcs"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
log.Println("Collecting GitHub repo...")
|
log.Println("Collecting GitHub repo...")
|
||||||
|
|
||||||
cloner := vcs.NewGitCloner()
|
cloner := vcs.NewGitCloner()
|
||||||
dn, err := cloner.CloneGitRepository("https://github.com/Snider/Borg", nil)
|
dn, err := cloner.CloneGitRepository("https://forge.lthn.ai/Snider/Borg", nil)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
log.Fatalf("Failed to clone repository: %v", err)
|
log.Fatalf("Failed to clone repository: %v", err)
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -4,7 +4,7 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/pwa"
|
"forge.lthn.ai/Snider/Borg/pkg/pwa"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
|
|
@ -4,7 +4,7 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/website"
|
"forge.lthn.ai/Snider/Borg/pkg/website"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
|
|
@ -4,8 +4,8 @@ import (
|
||||||
"log"
|
"log"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
183
examples/encrypt_media/main.go
Normal file
183
examples/encrypt_media/main.go
Normal file
|
|
@ -0,0 +1,183 @@
|
||||||
|
// Package main demonstrates encrypting media files into SMSG format for dapp.fm
|
||||||
|
//
|
||||||
|
// Usage:
|
||||||
|
//
|
||||||
|
// go run main.go -input video.mp4 -output video.smsg -password "license-token" -title "My Track" -artist "Artist Name"
|
||||||
|
// go run main.go -input video.mp4 -password "token" -track "0:Intro" -track "67:Sonnata, It Feels So Good"
|
||||||
|
package main
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/base64"
|
||||||
|
"flag"
|
||||||
|
"fmt"
|
||||||
|
"log"
|
||||||
|
"mime"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"strconv"
|
||||||
|
"strings"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
)
|
||||||
|
|
||||||
|
// trackList allows multiple -track flags
|
||||||
|
type trackList []string
|
||||||
|
|
||||||
|
func (t *trackList) String() string {
|
||||||
|
return strings.Join(*t, ", ")
|
||||||
|
}
|
||||||
|
|
||||||
|
func (t *trackList) Set(value string) error {
|
||||||
|
*t = append(*t, value)
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func main() {
|
||||||
|
inputFile := flag.String("input", "", "Input media file (mp4, mp3, etc)")
|
||||||
|
outputFile := flag.String("output", "", "Output SMSG file (default: input.smsg)")
|
||||||
|
password := flag.String("password", "", "License token / password for encryption")
|
||||||
|
title := flag.String("title", "", "Track title (default: filename)")
|
||||||
|
artist := flag.String("artist", "", "Artist name")
|
||||||
|
releaseType := flag.String("type", "single", "Release type: single, ep, album, djset, live")
|
||||||
|
hint := flag.String("hint", "", "Optional password hint")
|
||||||
|
outputBase64 := flag.Bool("base64", false, "Output as base64 text file instead of binary")
|
||||||
|
|
||||||
|
var tracks trackList
|
||||||
|
flag.Var(&tracks, "track", "Track marker as 'seconds:title' or 'mm:ss:title' (can be repeated)")
|
||||||
|
|
||||||
|
flag.Parse()
|
||||||
|
|
||||||
|
if *inputFile == "" {
|
||||||
|
log.Fatal("Input file is required. Use -input flag.")
|
||||||
|
}
|
||||||
|
|
||||||
|
if *password == "" {
|
||||||
|
log.Fatal("Password/license token is required. Use -password flag.")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read input file
|
||||||
|
data, err := os.ReadFile(*inputFile)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to read input file: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine MIME type
|
||||||
|
ext := strings.ToLower(filepath.Ext(*inputFile))
|
||||||
|
mimeType := mime.TypeByExtension(ext)
|
||||||
|
if mimeType == "" {
|
||||||
|
// Fallback for common types
|
||||||
|
switch ext {
|
||||||
|
case ".mp4":
|
||||||
|
mimeType = "video/mp4"
|
||||||
|
case ".mp3":
|
||||||
|
mimeType = "audio/mpeg"
|
||||||
|
case ".wav":
|
||||||
|
mimeType = "audio/wav"
|
||||||
|
case ".ogg":
|
||||||
|
mimeType = "audio/ogg"
|
||||||
|
case ".webm":
|
||||||
|
mimeType = "video/webm"
|
||||||
|
case ".m4a":
|
||||||
|
mimeType = "audio/mp4"
|
||||||
|
case ".flac":
|
||||||
|
mimeType = "audio/flac"
|
||||||
|
default:
|
||||||
|
mimeType = "application/octet-stream"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set defaults
|
||||||
|
trackTitle := *title
|
||||||
|
if trackTitle == "" {
|
||||||
|
trackTitle = strings.TrimSuffix(filepath.Base(*inputFile), ext)
|
||||||
|
}
|
||||||
|
|
||||||
|
output := *outputFile
|
||||||
|
if output == "" {
|
||||||
|
output = *inputFile + ".smsg"
|
||||||
|
if *outputBase64 {
|
||||||
|
output = *inputFile + ".smsg.txt"
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create SMSG message with media attachment
|
||||||
|
msg := smsg.NewMessage("Licensed media content from dapp.fm")
|
||||||
|
msg.WithSubject(trackTitle)
|
||||||
|
|
||||||
|
if *artist != "" {
|
||||||
|
msg.WithFrom(*artist)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Add the media file as base64 attachment
|
||||||
|
contentB64 := base64.StdEncoding.EncodeToString(data)
|
||||||
|
msg.AddAttachment(filepath.Base(*inputFile), contentB64, mimeType)
|
||||||
|
|
||||||
|
// Build manifest with public metadata
|
||||||
|
manifest := smsg.NewManifest(trackTitle)
|
||||||
|
manifest.Artist = *artist
|
||||||
|
manifest.ReleaseType = *releaseType
|
||||||
|
manifest.Format = "dapp.fm/v1"
|
||||||
|
|
||||||
|
// Parse track markers
|
||||||
|
for _, trackStr := range tracks {
|
||||||
|
parts := strings.SplitN(trackStr, ":", 3)
|
||||||
|
var startSec float64
|
||||||
|
var trackName string
|
||||||
|
|
||||||
|
if len(parts) == 2 {
|
||||||
|
// Format: "seconds:title"
|
||||||
|
startSec, _ = strconv.ParseFloat(parts[0], 64)
|
||||||
|
trackName = parts[1]
|
||||||
|
} else if len(parts) == 3 {
|
||||||
|
// Format: "mm:ss:title"
|
||||||
|
mins, _ := strconv.ParseFloat(parts[0], 64)
|
||||||
|
secs, _ := strconv.ParseFloat(parts[1], 64)
|
||||||
|
startSec = mins*60 + secs
|
||||||
|
trackName = parts[2]
|
||||||
|
} else {
|
||||||
|
log.Printf("Warning: Invalid track format '%s', expected 'seconds:title' or 'mm:ss:title'", trackStr)
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest.AddTrack(trackName, startSec)
|
||||||
|
fmt.Printf(" Track: %s @ %.0fs\n", trackName, startSec)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt with manifest
|
||||||
|
var encrypted []byte
|
||||||
|
if *hint != "" {
|
||||||
|
// For hint, we'd need to extend the API - for now just use manifest
|
||||||
|
_ = hint
|
||||||
|
}
|
||||||
|
encrypted, err = smsg.EncryptWithManifest(msg, *password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Encryption failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write output
|
||||||
|
if *outputBase64 {
|
||||||
|
// Write as base64 text
|
||||||
|
b64 := base64.StdEncoding.EncodeToString(encrypted)
|
||||||
|
err = os.WriteFile(output, []byte(b64), 0644)
|
||||||
|
} else {
|
||||||
|
// Write as binary
|
||||||
|
err = os.WriteFile(output, encrypted, 0644)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
log.Fatalf("Failed to write output file: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
fmt.Printf("Encrypted media created successfully!\n")
|
||||||
|
fmt.Printf(" Input: %s (%s)\n", *inputFile, mimeType)
|
||||||
|
fmt.Printf(" Output: %s\n", output)
|
||||||
|
fmt.Printf(" Title: %s\n", trackTitle)
|
||||||
|
if *artist != "" {
|
||||||
|
fmt.Printf(" Artist: %s\n", *artist)
|
||||||
|
}
|
||||||
|
fmt.Printf(" Size: %.2f MB -> %.2f MB\n",
|
||||||
|
float64(len(data))/1024/1024,
|
||||||
|
float64(len(encrypted))/1024/1024)
|
||||||
|
fmt.Printf("\nLicense token: %s\n", *password)
|
||||||
|
fmt.Printf("\nShare the .smsg file publicly. Only users with the license token can play it.\n")
|
||||||
|
}
|
||||||
95
examples/failures/001-double-base64-encoding.md
Normal file
95
examples/failures/001-double-base64-encoding.md
Normal file
|
|
@ -0,0 +1,95 @@
|
||||||
|
# Failure Case 001: Double Base64 Encoding
|
||||||
|
|
||||||
|
## Error Message
|
||||||
|
```
|
||||||
|
Failed: decryption failed: invalid SMSG magic: trix: invalid magic number: expected SMSG, got U01T
|
||||||
|
```
|
||||||
|
|
||||||
|
## Environment
|
||||||
|
- Demo page: `demo/index.html`
|
||||||
|
- File: `demo/demo-track.smsg`
|
||||||
|
- WASM version: 1.2.0
|
||||||
|
|
||||||
|
## Root Cause Analysis
|
||||||
|
|
||||||
|
### The Problem
|
||||||
|
The demo file `demo-track.smsg` is stored as **base64-encoded text**, but the JavaScript code treats it as binary and re-encodes it to base64 before passing to WASM.
|
||||||
|
|
||||||
|
### Evidence
|
||||||
|
|
||||||
|
File inspection:
|
||||||
|
```bash
|
||||||
|
$ file demo/demo-track.smsg
|
||||||
|
ASCII text, with very long lines (65536), with no line terminators
|
||||||
|
|
||||||
|
$ head -c 64 demo/demo-track.smsg | xxd
|
||||||
|
00000000: 5530 3154 5277 4941 4141 457a 6579 4a68 U01TRwIAAAEzeyJh
|
||||||
|
```
|
||||||
|
|
||||||
|
The file starts with `U01TRwIA...` which is **base64-encoded SMSG**:
|
||||||
|
- `U01TRw` decodes to bytes `0x53 0x4D 0x53 0x47` = "SMSG" (the magic number)
|
||||||
|
|
||||||
|
### The Double-Encoding Chain
|
||||||
|
|
||||||
|
```
|
||||||
|
Original SMSG binary:
|
||||||
|
SMSG.... (starts with 0x534D5347)
|
||||||
|
↓ base64 encode (file storage)
|
||||||
|
U01TRwIA... (stored in demo-track.smsg)
|
||||||
|
↓ fetch() as binary
|
||||||
|
[0x55, 0x30, 0x31, 0x54, ...] (bytes of ASCII "U01T...")
|
||||||
|
↓ btoa() in JavaScript
|
||||||
|
VTAxVFJ3SUFBQUUzZXlK... (base64 of base64!)
|
||||||
|
↓ WASM base64 decode
|
||||||
|
U01TRwIA... (back to first base64)
|
||||||
|
↓ WASM tries to parse as SMSG
|
||||||
|
ERROR: expected "SMSG", got "U01T" (first 4 chars of base64)
|
||||||
|
```
|
||||||
|
|
||||||
|
### Why "U01T"?
|
||||||
|
The error shows "U01T" because when WASM decodes the double-base64, it gets back the original base64 string, and the first 4 ASCII characters "U01T" are interpreted as the magic number instead of the actual bytes 0x534D5347.
|
||||||
|
|
||||||
|
## Solution Options
|
||||||
|
|
||||||
|
### Option A: Store as binary (recommended)
|
||||||
|
Convert the demo file to raw binary format:
|
||||||
|
```bash
|
||||||
|
base64 -d demo/demo-track.smsg > demo/demo-track-binary.smsg
|
||||||
|
mv demo/demo-track-binary.smsg demo/demo-track.smsg
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option B: Detect format in JavaScript
|
||||||
|
Check if content is already base64 and skip re-encoding:
|
||||||
|
```javascript
|
||||||
|
// Check if content looks like base64 (ASCII text starting with valid base64 chars)
|
||||||
|
const isBase64 = /^[A-Za-z0-9+/=]+$/.test(text.trim());
|
||||||
|
if (!isBase64) {
|
||||||
|
// Binary content - encode to base64
|
||||||
|
base64 = btoa(binaryToString(bytes));
|
||||||
|
} else {
|
||||||
|
// Already base64 - use as-is
|
||||||
|
base64 = text;
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
### Option C: Use text fetch for base64 files
|
||||||
|
```javascript
|
||||||
|
// For base64-encoded .smsg files
|
||||||
|
const response = await fetch(DEMO_URL);
|
||||||
|
const base64 = await response.text(); // Don't re-encode
|
||||||
|
```
|
||||||
|
|
||||||
|
## Lesson Learned
|
||||||
|
SMSG files can exist in two formats:
|
||||||
|
1. **Binary** (.smsg) - raw bytes, magic number is `0x534D5347`
|
||||||
|
2. **Base64** (.smsg.b64 or .smsg with text content) - ASCII text, starts with `U01T`
|
||||||
|
|
||||||
|
The loader must detect which format it's receiving and handle accordingly.
|
||||||
|
|
||||||
|
## Recommended Fix
|
||||||
|
Implement Option A (binary storage) for the demo, as it's the canonical format and avoids ambiguity. Reserve Option B for the License Manager where users might drag-drop either format.
|
||||||
|
|
||||||
|
## Related
|
||||||
|
- `pkg/smsg/smsg.go` - SMSG format definition
|
||||||
|
- `pkg/wasm/stmf/main.go` - WASM decryption API
|
||||||
|
- `demo/index.html` - Demo page loader
|
||||||
125
examples/formats/smsg-format.md
Normal file
125
examples/formats/smsg-format.md
Normal file
|
|
@ -0,0 +1,125 @@
|
||||||
|
# SMSG Format Specification
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
SMSG (Secure Message) is an encrypted container format using ChaCha20-Poly1305 authenticated encryption.
|
||||||
|
|
||||||
|
## File Structure
|
||||||
|
|
||||||
|
```
|
||||||
|
┌─────────────────────────────────────────┐
|
||||||
|
│ Magic Number: "SMSG" (4 bytes) │
|
||||||
|
├─────────────────────────────────────────┤
|
||||||
|
│ Version: uint16 (2 bytes) │
|
||||||
|
├─────────────────────────────────────────┤
|
||||||
|
│ Header Length: uint32 (4 bytes) │
|
||||||
|
├─────────────────────────────────────────┤
|
||||||
|
│ Header (JSON, plaintext) │
|
||||||
|
│ - algorithm: "chacha20poly1305" │
|
||||||
|
│ - manifest: {title, artist, license...} │
|
||||||
|
│ - nonce: base64 │
|
||||||
|
├─────────────────────────────────────────┤
|
||||||
|
│ Encrypted Payload │
|
||||||
|
│ - Nonce (24 bytes for XChaCha20) │
|
||||||
|
│ - Ciphertext + Auth Tag │
|
||||||
|
└─────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
## Magic Number
|
||||||
|
- Binary: `0x53 0x4D 0x53 0x47`
|
||||||
|
- ASCII: "SMSG"
|
||||||
|
- Base64 (first 6 chars): "U01TRw"
|
||||||
|
|
||||||
|
## Header (JSON, unencrypted)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"algorithm": "chacha20poly1305",
|
||||||
|
"manifest": {
|
||||||
|
"title": "Track Title",
|
||||||
|
"artist": "Artist Name",
|
||||||
|
"license": "CC-BY-4.0",
|
||||||
|
"expires": "2025-12-31T23:59:59Z",
|
||||||
|
"tracks": [
|
||||||
|
{"title": "Track 1", "start": 0, "trackNum": 1}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
The manifest is **readable without decryption** - this enables:
|
||||||
|
- License validation before decryption
|
||||||
|
- Metadata display in file browsers
|
||||||
|
- Expiration enforcement
|
||||||
|
|
||||||
|
## Encrypted Payload (JSON)
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"from": "artist@example.com",
|
||||||
|
"to": "fan@example.com",
|
||||||
|
"subject": "Album Title",
|
||||||
|
"body": "Thank you for your purchase!",
|
||||||
|
"attachments": [
|
||||||
|
{
|
||||||
|
"name": "track.mp3",
|
||||||
|
"mime": "audio/mpeg",
|
||||||
|
"content": "<base64-encoded-data>"
|
||||||
|
}
|
||||||
|
]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## Key Derivation
|
||||||
|
```
|
||||||
|
password → SHA-256 → 32-byte key
|
||||||
|
```
|
||||||
|
|
||||||
|
Simple but effective - the password IS the license key.
|
||||||
|
|
||||||
|
## Storage Formats
|
||||||
|
|
||||||
|
### Binary (.smsg)
|
||||||
|
Raw bytes. Canonical format for distribution.
|
||||||
|
```
|
||||||
|
53 4D 53 47 02 00 00 00 33 00 00 00 7B 22 61 6C ...
|
||||||
|
S M S G [ver] [hdr len] {"al...
|
||||||
|
```
|
||||||
|
|
||||||
|
### Base64 Text (.smsg or .smsg.b64)
|
||||||
|
For embedding in JSON, URLs, or text-based transport.
|
||||||
|
```
|
||||||
|
U01TRwIAAAEzeyJhbGdvcml0aG0iOiJjaGFjaGEyMHBvbHkxMzA1Ii...
|
||||||
|
```
|
||||||
|
|
||||||
|
## WASM API
|
||||||
|
|
||||||
|
```javascript
|
||||||
|
// Initialize
|
||||||
|
const go = new Go();
|
||||||
|
await WebAssembly.instantiateStreaming(fetch('stmf.wasm'), go.importObject);
|
||||||
|
go.run(result.instance);
|
||||||
|
|
||||||
|
// Get metadata (no password needed)
|
||||||
|
const info = await BorgSMSG.getInfo(base64Content);
|
||||||
|
// info.manifest.title, info.manifest.expires, etc.
|
||||||
|
|
||||||
|
// Decrypt (requires password)
|
||||||
|
const msg = await BorgSMSG.decryptStream(base64Content, password);
|
||||||
|
// msg.attachments[0].data is Uint8Array (binary)
|
||||||
|
// msg.attachments[0].mime is MIME type
|
||||||
|
```
|
||||||
|
|
||||||
|
## Security Properties
|
||||||
|
|
||||||
|
1. **Authenticated Encryption**: ChaCha20-Poly1305 provides both confidentiality and integrity
|
||||||
|
2. **No Key Escrow**: Password never transmitted, derived locally
|
||||||
|
3. **Metadata Privacy**: Only manifest is public; actual content encrypted
|
||||||
|
4. **Browser-Safe**: WASM runs in sandbox, keys never leave client
|
||||||
|
|
||||||
|
## Use Cases
|
||||||
|
|
||||||
|
| Use Case | Format | Notes |
|
||||||
|
|----------|--------|-------|
|
||||||
|
| Direct download | Binary | Most efficient |
|
||||||
|
| Email attachment | Base64 | Safe for text transport |
|
||||||
|
| IPFS/CDN | Binary | Content-addressed |
|
||||||
|
| Embedded in JSON | Base64 | API responses |
|
||||||
|
| Browser demo | Either | Must detect format |
|
||||||
|
|
@ -5,8 +5,8 @@ import (
|
||||||
"io/fs"
|
"io/fs"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
|
|
@ -3,7 +3,7 @@ package main
|
||||||
import (
|
import (
|
||||||
"log"
|
"log"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
|
|
@ -5,8 +5,8 @@ import (
|
||||||
"net/http"
|
"net/http"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/compress"
|
"forge.lthn.ai/Snider/Borg/pkg/compress"
|
||||||
"github.com/Snider/Borg/pkg/tarfs"
|
"forge.lthn.ai/Snider/Borg/pkg/tarfs"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
|
|
@ -19,8 +19,8 @@ import (
|
||||||
"path/filepath"
|
"path/filepath"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/smsg"
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
"github.com/Snider/Borg/pkg/stmf"
|
"forge.lthn.ai/Snider/Borg/pkg/stmf"
|
||||||
)
|
)
|
||||||
|
|
||||||
func main() {
|
func main() {
|
||||||
|
|
|
||||||
69
go.mod
69
go.mod
|
|
@ -1,45 +1,74 @@
|
||||||
module github.com/Snider/Borg
|
module forge.lthn.ai/Snider/Borg
|
||||||
|
|
||||||
go 1.25.0
|
go 1.25.0
|
||||||
|
|
||||||
require (
|
require (
|
||||||
github.com/Snider/Enchantrix v0.0.2
|
forge.lthn.ai/Snider/Enchantrix v0.0.4
|
||||||
github.com/fatih/color v1.18.0
|
github.com/fatih/color v1.18.0
|
||||||
github.com/go-git/go-git/v5 v5.16.3
|
github.com/go-git/go-git/v5 v5.16.4
|
||||||
github.com/google/go-github/v39 v39.2.0
|
github.com/google/go-github/v39 v39.2.0
|
||||||
|
github.com/klauspost/compress v1.18.4
|
||||||
github.com/mattn/go-isatty v0.0.20
|
github.com/mattn/go-isatty v0.0.20
|
||||||
github.com/schollz/progressbar/v3 v3.18.0
|
github.com/schollz/progressbar/v3 v3.18.0
|
||||||
github.com/spf13/cobra v1.10.1
|
github.com/spf13/cobra v1.10.2
|
||||||
github.com/ulikunitz/xz v0.5.15
|
github.com/ulikunitz/xz v0.5.15
|
||||||
golang.org/x/mod v0.30.0
|
github.com/wailsapp/wails/v2 v2.11.0
|
||||||
golang.org/x/net v0.47.0
|
golang.org/x/crypto v0.48.0
|
||||||
golang.org/x/oauth2 v0.33.0
|
golang.org/x/mod v0.33.0
|
||||||
|
golang.org/x/net v0.50.0
|
||||||
|
golang.org/x/oauth2 v0.35.0
|
||||||
)
|
)
|
||||||
|
|
||||||
require (
|
require (
|
||||||
dario.cat/mergo v1.0.0 // indirect
|
dario.cat/mergo v1.0.2 // indirect
|
||||||
github.com/Microsoft/go-winio v0.6.2 // indirect
|
github.com/Microsoft/go-winio v0.6.2 // indirect
|
||||||
github.com/ProtonMail/go-crypto v1.3.0 // indirect
|
github.com/ProtonMail/go-crypto v1.3.0 // indirect
|
||||||
github.com/cloudflare/circl v1.6.1 // indirect
|
github.com/bep/debounce v1.2.1 // indirect
|
||||||
github.com/cyphar/filepath-securejoin v0.4.1 // indirect
|
github.com/clipperhouse/uax29/v2 v2.4.0 // indirect
|
||||||
|
github.com/cloudflare/circl v1.6.3 // indirect
|
||||||
|
github.com/cyphar/filepath-securejoin v0.6.1 // indirect
|
||||||
|
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc // indirect
|
||||||
github.com/emirpasic/gods v1.18.1 // indirect
|
github.com/emirpasic/gods v1.18.1 // indirect
|
||||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect
|
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 // indirect
|
||||||
github.com/go-git/go-billy/v5 v5.6.2 // indirect
|
github.com/go-git/go-billy/v5 v5.7.0 // indirect
|
||||||
|
github.com/go-ole/go-ole v1.3.0 // indirect
|
||||||
|
github.com/godbus/dbus/v5 v5.2.2 // indirect
|
||||||
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 // indirect
|
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 // indirect
|
||||||
github.com/google/go-querystring v1.1.0 // indirect
|
github.com/google/go-querystring v1.1.0 // indirect
|
||||||
|
github.com/google/uuid v1.6.0 // indirect
|
||||||
|
github.com/gorilla/websocket v1.5.3 // indirect
|
||||||
github.com/inconshreveable/mousetrap v1.1.0 // indirect
|
github.com/inconshreveable/mousetrap v1.1.0 // indirect
|
||||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect
|
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 // indirect
|
||||||
github.com/kevinburke/ssh_config v1.2.0 // indirect
|
github.com/jchv/go-winloader v0.0.0-20250406163304-c1995be93bd1 // indirect
|
||||||
github.com/mattn/go-colorable v0.1.13 // indirect
|
github.com/kevinburke/ssh_config v1.4.0 // indirect
|
||||||
|
github.com/klauspost/cpuid/v2 v2.3.0 // indirect
|
||||||
|
github.com/labstack/echo/v4 v4.13.3 // indirect
|
||||||
|
github.com/labstack/gommon v0.4.2 // indirect
|
||||||
|
github.com/leaanthony/go-ansi-parser v1.6.1 // indirect
|
||||||
|
github.com/leaanthony/gosod v1.0.4 // indirect
|
||||||
|
github.com/leaanthony/slicer v1.6.0 // indirect
|
||||||
|
github.com/leaanthony/u v1.1.1 // indirect
|
||||||
|
github.com/mattn/go-colorable v0.1.14 // indirect
|
||||||
|
github.com/mattn/go-runewidth v0.0.19 // indirect
|
||||||
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db // indirect
|
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db // indirect
|
||||||
github.com/pjbgf/sha1cd v0.3.2 // indirect
|
github.com/pjbgf/sha1cd v0.5.0 // indirect
|
||||||
|
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c // indirect
|
||||||
|
github.com/pkg/errors v0.9.1 // indirect
|
||||||
|
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 // indirect
|
||||||
github.com/rivo/uniseg v0.4.7 // indirect
|
github.com/rivo/uniseg v0.4.7 // indirect
|
||||||
github.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3 // indirect
|
github.com/samber/lo v1.52.0 // indirect
|
||||||
github.com/skeema/knownhosts v1.3.1 // indirect
|
github.com/sergi/go-diff v1.4.0 // indirect
|
||||||
github.com/spf13/pflag v1.0.9 // indirect
|
github.com/skeema/knownhosts v1.3.2 // indirect
|
||||||
|
github.com/spf13/pflag v1.0.10 // indirect
|
||||||
|
github.com/tkrajina/go-reflector v0.5.8 // indirect
|
||||||
|
github.com/valyala/bytebufferpool v1.0.0 // indirect
|
||||||
|
github.com/valyala/fasttemplate v1.2.2 // indirect
|
||||||
|
github.com/wailsapp/go-webview2 v1.0.23 // indirect
|
||||||
|
github.com/wailsapp/mimetype v1.4.1 // indirect
|
||||||
github.com/xanzy/ssh-agent v0.3.3 // indirect
|
github.com/xanzy/ssh-agent v0.3.3 // indirect
|
||||||
golang.org/x/crypto v0.44.0 // indirect
|
golang.org/x/exp v0.0.0-20260212183809-81e46e3db34a // indirect
|
||||||
golang.org/x/sys v0.38.0 // indirect
|
golang.org/x/sys v0.41.0 // indirect
|
||||||
golang.org/x/term v0.37.0 // indirect
|
golang.org/x/term v0.40.0 // indirect
|
||||||
|
golang.org/x/text v0.34.0 // indirect
|
||||||
gopkg.in/warnings.v0 v0.1.2 // indirect
|
gopkg.in/warnings.v0 v0.1.2 // indirect
|
||||||
)
|
)
|
||||||
|
|
|
||||||
126
go.sum
126
go.sum
|
|
@ -1,28 +1,26 @@
|
||||||
dario.cat/mergo v1.0.0 h1:AGCNq9Evsj31mOgNPcLyXc+4PNABt905YmuqPYYpBWk=
|
dario.cat/mergo v1.0.2 h1:85+piFYR1tMbRrLcDwR18y4UKJ3aH1Tbzi24VRW1TK8=
|
||||||
dario.cat/mergo v1.0.0/go.mod h1:uNxQE+84aUszobStD9th8a29P2fMDhsBdgRYvZOxGmk=
|
forge.lthn.ai/Snider/Enchantrix v0.0.4 h1:biwpix/bdedfyc0iVeK15awhhJKH6TEMYOTXzHXx5TI=
|
||||||
github.com/Microsoft/go-winio v0.5.2/go.mod h1:WpS1mjBmmwHBEWmogvA2mj8546UReBk4v8QkMxJ6pZY=
|
github.com/Microsoft/go-winio v0.5.2/go.mod h1:WpS1mjBmmwHBEWmogvA2mj8546UReBk4v8QkMxJ6pZY=
|
||||||
github.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=
|
github.com/Microsoft/go-winio v0.6.2 h1:F2VQgta7ecxGYO8k3ZZz3RS8fVIXVxONVUPlNERoyfY=
|
||||||
github.com/Microsoft/go-winio v0.6.2/go.mod h1:yd8OoFMLzJbo9gZq8j5qaps8bJ9aShtEA8Ipt1oGCvU=
|
github.com/Microsoft/go-winio v0.6.2/go.mod h1:yd8OoFMLzJbo9gZq8j5qaps8bJ9aShtEA8Ipt1oGCvU=
|
||||||
github.com/ProtonMail/go-crypto v1.3.0 h1:ILq8+Sf5If5DCpHQp4PbZdS1J7HDFRXz/+xKBiRGFrw=
|
github.com/ProtonMail/go-crypto v1.3.0 h1:ILq8+Sf5If5DCpHQp4PbZdS1J7HDFRXz/+xKBiRGFrw=
|
||||||
github.com/ProtonMail/go-crypto v1.3.0/go.mod h1:9whxjD8Rbs29b4XWbB8irEcE8KHMqaR2e7GWU1R+/PE=
|
github.com/ProtonMail/go-crypto v1.3.0/go.mod h1:9whxjD8Rbs29b4XWbB8irEcE8KHMqaR2e7GWU1R+/PE=
|
||||||
github.com/Snider/Enchantrix v0.0.0-20251113213145-deff3a80c600 h1:9jyEgos5SNTVp3aJkhPs/fb4eTZE5l73YqaT+vFmFu0=
|
|
||||||
github.com/Snider/Enchantrix v0.0.0-20251113213145-deff3a80c600/go.mod h1:v9HATMgLJWycy/R5ho1SL0OHbggXgEhu/qRB9gbS0BM=
|
|
||||||
github.com/Snider/Enchantrix v0.0.2 h1:ExZQiBhfS/p/AHFTKhY80TOd+BXZjK95EzByAEgwvjs=
|
|
||||||
github.com/Snider/Enchantrix v0.0.2/go.mod h1:CtFcLAvnDT1KcuF1JBb/DJj0KplY8jHryO06KzQ1hsQ=
|
|
||||||
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
|
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be h1:9AeTilPcZAjCFIImctFaOjnTIavg87rW78vTPkQqLI8=
|
||||||
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be/go.mod h1:ySMOLuWl6zY27l47sB3qLNK6tF2fkHG55UZxx8oIVo4=
|
github.com/anmitsu/go-shlex v0.0.0-20200514113438-38f4b401e2be/go.mod h1:ySMOLuWl6zY27l47sB3qLNK6tF2fkHG55UZxx8oIVo4=
|
||||||
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
|
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5 h1:0CwZNZbxp69SHPdPJAN/hZIm0C4OItdklCFmMRWYpio=
|
||||||
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5/go.mod h1:wHh0iHkYZB8zMSxRWpUBQtwG5a7fFgvEO+odwuTv2gs=
|
github.com/armon/go-socks5 v0.0.0-20160902184237-e75332964ef5/go.mod h1:wHh0iHkYZB8zMSxRWpUBQtwG5a7fFgvEO+odwuTv2gs=
|
||||||
|
github.com/bep/debounce v1.2.1 h1:v67fRdBA9UQu2NhLFXrSg0Brw7CexQekrBwDMM8bzeY=
|
||||||
|
github.com/bep/debounce v1.2.1/go.mod h1:H8yggRPQKLUhUoqrJC1bO2xNya7vanpDl7xR3ISbCJ0=
|
||||||
github.com/chengxilo/virtualterm v1.0.4 h1:Z6IpERbRVlfB8WkOmtbHiDbBANU7cimRIof7mk9/PwM=
|
github.com/chengxilo/virtualterm v1.0.4 h1:Z6IpERbRVlfB8WkOmtbHiDbBANU7cimRIof7mk9/PwM=
|
||||||
github.com/chengxilo/virtualterm v1.0.4/go.mod h1:DyxxBZz/x1iqJjFxTFcr6/x+jSpqN0iwWCOK1q10rlY=
|
github.com/chengxilo/virtualterm v1.0.4/go.mod h1:DyxxBZz/x1iqJjFxTFcr6/x+jSpqN0iwWCOK1q10rlY=
|
||||||
github.com/cloudflare/circl v1.6.1 h1:zqIqSPIndyBh1bjLVVDHMPpVKqp8Su/V+6MeDzzQBQ0=
|
github.com/clipperhouse/stringish v0.1.1 h1:+NSqMOr3GR6k1FdRhhnXrLfztGzuG+VuFDfatpWHKCs=
|
||||||
github.com/cloudflare/circl v1.6.1/go.mod h1:uddAzsPgqdMAYatqJ0lsjX1oECcQLIlRpzZh3pJrofs=
|
github.com/clipperhouse/uax29/v2 v2.4.0 h1:RXqE/l5EiAbA4u97giimKNlmpvkmz+GrBVTelsoXy9g=
|
||||||
|
github.com/cloudflare/circl v1.6.3 h1:9GPOhQGF9MCYUeXyMYlqTR6a5gTrgR/fBLXvUgtVcg8=
|
||||||
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
|
github.com/cpuguy83/go-md2man/v2 v2.0.6/go.mod h1:oOW0eioCTA6cOiMLiUPZOpcVxMig6NIQQ7OS05n1F4g=
|
||||||
github.com/cyphar/filepath-securejoin v0.4.1 h1:JyxxyPEaktOD+GAnqIqTf9A8tHyAG22rowi7HkoSU1s=
|
github.com/cyphar/filepath-securejoin v0.6.1 h1:5CeZ1jPXEiYt3+Z6zqprSAgSWiggmpVyciv8syjIpVE=
|
||||||
github.com/cyphar/filepath-securejoin v0.4.1/go.mod h1:Sdj7gXlvMcPZsbhwhQ33GguGLDGQL7h7bg04C/+u9jI=
|
|
||||||
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
|
|
||||||
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
|
||||||
|
github.com/davecgh/go-spew v1.1.2-0.20180830191138-d8f796af33cc h1:U9qPSI2PIWSS1VwoXQT9A3Wy9MM3WgvqSxFWenqJduM=
|
||||||
github.com/elazarl/goproxy v1.7.2 h1:Y2o6urb7Eule09PjlhQRGNsqRfPmYI3KKQLFpCAV3+o=
|
github.com/elazarl/goproxy v1.7.2 h1:Y2o6urb7Eule09PjlhQRGNsqRfPmYI3KKQLFpCAV3+o=
|
||||||
github.com/elazarl/goproxy v1.7.2/go.mod h1:82vkLNir0ALaW14Rc399OTTjyNREgmdL2cVoIbS6XaE=
|
github.com/elazarl/goproxy v1.7.2/go.mod h1:82vkLNir0ALaW14Rc399OTTjyNREgmdL2cVoIbS6XaE=
|
||||||
github.com/emirpasic/gods v1.18.1 h1:FXtiHYKDGKCW2KzwZKx0iC0PQmdlorYgdFG9jPXJ1Bc=
|
github.com/emirpasic/gods v1.18.1 h1:FXtiHYKDGKCW2KzwZKx0iC0PQmdlorYgdFG9jPXJ1Bc=
|
||||||
|
|
@ -33,12 +31,13 @@ github.com/gliderlabs/ssh v0.3.8 h1:a4YXD1V7xMF9g5nTkdfnja3Sxy1PVDCj1Zg4Wb8vY6c=
|
||||||
github.com/gliderlabs/ssh v0.3.8/go.mod h1:xYoytBv1sV0aL3CavoDuJIQNURXkkfPA/wxQ1pL1fAU=
|
github.com/gliderlabs/ssh v0.3.8/go.mod h1:xYoytBv1sV0aL3CavoDuJIQNURXkkfPA/wxQ1pL1fAU=
|
||||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 h1:+zs/tPmkDkHx3U66DAb0lQFJrpS6731Oaa12ikc+DiI=
|
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376 h1:+zs/tPmkDkHx3U66DAb0lQFJrpS6731Oaa12ikc+DiI=
|
||||||
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376/go.mod h1:an3vInlBmSxCcxctByoQdvwPiA7DTK7jaaFDBTtu0ic=
|
github.com/go-git/gcfg v1.5.1-0.20230307220236-3a3c6141e376/go.mod h1:an3vInlBmSxCcxctByoQdvwPiA7DTK7jaaFDBTtu0ic=
|
||||||
github.com/go-git/go-billy/v5 v5.6.2 h1:6Q86EsPXMa7c3YZ3aLAQsMA0VlWmy43r6FHqa/UNbRM=
|
github.com/go-git/go-billy/v5 v5.7.0 h1:83lBUJhGWhYp0ngzCMSgllhUSuoHP1iEWYjsPl9nwqM=
|
||||||
github.com/go-git/go-billy/v5 v5.6.2/go.mod h1:rcFC2rAsp/erv7CMz9GczHcuD0D32fWzH+MJAU+jaUU=
|
|
||||||
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399 h1:eMje31YglSBqCdIqdhKBW8lokaMrL3uTkpGYlE2OOT4=
|
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399 h1:eMje31YglSBqCdIqdhKBW8lokaMrL3uTkpGYlE2OOT4=
|
||||||
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399/go.mod h1:1OCfN199q1Jm3HZlxleg+Dw/mwps2Wbk9frAWm+4FII=
|
github.com/go-git/go-git-fixtures/v4 v4.3.2-0.20231010084843-55a94097c399/go.mod h1:1OCfN199q1Jm3HZlxleg+Dw/mwps2Wbk9frAWm+4FII=
|
||||||
github.com/go-git/go-git/v5 v5.16.3 h1:Z8BtvxZ09bYm/yYNgPKCzgWtaRqDTgIKRgIRHBfU6Z8=
|
github.com/go-git/go-git/v5 v5.16.4 h1:7ajIEZHZJULcyJebDLo99bGgS0jRrOxzZG4uCk2Yb2Y=
|
||||||
github.com/go-git/go-git/v5 v5.16.3/go.mod h1:4Ge4alE/5gPs30F2H1esi2gPd69R0C39lolkucHBOp8=
|
github.com/go-ole/go-ole v1.3.0 h1:Dt6ye7+vXGIKZ7Xtk4s6/xVdGDQynvom7xCFEdWr6uE=
|
||||||
|
github.com/go-ole/go-ole v1.3.0/go.mod h1:5LS6F96DhAwUc7C+1HLexzMXY1xGRSryjyPPKW6zv78=
|
||||||
|
github.com/godbus/dbus/v5 v5.2.2 h1:TUR3TgtSVDmjiXOgAAyaZbYmIeP3DPkld3jgKGV8mXQ=
|
||||||
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 h1:f+oWsMOmNPc8JmEHVZIycC7hBoQxHH9pNKQORJNozsQ=
|
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8 h1:f+oWsMOmNPc8JmEHVZIycC7hBoQxHH9pNKQORJNozsQ=
|
||||||
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8/go.mod h1:wcDNUvekVysuuOpQKo3191zZyTpiI6se1N1ULghS0sw=
|
github.com/golang/groupcache v0.0.0-20241129210726-2c02b8208cf8/go.mod h1:wcDNUvekVysuuOpQKo3191zZyTpiI6se1N1ULghS0sw=
|
||||||
github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
|
github.com/golang/protobuf v1.3.1/go.mod h1:6lQm79b+lXiMfvg/cZm0SGofjICqVBUtrP5yJMmIC1U=
|
||||||
|
|
@ -51,12 +50,18 @@ github.com/google/go-github/v39 v39.2.0 h1:rNNM311XtPOz5rDdsJXAp2o8F67X9FnROXTvt
|
||||||
github.com/google/go-github/v39 v39.2.0/go.mod h1:C1s8C5aCC9L+JXIYpJM5GYytdX52vC1bLvHEF1IhBrE=
|
github.com/google/go-github/v39 v39.2.0/go.mod h1:C1s8C5aCC9L+JXIYpJM5GYytdX52vC1bLvHEF1IhBrE=
|
||||||
github.com/google/go-querystring v1.1.0 h1:AnCroh3fv4ZBgVIf1Iwtovgjaw/GiKJo8M8yD/fhyJ8=
|
github.com/google/go-querystring v1.1.0 h1:AnCroh3fv4ZBgVIf1Iwtovgjaw/GiKJo8M8yD/fhyJ8=
|
||||||
github.com/google/go-querystring v1.1.0/go.mod h1:Kcdr2DB4koayq7X8pmAG4sNG59So17icRSOU623lUBU=
|
github.com/google/go-querystring v1.1.0/go.mod h1:Kcdr2DB4koayq7X8pmAG4sNG59So17icRSOU623lUBU=
|
||||||
|
github.com/google/uuid v1.6.0 h1:NIvaJDMOsjHA8n1jAhLSgzrAzy1Hgr+hNrb57e+94F0=
|
||||||
|
github.com/google/uuid v1.6.0/go.mod h1:TIyPZe4MgqvfeYDBFedMoGGpEw/LqOeaOT+nhxU+yHo=
|
||||||
|
github.com/gorilla/websocket v1.5.3 h1:saDtZ6Pbx/0u+bgYQ3q96pZgCzfhKXGPqt7kZ72aNNg=
|
||||||
|
github.com/gorilla/websocket v1.5.3/go.mod h1:YR8l580nyteQvAITg2hZ9XVh4b55+EU/adAjf1fMHhE=
|
||||||
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
|
github.com/inconshreveable/mousetrap v1.1.0 h1:wN+x4NVGpMsO7ErUn/mUI3vEoE6Jt13X2s0bqwp9tc8=
|
||||||
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
|
github.com/inconshreveable/mousetrap v1.1.0/go.mod h1:vpF70FUmC8bwa3OWnCshd2FqLfsEA9PFc4w1p2J65bw=
|
||||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=
|
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99 h1:BQSFePA1RWJOlocH6Fxy8MmwDt+yVQYULKfN0RoTN8A=
|
||||||
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=
|
github.com/jbenet/go-context v0.0.0-20150711004518-d14ea06fba99/go.mod h1:1lJo3i6rXxKeerYnT8Nvf0QmHCRC1n8sfWVwXF2Frvo=
|
||||||
github.com/kevinburke/ssh_config v1.2.0 h1:x584FjTGwHzMwvHx18PXxbBVzfnxogHaAReU4gf13a4=
|
github.com/jchv/go-winloader v0.0.0-20250406163304-c1995be93bd1 h1:njuLRcjAuMKr7kI3D85AXWkw6/+v9PwtV6M6o11sWHQ=
|
||||||
github.com/kevinburke/ssh_config v1.2.0/go.mod h1:CT57kijsi8u/K/BOFA39wgDQJ9CxiF4nAY/ojJ6r6mM=
|
github.com/kevinburke/ssh_config v1.4.0 h1:6xxtP5bZ2E4NF5tuQulISpTO2z8XbtH8cg1PWkxoFkQ=
|
||||||
|
github.com/klauspost/compress v1.18.4 h1:RPhnKRAQ4Fh8zU2FY/6ZFDwTVTxgJ/EMydqSTzE9a2c=
|
||||||
|
github.com/klauspost/cpuid/v2 v2.3.0 h1:S4CRMLnYUhGeDFDqkGriYKdfoFlDnMtqTiI/sFzhA9Y=
|
||||||
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
|
github.com/kr/pretty v0.1.0/go.mod h1:dAy3ld7l9f0ibDNOQOHHMYYIIbhfbHSm3C4ZsoJORNo=
|
||||||
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
|
github.com/kr/pretty v0.3.1 h1:flRD4NNwYAUpkphVc1HcthR4KEIFJ65n8Mw5qdRn3LE=
|
||||||
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
|
github.com/kr/pretty v0.3.1/go.mod h1:hoEshYVHaxMs3cyo3Yncou5ZscifuDolrwPKZanG3xk=
|
||||||
|
|
@ -64,85 +69,108 @@ github.com/kr/pty v1.1.1/go.mod h1:pFQYn66WHrOpPYNljwOMqo10TkYh1fy3cYio2l3bCsQ=
|
||||||
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
github.com/kr/text v0.1.0/go.mod h1:4Jbv+DJW3UT/LiOwJeYQe1efqtUx/iVham/4vfdArNI=
|
||||||
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
github.com/kr/text v0.2.0 h1:5Nx0Ya0ZqY2ygV366QzturHI13Jq95ApcVaJBhpS+AY=
|
||||||
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
github.com/kr/text v0.2.0/go.mod h1:eLer722TekiGuMkidMxC/pM04lWEeraHUUmBw8l2grE=
|
||||||
github.com/mattn/go-colorable v0.1.13 h1:fFA4WZxdEF4tXPZVKMLwD8oUnCTTo08duU7wxecdEvA=
|
github.com/labstack/echo/v4 v4.13.3 h1:pwhpCPrTl5qry5HRdM5FwdXnhXSLSY+WE+YQSeCaafY=
|
||||||
github.com/mattn/go-colorable v0.1.13/go.mod h1:7S9/ev0klgBDR4GtXTXX8a3vIGJpMovkB8vQcUbaXHg=
|
github.com/labstack/echo/v4 v4.13.3/go.mod h1:o90YNEeQWjDozo584l7AwhJMHN0bOC4tAfg+Xox9q5g=
|
||||||
github.com/mattn/go-isatty v0.0.16/go.mod h1:kYGgaQfpe5nmfYZH+SKPsOc2e4SrIfOl2e/yFXSvRLM=
|
github.com/labstack/gommon v0.4.2 h1:F8qTUNXgG1+6WQmqoUWnz8WiEU60mXVVw0P4ht1WRA0=
|
||||||
|
github.com/labstack/gommon v0.4.2/go.mod h1:QlUFxVM+SNXhDL/Z7YhocGIBYOiwB0mXm1+1bAPHPyU=
|
||||||
|
github.com/leaanthony/debme v1.2.1 h1:9Tgwf+kjcrbMQ4WnPcEIUcQuIZYqdWftzZkBr+i/oOc=
|
||||||
|
github.com/leaanthony/debme v1.2.1/go.mod h1:3V+sCm5tYAgQymvSOfYQ5Xx2JCr+OXiD9Jkw3otUjiA=
|
||||||
|
github.com/leaanthony/go-ansi-parser v1.6.1 h1:xd8bzARK3dErqkPFtoF9F3/HgN8UQk0ed1YDKpEz01A=
|
||||||
|
github.com/leaanthony/go-ansi-parser v1.6.1/go.mod h1:+vva/2y4alzVmmIEpk9QDhA7vLC5zKDTRwfZGOp3IWU=
|
||||||
|
github.com/leaanthony/gosod v1.0.4 h1:YLAbVyd591MRffDgxUOU1NwLhT9T1/YiwjKZpkNFeaI=
|
||||||
|
github.com/leaanthony/gosod v1.0.4/go.mod h1:GKuIL0zzPj3O1SdWQOdgURSuhkF+Urizzxh26t9f1cw=
|
||||||
|
github.com/leaanthony/slicer v1.6.0 h1:1RFP5uiPJvT93TAHi+ipd3NACobkW53yUiBqZheE/Js=
|
||||||
|
github.com/leaanthony/slicer v1.6.0/go.mod h1:o/Iz29g7LN0GqH3aMjWAe90381nyZlDNquK+mtH2Fj8=
|
||||||
|
github.com/leaanthony/u v1.1.1 h1:TUFjwDGlNX+WuwVEzDqQwC2lOv0P4uhTQw7CMFdiK7M=
|
||||||
|
github.com/leaanthony/u v1.1.1/go.mod h1:9+o6hejoRljvZ3BzdYlVL0JYCwtnAsVuN9pVTQcaRfI=
|
||||||
|
github.com/matryer/is v1.4.0/go.mod h1:8I/i5uYgLzgsgEloJE1U6xx5HkBQpAZvepWuujKwMRU=
|
||||||
|
github.com/matryer/is v1.4.1 h1:55ehd8zaGABKLXQUe2awZ99BD/PTc2ls+KV/dXphgEQ=
|
||||||
|
github.com/matryer/is v1.4.1/go.mod h1:8I/i5uYgLzgsgEloJE1U6xx5HkBQpAZvepWuujKwMRU=
|
||||||
|
github.com/mattn/go-colorable v0.1.14 h1:9A9LHSqF/7dyVVX6g0U9cwm9pG3kP9gSzcuIPHPsaIE=
|
||||||
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
|
github.com/mattn/go-isatty v0.0.20 h1:xfD0iDuEKnDkl03q4limB+vH+GxLEtL/jb4xVJSWWEY=
|
||||||
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
|
github.com/mattn/go-isatty v0.0.20/go.mod h1:W+V8PltTTMOvKvAeJH7IuucS94S2C6jfK/D7dTCTo3Y=
|
||||||
github.com/mattn/go-runewidth v0.0.16 h1:E5ScNMtiwvlvB5paMFdw9p4kSQzbXFikJ5SQO6TULQc=
|
github.com/mattn/go-runewidth v0.0.19 h1:v++JhqYnZuu5jSKrk9RbgF5v4CGUjqRfBm05byFGLdw=
|
||||||
github.com/mattn/go-runewidth v0.0.16/go.mod h1:Jdepj2loyihRzMpdS35Xk/zdY8IAYHsh153qUoGf23w=
|
|
||||||
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db h1:62I3jR2EmQ4l5rM/4FEfDWcRD+abF5XlKShorW5LRoQ=
|
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db h1:62I3jR2EmQ4l5rM/4FEfDWcRD+abF5XlKShorW5LRoQ=
|
||||||
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db/go.mod h1:l0dey0ia/Uv7NcFFVbCLtqEBQbrT4OCwCSKTEv6enCw=
|
github.com/mitchellh/colorstring v0.0.0-20190213212951-d06e56a500db/go.mod h1:l0dey0ia/Uv7NcFFVbCLtqEBQbrT4OCwCSKTEv6enCw=
|
||||||
github.com/onsi/gomega v1.34.1 h1:EUMJIKUjM8sKjYbtxQI9A4z2o+rruxnzNvpknOXie6k=
|
github.com/onsi/gomega v1.34.1 h1:EUMJIKUjM8sKjYbtxQI9A4z2o+rruxnzNvpknOXie6k=
|
||||||
github.com/onsi/gomega v1.34.1/go.mod h1:kU1QgUvBDLXBJq618Xvm2LUX6rSAfRaFRTcdOeDLwwY=
|
github.com/onsi/gomega v1.34.1/go.mod h1:kU1QgUvBDLXBJq618Xvm2LUX6rSAfRaFRTcdOeDLwwY=
|
||||||
github.com/pjbgf/sha1cd v0.3.2 h1:a9wb0bp1oC2TGwStyn0Umc/IGKQnEgF0vVaZ8QF8eo4=
|
github.com/pjbgf/sha1cd v0.5.0 h1:a+UkboSi1znleCDUNT3M5YxjOnN1fz2FhN48FlwCxs0=
|
||||||
github.com/pjbgf/sha1cd v0.3.2/go.mod h1:zQWigSxVmsHEZow5qaLtPYxpcKMMQpa09ixqBxuCS6A=
|
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c h1:+mdjkGKdHQG3305AYmdv1U2eRNDiU2ErMBj1gwrq8eQ=
|
||||||
|
github.com/pkg/browser v0.0.0-20240102092130-5ac0b6a4141c/go.mod h1:7rwL4CYBLnjLxUqIJNnCWiEdr3bn6IUYi15bNlnbCCU=
|
||||||
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
|
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
|
||||||
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
|
||||||
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
|
|
||||||
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
|
||||||
|
github.com/pmezard/go-difflib v1.0.1-0.20181226105442-5d4384ee4fb2 h1:Jamvg5psRIccs7FGNTlIRMkT8wgtp5eCXdBlqhYGL6U=
|
||||||
|
github.com/rivo/uniseg v0.2.0/go.mod h1:J6wj4VEh+S6ZtnVlnTBMWIodfgj8LQOQFoIToxlJtxc=
|
||||||
github.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=
|
github.com/rivo/uniseg v0.4.7 h1:WUdvkW8uEhrYfLC4ZzdpI2ztxP1I582+49Oc5Mq64VQ=
|
||||||
github.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
github.com/rivo/uniseg v0.4.7/go.mod h1:FN3SvrM+Zdj16jyLfmOkMNblXMcoc8DfTHruCPUcx88=
|
||||||
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
github.com/rogpeppe/go-internal v1.14.1 h1:UQB4HGPB6osV0SQTLymcB4TgvyWu6ZyliaW0tI/otEQ=
|
||||||
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
github.com/rogpeppe/go-internal v1.14.1/go.mod h1:MaRKkUm5W0goXpeCfT7UZI6fk/L7L7so1lCWt35ZSgc=
|
||||||
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
|
github.com/russross/blackfriday/v2 v2.1.0/go.mod h1:+Rmxgy9KzJVeS9/2gXHxylqXiyQDYRxCVz55jmeOWTM=
|
||||||
|
github.com/samber/lo v1.52.0 h1:Rvi+3BFHES3A8meP33VPAxiBZX/Aws5RxrschYGjomw=
|
||||||
github.com/schollz/progressbar/v3 v3.18.0 h1:uXdoHABRFmNIjUfte/Ex7WtuyVslrw2wVPQmCN62HpA=
|
github.com/schollz/progressbar/v3 v3.18.0 h1:uXdoHABRFmNIjUfte/Ex7WtuyVslrw2wVPQmCN62HpA=
|
||||||
github.com/schollz/progressbar/v3 v3.18.0/go.mod h1:IsO3lpbaGuzh8zIMzgY3+J8l4C8GjO0Y9S69eFvNsec=
|
github.com/schollz/progressbar/v3 v3.18.0/go.mod h1:IsO3lpbaGuzh8zIMzgY3+J8l4C8GjO0Y9S69eFvNsec=
|
||||||
github.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3 h1:n661drycOFuPLCN3Uc8sB6B/s6Z4t2xvBgU1htSHuq8=
|
github.com/sergi/go-diff v1.4.0 h1:n/SP9D5ad1fORl+llWyN+D6qoUETXNZARKjyY2/KVCw=
|
||||||
github.com/sergi/go-diff v1.3.2-0.20230802210424-5b0b94c5c0d3/go.mod h1:A0bzQcvG0E7Rwjx0REVgAGH58e96+X0MeOfepqsbeW4=
|
|
||||||
github.com/sirupsen/logrus v1.7.0/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=
|
github.com/sirupsen/logrus v1.7.0/go.mod h1:yWOB1SBYBC5VeMP7gHvWumXLIWorT60ONWic61uBYv0=
|
||||||
github.com/skeema/knownhosts v1.3.1 h1:X2osQ+RAjK76shCbvhHHHVl3ZlgDm8apHEHFqRjnBY8=
|
github.com/skeema/knownhosts v1.3.2 h1:EDL9mgf4NzwMXCTfaxSD/o/a5fxDw/xL9nkU28JjdBg=
|
||||||
github.com/skeema/knownhosts v1.3.1/go.mod h1:r7KTdC8l4uxWRyK2TpQZ/1o5HaSzh06ePQNxPwTcfiY=
|
github.com/spf13/cobra v1.10.2 h1:DMTTonx5m65Ic0GOoRY2c16WCbHxOOw6xxezuLaBpcU=
|
||||||
github.com/spf13/cobra v1.10.1 h1:lJeBwCfmrnXthfAupyUTzJ/J4Nc1RsHC/mSRU2dll/s=
|
|
||||||
github.com/spf13/cobra v1.10.1/go.mod h1:7SmJGaTHFVBY0jW4NXGluQoLvhqFQM+6XSKD+P4XaB0=
|
|
||||||
github.com/spf13/pflag v1.0.9 h1:9exaQaMOCwffKiiiYk6/BndUBv+iRViNW+4lEMi0PvY=
|
|
||||||
github.com/spf13/pflag v1.0.9/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
|
github.com/spf13/pflag v1.0.9/go.mod h1:McXfInJRrz4CZXVZOBLb0bTZqETkiAhM9Iw0y3An2Bg=
|
||||||
|
github.com/spf13/pflag v1.0.10 h1:4EBh2KAYBwaONj6b2Ye1GiHfwjqyROoF4RwYO+vPwFk=
|
||||||
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
|
||||||
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
|
||||||
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
|
github.com/stretchr/testify v1.4.0/go.mod h1:j7eGeouHqKxXV5pUuKE4zz7dFj8WfuZ+81PSLYec5m4=
|
||||||
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
|
github.com/stretchr/testify v1.11.1 h1:7s2iGBzp5EwR7/aIZr8ao5+dra3wiQyKjjFuvgVKu7U=
|
||||||
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
|
github.com/stretchr/testify v1.11.1/go.mod h1:wZwfW3scLgRK+23gO65QZefKpKQRnfz6sD981Nm4B6U=
|
||||||
|
github.com/tkrajina/go-reflector v0.5.8 h1:yPADHrwmUbMq4RGEyaOUpz2H90sRsETNVpjzo3DLVQQ=
|
||||||
|
github.com/tkrajina/go-reflector v0.5.8/go.mod h1:ECbqLgccecY5kPmPmXg1MrHW585yMcDkVl6IvJe64T4=
|
||||||
github.com/ulikunitz/xz v0.5.15 h1:9DNdB5s+SgV3bQ2ApL10xRc35ck0DuIX/isZvIk+ubY=
|
github.com/ulikunitz/xz v0.5.15 h1:9DNdB5s+SgV3bQ2ApL10xRc35ck0DuIX/isZvIk+ubY=
|
||||||
github.com/ulikunitz/xz v0.5.15/go.mod h1:nbz6k7qbPmH4IRqmfOplQw/tblSgqTqBwxkY0oWt/14=
|
github.com/ulikunitz/xz v0.5.15/go.mod h1:nbz6k7qbPmH4IRqmfOplQw/tblSgqTqBwxkY0oWt/14=
|
||||||
|
github.com/valyala/bytebufferpool v1.0.0 h1:GqA5TC/0021Y/b9FG4Oi9Mr3q7XYx6KllzawFIhcdPw=
|
||||||
|
github.com/valyala/bytebufferpool v1.0.0/go.mod h1:6bBcMArwyJ5K/AmCkWv1jt77kVWyCJ6HpOuEn7z0Csc=
|
||||||
|
github.com/valyala/fasttemplate v1.2.2 h1:lxLXG0uE3Qnshl9QyaK6XJxMXlQZELvChBOCmQD0Loo=
|
||||||
|
github.com/valyala/fasttemplate v1.2.2/go.mod h1:KHLXt3tVN2HBp8eijSv/kGJopbvo7S+qRAEEKiv+SiQ=
|
||||||
|
github.com/wailsapp/go-webview2 v1.0.23 h1:jmv8qhz1lHibCc79bMM/a/FqOnnzOGEisLav+a0b9P0=
|
||||||
|
github.com/wailsapp/mimetype v1.4.1 h1:pQN9ycO7uo4vsUUuPeHEYoUkLVkaRntMnHJxVwYhwHs=
|
||||||
|
github.com/wailsapp/mimetype v1.4.1/go.mod h1:9aV5k31bBOv5z6u+QP8TltzvNGJPmNJD4XlAL3U+j3o=
|
||||||
|
github.com/wailsapp/wails/v2 v2.11.0 h1:seLacV8pqupq32IjS4Y7V8ucab0WZwtK6VvUVxSBtqQ=
|
||||||
|
github.com/wailsapp/wails/v2 v2.11.0/go.mod h1:jrf0ZaM6+GBc1wRmXsM8cIvzlg0karYin3erahI4+0k=
|
||||||
github.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=
|
github.com/xanzy/ssh-agent v0.3.3 h1:+/15pJfg/RsTxqYcX6fHqOXZwwMP+2VyYWJeWM2qQFM=
|
||||||
github.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=
|
github.com/xanzy/ssh-agent v0.3.3/go.mod h1:6dzNDKs0J9rVPHPhaGCukekBHKqfl+L3KghI1Bc68Uw=
|
||||||
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
|
golang.org/x/crypto v0.0.0-20190308221718-c2843e01d9a2/go.mod h1:djNgcEr1/C05ACkg1iLfiJU5Ep61QUkGW8qpdssI0+w=
|
||||||
golang.org/x/crypto v0.0.0-20210817164053-32db794688a5/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
|
golang.org/x/crypto v0.0.0-20210817164053-32db794688a5/go.mod h1:GvvjBRRGRdwPK5ydBHafDWAxML/pGHZbMvKqRZ5+Abc=
|
||||||
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
|
golang.org/x/crypto v0.0.0-20220622213112-05595931fe9d/go.mod h1:IxCIyHEi3zRg3s0A5j5BB6A9Jmi73HwBIUl50j+osU4=
|
||||||
golang.org/x/crypto v0.44.0 h1:A97SsFvM3AIwEEmTBiaxPPTYpDC47w720rdiiUvgoAU=
|
golang.org/x/crypto v0.48.0 h1:/VRzVqiRSggnhY7gNRxPauEQ5Drw9haKdM0jqfcCFts=
|
||||||
golang.org/x/crypto v0.44.0/go.mod h1:013i+Nw79BMiQiMsOPcVCB5ZIJbYkerPrGnOa00tvmc=
|
golang.org/x/crypto v0.48.0/go.mod h1:r0kV5h3qnFPlQnBSrULhlsRfryS2pmewsg+XfMgkVos=
|
||||||
golang.org/x/exp v0.0.0-20240719175910-8a7402abbf56 h1:2dVuKD2vS7b0QIHQbpyTISPd0LeHDbnYEryqj5Q1ug8=
|
golang.org/x/exp v0.0.0-20260212183809-81e46e3db34a h1:ovFr6Z0MNmU7nH8VaX5xqw+05ST2uO1exVfZPVqRC5o=
|
||||||
golang.org/x/exp v0.0.0-20240719175910-8a7402abbf56/go.mod h1:M4RDyNAINzryxdtnbRXRL/OHtkFuWGRjvuhBJpk2IlY=
|
golang.org/x/mod v0.33.0 h1:tHFzIWbBifEmbwtGz65eaWyGiGZatSrT9prnU8DbVL8=
|
||||||
golang.org/x/mod v0.30.0 h1:fDEXFVZ/fmCKProc/yAXXUijritrDzahmwwefnjoPFk=
|
|
||||||
golang.org/x/mod v0.30.0/go.mod h1:lAsf5O2EvJeSFMiBxXDki7sCgAxEUcZHXoXMKT4GJKc=
|
|
||||||
golang.org/x/net v0.0.0-20190603091049-60506f45cf65/go.mod h1:HSz+uSET+XFnRR8LxR5pz3Of3rY3CfYBVs4xY44aLks=
|
golang.org/x/net v0.0.0-20190603091049-60506f45cf65/go.mod h1:HSz+uSET+XFnRR8LxR5pz3Of3rY3CfYBVs4xY44aLks=
|
||||||
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
|
golang.org/x/net v0.0.0-20210226172049-e18ecbb05110/go.mod h1:m0MpNAwzfU5UDzcl9v0D8zg8gWTRqZa9RBIspLL5mdg=
|
||||||
|
golang.org/x/net v0.0.0-20210505024714-0287a6fb4125/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||||
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
golang.org/x/net v0.0.0-20211112202133-69e39bad7dc2/go.mod h1:9nx3DQGgdP8bBQD5qxJ1jj9UTztislL4KSBs9R2vV5Y=
|
||||||
golang.org/x/net v0.47.0 h1:Mx+4dIFzqraBXUugkia1OOvlD6LemFo1ALMHjrXDOhY=
|
golang.org/x/net v0.50.0 h1:ucWh9eiCGyDR3vtzso0WMQinm2Dnt8cFMuQa9K33J60=
|
||||||
golang.org/x/net v0.47.0/go.mod h1:/jNxtkgq5yWUGYkaZGqo27cfGZ1c5Nen03aYrrKpVRU=
|
|
||||||
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
golang.org/x/oauth2 v0.0.0-20180821212333-d2e6202438be/go.mod h1:N/0e6XlmueqKjAGxoOufVs8QHGRruUQn6yWY3a++T0U=
|
||||||
golang.org/x/oauth2 v0.33.0 h1:4Q+qn+E5z8gPRJfmRy7C2gGG3T4jIprK6aSYgTXGRpo=
|
golang.org/x/oauth2 v0.35.0 h1:Mv2mzuHuZuY2+bkyWXIHMfhNdJAdwW3FuWeCPYN5GVQ=
|
||||||
golang.org/x/oauth2 v0.33.0/go.mod h1:lzm5WQJQwKZ3nwavOZ3IS5Aulzxi68dUSgRHujetwEA=
|
|
||||||
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
golang.org/x/sys v0.0.0-20190215142949-d0b11bdaac8a/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=
|
||||||
golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20191026070338-33540a1f6037/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
|
golang.org/x/sys v0.0.0-20200810151505-1b9f1253b3ed/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20201119102817-f84b799fce68/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20210124154548-22da62e12c0c/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
golang.org/x/sys v0.0.0-20210423082822-04245dca01da/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
|
||||||
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
golang.org/x/sys v0.0.0-20210615035016-665e8c7367d1/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
golang.org/x/sys v0.0.0-20220715151400-c0bba94af5f8/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
golang.org/x/sys v0.0.0-20220811171246-fbc7d0a398ab/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
golang.org/x/sys v0.1.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
golang.org/x/sys v0.6.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
golang.org/x/sys v0.38.0 h1:3yZWxaJjBmCWXqhN1qh02AkOnCQ1poK6oF+a7xWL6Gc=
|
golang.org/x/sys v0.41.0 h1:Ivj+2Cp/ylzLiEU89QhWblYnOE9zerudt9Ftecq2C6k=
|
||||||
golang.org/x/sys v0.38.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
golang.org/x/sys v0.41.0/go.mod h1:OgkHotnGiDImocRcuBABYBEXf8A9a87e/uXjp9XT3ks=
|
||||||
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
golang.org/x/term v0.0.0-20201126162022-7de9c90e9dd1/go.mod h1:bj7SfCRtBDWHUb9snDiAeCFNEtKQo2Wmx5Cou7ajbmo=
|
||||||
golang.org/x/term v0.37.0 h1:8EGAD0qCmHYZg6J17DvsMy9/wJ7/D/4pV/wfnld5lTU=
|
golang.org/x/term v0.40.0 h1:36e4zGLqU4yhjlmxEaagx2KuYbJq3EwY8K943ZsHcvg=
|
||||||
golang.org/x/term v0.37.0/go.mod h1:5pB4lxRNYYVZuTLmy8oR2BH8dflOR+IbTYFD8fi3254=
|
golang.org/x/term v0.40.0/go.mod h1:w2P8uVp06p2iyKKuvXIm7N/y0UCRt3UfJTfZ7oOpglM=
|
||||||
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
golang.org/x/text v0.3.0/go.mod h1:NqM8EUOU14njkJ3fqMW+pc6Ldnwhi/IjpwHt7yyuwOQ=
|
||||||
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
|
golang.org/x/text v0.3.2/go.mod h1:bEr9sfX3Q8Zfm5fL9x+3itogRgK3+ptLWKqgva+5dAk=
|
||||||
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
golang.org/x/text v0.3.3/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||||
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
golang.org/x/text v0.3.6/go.mod h1:5Zoc/QRtKVWzQhOtBMvqHzDpF6irO9z98xDceosuGiQ=
|
||||||
golang.org/x/text v0.31.0 h1:aC8ghyu4JhP8VojJ2lEHBnochRno1sgL6nEi9WGFGMM=
|
golang.org/x/text v0.34.0 h1:oL/Qq0Kdaqxa1KbNeMKwQq0reLCCaFtqu2eNuSeNHbk=
|
||||||
golang.org/x/text v0.31.0/go.mod h1:tKRAlv61yKIjGGHX/4tP1LTbc13YSec1pxVEWXzfoeM=
|
golang.org/x/text v0.34.0/go.mod h1:homfLqTYRFyVYemLBFl5GgL/DWEiH5wcsQ5gSh1yziA=
|
||||||
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
golang.org/x/tools v0.0.0-20180917221912-90fa682c2a6e/go.mod h1:n7NCudcB/nEzxVGmLbDWY5pfWTLqBcC2KZ6jyYvM4mQ=
|
||||||
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
golang.org/x/xerrors v0.0.0-20191204190536-9bdfabe68543/go.mod h1:I/5z698sn9Ka8TeJc9MKroUUfqBBauWjQqLJ2OPfmY0=
|
||||||
google.golang.org/appengine v1.6.7/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
|
google.golang.org/appengine v1.6.7/go.mod h1:8WjMMxjGQR8xUklV/ARdw2HLXBOI7O7uCIDZVag1xfc=
|
||||||
|
|
|
||||||
102
go.work.sum
102
go.work.sum
|
|
@ -1,24 +1,122 @@
|
||||||
|
atomicgo.dev/cursor v0.2.0 h1:H6XN5alUJ52FZZUkI7AlJbUc1aW38GWZalpYRPpoPOw=
|
||||||
|
atomicgo.dev/cursor v0.2.0/go.mod h1:Lr4ZJB3U7DfPPOkbH7/6TOtJ4vFGHlgj1nc+n900IpU=
|
||||||
|
atomicgo.dev/keyboard v0.2.9 h1:tOsIid3nlPLZ3lwgG8KZMp/SFmr7P0ssEN5JUsm78K8=
|
||||||
|
atomicgo.dev/keyboard v0.2.9/go.mod h1:BC4w9g00XkxH/f1HXhW2sXmJFOCWbKn9xrOunSFtExQ=
|
||||||
|
atomicgo.dev/schedule v0.1.0 h1:nTthAbhZS5YZmgYbb2+DH8uQIZcTlIrd4eYr3UQxEjs=
|
||||||
|
atomicgo.dev/schedule v0.1.0/go.mod h1:xeUa3oAkiuHYh8bKiQBRojqAMq3PXXbJujjb0hw8pEU=
|
||||||
cloud.google.com/go/compute/metadata v0.3.0 h1:Tz+eQXMEqDIKRsmY3cHTL6FVaynIjX2QxYC4trgAKZc=
|
cloud.google.com/go/compute/metadata v0.3.0 h1:Tz+eQXMEqDIKRsmY3cHTL6FVaynIjX2QxYC4trgAKZc=
|
||||||
cloud.google.com/go/compute/metadata v0.3.0/go.mod h1:zFmK7XCadkQkj6TtorcaGlCW1hT1fIilQDwofLpJ20k=
|
cloud.google.com/go/compute/metadata v0.3.0/go.mod h1:zFmK7XCadkQkj6TtorcaGlCW1hT1fIilQDwofLpJ20k=
|
||||||
|
github.com/Masterminds/semver v1.5.0 h1:H65muMkzWKEuNDnfl9d70GUjFniHKHRbFPGBuZ3QEww=
|
||||||
|
github.com/Masterminds/semver v1.5.0/go.mod h1:MB6lktGJrhw8PrUyiEoblNEGEQ+RzHPF078ddwwvV3Y=
|
||||||
|
github.com/StackExchange/wmi v1.2.1 h1:VIkavFPXSjcnS+O8yTq7NI32k0R5Aj+v39y29VYDOSA=
|
||||||
|
github.com/StackExchange/wmi v1.2.1/go.mod h1:rcmrprowKIVzvc+NUiLncP2uuArMWLCbu9SBzvHz7e8=
|
||||||
|
github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d h1:licZJFw2RwpHMqeKTCYkitsPqHNxTmd4SNR5r94FGM8=
|
||||||
|
github.com/acarl005/stripansi v0.0.0-20180116102854-5a71ef0e047d/go.mod h1:asat636LX7Bqt5lYEZ27JNDcqxfjdBQuJ/MM4CN/Lzo=
|
||||||
|
github.com/alecthomas/chroma/v2 v2.14.0 h1:R3+wzpnUArGcQz7fCETQBzO5n9IMNi13iIs46aU4V9E=
|
||||||
|
github.com/alecthomas/chroma/v2 v2.14.0/go.mod h1:QolEbTfmUHIMVpBqxeDnNBj2uoeI4EbYP4i6n68SG4I=
|
||||||
|
github.com/aymanbagabas/go-osc52/v2 v2.0.1 h1:HwpRHbFMcZLEVr42D4p7XBqjyuxQH5SMiErDT4WkJ2k=
|
||||||
|
github.com/aymanbagabas/go-osc52/v2 v2.0.1/go.mod h1:uYgXzlJ7ZpABp8OJ+exZzJJhRNQ2ASbcXHWsFqH8hp8=
|
||||||
|
github.com/aymerick/douceur v0.2.0 h1:Mv+mAeH1Q+n9Fr+oyamOlAkUNPWPlA8PPGR0QAaYuPk=
|
||||||
|
github.com/aymerick/douceur v0.2.0/go.mod h1:wlT5vV2O3h55X9m7iVYN0TBM0NH/MmbLnd30/FjWUq4=
|
||||||
|
github.com/bitfield/script v0.24.0 h1:ic0Tbx+2AgRtkGGIcUyr+Un60vu4WXvqFrCSumf+T7M=
|
||||||
|
github.com/bitfield/script v0.24.0/go.mod h1:fv+6x4OzVsRs6qAlc7wiGq8fq1b5orhtQdtW0dwjUHI=
|
||||||
github.com/bwesterb/go-ristretto v1.2.3 h1:1w53tCkGhCQ5djbat3+MH0BAQ5Kfgbt56UZQ/JMzngw=
|
github.com/bwesterb/go-ristretto v1.2.3 h1:1w53tCkGhCQ5djbat3+MH0BAQ5Kfgbt56UZQ/JMzngw=
|
||||||
github.com/bwesterb/go-ristretto v1.2.3/go.mod h1:fUIoIZaG73pV5biE2Blr2xEzDoMj7NFEuV9ekS419A0=
|
github.com/bwesterb/go-ristretto v1.2.3/go.mod h1:fUIoIZaG73pV5biE2Blr2xEzDoMj7NFEuV9ekS419A0=
|
||||||
|
github.com/charmbracelet/glamour v0.8.0 h1:tPrjL3aRcQbn++7t18wOpgLyl8wrOHUEDS7IZ68QtZs=
|
||||||
|
github.com/charmbracelet/glamour v0.8.0/go.mod h1:ViRgmKkf3u5S7uakt2czJ272WSg2ZenlYEZXT2x7Bjw=
|
||||||
|
github.com/charmbracelet/lipgloss v0.12.1 h1:/gmzszl+pedQpjCOH+wFkZr/N90Snz40J/NR7A0zQcs=
|
||||||
|
github.com/charmbracelet/lipgloss v0.12.1/go.mod h1:V2CiwIuhx9S1S1ZlADfOj9HmxeMAORuz5izHb0zGbB8=
|
||||||
|
github.com/charmbracelet/x/ansi v0.1.4 h1:IEU3D6+dWwPSgZ6HBH+v6oUuZ/nVawMiWj5831KfiLM=
|
||||||
|
github.com/charmbracelet/x/ansi v0.1.4/go.mod h1:dk73KoMTT5AX5BsX0KrqhsTqAnhZZoCBjs7dGWp4Ktw=
|
||||||
|
github.com/containerd/console v1.0.3 h1:lIr7SlA5PxZyMV30bDW0MGbiOPXwc63yRuCP0ARubLw=
|
||||||
|
github.com/containerd/console v1.0.3/go.mod h1:7LqA/THxQ86k76b8c/EMSiaJ3h1eZkMkXar0TQ1gf3U=
|
||||||
github.com/cpuguy83/go-md2man/v2 v2.0.6 h1:XJtiaUW6dEEqVuZiMTn1ldk455QWwEIsMIJlo5vtkx0=
|
github.com/cpuguy83/go-md2man/v2 v2.0.6 h1:XJtiaUW6dEEqVuZiMTn1ldk455QWwEIsMIJlo5vtkx0=
|
||||||
|
github.com/dlclark/regexp2 v1.11.0 h1:G/nrcoOa7ZXlpoa/91N3X7mM3r8eIlMBBJZvsz/mxKI=
|
||||||
|
github.com/dlclark/regexp2 v1.11.0/go.mod h1:DHkYz0B9wPfa6wondMfaivmHpzrQ3v9q8cnmRbL6yW8=
|
||||||
|
github.com/flytam/filenamify v1.2.0 h1:7RiSqXYR4cJftDQ5NuvljKMfd/ubKnW/j9C6iekChgI=
|
||||||
|
github.com/flytam/filenamify v1.2.0/go.mod h1:Dzf9kVycwcsBlr2ATg6uxjqiFgKGH+5SKFuhdeP5zu8=
|
||||||
|
github.com/fsnotify/fsnotify v1.9.0 h1:2Ml+OJNzbYCTzsxtv8vKSFD9PbJjmhYF14k/jKC7S9k=
|
||||||
|
github.com/fsnotify/fsnotify v1.9.0/go.mod h1:8jBTzvmWwFyi3Pb8djgCCO5IBqzKJ/Jwo8TRcHyHii0=
|
||||||
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
|
github.com/golang/protobuf v1.5.4 h1:i7eJL8qZTpSEXOPTxNKhASYpMn+8e5Q6AdndVa1dWek=
|
||||||
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
|
github.com/golang/protobuf v1.5.4/go.mod h1:lnTiLA8Wa4RWRcIUkrtSVa5nRhsEGBg48fD6rSs7xps=
|
||||||
|
github.com/google/shlex v0.0.0-20191202100458-e7afc7fbc510 h1:El6M4kTTCOh6aBiKaUGG7oYTSPP8MxqL4YI3kZKwcP4=
|
||||||
|
github.com/google/shlex v0.0.0-20191202100458-e7afc7fbc510/go.mod h1:pupxD2MaaD3pAXIBCelhxNneeOaAeabZDe5s4K6zSpQ=
|
||||||
|
github.com/gookit/color v1.5.4 h1:FZmqs7XOyGgCAxmWyPslpiok1k05wmY3SJTytgvYFs0=
|
||||||
|
github.com/gookit/color v1.5.4/go.mod h1:pZJOeOS8DM43rXbp4AZo1n9zCU2qjpcRko0b6/QJi9w=
|
||||||
|
github.com/gorilla/css v1.0.1 h1:ntNaBIghp6JmvWnxbZKANoLyuXTPZ4cAMlo6RyhlbO8=
|
||||||
|
github.com/gorilla/css v1.0.1/go.mod h1:BvnYkspnSzMmwRK+b8/xgNPLiIuNZr6vbZBTPQ2A3b0=
|
||||||
|
github.com/itchyny/gojq v0.12.13 h1:IxyYlHYIlspQHHTE0f3cJF0NKDMfajxViuhBLnHd/QU=
|
||||||
|
github.com/itchyny/gojq v0.12.13/go.mod h1:JzwzAqenfhrPUuwbmEz3nu3JQmFLlQTQMUcOdnu/Sf4=
|
||||||
|
github.com/itchyny/timefmt-go v0.1.5 h1:G0INE2la8S6ru/ZI5JecgyzbbJNs5lG1RcBqa7Jm6GE=
|
||||||
|
github.com/itchyny/timefmt-go v0.1.5/go.mod h1:nEP7L+2YmAbT2kZ2HfSs1d8Xtw9LY8D2stDBckWakZ8=
|
||||||
|
github.com/jackmordaunt/icns v1.0.0 h1:RYSxplerf/l/DUd09AHtITwckkv/mqjVv4DjYdPmAMQ=
|
||||||
|
github.com/jackmordaunt/icns v1.0.0/go.mod h1:7TTQVEuGzVVfOPPlLNHJIkzA6CoV7aH1Dv9dW351oOo=
|
||||||
|
github.com/jaypipes/ghw v0.13.0 h1:log8MXuB8hzTNnSktqpXMHc0c/2k/WgjOMSUtnI1RV4=
|
||||||
|
github.com/jaypipes/ghw v0.13.0/go.mod h1:In8SsaDqlb1oTyrbmTC14uy+fbBMvp+xdqX51MidlD8=
|
||||||
|
github.com/jaypipes/pcidb v1.0.1 h1:WB2zh27T3nwg8AE8ei81sNRb9yWBii3JGNJtT7K9Oic=
|
||||||
|
github.com/jaypipes/pcidb v1.0.1/go.mod h1:6xYUz/yYEyOkIkUt2t2J2folIuZ4Yg6uByCGFXMCeE4=
|
||||||
github.com/k0kubun/go-ansi v0.0.0-20180517002512-3bf9e2903213 h1:qGQQKEcAR99REcMpsXCp3lJ03zYT1PkRd3kQGPn9GVg=
|
github.com/k0kubun/go-ansi v0.0.0-20180517002512-3bf9e2903213 h1:qGQQKEcAR99REcMpsXCp3lJ03zYT1PkRd3kQGPn9GVg=
|
||||||
github.com/k0kubun/go-ansi v0.0.0-20180517002512-3bf9e2903213/go.mod h1:vNUNkEQ1e29fT/6vq2aBdFsgNPmy8qMdSay1npru+Sw=
|
github.com/k0kubun/go-ansi v0.0.0-20180517002512-3bf9e2903213/go.mod h1:vNUNkEQ1e29fT/6vq2aBdFsgNPmy8qMdSay1npru+Sw=
|
||||||
github.com/kr/pty v1.1.1 h1:VkoXIwSboBpnk99O/KFauAEILuNHv5DVFKZMBN/gUgw=
|
github.com/kr/pty v1.1.1 h1:VkoXIwSboBpnk99O/KFauAEILuNHv5DVFKZMBN/gUgw=
|
||||||
|
github.com/leaanthony/clir v1.3.0 h1:L9nPDWrmc/qU9UWZZvRaFajWYuO0np9V5p+5gxyYno0=
|
||||||
|
github.com/leaanthony/clir v1.3.0/go.mod h1:k/RBkdkFl18xkkACMCLt09bhiZnrGORoxmomeMvDpE0=
|
||||||
|
github.com/leaanthony/winicon v1.0.0 h1:ZNt5U5dY71oEoKZ97UVwJRT4e+5xo5o/ieKuHuk8NqQ=
|
||||||
|
github.com/leaanthony/winicon v1.0.0/go.mod h1:en5xhijl92aphrJdmRPlh4NI1L6wq3gEm0LpXAPghjU=
|
||||||
|
github.com/lithammer/fuzzysearch v1.1.8 h1:/HIuJnjHuXS8bKaiTMeeDlW2/AyIWk2brx1V8LFgLN4=
|
||||||
|
github.com/lithammer/fuzzysearch v1.1.8/go.mod h1:IdqeyBClc3FFqSzYq/MXESsS4S0FsZ5ajtkr5xPLts4=
|
||||||
|
github.com/lucasb-eyer/go-colorful v1.2.0 h1:1nnpGOrhyZZuNyfu1QjKiUICQ74+3FNCN69Aj6K7nkY=
|
||||||
|
github.com/lucasb-eyer/go-colorful v1.2.0/go.mod h1:R4dSotOR9KMtayYi1e77YzuveK+i7ruzyGqttikkLy0=
|
||||||
|
github.com/microcosm-cc/bluemonday v1.0.27 h1:MpEUotklkwCSLeH+Qdx1VJgNqLlpY2KXwXFM08ygZfk=
|
||||||
|
github.com/microcosm-cc/bluemonday v1.0.27/go.mod h1:jFi9vgW+H7c3V0lb6nR74Ib/DIB5OBs92Dimizgw2cA=
|
||||||
|
github.com/mitchellh/go-homedir v1.1.0 h1:lukF9ziXFxDFPkA1vsr5zpc1XuPDn/wFntq5mG+4E0Y=
|
||||||
|
github.com/mitchellh/go-homedir v1.1.0/go.mod h1:SfyaCUpYCn1Vlf4IUYiD9fPX4A5wJrkLzIz1N1q0pr0=
|
||||||
|
github.com/muesli/reflow v0.3.0 h1:IFsN6K9NfGtjeggFP+68I4chLZV2yIKsXJFNZ+eWh6s=
|
||||||
|
github.com/muesli/reflow v0.3.0/go.mod h1:pbwTDkVPibjO2kyvBQRBxTWEEGDGq0FlB1BIKtnHY/8=
|
||||||
|
github.com/muesli/termenv v0.15.3-0.20240618155329-98d742f6907a h1:2MaM6YC3mGu54x+RKAA6JiFFHlHDY1UbkxqppT7wYOg=
|
||||||
|
github.com/muesli/termenv v0.15.3-0.20240618155329-98d742f6907a/go.mod h1:hxSnBBYLK21Vtq/PHd0S2FYCxBXzBua8ov5s1RobyRQ=
|
||||||
|
github.com/nfnt/resize v0.0.0-20180221191011-83c6a9932646 h1:zYyBkD/k9seD2A7fsi6Oo2LfFZAehjjQMERAvZLEDnQ=
|
||||||
|
github.com/nfnt/resize v0.0.0-20180221191011-83c6a9932646/go.mod h1:jpp1/29i3P1S/RLdc7JQKbRpFeM1dOBd8T9ki5s+AY8=
|
||||||
|
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e h1:fD57ERR4JtEqsWbfPhv4DMiApHyliiK5xCTNVSPiaAs=
|
||||||
|
github.com/niemeyer/pretty v0.0.0-20200227124842-a10e7caefd8e/go.mod h1:zD1mROLANZcx1PVRCS0qkT7pwLkGfwJo4zjcN/Tysno=
|
||||||
|
github.com/pterm/pterm v0.12.80 h1:mM55B+GnKUnLMUSqhdINe4s6tOuVQIetQ3my8JGyAIg=
|
||||||
|
github.com/pterm/pterm v0.12.80/go.mod h1:c6DeF9bSnOSeFPZlfs4ZRAFcf5SCoTwvwQ5xaKGQlHo=
|
||||||
github.com/russross/blackfriday/v2 v2.1.0 h1:JIOH55/0cWyOuilr9/qlrm0BSXldqnqwMsf35Ld67mk=
|
github.com/russross/blackfriday/v2 v2.1.0 h1:JIOH55/0cWyOuilr9/qlrm0BSXldqnqwMsf35Ld67mk=
|
||||||
|
github.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06 h1:OkMGxebDjyw0ULyrTYWeN0UNCCkmCWfjPnIA2W6oviI=
|
||||||
|
github.com/sabhiram/go-gitignore v0.0.0-20210923224102-525f6e181f06/go.mod h1:+ePHsJ1keEjQtpvf9HHw0f4ZeJ0TLRsxhunSI2hYJSs=
|
||||||
github.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=
|
github.com/sirupsen/logrus v1.9.3 h1:dueUQJ1C2q9oE3F7wvmSGAaVtTmUizReu6fjN8uqzbQ=
|
||||||
github.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=
|
github.com/sirupsen/logrus v1.9.3/go.mod h1:naHLuLoDiP4jHNo9R0sCBMtWGeIprob74mVsIT4qYEQ=
|
||||||
github.com/stretchr/objx v0.1.0 h1:4G4v2dO3VZwixGIRoQ5Lfboy6nUhCyYzaqnIAPPhYs4=
|
github.com/stretchr/objx v0.1.0 h1:4G4v2dO3VZwixGIRoQ5Lfboy6nUhCyYzaqnIAPPhYs4=
|
||||||
github.com/stretchr/objx v0.5.2 h1:xuMeJ0Sdp5ZMRXx/aWO6RZxdr3beISkG5/G/aIRr3pY=
|
github.com/stretchr/objx v0.5.2 h1:xuMeJ0Sdp5ZMRXx/aWO6RZxdr3beISkG5/G/aIRr3pY=
|
||||||
github.com/stretchr/objx v0.5.2/go.mod h1:FRsXN1f5AsAjCGJKqEizvkpNtU+EGNCLh3NxZ/8L+MA=
|
github.com/stretchr/objx v0.5.2/go.mod h1:FRsXN1f5AsAjCGJKqEizvkpNtU+EGNCLh3NxZ/8L+MA=
|
||||||
|
github.com/tc-hib/winres v0.3.1 h1:CwRjEGrKdbi5CvZ4ID+iyVhgyfatxFoizjPhzez9Io4=
|
||||||
|
github.com/tc-hib/winres v0.3.1/go.mod h1:C/JaNhH3KBvhNKVbvdlDWkbMDO9H4fKKDaN7/07SSuk=
|
||||||
|
github.com/tidwall/gjson v1.14.2 h1:6BBkirS0rAHjumnjHF6qgy5d2YAJ1TLIaFE2lzfOLqo=
|
||||||
|
github.com/tidwall/gjson v1.14.2/go.mod h1:/wbyibRr2FHMks5tjHJ5F8dMZh3AcwJEMf5vlfC0lxk=
|
||||||
|
github.com/tidwall/match v1.1.1 h1:+Ho715JplO36QYgwN9PGYNhgZvoUSc9X2c80KVTi+GA=
|
||||||
|
github.com/tidwall/match v1.1.1/go.mod h1:eRSPERbgtNPcGhD8UCthc6PmLEQXEWd3PRB5JTxsfmM=
|
||||||
|
github.com/tidwall/pretty v1.2.0 h1:RWIZEg2iJ8/g6fDDYzMpobmaoGh5OLl4AXtGUGPcqCs=
|
||||||
|
github.com/tidwall/pretty v1.2.0/go.mod h1:ITEVvHYasfjBbM0u2Pg8T2nJnzm8xPwvNhhsoaGGjNU=
|
||||||
|
github.com/tidwall/sjson v1.2.5 h1:kLy8mja+1c9jlljvWTlSazM7cKDRfJuR/bOJhcY5NcY=
|
||||||
|
github.com/tidwall/sjson v1.2.5/go.mod h1:Fvgq9kS/6ociJEDnK0Fk1cpYF4FIW6ZF7LAe+6jwd28=
|
||||||
|
github.com/wzshiming/ctc v1.2.3 h1:q+hW3IQNsjIlOFBTGZZZeIXTElFM4grF4spW/errh/c=
|
||||||
|
github.com/wzshiming/ctc v1.2.3/go.mod h1:2tVAtIY7SUyraSk0JxvwmONNPFL4ARavPuEsg5+KA28=
|
||||||
|
github.com/wzshiming/winseq v0.0.0-20200112104235-db357dc107ae h1:tpXvBXC3hpQBDCc9OojJZCQMVRAbT3TTdUMP8WguXkY=
|
||||||
|
github.com/wzshiming/winseq v0.0.0-20200112104235-db357dc107ae/go.mod h1:VTAq37rkGeV+WOybvZwjXiJOicICdpLCN8ifpISjK20=
|
||||||
|
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e h1:JVG44RsyaB9T2KIHavMF/ppJZNG9ZpyihvCd0w101no=
|
||||||
|
github.com/xo/terminfo v0.0.0-20220910002029-abceb7e1c41e/go.mod h1:RbqR21r5mrJuqunuUZ/Dhy/avygyECGrLceyNeo4LiM=
|
||||||
|
github.com/yuin/goldmark v1.7.4 h1:BDXOHExt+A7gwPCJgPIIq7ENvceR7we7rOS9TNoLZeg=
|
||||||
|
github.com/yuin/goldmark v1.7.4/go.mod h1:uzxRWxtg69N339t3louHJ7+O03ezfj6PlliRlaOzY1E=
|
||||||
|
github.com/yuin/goldmark-emoji v1.0.3 h1:aLRkLHOuBR2czCY4R8olwMjID+tENfhyFDMCRhbIQY4=
|
||||||
|
github.com/yuin/goldmark-emoji v1.0.3/go.mod h1:tTkZEbwu5wkPmgTcitqddVxY9osFZiavD+r4AzQrh1U=
|
||||||
golang.org/x/crypto v0.11.1-0.20230711161743-2e82bdd1719d/go.mod h1:xgJhtzW8F9jGdVFWZESrid1U1bjeNy4zgy5cRr/CIio=
|
golang.org/x/crypto v0.11.1-0.20230711161743-2e82bdd1719d/go.mod h1:xgJhtzW8F9jGdVFWZESrid1U1bjeNy4zgy5cRr/CIio=
|
||||||
|
golang.org/x/image v0.12.0 h1:w13vZbU4o5rKOFFR8y7M+c4A5jXDC0uXTdHYRP8X2DQ=
|
||||||
|
golang.org/x/image v0.12.0/go.mod h1:Lu90jvHG7GfemOIcldsh9A2hS01ocl6oNO7ype5mEnk=
|
||||||
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
|
golang.org/x/net v0.46.0/go.mod h1:Q9BGdFy1y4nkUwiLvT5qtyhAnEHgnQ/zd8PfU6nc210=
|
||||||
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
|
golang.org/x/sync v0.18.0 h1:kr88TuHDroi+UVf+0hZnirlk8o8T+4MrK6mr60WkH/I=
|
||||||
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
golang.org/x/sync v0.18.0/go.mod h1:9KTHXmSnoGruLpwFjVSX0lNNA75CykiMECbovNTZqGI=
|
||||||
golang.org/x/sys v0.10.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
golang.org/x/sys v0.10.0/go.mod h1:oPkhp1MJrh7nUepCBck5+mAzfO9JrbApNNgaTdGDITg=
|
||||||
|
golang.org/x/time v0.8.0 h1:9i3RxcPv3PZnitoVGMPDKZSq1xW1gK1Xy3ArNOGZfEg=
|
||||||
|
golang.org/x/time v0.8.0/go.mod h1:3BpzKBy/shNhVucY/MWOyx10tF3SFh9QdLuxbVysPQM=
|
||||||
golang.org/x/tools v0.37.0 h1:DVSRzp7FwePZW356yEAChSdNcQo6Nsp+fex1SUW09lE=
|
golang.org/x/tools v0.37.0 h1:DVSRzp7FwePZW356yEAChSdNcQo6Nsp+fex1SUW09lE=
|
||||||
golang.org/x/tools v0.37.0/go.mod h1:MBN5QPQtLMHVdvsbtarmTNukZDdgwdwlO5qGacAzF0w=
|
golang.org/x/tools v0.37.0/go.mod h1:MBN5QPQtLMHVdvsbtarmTNukZDdgwdwlO5qGacAzF0w=
|
||||||
golang.org/x/tools v0.38.0 h1:Hx2Xv8hISq8Lm16jvBZ2VQf+RLmbd7wVUsALibYI/IQ=
|
golang.org/x/tools v0.38.0 h1:Hx2Xv8hISq8Lm16jvBZ2VQf+RLmbd7wVUsALibYI/IQ=
|
||||||
|
|
@ -28,3 +126,7 @@ google.golang.org/appengine v1.6.7 h1:FZR1q0exgwxzPzp/aF+VccGrSfxfPpkBqjIIEq3ru6
|
||||||
google.golang.org/protobuf v1.33.0 h1:uNO2rsAINq/JlFpSdYEKIZ0uKD/R9cpdv0T+yoGwGmI=
|
google.golang.org/protobuf v1.33.0 h1:uNO2rsAINq/JlFpSdYEKIZ0uKD/R9cpdv0T+yoGwGmI=
|
||||||
google.golang.org/protobuf v1.33.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
|
google.golang.org/protobuf v1.33.0/go.mod h1:c6P6GXX6sHbq/GpV6MGZEdwhWPcYBgnhAHhKbcUYpos=
|
||||||
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
|
gopkg.in/yaml.v2 v2.4.0 h1:D8xgwECY7CYvx+Y2n4sBz93Jn9JRvxdiyyo8CTfuKaY=
|
||||||
|
howett.net/plist v1.0.0 h1:7CrbWYbPPO/PyNy38b2EB/+gYbjCe2DXBxgtOOZbSQM=
|
||||||
|
howett.net/plist v1.0.0/go.mod h1:lqaXoTrLY4hg8tnEzNru53gicrbv7rrk+2xJA/7hw9g=
|
||||||
|
mvdan.cc/sh/v3 v3.7.0 h1:lSTjdP/1xsddtaKfGg7Myu7DnlHItd3/M2tomOcNNBg=
|
||||||
|
mvdan.cc/sh/v3 v3.7.0/go.mod h1:K2gwkaesF/D7av7Kxl0HbF5kGOd2ArupNTX3X44+8l8=
|
||||||
|
|
|
||||||
1161
js/borg-stmf/artist-portal.html
Normal file
1161
js/borg-stmf/artist-portal.html
Normal file
File diff suppressed because it is too large
Load diff
1
js/borg-stmf/demo-track.smsg
Normal file
1
js/borg-stmf/demo-track.smsg
Normal file
File diff suppressed because one or more lines are too long
|
|
@ -310,6 +310,8 @@
|
||||||
<nav class="nav-links">
|
<nav class="nav-links">
|
||||||
<a href="index.html" class="active">Form Encryption</a>
|
<a href="index.html" class="active">Form Encryption</a>
|
||||||
<a href="support-reply.html">Decrypt Messages</a>
|
<a href="support-reply.html">Decrypt Messages</a>
|
||||||
|
<a href="media-player.html">Media Player</a>
|
||||||
|
<a href="artist-portal.html">Artist Portal</a>
|
||||||
</nav>
|
</nav>
|
||||||
|
|
||||||
<div id="wasm-status" class="status-indicator loading">
|
<div id="wasm-status" class="status-indicator loading">
|
||||||
|
|
|
||||||
1290
js/borg-stmf/media-player.html
Normal file
1290
js/borg-stmf/media-player.html
Normal file
File diff suppressed because it is too large
Load diff
Binary file not shown.
|
|
@ -389,6 +389,8 @@
|
||||||
<nav class="nav-links">
|
<nav class="nav-links">
|
||||||
<a href="index.html">Form Encryption</a>
|
<a href="index.html">Form Encryption</a>
|
||||||
<a href="support-reply.html" class="active">Decrypt Messages</a>
|
<a href="support-reply.html" class="active">Decrypt Messages</a>
|
||||||
|
<a href="media-player.html">Media Player</a>
|
||||||
|
<a href="artist-portal.html">Artist Portal</a>
|
||||||
</nav>
|
</nav>
|
||||||
|
|
||||||
<div id="wasm-status" class="status-indicator loading">
|
<div id="wasm-status" class="status-indicator loading">
|
||||||
|
|
|
||||||
4
main.go
4
main.go
|
|
@ -3,8 +3,8 @@ package main
|
||||||
import (
|
import (
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/cmd"
|
"forge.lthn.ai/Snider/Borg/cmd"
|
||||||
"github.com/Snider/Borg/pkg/logger"
|
"forge.lthn.ai/Snider/Borg/pkg/logger"
|
||||||
)
|
)
|
||||||
|
|
||||||
var osExit = os.Exit
|
var osExit = os.Exit
|
||||||
|
|
|
||||||
|
|
@ -9,7 +9,7 @@ import (
|
||||||
"fmt"
|
"fmt"
|
||||||
"os"
|
"os"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/stmf"
|
"forge.lthn.ai/Snider/Borg/pkg/stmf"
|
||||||
)
|
)
|
||||||
|
|
||||||
type TestVector struct {
|
type TestVector struct {
|
||||||
|
|
|
||||||
|
|
@ -3,11 +3,34 @@ package compress
|
||||||
import (
|
import (
|
||||||
"bytes"
|
"bytes"
|
||||||
"compress/gzip"
|
"compress/gzip"
|
||||||
|
"fmt"
|
||||||
"io"
|
"io"
|
||||||
|
|
||||||
"github.com/ulikunitz/xz"
|
"github.com/ulikunitz/xz"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
// nopCloser wraps an io.Writer with a no-op Close method.
|
||||||
|
type nopCloser struct{ io.Writer }
|
||||||
|
|
||||||
|
func (n *nopCloser) Close() error { return nil }
|
||||||
|
|
||||||
|
// NewCompressWriter returns a streaming io.WriteCloser that compresses data
|
||||||
|
// written to it into the underlying writer w using the specified format.
|
||||||
|
// Supported formats: "gz" (gzip), "xz", "none" or "" (passthrough).
|
||||||
|
// Unknown formats return an error.
|
||||||
|
func NewCompressWriter(w io.Writer, format string) (io.WriteCloser, error) {
|
||||||
|
switch format {
|
||||||
|
case "gz":
|
||||||
|
return gzip.NewWriter(w), nil
|
||||||
|
case "xz":
|
||||||
|
return xz.NewWriter(w)
|
||||||
|
case "none", "":
|
||||||
|
return &nopCloser{w}, nil
|
||||||
|
default:
|
||||||
|
return nil, fmt.Errorf("unsupported compression format: %q", format)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
// Compress compresses data using the specified format.
|
// Compress compresses data using the specified format.
|
||||||
func Compress(data []byte, format string) ([]byte, error) {
|
func Compress(data []byte, format string) ([]byte, error) {
|
||||||
var buf bytes.Buffer
|
var buf bytes.Buffer
|
||||||
|
|
|
||||||
|
|
@ -5,6 +5,108 @@ import (
|
||||||
"testing"
|
"testing"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
func TestNewCompressWriter_Gzip_Good(t *testing.T) {
|
||||||
|
original := []byte("hello, streaming gzip world")
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
w, err := NewCompressWriter(&buf, "gz")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("NewCompressWriter(gz) error: %v", err)
|
||||||
|
}
|
||||||
|
if _, err := w.Write(original); err != nil {
|
||||||
|
t.Fatalf("Write error: %v", err)
|
||||||
|
}
|
||||||
|
if err := w.Close(); err != nil {
|
||||||
|
t.Fatalf("Close error: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
compressed := buf.Bytes()
|
||||||
|
if bytes.Equal(original, compressed) {
|
||||||
|
t.Fatal("compressed data should differ from original")
|
||||||
|
}
|
||||||
|
|
||||||
|
decompressed, err := Decompress(compressed)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Decompress error: %v", err)
|
||||||
|
}
|
||||||
|
if !bytes.Equal(original, decompressed) {
|
||||||
|
t.Errorf("round-trip mismatch: got %q, want %q", decompressed, original)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestNewCompressWriter_Xz_Good(t *testing.T) {
|
||||||
|
original := []byte("hello, streaming xz world")
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
w, err := NewCompressWriter(&buf, "xz")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("NewCompressWriter(xz) error: %v", err)
|
||||||
|
}
|
||||||
|
if _, err := w.Write(original); err != nil {
|
||||||
|
t.Fatalf("Write error: %v", err)
|
||||||
|
}
|
||||||
|
if err := w.Close(); err != nil {
|
||||||
|
t.Fatalf("Close error: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
compressed := buf.Bytes()
|
||||||
|
if bytes.Equal(original, compressed) {
|
||||||
|
t.Fatal("compressed data should differ from original")
|
||||||
|
}
|
||||||
|
|
||||||
|
decompressed, err := Decompress(compressed)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Decompress error: %v", err)
|
||||||
|
}
|
||||||
|
if !bytes.Equal(original, decompressed) {
|
||||||
|
t.Errorf("round-trip mismatch: got %q, want %q", decompressed, original)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestNewCompressWriter_None_Good(t *testing.T) {
|
||||||
|
original := []byte("hello, passthrough world")
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
w, err := NewCompressWriter(&buf, "none")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("NewCompressWriter(none) error: %v", err)
|
||||||
|
}
|
||||||
|
if _, err := w.Write(original); err != nil {
|
||||||
|
t.Fatalf("Write error: %v", err)
|
||||||
|
}
|
||||||
|
if err := w.Close(); err != nil {
|
||||||
|
t.Fatalf("Close error: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if !bytes.Equal(original, buf.Bytes()) {
|
||||||
|
t.Errorf("passthrough mismatch: got %q, want %q", buf.Bytes(), original)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Also test empty string format
|
||||||
|
var buf2 bytes.Buffer
|
||||||
|
w2, err := NewCompressWriter(&buf2, "")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("NewCompressWriter('') error: %v", err)
|
||||||
|
}
|
||||||
|
if _, err := w2.Write(original); err != nil {
|
||||||
|
t.Fatalf("Write error: %v", err)
|
||||||
|
}
|
||||||
|
if err := w2.Close(); err != nil {
|
||||||
|
t.Fatalf("Close error: %v", err)
|
||||||
|
}
|
||||||
|
if !bytes.Equal(original, buf2.Bytes()) {
|
||||||
|
t.Errorf("passthrough (empty string) mismatch: got %q, want %q", buf2.Bytes(), original)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestNewCompressWriter_Bad(t *testing.T) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
_, err := NewCompressWriter(&buf, "invalid-format")
|
||||||
|
if err == nil {
|
||||||
|
t.Fatal("expected error for unknown compression format, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func TestGzip_Good(t *testing.T) {
|
func TestGzip_Good(t *testing.T) {
|
||||||
originalData := []byte("hello, gzip world")
|
originalData := []byte("hello, gzip world")
|
||||||
compressed, err := Compress(originalData, "gz")
|
compressed, err := Compress(originalData, "gz")
|
||||||
|
|
|
||||||
|
|
@ -8,8 +8,8 @@ import (
|
||||||
"os"
|
"os"
|
||||||
"sync"
|
"sync"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/tim"
|
"forge.lthn.ai/Snider/Borg/pkg/tim"
|
||||||
)
|
)
|
||||||
|
|
||||||
//go:embed unlock.html
|
//go:embed unlock.html
|
||||||
|
|
|
||||||
197
pkg/datanode/addpath_test.go
Normal file
197
pkg/datanode/addpath_test.go
Normal file
|
|
@ -0,0 +1,197 @@
|
||||||
|
package datanode
|
||||||
|
|
||||||
|
import (
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"runtime"
|
||||||
|
"testing"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestAddPath_Good(t *testing.T) {
|
||||||
|
// Create a temp directory with files and a nested subdirectory.
|
||||||
|
dir := t.TempDir()
|
||||||
|
if err := os.WriteFile(filepath.Join(dir, "hello.txt"), []byte("hello"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
subdir := filepath.Join(dir, "sub")
|
||||||
|
if err := os.Mkdir(subdir, 0755); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
if err := os.WriteFile(filepath.Join(subdir, "world.txt"), []byte("world"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
dn := New()
|
||||||
|
if err := dn.AddPath(dir, AddPathOptions{}); err != nil {
|
||||||
|
t.Fatalf("AddPath failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify files are stored with paths relative to dir, using forward slashes.
|
||||||
|
file, ok := dn.files["hello.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("hello.txt not found in datanode")
|
||||||
|
}
|
||||||
|
if string(file.content) != "hello" {
|
||||||
|
t.Errorf("expected content 'hello', got %q", file.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
file, ok = dn.files["sub/world.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("sub/world.txt not found in datanode")
|
||||||
|
}
|
||||||
|
if string(file.content) != "world" {
|
||||||
|
t.Errorf("expected content 'world', got %q", file.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Directories should not be stored explicitly.
|
||||||
|
if _, ok := dn.files["sub"]; ok {
|
||||||
|
t.Error("directories should not be stored as explicit entries")
|
||||||
|
}
|
||||||
|
if _, ok := dn.files["sub/"]; ok {
|
||||||
|
t.Error("directories should not be stored as explicit entries")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAddPath_SkipBrokenSymlinks_Good(t *testing.T) {
|
||||||
|
if runtime.GOOS == "windows" {
|
||||||
|
t.Skip("symlinks not reliably supported on Windows")
|
||||||
|
}
|
||||||
|
|
||||||
|
dir := t.TempDir()
|
||||||
|
|
||||||
|
// Create a real file.
|
||||||
|
if err := os.WriteFile(filepath.Join(dir, "real.txt"), []byte("real"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a broken symlink (target does not exist).
|
||||||
|
if err := os.Symlink("/nonexistent/target", filepath.Join(dir, "broken.txt")); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
dn := New()
|
||||||
|
err := dn.AddPath(dir, AddPathOptions{SkipBrokenSymlinks: true})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("AddPath should not error with SkipBrokenSymlinks: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// The real file should be present.
|
||||||
|
if _, ok := dn.files["real.txt"]; !ok {
|
||||||
|
t.Error("real.txt should be present")
|
||||||
|
}
|
||||||
|
|
||||||
|
// The broken symlink should be skipped.
|
||||||
|
if _, ok := dn.files["broken.txt"]; ok {
|
||||||
|
t.Error("broken.txt should have been skipped")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAddPath_ExcludePatterns_Good(t *testing.T) {
|
||||||
|
dir := t.TempDir()
|
||||||
|
|
||||||
|
if err := os.WriteFile(filepath.Join(dir, "app.go"), []byte("package main"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
if err := os.WriteFile(filepath.Join(dir, "debug.log"), []byte("log data"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
if err := os.WriteFile(filepath.Join(dir, "error.log"), []byte("error data"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
dn := New()
|
||||||
|
err := dn.AddPath(dir, AddPathOptions{
|
||||||
|
ExcludePatterns: []string{"*.log"},
|
||||||
|
})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("AddPath failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// app.go should be present.
|
||||||
|
if _, ok := dn.files["app.go"]; !ok {
|
||||||
|
t.Error("app.go should be present")
|
||||||
|
}
|
||||||
|
|
||||||
|
// .log files should be excluded.
|
||||||
|
if _, ok := dn.files["debug.log"]; ok {
|
||||||
|
t.Error("debug.log should have been excluded")
|
||||||
|
}
|
||||||
|
if _, ok := dn.files["error.log"]; ok {
|
||||||
|
t.Error("error.log should have been excluded")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAddPath_Bad(t *testing.T) {
|
||||||
|
dn := New()
|
||||||
|
err := dn.AddPath("/nonexistent/path/that/does/not/exist", AddPathOptions{})
|
||||||
|
if err == nil {
|
||||||
|
t.Fatal("expected error for nonexistent directory, got nil")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAddPath_ValidSymlink_Good(t *testing.T) {
|
||||||
|
if runtime.GOOS == "windows" {
|
||||||
|
t.Skip("symlinks not reliably supported on Windows")
|
||||||
|
}
|
||||||
|
|
||||||
|
dir := t.TempDir()
|
||||||
|
|
||||||
|
// Create a real file.
|
||||||
|
if err := os.WriteFile(filepath.Join(dir, "target.txt"), []byte("target content"), 0644); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create a valid symlink pointing to the real file.
|
||||||
|
if err := os.Symlink("target.txt", filepath.Join(dir, "link.txt")); err != nil {
|
||||||
|
t.Fatal(err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Default behavior (FollowSymlinks=false): store as symlink.
|
||||||
|
dn := New()
|
||||||
|
err := dn.AddPath(dir, AddPathOptions{})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("AddPath failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// The target file should be a regular file.
|
||||||
|
targetFile, ok := dn.files["target.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("target.txt not found")
|
||||||
|
}
|
||||||
|
if targetFile.isSymlink() {
|
||||||
|
t.Error("target.txt should not be a symlink")
|
||||||
|
}
|
||||||
|
if string(targetFile.content) != "target content" {
|
||||||
|
t.Errorf("expected content 'target content', got %q", targetFile.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
// The symlink should be stored as a symlink entry.
|
||||||
|
linkFile, ok := dn.files["link.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("link.txt not found")
|
||||||
|
}
|
||||||
|
if !linkFile.isSymlink() {
|
||||||
|
t.Error("link.txt should be a symlink")
|
||||||
|
}
|
||||||
|
if linkFile.symlink != "target.txt" {
|
||||||
|
t.Errorf("expected symlink target 'target.txt', got %q", linkFile.symlink)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test with FollowSymlinks=true: store as regular file with target content.
|
||||||
|
dn2 := New()
|
||||||
|
err = dn2.AddPath(dir, AddPathOptions{FollowSymlinks: true})
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("AddPath with FollowSymlinks failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
linkFile2, ok := dn2.files["link.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("link.txt not found with FollowSymlinks")
|
||||||
|
}
|
||||||
|
if linkFile2.isSymlink() {
|
||||||
|
t.Error("link.txt should NOT be a symlink when FollowSymlinks is true")
|
||||||
|
}
|
||||||
|
if string(linkFile2.content) != "target content" {
|
||||||
|
t.Errorf("expected content 'target content', got %q", linkFile2.content)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -8,6 +8,7 @@ import (
|
||||||
"io/fs"
|
"io/fs"
|
||||||
"os"
|
"os"
|
||||||
"path"
|
"path"
|
||||||
|
"path/filepath"
|
||||||
"sort"
|
"sort"
|
||||||
"strings"
|
"strings"
|
||||||
"time"
|
"time"
|
||||||
|
|
@ -42,12 +43,15 @@ func FromTar(tarball []byte) (*DataNode, error) {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
|
|
||||||
if header.Typeflag == tar.TypeReg {
|
switch header.Typeflag {
|
||||||
|
case tar.TypeReg:
|
||||||
data, err := io.ReadAll(tarReader)
|
data, err := io.ReadAll(tarReader)
|
||||||
if err != nil {
|
if err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
dn.AddData(header.Name, data)
|
dn.AddData(header.Name, data)
|
||||||
|
case tar.TypeSymlink:
|
||||||
|
dn.AddSymlink(header.Name, header.Linkname)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -60,17 +64,30 @@ func (d *DataNode) ToTar() ([]byte, error) {
|
||||||
tw := tar.NewWriter(buf)
|
tw := tar.NewWriter(buf)
|
||||||
|
|
||||||
for _, file := range d.files {
|
for _, file := range d.files {
|
||||||
hdr := &tar.Header{
|
var hdr *tar.Header
|
||||||
Name: file.name,
|
if file.isSymlink() {
|
||||||
Mode: 0600,
|
hdr = &tar.Header{
|
||||||
Size: int64(len(file.content)),
|
Typeflag: tar.TypeSymlink,
|
||||||
ModTime: file.modTime,
|
Name: file.name,
|
||||||
|
Linkname: file.symlink,
|
||||||
|
Mode: 0777,
|
||||||
|
ModTime: file.modTime,
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
hdr = &tar.Header{
|
||||||
|
Name: file.name,
|
||||||
|
Mode: 0600,
|
||||||
|
Size: int64(len(file.content)),
|
||||||
|
ModTime: file.modTime,
|
||||||
|
}
|
||||||
}
|
}
|
||||||
if err := tw.WriteHeader(hdr); err != nil {
|
if err := tw.WriteHeader(hdr); err != nil {
|
||||||
return nil, err
|
return nil, err
|
||||||
}
|
}
|
||||||
if _, err := tw.Write(file.content); err != nil {
|
if !file.isSymlink() {
|
||||||
return nil, err
|
if _, err := tw.Write(file.content); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -81,6 +98,51 @@ func (d *DataNode) ToTar() ([]byte, error) {
|
||||||
return buf.Bytes(), nil
|
return buf.Bytes(), nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// ToTarWriter streams the DataNode contents to a tar writer.
|
||||||
|
// File keys are sorted for deterministic output.
|
||||||
|
func (d *DataNode) ToTarWriter(w io.Writer) error {
|
||||||
|
tw := tar.NewWriter(w)
|
||||||
|
defer tw.Close()
|
||||||
|
|
||||||
|
// Sort keys for deterministic output.
|
||||||
|
keys := make([]string, 0, len(d.files))
|
||||||
|
for k := range d.files {
|
||||||
|
keys = append(keys, k)
|
||||||
|
}
|
||||||
|
sort.Strings(keys)
|
||||||
|
|
||||||
|
for _, k := range keys {
|
||||||
|
file := d.files[k]
|
||||||
|
var hdr *tar.Header
|
||||||
|
if file.isSymlink() {
|
||||||
|
hdr = &tar.Header{
|
||||||
|
Typeflag: tar.TypeSymlink,
|
||||||
|
Name: file.name,
|
||||||
|
Linkname: file.symlink,
|
||||||
|
Mode: 0777,
|
||||||
|
ModTime: file.modTime,
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
hdr = &tar.Header{
|
||||||
|
Name: file.name,
|
||||||
|
Mode: 0600,
|
||||||
|
Size: int64(len(file.content)),
|
||||||
|
ModTime: file.modTime,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if err := tw.WriteHeader(hdr); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if !file.isSymlink() {
|
||||||
|
if _, err := tw.Write(file.content); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
// AddData adds a file to the DataNode.
|
// AddData adds a file to the DataNode.
|
||||||
func (d *DataNode) AddData(name string, content []byte) {
|
func (d *DataNode) AddData(name string, content []byte) {
|
||||||
name = strings.TrimPrefix(name, "/")
|
name = strings.TrimPrefix(name, "/")
|
||||||
|
|
@ -99,6 +161,119 @@ func (d *DataNode) AddData(name string, content []byte) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// AddSymlink adds a symlink entry to the DataNode.
|
||||||
|
func (d *DataNode) AddSymlink(name, target string) {
|
||||||
|
name = strings.TrimPrefix(name, "/")
|
||||||
|
if name == "" {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
if strings.HasSuffix(name, "/") {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
d.files[name] = &dataFile{
|
||||||
|
name: name,
|
||||||
|
symlink: target,
|
||||||
|
modTime: time.Now(),
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddPathOptions configures the behaviour of AddPath.
|
||||||
|
type AddPathOptions struct {
|
||||||
|
SkipBrokenSymlinks bool // skip broken symlinks instead of erroring
|
||||||
|
FollowSymlinks bool // follow symlinks and store target content (default false = store as symlinks)
|
||||||
|
ExcludePatterns []string // glob patterns to exclude (matched against basename)
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddPath walks a real directory and adds its files to the DataNode.
|
||||||
|
// Paths are stored relative to dir, normalized with forward slashes.
|
||||||
|
// Directories are implicit and not stored.
|
||||||
|
func (d *DataNode) AddPath(dir string, opts AddPathOptions) error {
|
||||||
|
absDir, err := filepath.Abs(dir)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
return filepath.WalkDir(absDir, func(p string, entry fs.DirEntry, err error) error {
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip the root directory itself.
|
||||||
|
if p == absDir {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compute relative path and normalize to forward slashes.
|
||||||
|
rel, err := filepath.Rel(absDir, p)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
rel = filepath.ToSlash(rel)
|
||||||
|
|
||||||
|
// Skip directories — they are implicit in DataNode.
|
||||||
|
isSymlink := entry.Type()&fs.ModeSymlink != 0
|
||||||
|
if entry.IsDir() {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Apply exclude patterns against basename.
|
||||||
|
base := filepath.Base(p)
|
||||||
|
for _, pattern := range opts.ExcludePatterns {
|
||||||
|
matched, matchErr := filepath.Match(pattern, base)
|
||||||
|
if matchErr != nil {
|
||||||
|
return matchErr
|
||||||
|
}
|
||||||
|
if matched {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Handle symlinks.
|
||||||
|
if isSymlink {
|
||||||
|
linkTarget, err := os.Readlink(p)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Resolve the symlink target to check if it exists.
|
||||||
|
absTarget := linkTarget
|
||||||
|
if !filepath.IsAbs(absTarget) {
|
||||||
|
absTarget = filepath.Join(filepath.Dir(p), linkTarget)
|
||||||
|
}
|
||||||
|
|
||||||
|
_, statErr := os.Stat(absTarget)
|
||||||
|
if statErr != nil {
|
||||||
|
// Broken symlink.
|
||||||
|
if opts.SkipBrokenSymlinks {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return statErr
|
||||||
|
}
|
||||||
|
|
||||||
|
if opts.FollowSymlinks {
|
||||||
|
// Read the target content and store as regular file.
|
||||||
|
content, err := os.ReadFile(absTarget)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
d.AddData(rel, content)
|
||||||
|
} else {
|
||||||
|
// Store as symlink.
|
||||||
|
d.AddSymlink(rel, linkTarget)
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Regular file: read content and add.
|
||||||
|
content, err := os.ReadFile(p)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
d.AddData(rel, content)
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
// Open opens a file from the DataNode.
|
// Open opens a file from the DataNode.
|
||||||
func (d *DataNode) Open(name string) (fs.File, error) {
|
func (d *DataNode) Open(name string) (fs.File, error) {
|
||||||
name = strings.TrimPrefix(name, "/")
|
name = strings.TrimPrefix(name, "/")
|
||||||
|
|
@ -299,8 +474,11 @@ type dataFile struct {
|
||||||
name string
|
name string
|
||||||
content []byte
|
content []byte
|
||||||
modTime time.Time
|
modTime time.Time
|
||||||
|
symlink string
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func (d *dataFile) isSymlink() bool { return d.symlink != "" }
|
||||||
|
|
||||||
func (d *dataFile) Stat() (fs.FileInfo, error) { return &dataFileInfo{file: d}, nil }
|
func (d *dataFile) Stat() (fs.FileInfo, error) { return &dataFileInfo{file: d}, nil }
|
||||||
func (d *dataFile) Read(p []byte) (int, error) { return 0, io.EOF }
|
func (d *dataFile) Read(p []byte) (int, error) { return 0, io.EOF }
|
||||||
func (d *dataFile) Close() error { return nil }
|
func (d *dataFile) Close() error { return nil }
|
||||||
|
|
@ -310,7 +488,12 @@ type dataFileInfo struct{ file *dataFile }
|
||||||
|
|
||||||
func (d *dataFileInfo) Name() string { return path.Base(d.file.name) }
|
func (d *dataFileInfo) Name() string { return path.Base(d.file.name) }
|
||||||
func (d *dataFileInfo) Size() int64 { return int64(len(d.file.content)) }
|
func (d *dataFileInfo) Size() int64 { return int64(len(d.file.content)) }
|
||||||
func (d *dataFileInfo) Mode() fs.FileMode { return 0444 }
|
func (d *dataFileInfo) Mode() fs.FileMode {
|
||||||
|
if d.file.isSymlink() {
|
||||||
|
return os.ModeSymlink | 0777
|
||||||
|
}
|
||||||
|
return 0444
|
||||||
|
}
|
||||||
func (d *dataFileInfo) ModTime() time.Time { return d.file.modTime }
|
func (d *dataFileInfo) ModTime() time.Time { return d.file.modTime }
|
||||||
func (d *dataFileInfo) IsDir() bool { return false }
|
func (d *dataFileInfo) IsDir() bool { return false }
|
||||||
func (d *dataFileInfo) Sys() interface{} { return nil }
|
func (d *dataFileInfo) Sys() interface{} { return nil }
|
||||||
|
|
|
||||||
|
|
@ -580,6 +580,273 @@ func TestFromTar_Bad(t *testing.T) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestAddSymlink_Good(t *testing.T) {
|
||||||
|
dn := New()
|
||||||
|
dn.AddSymlink("link.txt", "target.txt")
|
||||||
|
|
||||||
|
file, ok := dn.files["link.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("symlink not found in datanode")
|
||||||
|
}
|
||||||
|
if file.symlink != "target.txt" {
|
||||||
|
t.Errorf("expected symlink target 'target.txt', got %q", file.symlink)
|
||||||
|
}
|
||||||
|
if !file.isSymlink() {
|
||||||
|
t.Error("expected isSymlink() to return true")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Stat should return ModeSymlink
|
||||||
|
info, err := dn.Stat("link.txt")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Stat failed: %v", err)
|
||||||
|
}
|
||||||
|
if info.Mode()&os.ModeSymlink == 0 {
|
||||||
|
t.Error("expected ModeSymlink to be set in file mode")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestSymlinkTarRoundTrip_Good(t *testing.T) {
|
||||||
|
dn1 := New()
|
||||||
|
dn1.AddData("real.txt", []byte("real content"))
|
||||||
|
dn1.AddSymlink("link.txt", "real.txt")
|
||||||
|
|
||||||
|
tarball, err := dn1.ToTar()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("ToTar failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify the tar contains a symlink entry
|
||||||
|
tr := tar.NewReader(bytes.NewReader(tarball))
|
||||||
|
foundSymlink := false
|
||||||
|
foundFile := false
|
||||||
|
for {
|
||||||
|
header, err := tr.Next()
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("tar.Next failed: %v", err)
|
||||||
|
}
|
||||||
|
switch header.Name {
|
||||||
|
case "link.txt":
|
||||||
|
foundSymlink = true
|
||||||
|
if header.Typeflag != tar.TypeSymlink {
|
||||||
|
t.Errorf("expected TypeSymlink, got %d", header.Typeflag)
|
||||||
|
}
|
||||||
|
if header.Linkname != "real.txt" {
|
||||||
|
t.Errorf("expected Linkname 'real.txt', got %q", header.Linkname)
|
||||||
|
}
|
||||||
|
if header.Mode != 0777 {
|
||||||
|
t.Errorf("expected mode 0777, got %o", header.Mode)
|
||||||
|
}
|
||||||
|
case "real.txt":
|
||||||
|
foundFile = true
|
||||||
|
if header.Typeflag != tar.TypeReg {
|
||||||
|
t.Errorf("expected TypeReg for real.txt, got %d", header.Typeflag)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !foundSymlink {
|
||||||
|
t.Error("symlink entry not found in tarball")
|
||||||
|
}
|
||||||
|
if !foundFile {
|
||||||
|
t.Error("regular file entry not found in tarball")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Round-trip: FromTar should restore the symlink
|
||||||
|
dn2, err := FromTar(tarball)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("FromTar failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify the regular file survived
|
||||||
|
exists, _ := dn2.Exists("real.txt")
|
||||||
|
if !exists {
|
||||||
|
t.Error("real.txt missing after round-trip")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify the symlink survived
|
||||||
|
linkFile, ok := dn2.files["link.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("link.txt missing after round-trip")
|
||||||
|
}
|
||||||
|
if !linkFile.isSymlink() {
|
||||||
|
t.Error("expected link.txt to be a symlink after round-trip")
|
||||||
|
}
|
||||||
|
if linkFile.symlink != "real.txt" {
|
||||||
|
t.Errorf("expected symlink target 'real.txt', got %q", linkFile.symlink)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Stat should still report ModeSymlink
|
||||||
|
info, err := dn2.Stat("link.txt")
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Stat failed: %v", err)
|
||||||
|
}
|
||||||
|
if info.Mode()&os.ModeSymlink == 0 {
|
||||||
|
t.Error("expected ModeSymlink after round-trip")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestAddSymlink_Bad(t *testing.T) {
|
||||||
|
dn := New()
|
||||||
|
|
||||||
|
// Empty name should be ignored
|
||||||
|
dn.AddSymlink("", "target.txt")
|
||||||
|
if len(dn.files) != 0 {
|
||||||
|
t.Error("expected empty name to be ignored")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Leading slash should be stripped
|
||||||
|
dn.AddSymlink("/link.txt", "target.txt")
|
||||||
|
if _, ok := dn.files["link.txt"]; !ok {
|
||||||
|
t.Error("expected leading slash to be stripped")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Directory-like name (trailing slash) should be ignored
|
||||||
|
dn2 := New()
|
||||||
|
dn2.AddSymlink("dir/", "target")
|
||||||
|
if len(dn2.files) != 0 {
|
||||||
|
t.Error("expected directory-like name to be ignored")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestToTarWriter_Good(t *testing.T) {
|
||||||
|
dn := New()
|
||||||
|
dn.AddData("foo.txt", []byte("hello"))
|
||||||
|
dn.AddData("bar/baz.txt", []byte("world"))
|
||||||
|
|
||||||
|
var buf bytes.Buffer
|
||||||
|
if err := dn.ToTarWriter(&buf); err != nil {
|
||||||
|
t.Fatalf("ToTarWriter failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Round-trip through FromTar to verify contents survived.
|
||||||
|
dn2, err := FromTar(buf.Bytes())
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("FromTar failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify foo.txt
|
||||||
|
f1, ok := dn2.files["foo.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("foo.txt missing after round-trip")
|
||||||
|
}
|
||||||
|
if string(f1.content) != "hello" {
|
||||||
|
t.Errorf("expected foo.txt content 'hello', got %q", f1.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify bar/baz.txt
|
||||||
|
f2, ok := dn2.files["bar/baz.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("bar/baz.txt missing after round-trip")
|
||||||
|
}
|
||||||
|
if string(f2.content) != "world" {
|
||||||
|
t.Errorf("expected bar/baz.txt content 'world', got %q", f2.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify deterministic ordering: bar/baz.txt should come before foo.txt.
|
||||||
|
tr := tar.NewReader(bytes.NewReader(buf.Bytes()))
|
||||||
|
header1, err := tr.Next()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("tar.Next failed: %v", err)
|
||||||
|
}
|
||||||
|
header2, err := tr.Next()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("tar.Next failed: %v", err)
|
||||||
|
}
|
||||||
|
if header1.Name != "bar/baz.txt" || header2.Name != "foo.txt" {
|
||||||
|
t.Errorf("expected sorted order [bar/baz.txt, foo.txt], got [%s, %s]",
|
||||||
|
header1.Name, header2.Name)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestToTarWriter_Symlinks_Good(t *testing.T) {
|
||||||
|
dn := New()
|
||||||
|
dn.AddData("real.txt", []byte("real content"))
|
||||||
|
dn.AddSymlink("link.txt", "real.txt")
|
||||||
|
|
||||||
|
var buf bytes.Buffer
|
||||||
|
if err := dn.ToTarWriter(&buf); err != nil {
|
||||||
|
t.Fatalf("ToTarWriter failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Round-trip through FromTar.
|
||||||
|
dn2, err := FromTar(buf.Bytes())
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("FromTar failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify regular file survived.
|
||||||
|
realFile, ok := dn2.files["real.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("real.txt missing after round-trip")
|
||||||
|
}
|
||||||
|
if string(realFile.content) != "real content" {
|
||||||
|
t.Errorf("expected 'real content', got %q", realFile.content)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify symlink survived.
|
||||||
|
linkFile, ok := dn2.files["link.txt"]
|
||||||
|
if !ok {
|
||||||
|
t.Fatal("link.txt missing after round-trip")
|
||||||
|
}
|
||||||
|
if !linkFile.isSymlink() {
|
||||||
|
t.Error("expected link.txt to be a symlink")
|
||||||
|
}
|
||||||
|
if linkFile.symlink != "real.txt" {
|
||||||
|
t.Errorf("expected symlink target 'real.txt', got %q", linkFile.symlink)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Also verify the raw tar entries have correct types and modes.
|
||||||
|
tr := tar.NewReader(bytes.NewReader(buf.Bytes()))
|
||||||
|
for {
|
||||||
|
header, err := tr.Next()
|
||||||
|
if err == io.EOF {
|
||||||
|
break
|
||||||
|
}
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("tar.Next failed: %v", err)
|
||||||
|
}
|
||||||
|
switch header.Name {
|
||||||
|
case "link.txt":
|
||||||
|
if header.Typeflag != tar.TypeSymlink {
|
||||||
|
t.Errorf("expected TypeSymlink for link.txt, got %d", header.Typeflag)
|
||||||
|
}
|
||||||
|
if header.Linkname != "real.txt" {
|
||||||
|
t.Errorf("expected Linkname 'real.txt', got %q", header.Linkname)
|
||||||
|
}
|
||||||
|
if header.Mode != 0777 {
|
||||||
|
t.Errorf("expected mode 0777 for symlink, got %o", header.Mode)
|
||||||
|
}
|
||||||
|
case "real.txt":
|
||||||
|
if header.Typeflag != tar.TypeReg {
|
||||||
|
t.Errorf("expected TypeReg for real.txt, got %d", header.Typeflag)
|
||||||
|
}
|
||||||
|
if header.Mode != 0600 {
|
||||||
|
t.Errorf("expected mode 0600 for regular file, got %o", header.Mode)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestToTarWriter_Empty_Good(t *testing.T) {
|
||||||
|
dn := New()
|
||||||
|
|
||||||
|
var buf bytes.Buffer
|
||||||
|
if err := dn.ToTarWriter(&buf); err != nil {
|
||||||
|
t.Fatalf("ToTarWriter on empty DataNode should not error, got: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// The buffer should contain a valid (empty) tar archive.
|
||||||
|
dn2, err := FromTar(buf.Bytes())
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("FromTar on empty tar failed: %v", err)
|
||||||
|
}
|
||||||
|
if len(dn2.files) != 0 {
|
||||||
|
t.Errorf("expected 0 files in empty round-trip, got %d", len(dn2.files))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
func toSortedNames(entries []fs.DirEntry) []string {
|
func toSortedNames(entries []fs.DirEntry) []string {
|
||||||
var names []string
|
var names []string
|
||||||
for _, e := range entries {
|
for _, e := range entries {
|
||||||
|
|
|
||||||
|
|
@ -8,7 +8,7 @@ import (
|
||||||
"strings"
|
"strings"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/mocks"
|
"forge.lthn.ai/Snider/Borg/pkg/mocks"
|
||||||
)
|
)
|
||||||
|
|
||||||
func TestGetPublicRepos_Good(t *testing.T) {
|
func TestGetPublicRepos_Good(t *testing.T) {
|
||||||
|
|
|
||||||
|
|
@ -8,7 +8,7 @@ import (
|
||||||
"net/url"
|
"net/url"
|
||||||
"testing"
|
"testing"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/mocks"
|
"forge.lthn.ai/Snider/Borg/pkg/mocks"
|
||||||
"github.com/google/go-github/v39/github"
|
"github.com/google/go-github/v39/github"
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
|
||||||
|
|
@ -3,8 +3,8 @@ package mocks
|
||||||
import (
|
import (
|
||||||
"io"
|
"io"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/Snider/Borg/pkg/vcs"
|
"forge.lthn.ai/Snider/Borg/pkg/vcs"
|
||||||
)
|
)
|
||||||
|
|
||||||
// MockGitCloner is a mock implementation of the GitCloner interface.
|
// MockGitCloner is a mock implementation of the GitCloner interface.
|
||||||
|
|
|
||||||
36
pkg/player/assets.go
Normal file
36
pkg/player/assets.go
Normal file
|
|
@ -0,0 +1,36 @@
|
||||||
|
package player
|
||||||
|
|
||||||
|
import (
|
||||||
|
"embed"
|
||||||
|
"io/fs"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Assets embeds all frontend files for the media player
|
||||||
|
// These are served both by Wails (memory) and HTTP (fallback)
|
||||||
|
//
|
||||||
|
//go:embed frontend/index.html
|
||||||
|
//go:embed frontend/wasm_exec.js
|
||||||
|
//go:embed frontend/stmf.wasm
|
||||||
|
//go:embed frontend/demo-track.smsg
|
||||||
|
var assets embed.FS
|
||||||
|
|
||||||
|
// Assets returns the embedded filesystem with frontend/ prefix stripped
|
||||||
|
var Assets fs.FS
|
||||||
|
|
||||||
|
func init() {
|
||||||
|
var err error
|
||||||
|
Assets, err = fs.Sub(assets, "frontend")
|
||||||
|
if err != nil {
|
||||||
|
panic("failed to create sub filesystem: " + err.Error())
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetDemoTrack returns the embedded demo track content
|
||||||
|
func GetDemoTrack() ([]byte, error) {
|
||||||
|
return fs.ReadFile(Assets, "demo-track.smsg")
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetIndex returns the main HTML page
|
||||||
|
func GetIndex() ([]byte, error) {
|
||||||
|
return fs.ReadFile(Assets, "index.html")
|
||||||
|
}
|
||||||
1290
pkg/player/frontend/index.html
Normal file
1290
pkg/player/frontend/index.html
Normal file
File diff suppressed because it is too large
Load diff
BIN
pkg/player/frontend/stmf.wasm
Executable file
BIN
pkg/player/frontend/stmf.wasm
Executable file
Binary file not shown.
575
pkg/player/frontend/wasm_exec.js
Normal file
575
pkg/player/frontend/wasm_exec.js
Normal file
|
|
@ -0,0 +1,575 @@
|
||||||
|
// Copyright 2018 The Go Authors. All rights reserved.
|
||||||
|
// Use of this source code is governed by a BSD-style
|
||||||
|
// license that can be found in the LICENSE file.
|
||||||
|
|
||||||
|
"use strict";
|
||||||
|
|
||||||
|
(() => {
|
||||||
|
const enosys = () => {
|
||||||
|
const err = new Error("not implemented");
|
||||||
|
err.code = "ENOSYS";
|
||||||
|
return err;
|
||||||
|
};
|
||||||
|
|
||||||
|
if (!globalThis.fs) {
|
||||||
|
let outputBuf = "";
|
||||||
|
globalThis.fs = {
|
||||||
|
constants: { O_WRONLY: -1, O_RDWR: -1, O_CREAT: -1, O_TRUNC: -1, O_APPEND: -1, O_EXCL: -1, O_DIRECTORY: -1 }, // unused
|
||||||
|
writeSync(fd, buf) {
|
||||||
|
outputBuf += decoder.decode(buf);
|
||||||
|
const nl = outputBuf.lastIndexOf("\n");
|
||||||
|
if (nl != -1) {
|
||||||
|
console.log(outputBuf.substring(0, nl));
|
||||||
|
outputBuf = outputBuf.substring(nl + 1);
|
||||||
|
}
|
||||||
|
return buf.length;
|
||||||
|
},
|
||||||
|
write(fd, buf, offset, length, position, callback) {
|
||||||
|
if (offset !== 0 || length !== buf.length || position !== null) {
|
||||||
|
callback(enosys());
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const n = this.writeSync(fd, buf);
|
||||||
|
callback(null, n);
|
||||||
|
},
|
||||||
|
chmod(path, mode, callback) { callback(enosys()); },
|
||||||
|
chown(path, uid, gid, callback) { callback(enosys()); },
|
||||||
|
close(fd, callback) { callback(enosys()); },
|
||||||
|
fchmod(fd, mode, callback) { callback(enosys()); },
|
||||||
|
fchown(fd, uid, gid, callback) { callback(enosys()); },
|
||||||
|
fstat(fd, callback) { callback(enosys()); },
|
||||||
|
fsync(fd, callback) { callback(null); },
|
||||||
|
ftruncate(fd, length, callback) { callback(enosys()); },
|
||||||
|
lchown(path, uid, gid, callback) { callback(enosys()); },
|
||||||
|
link(path, link, callback) { callback(enosys()); },
|
||||||
|
lstat(path, callback) { callback(enosys()); },
|
||||||
|
mkdir(path, perm, callback) { callback(enosys()); },
|
||||||
|
open(path, flags, mode, callback) { callback(enosys()); },
|
||||||
|
read(fd, buffer, offset, length, position, callback) { callback(enosys()); },
|
||||||
|
readdir(path, callback) { callback(enosys()); },
|
||||||
|
readlink(path, callback) { callback(enosys()); },
|
||||||
|
rename(from, to, callback) { callback(enosys()); },
|
||||||
|
rmdir(path, callback) { callback(enosys()); },
|
||||||
|
stat(path, callback) { callback(enosys()); },
|
||||||
|
symlink(path, link, callback) { callback(enosys()); },
|
||||||
|
truncate(path, length, callback) { callback(enosys()); },
|
||||||
|
unlink(path, callback) { callback(enosys()); },
|
||||||
|
utimes(path, atime, mtime, callback) { callback(enosys()); },
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.process) {
|
||||||
|
globalThis.process = {
|
||||||
|
getuid() { return -1; },
|
||||||
|
getgid() { return -1; },
|
||||||
|
geteuid() { return -1; },
|
||||||
|
getegid() { return -1; },
|
||||||
|
getgroups() { throw enosys(); },
|
||||||
|
pid: -1,
|
||||||
|
ppid: -1,
|
||||||
|
umask() { throw enosys(); },
|
||||||
|
cwd() { throw enosys(); },
|
||||||
|
chdir() { throw enosys(); },
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.path) {
|
||||||
|
globalThis.path = {
|
||||||
|
resolve(...pathSegments) {
|
||||||
|
return pathSegments.join("/");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.crypto) {
|
||||||
|
throw new Error("globalThis.crypto is not available, polyfill required (crypto.getRandomValues only)");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.performance) {
|
||||||
|
throw new Error("globalThis.performance is not available, polyfill required (performance.now only)");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.TextEncoder) {
|
||||||
|
throw new Error("globalThis.TextEncoder is not available, polyfill required");
|
||||||
|
}
|
||||||
|
|
||||||
|
if (!globalThis.TextDecoder) {
|
||||||
|
throw new Error("globalThis.TextDecoder is not available, polyfill required");
|
||||||
|
}
|
||||||
|
|
||||||
|
const encoder = new TextEncoder("utf-8");
|
||||||
|
const decoder = new TextDecoder("utf-8");
|
||||||
|
|
||||||
|
globalThis.Go = class {
|
||||||
|
constructor() {
|
||||||
|
this.argv = ["js"];
|
||||||
|
this.env = {};
|
||||||
|
this.exit = (code) => {
|
||||||
|
if (code !== 0) {
|
||||||
|
console.warn("exit code:", code);
|
||||||
|
}
|
||||||
|
};
|
||||||
|
this._exitPromise = new Promise((resolve) => {
|
||||||
|
this._resolveExitPromise = resolve;
|
||||||
|
});
|
||||||
|
this._pendingEvent = null;
|
||||||
|
this._scheduledTimeouts = new Map();
|
||||||
|
this._nextCallbackTimeoutID = 1;
|
||||||
|
|
||||||
|
const setInt64 = (addr, v) => {
|
||||||
|
this.mem.setUint32(addr + 0, v, true);
|
||||||
|
this.mem.setUint32(addr + 4, Math.floor(v / 4294967296), true);
|
||||||
|
}
|
||||||
|
|
||||||
|
const setInt32 = (addr, v) => {
|
||||||
|
this.mem.setUint32(addr + 0, v, true);
|
||||||
|
}
|
||||||
|
|
||||||
|
const getInt64 = (addr) => {
|
||||||
|
const low = this.mem.getUint32(addr + 0, true);
|
||||||
|
const high = this.mem.getInt32(addr + 4, true);
|
||||||
|
return low + high * 4294967296;
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadValue = (addr) => {
|
||||||
|
const f = this.mem.getFloat64(addr, true);
|
||||||
|
if (f === 0) {
|
||||||
|
return undefined;
|
||||||
|
}
|
||||||
|
if (!isNaN(f)) {
|
||||||
|
return f;
|
||||||
|
}
|
||||||
|
|
||||||
|
const id = this.mem.getUint32(addr, true);
|
||||||
|
return this._values[id];
|
||||||
|
}
|
||||||
|
|
||||||
|
const storeValue = (addr, v) => {
|
||||||
|
const nanHead = 0x7FF80000;
|
||||||
|
|
||||||
|
if (typeof v === "number" && v !== 0) {
|
||||||
|
if (isNaN(v)) {
|
||||||
|
this.mem.setUint32(addr + 4, nanHead, true);
|
||||||
|
this.mem.setUint32(addr, 0, true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
this.mem.setFloat64(addr, v, true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
if (v === undefined) {
|
||||||
|
this.mem.setFloat64(addr, 0, true);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
let id = this._ids.get(v);
|
||||||
|
if (id === undefined) {
|
||||||
|
id = this._idPool.pop();
|
||||||
|
if (id === undefined) {
|
||||||
|
id = this._values.length;
|
||||||
|
}
|
||||||
|
this._values[id] = v;
|
||||||
|
this._goRefCounts[id] = 0;
|
||||||
|
this._ids.set(v, id);
|
||||||
|
}
|
||||||
|
this._goRefCounts[id]++;
|
||||||
|
let typeFlag = 0;
|
||||||
|
switch (typeof v) {
|
||||||
|
case "object":
|
||||||
|
if (v !== null) {
|
||||||
|
typeFlag = 1;
|
||||||
|
}
|
||||||
|
break;
|
||||||
|
case "string":
|
||||||
|
typeFlag = 2;
|
||||||
|
break;
|
||||||
|
case "symbol":
|
||||||
|
typeFlag = 3;
|
||||||
|
break;
|
||||||
|
case "function":
|
||||||
|
typeFlag = 4;
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
this.mem.setUint32(addr + 4, nanHead | typeFlag, true);
|
||||||
|
this.mem.setUint32(addr, id, true);
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadSlice = (addr) => {
|
||||||
|
const array = getInt64(addr + 0);
|
||||||
|
const len = getInt64(addr + 8);
|
||||||
|
return new Uint8Array(this._inst.exports.mem.buffer, array, len);
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadSliceOfValues = (addr) => {
|
||||||
|
const array = getInt64(addr + 0);
|
||||||
|
const len = getInt64(addr + 8);
|
||||||
|
const a = new Array(len);
|
||||||
|
for (let i = 0; i < len; i++) {
|
||||||
|
a[i] = loadValue(array + i * 8);
|
||||||
|
}
|
||||||
|
return a;
|
||||||
|
}
|
||||||
|
|
||||||
|
const loadString = (addr) => {
|
||||||
|
const saddr = getInt64(addr + 0);
|
||||||
|
const len = getInt64(addr + 8);
|
||||||
|
return decoder.decode(new DataView(this._inst.exports.mem.buffer, saddr, len));
|
||||||
|
}
|
||||||
|
|
||||||
|
const testCallExport = (a, b) => {
|
||||||
|
this._inst.exports.testExport0();
|
||||||
|
return this._inst.exports.testExport(a, b);
|
||||||
|
}
|
||||||
|
|
||||||
|
const timeOrigin = Date.now() - performance.now();
|
||||||
|
this.importObject = {
|
||||||
|
_gotest: {
|
||||||
|
add: (a, b) => a + b,
|
||||||
|
callExport: testCallExport,
|
||||||
|
},
|
||||||
|
gojs: {
|
||||||
|
// Go's SP does not change as long as no Go code is running. Some operations (e.g. calls, getters and setters)
|
||||||
|
// may synchronously trigger a Go event handler. This makes Go code get executed in the middle of the imported
|
||||||
|
// function. A goroutine can switch to a new stack if the current stack is too small (see morestack function).
|
||||||
|
// This changes the SP, thus we have to update the SP used by the imported function.
|
||||||
|
|
||||||
|
// func wasmExit(code int32)
|
||||||
|
"runtime.wasmExit": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const code = this.mem.getInt32(sp + 8, true);
|
||||||
|
this.exited = true;
|
||||||
|
delete this._inst;
|
||||||
|
delete this._values;
|
||||||
|
delete this._goRefCounts;
|
||||||
|
delete this._ids;
|
||||||
|
delete this._idPool;
|
||||||
|
this.exit(code);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func wasmWrite(fd uintptr, p unsafe.Pointer, n int32)
|
||||||
|
"runtime.wasmWrite": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const fd = getInt64(sp + 8);
|
||||||
|
const p = getInt64(sp + 16);
|
||||||
|
const n = this.mem.getInt32(sp + 24, true);
|
||||||
|
fs.writeSync(fd, new Uint8Array(this._inst.exports.mem.buffer, p, n));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func resetMemoryDataView()
|
||||||
|
"runtime.resetMemoryDataView": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
this.mem = new DataView(this._inst.exports.mem.buffer);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func nanotime1() int64
|
||||||
|
"runtime.nanotime1": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
setInt64(sp + 8, (timeOrigin + performance.now()) * 1000000);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func walltime() (sec int64, nsec int32)
|
||||||
|
"runtime.walltime": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const msec = (new Date).getTime();
|
||||||
|
setInt64(sp + 8, msec / 1000);
|
||||||
|
this.mem.setInt32(sp + 16, (msec % 1000) * 1000000, true);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func scheduleTimeoutEvent(delay int64) int32
|
||||||
|
"runtime.scheduleTimeoutEvent": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const id = this._nextCallbackTimeoutID;
|
||||||
|
this._nextCallbackTimeoutID++;
|
||||||
|
this._scheduledTimeouts.set(id, setTimeout(
|
||||||
|
() => {
|
||||||
|
this._resume();
|
||||||
|
while (this._scheduledTimeouts.has(id)) {
|
||||||
|
// for some reason Go failed to register the timeout event, log and try again
|
||||||
|
// (temporary workaround for https://github.com/golang/go/issues/28975)
|
||||||
|
console.warn("scheduleTimeoutEvent: missed timeout event");
|
||||||
|
this._resume();
|
||||||
|
}
|
||||||
|
},
|
||||||
|
getInt64(sp + 8),
|
||||||
|
));
|
||||||
|
this.mem.setInt32(sp + 16, id, true);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func clearTimeoutEvent(id int32)
|
||||||
|
"runtime.clearTimeoutEvent": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const id = this.mem.getInt32(sp + 8, true);
|
||||||
|
clearTimeout(this._scheduledTimeouts.get(id));
|
||||||
|
this._scheduledTimeouts.delete(id);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func getRandomData(r []byte)
|
||||||
|
"runtime.getRandomData": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
crypto.getRandomValues(loadSlice(sp + 8));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func finalizeRef(v ref)
|
||||||
|
"syscall/js.finalizeRef": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const id = this.mem.getUint32(sp + 8, true);
|
||||||
|
this._goRefCounts[id]--;
|
||||||
|
if (this._goRefCounts[id] === 0) {
|
||||||
|
const v = this._values[id];
|
||||||
|
this._values[id] = null;
|
||||||
|
this._ids.delete(v);
|
||||||
|
this._idPool.push(id);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func stringVal(value string) ref
|
||||||
|
"syscall/js.stringVal": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
storeValue(sp + 24, loadString(sp + 8));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueGet(v ref, p string) ref
|
||||||
|
"syscall/js.valueGet": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const result = Reflect.get(loadValue(sp + 8), loadString(sp + 16));
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 32, result);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueSet(v ref, p string, x ref)
|
||||||
|
"syscall/js.valueSet": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
Reflect.set(loadValue(sp + 8), loadString(sp + 16), loadValue(sp + 32));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueDelete(v ref, p string)
|
||||||
|
"syscall/js.valueDelete": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
Reflect.deleteProperty(loadValue(sp + 8), loadString(sp + 16));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueIndex(v ref, i int) ref
|
||||||
|
"syscall/js.valueIndex": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
storeValue(sp + 24, Reflect.get(loadValue(sp + 8), getInt64(sp + 16)));
|
||||||
|
},
|
||||||
|
|
||||||
|
// valueSetIndex(v ref, i int, x ref)
|
||||||
|
"syscall/js.valueSetIndex": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
Reflect.set(loadValue(sp + 8), getInt64(sp + 16), loadValue(sp + 24));
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueCall(v ref, m string, args []ref) (ref, bool)
|
||||||
|
"syscall/js.valueCall": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
try {
|
||||||
|
const v = loadValue(sp + 8);
|
||||||
|
const m = Reflect.get(v, loadString(sp + 16));
|
||||||
|
const args = loadSliceOfValues(sp + 32);
|
||||||
|
const result = Reflect.apply(m, v, args);
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 56, result);
|
||||||
|
this.mem.setUint8(sp + 64, 1);
|
||||||
|
} catch (err) {
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 56, err);
|
||||||
|
this.mem.setUint8(sp + 64, 0);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueInvoke(v ref, args []ref) (ref, bool)
|
||||||
|
"syscall/js.valueInvoke": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
try {
|
||||||
|
const v = loadValue(sp + 8);
|
||||||
|
const args = loadSliceOfValues(sp + 16);
|
||||||
|
const result = Reflect.apply(v, undefined, args);
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, result);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
} catch (err) {
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, err);
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueNew(v ref, args []ref) (ref, bool)
|
||||||
|
"syscall/js.valueNew": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
try {
|
||||||
|
const v = loadValue(sp + 8);
|
||||||
|
const args = loadSliceOfValues(sp + 16);
|
||||||
|
const result = Reflect.construct(v, args);
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, result);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
} catch (err) {
|
||||||
|
sp = this._inst.exports.getsp() >>> 0; // see comment above
|
||||||
|
storeValue(sp + 40, err);
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
}
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueLength(v ref) int
|
||||||
|
"syscall/js.valueLength": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
setInt64(sp + 16, parseInt(loadValue(sp + 8).length));
|
||||||
|
},
|
||||||
|
|
||||||
|
// valuePrepareString(v ref) (ref, int)
|
||||||
|
"syscall/js.valuePrepareString": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const str = encoder.encode(String(loadValue(sp + 8)));
|
||||||
|
storeValue(sp + 16, str);
|
||||||
|
setInt64(sp + 24, str.length);
|
||||||
|
},
|
||||||
|
|
||||||
|
// valueLoadString(v ref, b []byte)
|
||||||
|
"syscall/js.valueLoadString": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const str = loadValue(sp + 8);
|
||||||
|
loadSlice(sp + 16).set(str);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func valueInstanceOf(v ref, t ref) bool
|
||||||
|
"syscall/js.valueInstanceOf": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
this.mem.setUint8(sp + 24, (loadValue(sp + 8) instanceof loadValue(sp + 16)) ? 1 : 0);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func copyBytesToGo(dst []byte, src ref) (int, bool)
|
||||||
|
"syscall/js.copyBytesToGo": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const dst = loadSlice(sp + 8);
|
||||||
|
const src = loadValue(sp + 32);
|
||||||
|
if (!(src instanceof Uint8Array || src instanceof Uint8ClampedArray)) {
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const toCopy = src.subarray(0, dst.length);
|
||||||
|
dst.set(toCopy);
|
||||||
|
setInt64(sp + 40, toCopy.length);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
},
|
||||||
|
|
||||||
|
// func copyBytesToJS(dst ref, src []byte) (int, bool)
|
||||||
|
"syscall/js.copyBytesToJS": (sp) => {
|
||||||
|
sp >>>= 0;
|
||||||
|
const dst = loadValue(sp + 8);
|
||||||
|
const src = loadSlice(sp + 16);
|
||||||
|
if (!(dst instanceof Uint8Array || dst instanceof Uint8ClampedArray)) {
|
||||||
|
this.mem.setUint8(sp + 48, 0);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
const toCopy = src.subarray(0, dst.length);
|
||||||
|
dst.set(toCopy);
|
||||||
|
setInt64(sp + 40, toCopy.length);
|
||||||
|
this.mem.setUint8(sp + 48, 1);
|
||||||
|
},
|
||||||
|
|
||||||
|
"debug": (value) => {
|
||||||
|
console.log(value);
|
||||||
|
},
|
||||||
|
}
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
async run(instance) {
|
||||||
|
if (!(instance instanceof WebAssembly.Instance)) {
|
||||||
|
throw new Error("Go.run: WebAssembly.Instance expected");
|
||||||
|
}
|
||||||
|
this._inst = instance;
|
||||||
|
this.mem = new DataView(this._inst.exports.mem.buffer);
|
||||||
|
this._values = [ // JS values that Go currently has references to, indexed by reference id
|
||||||
|
NaN,
|
||||||
|
0,
|
||||||
|
null,
|
||||||
|
true,
|
||||||
|
false,
|
||||||
|
globalThis,
|
||||||
|
this,
|
||||||
|
];
|
||||||
|
this._goRefCounts = new Array(this._values.length).fill(Infinity); // number of references that Go has to a JS value, indexed by reference id
|
||||||
|
this._ids = new Map([ // mapping from JS values to reference ids
|
||||||
|
[0, 1],
|
||||||
|
[null, 2],
|
||||||
|
[true, 3],
|
||||||
|
[false, 4],
|
||||||
|
[globalThis, 5],
|
||||||
|
[this, 6],
|
||||||
|
]);
|
||||||
|
this._idPool = []; // unused ids that have been garbage collected
|
||||||
|
this.exited = false; // whether the Go program has exited
|
||||||
|
|
||||||
|
// Pass command line arguments and environment variables to WebAssembly by writing them to the linear memory.
|
||||||
|
let offset = 4096;
|
||||||
|
|
||||||
|
const strPtr = (str) => {
|
||||||
|
const ptr = offset;
|
||||||
|
const bytes = encoder.encode(str + "\0");
|
||||||
|
new Uint8Array(this.mem.buffer, offset, bytes.length).set(bytes);
|
||||||
|
offset += bytes.length;
|
||||||
|
if (offset % 8 !== 0) {
|
||||||
|
offset += 8 - (offset % 8);
|
||||||
|
}
|
||||||
|
return ptr;
|
||||||
|
};
|
||||||
|
|
||||||
|
const argc = this.argv.length;
|
||||||
|
|
||||||
|
const argvPtrs = [];
|
||||||
|
this.argv.forEach((arg) => {
|
||||||
|
argvPtrs.push(strPtr(arg));
|
||||||
|
});
|
||||||
|
argvPtrs.push(0);
|
||||||
|
|
||||||
|
const keys = Object.keys(this.env).sort();
|
||||||
|
keys.forEach((key) => {
|
||||||
|
argvPtrs.push(strPtr(`${key}=${this.env[key]}`));
|
||||||
|
});
|
||||||
|
argvPtrs.push(0);
|
||||||
|
|
||||||
|
const argv = offset;
|
||||||
|
argvPtrs.forEach((ptr) => {
|
||||||
|
this.mem.setUint32(offset, ptr, true);
|
||||||
|
this.mem.setUint32(offset + 4, 0, true);
|
||||||
|
offset += 8;
|
||||||
|
});
|
||||||
|
|
||||||
|
// The linker guarantees global data starts from at least wasmMinDataAddr.
|
||||||
|
// Keep in sync with cmd/link/internal/ld/data.go:wasmMinDataAddr.
|
||||||
|
const wasmMinDataAddr = 4096 + 8192;
|
||||||
|
if (offset >= wasmMinDataAddr) {
|
||||||
|
throw new Error("total length of command line and environment variables exceeds limit");
|
||||||
|
}
|
||||||
|
|
||||||
|
this._inst.exports.run(argc, argv);
|
||||||
|
if (this.exited) {
|
||||||
|
this._resolveExitPromise();
|
||||||
|
}
|
||||||
|
await this._exitPromise;
|
||||||
|
}
|
||||||
|
|
||||||
|
_resume() {
|
||||||
|
if (this.exited) {
|
||||||
|
throw new Error("Go program has already exited");
|
||||||
|
}
|
||||||
|
this._inst.exports.resume();
|
||||||
|
if (this.exited) {
|
||||||
|
this._resolveExitPromise();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_makeFuncWrapper(id) {
|
||||||
|
const go = this;
|
||||||
|
return function () {
|
||||||
|
const event = { id: id, this: this, args: arguments };
|
||||||
|
go._pendingEvent = event;
|
||||||
|
go._resume();
|
||||||
|
return event.result;
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
})();
|
||||||
329
pkg/player/player.go
Normal file
329
pkg/player/player.go
Normal file
|
|
@ -0,0 +1,329 @@
|
||||||
|
// Package player provides the core media player functionality for dapp.fm
|
||||||
|
// It can be used both as Wails bindings (memory speed) or HTTP server (fallback)
|
||||||
|
package player
|
||||||
|
|
||||||
|
import (
|
||||||
|
"context"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"net/http"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Borg/pkg/smsg"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Player provides media decryption and playback services
|
||||||
|
// Methods are exposed to JavaScript via Wails bindings
|
||||||
|
type Player struct {
|
||||||
|
ctx context.Context
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewPlayer creates a new Player instance
|
||||||
|
func NewPlayer() *Player {
|
||||||
|
return &Player{}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Startup is called when the Wails app starts
|
||||||
|
func (p *Player) Startup(ctx context.Context) {
|
||||||
|
p.ctx = ctx
|
||||||
|
}
|
||||||
|
|
||||||
|
// DecryptResult holds the decrypted message data
|
||||||
|
type DecryptResult struct {
|
||||||
|
Body string `json:"body"`
|
||||||
|
Subject string `json:"subject,omitempty"`
|
||||||
|
From string `json:"from,omitempty"`
|
||||||
|
Attachments []AttachmentInfo `json:"attachments,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// AttachmentInfo describes a decrypted attachment
|
||||||
|
type AttachmentInfo struct {
|
||||||
|
Name string `json:"name"`
|
||||||
|
MimeType string `json:"mime_type"`
|
||||||
|
Size int `json:"size"`
|
||||||
|
DataURL string `json:"data_url"` // Base64 data URL for direct playback
|
||||||
|
}
|
||||||
|
|
||||||
|
// ManifestInfo holds public metadata (readable without decryption)
|
||||||
|
type ManifestInfo struct {
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Artist string `json:"artist,omitempty"`
|
||||||
|
Album string `json:"album,omitempty"`
|
||||||
|
Genre string `json:"genre,omitempty"`
|
||||||
|
Year int `json:"year,omitempty"`
|
||||||
|
ReleaseType string `json:"release_type,omitempty"`
|
||||||
|
Duration int `json:"duration,omitempty"`
|
||||||
|
Format string `json:"format,omitempty"`
|
||||||
|
ExpiresAt int64 `json:"expires_at,omitempty"`
|
||||||
|
IssuedAt int64 `json:"issued_at,omitempty"`
|
||||||
|
LicenseType string `json:"license_type,omitempty"`
|
||||||
|
Tracks []TrackInfo `json:"tracks,omitempty"`
|
||||||
|
IsExpired bool `json:"is_expired"`
|
||||||
|
TimeRemaining string `json:"time_remaining,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// TrackInfo describes a track marker
|
||||||
|
type TrackInfo struct {
|
||||||
|
Title string `json:"title"`
|
||||||
|
Start float64 `json:"start"`
|
||||||
|
End float64 `json:"end,omitempty"`
|
||||||
|
Type string `json:"type,omitempty"`
|
||||||
|
TrackNum int `json:"track_num,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetManifest returns public metadata without decryption
|
||||||
|
// This is memory-speed via Wails bindings
|
||||||
|
func (p *Player) GetManifest(encrypted string) (*ManifestInfo, error) {
|
||||||
|
info, err := smsg.GetInfoBase64(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to get manifest: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
result := &ManifestInfo{}
|
||||||
|
|
||||||
|
if info.Manifest != nil {
|
||||||
|
m := info.Manifest
|
||||||
|
result.Title = m.Title
|
||||||
|
result.Artist = m.Artist
|
||||||
|
result.Album = m.Album
|
||||||
|
result.Genre = m.Genre
|
||||||
|
result.Year = m.Year
|
||||||
|
result.ReleaseType = m.ReleaseType
|
||||||
|
result.Duration = m.Duration
|
||||||
|
result.Format = m.Format
|
||||||
|
result.ExpiresAt = m.ExpiresAt
|
||||||
|
result.IssuedAt = m.IssuedAt
|
||||||
|
result.LicenseType = m.LicenseType
|
||||||
|
result.IsExpired = m.IsExpired()
|
||||||
|
|
||||||
|
if !result.IsExpired && m.ExpiresAt > 0 {
|
||||||
|
remaining := m.TimeRemaining()
|
||||||
|
result.TimeRemaining = formatDurationSeconds(remaining)
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, t := range m.Tracks {
|
||||||
|
result.Tracks = append(result.Tracks, TrackInfo{
|
||||||
|
Title: t.Title,
|
||||||
|
Start: t.Start,
|
||||||
|
End: t.End,
|
||||||
|
Type: t.Type,
|
||||||
|
TrackNum: t.TrackNum,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsLicenseValid checks if the license has expired
|
||||||
|
// This is memory-speed via Wails bindings
|
||||||
|
func (p *Player) IsLicenseValid(encrypted string) (bool, error) {
|
||||||
|
info, err := smsg.GetInfoBase64(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
return false, fmt.Errorf("failed to check license: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if info.Manifest != nil && info.Manifest.ExpiresAt > 0 {
|
||||||
|
return !info.Manifest.IsExpired(), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// No expiration set = perpetual license
|
||||||
|
return true, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt decrypts the SMSG content and returns playable media
|
||||||
|
// This is memory-speed via Wails bindings - no HTTP, no WASM
|
||||||
|
func (p *Player) Decrypt(encrypted string, password string) (*DecryptResult, error) {
|
||||||
|
// Check license first
|
||||||
|
valid, err := p.IsLicenseValid(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if !valid {
|
||||||
|
return nil, fmt.Errorf("license has expired")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt using pkg/smsg (Base64 variant for string input)
|
||||||
|
msg, err := smsg.DecryptBase64(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("decryption failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
result := &DecryptResult{
|
||||||
|
Body: msg.Body,
|
||||||
|
Subject: msg.Subject,
|
||||||
|
From: msg.From,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Convert attachments to data URLs for direct playback
|
||||||
|
for _, att := range msg.Attachments {
|
||||||
|
// Decode base64 content to get size
|
||||||
|
data, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
continue
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create data URL for the browser to play directly
|
||||||
|
dataURL := fmt.Sprintf("data:%s;base64,%s", att.MimeType, att.Content)
|
||||||
|
|
||||||
|
result.Attachments = append(result.Attachments, AttachmentInfo{
|
||||||
|
Name: att.Name,
|
||||||
|
MimeType: att.MimeType,
|
||||||
|
Size: len(data),
|
||||||
|
DataURL: dataURL,
|
||||||
|
})
|
||||||
|
}
|
||||||
|
|
||||||
|
return result, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// QuickDecrypt returns just the first attachment as a data URL
|
||||||
|
// Optimized for single-track playback
|
||||||
|
func (p *Player) QuickDecrypt(encrypted string, password string) (string, error) {
|
||||||
|
result, err := p.Decrypt(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(result.Attachments) == 0 {
|
||||||
|
return "", fmt.Errorf("no media attachments found")
|
||||||
|
}
|
||||||
|
|
||||||
|
return result.Attachments[0].DataURL, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetLicenseInfo returns detailed license information
|
||||||
|
func (p *Player) GetLicenseInfo(encrypted string) (map[string]interface{}, error) {
|
||||||
|
manifest, err := p.GetManifest(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
info := map[string]interface{}{
|
||||||
|
"is_valid": !manifest.IsExpired,
|
||||||
|
"license_type": manifest.LicenseType,
|
||||||
|
"time_remaining": manifest.TimeRemaining,
|
||||||
|
}
|
||||||
|
|
||||||
|
if manifest.ExpiresAt > 0 {
|
||||||
|
info["expires_at"] = time.Unix(manifest.ExpiresAt, 0).Format(time.RFC3339)
|
||||||
|
}
|
||||||
|
if manifest.IssuedAt > 0 {
|
||||||
|
info["issued_at"] = time.Unix(manifest.IssuedAt, 0).Format(time.RFC3339)
|
||||||
|
}
|
||||||
|
|
||||||
|
return info, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serve starts an HTTP server for CLI/fallback mode
|
||||||
|
// This is the slower TCP path - use Wails bindings when possible
|
||||||
|
func (p *Player) Serve(addr string) error {
|
||||||
|
mux := http.NewServeMux()
|
||||||
|
|
||||||
|
// Serve embedded assets
|
||||||
|
mux.Handle("/", http.FileServer(http.FS(Assets)))
|
||||||
|
|
||||||
|
// API endpoints for WASM fallback
|
||||||
|
mux.HandleFunc("/api/manifest", p.handleManifest)
|
||||||
|
mux.HandleFunc("/api/decrypt", p.handleDecrypt)
|
||||||
|
mux.HandleFunc("/api/license", p.handleLicense)
|
||||||
|
|
||||||
|
fmt.Printf("dapp.fm player serving at http://localhost%s\n", addr)
|
||||||
|
return http.ListenAndServe(addr, mux)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Player) handleManifest(w http.ResponseWriter, r *http.Request) {
|
||||||
|
encrypted := r.URL.Query().Get("data")
|
||||||
|
if encrypted == "" {
|
||||||
|
http.Error(w, "missing data parameter", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest, err := p.GetManifest(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(manifest)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Player) handleDecrypt(w http.ResponseWriter, r *http.Request) {
|
||||||
|
if r.Method != http.MethodPost {
|
||||||
|
http.Error(w, "POST required", http.StatusMethodNotAllowed)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
var req struct {
|
||||||
|
Encrypted string `json:"encrypted"`
|
||||||
|
Password string `json:"password"`
|
||||||
|
}
|
||||||
|
if err := json.NewDecoder(r.Body).Decode(&req); err != nil {
|
||||||
|
http.Error(w, "invalid JSON", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
result, err := p.Decrypt(req.Encrypted, req.Password)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusUnauthorized)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(result)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (p *Player) handleLicense(w http.ResponseWriter, r *http.Request) {
|
||||||
|
encrypted := r.URL.Query().Get("data")
|
||||||
|
if encrypted == "" {
|
||||||
|
http.Error(w, "missing data parameter", http.StatusBadRequest)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
info, err := p.GetLicenseInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
http.Error(w, err.Error(), http.StatusInternalServerError)
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
w.Header().Set("Content-Type", "application/json")
|
||||||
|
json.NewEncoder(w).Encode(info)
|
||||||
|
}
|
||||||
|
|
||||||
|
func formatDuration(d time.Duration) string {
|
||||||
|
if d < 0 {
|
||||||
|
return "expired"
|
||||||
|
}
|
||||||
|
|
||||||
|
days := int(d.Hours()) / 24
|
||||||
|
hours := int(d.Hours()) % 24
|
||||||
|
minutes := int(d.Minutes()) % 60
|
||||||
|
|
||||||
|
if days > 0 {
|
||||||
|
return fmt.Sprintf("%dd %dh", days, hours)
|
||||||
|
}
|
||||||
|
if hours > 0 {
|
||||||
|
return fmt.Sprintf("%dh %dm", hours, minutes)
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("%dm", minutes)
|
||||||
|
}
|
||||||
|
|
||||||
|
func formatDurationSeconds(seconds int64) string {
|
||||||
|
if seconds < 0 {
|
||||||
|
return "expired"
|
||||||
|
}
|
||||||
|
|
||||||
|
days := seconds / 86400
|
||||||
|
hours := (seconds % 86400) / 3600
|
||||||
|
minutes := (seconds % 3600) / 60
|
||||||
|
|
||||||
|
if days > 0 {
|
||||||
|
return fmt.Sprintf("%dd %dh", days, hours)
|
||||||
|
}
|
||||||
|
if hours > 0 {
|
||||||
|
return fmt.Sprintf("%dh %dm", hours, minutes)
|
||||||
|
}
|
||||||
|
return fmt.Sprintf("%dm", minutes)
|
||||||
|
}
|
||||||
|
|
@ -11,7 +11,7 @@ import (
|
||||||
"strings"
|
"strings"
|
||||||
"sync"
|
"sync"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/datanode"
|
"forge.lthn.ai/Snider/Borg/pkg/datanode"
|
||||||
"github.com/schollz/progressbar/v3"
|
"github.com/schollz/progressbar/v3"
|
||||||
"golang.org/x/net/html"
|
"golang.org/x/net/html"
|
||||||
)
|
)
|
||||||
|
|
@ -217,7 +217,9 @@ func (p *pwaClient) DownloadAndPackagePWA(pwaURL, manifestURL string, bar *progr
|
||||||
if path == "" {
|
if path == "" {
|
||||||
path = "index.html"
|
path = "index.html"
|
||||||
}
|
}
|
||||||
|
mu.Lock()
|
||||||
dn.AddData(path, body)
|
dn.AddData(path, body)
|
||||||
|
mu.Unlock()
|
||||||
|
|
||||||
// Parse HTML for additional assets
|
// Parse HTML for additional assets
|
||||||
if parseHTML && isHTMLContent(resp.Header.Get("Content-Type"), body) {
|
if parseHTML && isHTMLContent(resp.Header.Get("Content-Type"), body) {
|
||||||
|
|
|
||||||
214
pkg/smsg/abr.go
Normal file
214
pkg/smsg/abr.go
Normal file
|
|
@ -0,0 +1,214 @@
|
||||||
|
// Package smsg - Adaptive Bitrate Streaming (ABR) support
|
||||||
|
//
|
||||||
|
// ABR enables multi-bitrate streaming with automatic quality switching based on
|
||||||
|
// network conditions. Similar to HLS/DASH but with ChaCha20-Poly1305 encryption.
|
||||||
|
//
|
||||||
|
// Architecture:
|
||||||
|
// - Master manifest (.json) lists available quality variants
|
||||||
|
// - Each variant is a standard v3 chunked .smsg file
|
||||||
|
// - Same password decrypts all variants (CEK unwrapped once)
|
||||||
|
// - Player switches variants at chunk boundaries based on bandwidth
|
||||||
|
package smsg
|
||||||
|
|
||||||
|
import (
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"os"
|
||||||
|
"path/filepath"
|
||||||
|
"sort"
|
||||||
|
)
|
||||||
|
|
||||||
|
const ABRVersion = "abr-v1"
|
||||||
|
|
||||||
|
// ABRSafetyFactor is the bandwidth multiplier for variant selection.
|
||||||
|
// Using 80% of available bandwidth prevents buffering on fluctuating networks.
|
||||||
|
const ABRSafetyFactor = 0.8
|
||||||
|
|
||||||
|
// NewABRManifest creates a new ABR manifest with the given title.
|
||||||
|
func NewABRManifest(title string) *ABRManifest {
|
||||||
|
return &ABRManifest{
|
||||||
|
Version: ABRVersion,
|
||||||
|
Title: title,
|
||||||
|
Variants: make([]Variant, 0),
|
||||||
|
DefaultIdx: 0,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddVariant adds a quality variant to the manifest.
|
||||||
|
// Variants are automatically sorted by bandwidth (ascending) after adding.
|
||||||
|
func (m *ABRManifest) AddVariant(v Variant) {
|
||||||
|
m.Variants = append(m.Variants, v)
|
||||||
|
// Sort by bandwidth ascending (lowest quality first)
|
||||||
|
sort.Slice(m.Variants, func(i, j int) bool {
|
||||||
|
return m.Variants[i].Bandwidth < m.Variants[j].Bandwidth
|
||||||
|
})
|
||||||
|
// Update default to 720p if available, otherwise middle variant
|
||||||
|
m.DefaultIdx = m.findDefaultVariant()
|
||||||
|
}
|
||||||
|
|
||||||
|
// findDefaultVariant finds the best default variant (prefers 720p).
|
||||||
|
func (m *ABRManifest) findDefaultVariant() int {
|
||||||
|
// Prefer 720p as default
|
||||||
|
for i, v := range m.Variants {
|
||||||
|
if v.Name == "720p" || v.Height == 720 {
|
||||||
|
return i
|
||||||
|
}
|
||||||
|
}
|
||||||
|
// Otherwise use middle variant
|
||||||
|
if len(m.Variants) > 0 {
|
||||||
|
return len(m.Variants) / 2
|
||||||
|
}
|
||||||
|
return 0
|
||||||
|
}
|
||||||
|
|
||||||
|
// SelectVariant selects the best variant for the given bandwidth (bits per second).
|
||||||
|
// Returns the index of the highest quality variant that fits within the bandwidth.
|
||||||
|
func (m *ABRManifest) SelectVariant(bandwidthBPS int) int {
|
||||||
|
safeBandwidth := float64(bandwidthBPS) * ABRSafetyFactor
|
||||||
|
|
||||||
|
// Find highest quality that fits
|
||||||
|
selected := 0
|
||||||
|
for i, v := range m.Variants {
|
||||||
|
if float64(v.Bandwidth) <= safeBandwidth {
|
||||||
|
selected = i
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return selected
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetVariant returns the variant at the given index, or nil if out of range.
|
||||||
|
func (m *ABRManifest) GetVariant(idx int) *Variant {
|
||||||
|
if idx < 0 || idx >= len(m.Variants) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
return &m.Variants[idx]
|
||||||
|
}
|
||||||
|
|
||||||
|
// WriteABRManifest writes the ABR manifest to a JSON file.
|
||||||
|
func WriteABRManifest(manifest *ABRManifest, path string) error {
|
||||||
|
data, err := json.MarshalIndent(manifest, "", " ")
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("marshal ABR manifest: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Ensure directory exists
|
||||||
|
dir := filepath.Dir(path)
|
||||||
|
if err := os.MkdirAll(dir, 0755); err != nil {
|
||||||
|
return fmt.Errorf("create directory: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if err := os.WriteFile(path, data, 0644); err != nil {
|
||||||
|
return fmt.Errorf("write ABR manifest: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ReadABRManifest reads an ABR manifest from a JSON file.
|
||||||
|
func ReadABRManifest(path string) (*ABRManifest, error) {
|
||||||
|
data, err := os.ReadFile(path)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("read ABR manifest: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return ParseABRManifest(data)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ParseABRManifest parses an ABR manifest from JSON bytes.
|
||||||
|
func ParseABRManifest(data []byte) (*ABRManifest, error) {
|
||||||
|
var manifest ABRManifest
|
||||||
|
if err := json.Unmarshal(data, &manifest); err != nil {
|
||||||
|
return nil, fmt.Errorf("parse ABR manifest: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Validate version
|
||||||
|
if manifest.Version != ABRVersion {
|
||||||
|
return nil, fmt.Errorf("unsupported ABR version: %s (expected %s)", manifest.Version, ABRVersion)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &manifest, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// VariantFromSMSG creates a Variant from an existing .smsg file.
|
||||||
|
// It reads the header to extract chunk count and file size.
|
||||||
|
func VariantFromSMSG(name string, bandwidth, width, height int, smsgPath string) (*Variant, error) {
|
||||||
|
// Read file to get size and chunk info
|
||||||
|
data, err := os.ReadFile(smsgPath)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("read smsg file: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get header to extract chunk count
|
||||||
|
header, err := GetV3Header(data)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("parse smsg header: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
chunkCount := 0
|
||||||
|
if header.Chunked != nil {
|
||||||
|
chunkCount = header.Chunked.TotalChunks
|
||||||
|
}
|
||||||
|
|
||||||
|
return &Variant{
|
||||||
|
Name: name,
|
||||||
|
Bandwidth: bandwidth,
|
||||||
|
Width: width,
|
||||||
|
Height: height,
|
||||||
|
Codecs: "avc1.640028,mp4a.40.2", // Default H.264 + AAC
|
||||||
|
URL: filepath.Base(smsgPath),
|
||||||
|
ChunkCount: chunkCount,
|
||||||
|
FileSize: int64(len(data)),
|
||||||
|
}, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// ABRBandwidthEstimator tracks download speeds for adaptive quality selection.
|
||||||
|
type ABRBandwidthEstimator struct {
|
||||||
|
samples []int // bandwidth samples in bps
|
||||||
|
maxSamples int
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewABRBandwidthEstimator creates a new bandwidth estimator.
|
||||||
|
func NewABRBandwidthEstimator(maxSamples int) *ABRBandwidthEstimator {
|
||||||
|
if maxSamples <= 0 {
|
||||||
|
maxSamples = 10
|
||||||
|
}
|
||||||
|
return &ABRBandwidthEstimator{
|
||||||
|
samples: make([]int, 0, maxSamples),
|
||||||
|
maxSamples: maxSamples,
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// RecordSample records a bandwidth sample from a download.
|
||||||
|
// bytes is the number of bytes downloaded, durationMs is the time in milliseconds.
|
||||||
|
func (e *ABRBandwidthEstimator) RecordSample(bytes int, durationMs int) {
|
||||||
|
if durationMs <= 0 {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
// Calculate bits per second: (bytes * 8 * 1000) / durationMs
|
||||||
|
bps := (bytes * 8 * 1000) / durationMs
|
||||||
|
e.samples = append(e.samples, bps)
|
||||||
|
if len(e.samples) > e.maxSamples {
|
||||||
|
e.samples = e.samples[1:]
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Estimate returns the estimated bandwidth in bits per second.
|
||||||
|
// Uses average of recent samples, or 1 Mbps default if no samples.
|
||||||
|
func (e *ABRBandwidthEstimator) Estimate() int {
|
||||||
|
if len(e.samples) == 0 {
|
||||||
|
return 1000000 // 1 Mbps default
|
||||||
|
}
|
||||||
|
|
||||||
|
// Use average of last 3 samples (or all if fewer)
|
||||||
|
count := 3
|
||||||
|
if len(e.samples) < count {
|
||||||
|
count = len(e.samples)
|
||||||
|
}
|
||||||
|
recent := e.samples[len(e.samples)-count:]
|
||||||
|
|
||||||
|
sum := 0
|
||||||
|
for _, s := range recent {
|
||||||
|
sum += s
|
||||||
|
}
|
||||||
|
return sum / count
|
||||||
|
}
|
||||||
342
pkg/smsg/smsg.go
342
pkg/smsg/smsg.go
|
|
@ -1,14 +1,37 @@
|
||||||
package smsg
|
package smsg
|
||||||
|
|
||||||
|
// SMSG (Secure Message) provides ChaCha20-Poly1305 authenticated encryption.
|
||||||
|
//
|
||||||
|
// IMPORTANT: Nonce handling for developers
|
||||||
|
// =========================================
|
||||||
|
// Enchantrix embeds the nonce directly in the ciphertext:
|
||||||
|
//
|
||||||
|
// [24-byte nonce][encrypted data][16-byte auth tag]
|
||||||
|
//
|
||||||
|
// The nonce is NOT transmitted separately in headers. It is:
|
||||||
|
// - Generated fresh (random) for each encryption
|
||||||
|
// - Extracted automatically from ciphertext during decryption
|
||||||
|
// - Safe to transmit (public) - only the KEY must remain secret
|
||||||
|
//
|
||||||
|
// This means wrapped keys, encrypted payloads, etc. are self-contained.
|
||||||
|
// You only need the correct key to decrypt - no nonce management required.
|
||||||
|
//
|
||||||
|
// See: forge.lthn.ai/Snider/Enchantrix/pkg/enchantrix/crypto_sigil.go
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"bytes"
|
||||||
|
"compress/gzip"
|
||||||
"crypto/sha256"
|
"crypto/sha256"
|
||||||
"encoding/base64"
|
"encoding/base64"
|
||||||
|
"encoding/binary"
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
"io"
|
||||||
"time"
|
"time"
|
||||||
|
|
||||||
"github.com/Snider/Enchantrix/pkg/enchantrix"
|
"forge.lthn.ai/Snider/Enchantrix/pkg/enchantrix"
|
||||||
"github.com/Snider/Enchantrix/pkg/trix"
|
"forge.lthn.ai/Snider/Enchantrix/pkg/trix"
|
||||||
|
"github.com/klauspost/compress/zstd"
|
||||||
)
|
)
|
||||||
|
|
||||||
// DeriveKey derives a 32-byte key from a password using SHA-256.
|
// DeriveKey derives a 32-byte key from a password using SHA-256.
|
||||||
|
|
@ -120,7 +143,64 @@ func EncryptWithHint(msg *Message, password, hint string) ([]byte, error) {
|
||||||
return trix.Encode(t, Magic, nil)
|
return trix.Encode(t, Magic, nil)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// EncryptWithManifest encrypts with public manifest metadata in the clear text header
|
||||||
|
// The manifest is visible without decryption, enabling content discovery and indexing
|
||||||
|
func EncryptWithManifest(msg *Message, password string, manifest *Manifest) ([]byte, error) {
|
||||||
|
if password == "" {
|
||||||
|
return nil, ErrPasswordRequired
|
||||||
|
}
|
||||||
|
if msg.Body == "" && len(msg.Attachments) == 0 {
|
||||||
|
return nil, ErrEmptyMessage
|
||||||
|
}
|
||||||
|
|
||||||
|
if msg.Timestamp == 0 {
|
||||||
|
msg.Timestamp = time.Now().Unix()
|
||||||
|
}
|
||||||
|
|
||||||
|
payload, err := json.Marshal(msg)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to marshal message: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
key := DeriveKey(password)
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(key)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := sigil.In(payload)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("encryption failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build header with manifest
|
||||||
|
headerMap := map[string]interface{}{
|
||||||
|
"version": Version,
|
||||||
|
"algorithm": "chacha20poly1305",
|
||||||
|
}
|
||||||
|
if manifest != nil {
|
||||||
|
headerMap["manifest"] = manifest
|
||||||
|
}
|
||||||
|
|
||||||
|
t := &trix.Trix{
|
||||||
|
Header: headerMap,
|
||||||
|
Payload: encrypted,
|
||||||
|
}
|
||||||
|
|
||||||
|
return trix.Encode(t, Magic, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
// EncryptWithManifestBase64 encrypts with manifest and returns base64
|
||||||
|
func EncryptWithManifestBase64(msg *Message, password string, manifest *Manifest) (string, error) {
|
||||||
|
encrypted, err := EncryptWithManifest(msg, password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
return "", err
|
||||||
|
}
|
||||||
|
return base64.StdEncoding.EncodeToString(encrypted), nil
|
||||||
|
}
|
||||||
|
|
||||||
// Decrypt decrypts an SMSG container with a password
|
// Decrypt decrypts an SMSG container with a password
|
||||||
|
// Automatically handles both v1 (base64) and v2 (binary) formats
|
||||||
func Decrypt(data []byte, password string) (*Message, error) {
|
func Decrypt(data []byte, password string) (*Message, error) {
|
||||||
if password == "" {
|
if password == "" {
|
||||||
return nil, ErrPasswordRequired
|
return nil, ErrPasswordRequired
|
||||||
|
|
@ -132,6 +212,16 @@ func Decrypt(data []byte, password string) (*Message, error) {
|
||||||
return nil, fmt.Errorf("%w: %v", ErrInvalidMagic, err)
|
return nil, fmt.Errorf("%w: %v", ErrInvalidMagic, err)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Extract format and compression from header
|
||||||
|
format := ""
|
||||||
|
compression := ""
|
||||||
|
if f, ok := t.Header["format"].(string); ok {
|
||||||
|
format = f
|
||||||
|
}
|
||||||
|
if c, ok := t.Header["compression"].(string); ok {
|
||||||
|
compression = c
|
||||||
|
}
|
||||||
|
|
||||||
// Derive key and create sigil
|
// Derive key and create sigil
|
||||||
key := DeriveKey(password)
|
key := DeriveKey(password)
|
||||||
sigil, err := enchantrix.NewChaChaPolySigil(key)
|
sigil, err := enchantrix.NewChaChaPolySigil(key)
|
||||||
|
|
@ -145,7 +235,28 @@ func Decrypt(data []byte, password string) (*Message, error) {
|
||||||
return nil, ErrDecryptionFailed
|
return nil, ErrDecryptionFailed
|
||||||
}
|
}
|
||||||
|
|
||||||
// Parse message
|
// Decompress if needed
|
||||||
|
switch compression {
|
||||||
|
case CompressionGzip:
|
||||||
|
decompressed, err := gzipDecompress(decrypted)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("gzip decompression failed: %w", err)
|
||||||
|
}
|
||||||
|
decrypted = decompressed
|
||||||
|
case CompressionZstd:
|
||||||
|
decompressed, err := zstdDecompress(decrypted)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("zstd decompression failed: %w", err)
|
||||||
|
}
|
||||||
|
decrypted = decompressed
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse based on format
|
||||||
|
if format == FormatV2 {
|
||||||
|
return parseV2Payload(decrypted)
|
||||||
|
}
|
||||||
|
|
||||||
|
// v1 format: plain JSON with base64 attachments
|
||||||
var msg Message
|
var msg Message
|
||||||
if err := json.Unmarshal(decrypted, &msg); err != nil {
|
if err := json.Unmarshal(decrypted, &msg); err != nil {
|
||||||
return nil, fmt.Errorf("%w: invalid message format", ErrInvalidPayload)
|
return nil, fmt.Errorf("%w: invalid message format", ErrInvalidPayload)
|
||||||
|
|
@ -177,10 +288,28 @@ func GetInfo(data []byte) (*Header, error) {
|
||||||
if v, ok := t.Header["algorithm"].(string); ok {
|
if v, ok := t.Header["algorithm"].(string); ok {
|
||||||
header.Algorithm = v
|
header.Algorithm = v
|
||||||
}
|
}
|
||||||
|
if v, ok := t.Header["format"].(string); ok {
|
||||||
|
header.Format = v
|
||||||
|
}
|
||||||
|
if v, ok := t.Header["compression"].(string); ok {
|
||||||
|
header.Compression = v
|
||||||
|
}
|
||||||
if v, ok := t.Header["hint"].(string); ok {
|
if v, ok := t.Header["hint"].(string); ok {
|
||||||
header.Hint = v
|
header.Hint = v
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Extract manifest if present
|
||||||
|
if manifestData, ok := t.Header["manifest"]; ok && manifestData != nil {
|
||||||
|
// Re-marshal and unmarshal to properly convert the map to Manifest struct
|
||||||
|
manifestBytes, err := json.Marshal(manifestData)
|
||||||
|
if err == nil {
|
||||||
|
var manifest Manifest
|
||||||
|
if err := json.Unmarshal(manifestBytes, &manifest); err == nil {
|
||||||
|
header.Manifest = &manifest
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
return header, nil
|
return header, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
@ -216,3 +345,210 @@ func QuickDecrypt(encoded, password string) (string, error) {
|
||||||
}
|
}
|
||||||
return msg.Body, nil
|
return msg.Body, nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// EncryptV2 encrypts a message using v2 binary format (smaller file size)
|
||||||
|
// Attachments are stored as raw binary instead of base64-encoded JSON
|
||||||
|
// Uses zstd compression by default (faster than gzip, better ratio)
|
||||||
|
func EncryptV2(msg *Message, password string) ([]byte, error) {
|
||||||
|
return EncryptV2WithOptions(msg, password, nil, CompressionZstd)
|
||||||
|
}
|
||||||
|
|
||||||
|
// EncryptV2WithManifest encrypts with v2 binary format and public manifest
|
||||||
|
// Uses zstd compression by default (faster than gzip, better ratio)
|
||||||
|
func EncryptV2WithManifest(msg *Message, password string, manifest *Manifest) ([]byte, error) {
|
||||||
|
return EncryptV2WithOptions(msg, password, manifest, CompressionZstd)
|
||||||
|
}
|
||||||
|
|
||||||
|
// EncryptV2WithOptions encrypts with full control over format options
|
||||||
|
func EncryptV2WithOptions(msg *Message, password string, manifest *Manifest, compression string) ([]byte, error) {
|
||||||
|
if password == "" {
|
||||||
|
return nil, ErrPasswordRequired
|
||||||
|
}
|
||||||
|
if msg.Body == "" && len(msg.Attachments) == 0 {
|
||||||
|
return nil, ErrEmptyMessage
|
||||||
|
}
|
||||||
|
|
||||||
|
if msg.Timestamp == 0 {
|
||||||
|
msg.Timestamp = time.Now().Unix()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build v2 payload: [4-byte JSON length][JSON][binary attachments...]
|
||||||
|
payload, err := buildV2Payload(msg)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to build v2 payload: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Apply compression if requested
|
||||||
|
switch compression {
|
||||||
|
case CompressionGzip:
|
||||||
|
compressed, err := gzipCompress(payload)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("gzip compression failed: %w", err)
|
||||||
|
}
|
||||||
|
payload = compressed
|
||||||
|
case CompressionZstd:
|
||||||
|
compressed, err := zstdCompress(payload)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("zstd compression failed: %w", err)
|
||||||
|
}
|
||||||
|
payload = compressed
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt
|
||||||
|
key := DeriveKey(password)
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(key)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := sigil.In(payload)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("encryption failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build header
|
||||||
|
headerMap := map[string]interface{}{
|
||||||
|
"version": Version,
|
||||||
|
"algorithm": "chacha20poly1305",
|
||||||
|
"format": FormatV2,
|
||||||
|
}
|
||||||
|
if compression != CompressionNone {
|
||||||
|
headerMap["compression"] = compression
|
||||||
|
}
|
||||||
|
if manifest != nil {
|
||||||
|
headerMap["manifest"] = manifest
|
||||||
|
}
|
||||||
|
|
||||||
|
t := &trix.Trix{
|
||||||
|
Header: headerMap,
|
||||||
|
Payload: encrypted,
|
||||||
|
}
|
||||||
|
|
||||||
|
return trix.Encode(t, Magic, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
// buildV2Payload creates the v2 binary payload structure
|
||||||
|
func buildV2Payload(msg *Message) ([]byte, error) {
|
||||||
|
// Create a copy of the message with attachment content stripped
|
||||||
|
// We'll append the binary data after the JSON
|
||||||
|
msgCopy := *msg
|
||||||
|
var binaryData [][]byte
|
||||||
|
|
||||||
|
for i := range msgCopy.Attachments {
|
||||||
|
att := &msgCopy.Attachments[i]
|
||||||
|
if att.Content != "" {
|
||||||
|
// Decode the base64 content to get binary
|
||||||
|
data, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("invalid base64 in attachment %s: %w", att.Name, err)
|
||||||
|
}
|
||||||
|
binaryData = append(binaryData, data)
|
||||||
|
att.Size = len(data) // Store actual binary size
|
||||||
|
att.Content = "" // Clear content from JSON
|
||||||
|
} else {
|
||||||
|
binaryData = append(binaryData, nil)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Serialize the message (without attachment content)
|
||||||
|
jsonData, err := json.Marshal(&msgCopy)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to marshal message: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build payload: [4-byte length][JSON][binary1][binary2]...
|
||||||
|
var buf bytes.Buffer
|
||||||
|
|
||||||
|
// Write JSON length as uint32 big-endian
|
||||||
|
if err := binary.Write(&buf, binary.BigEndian, uint32(len(jsonData))); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Write JSON
|
||||||
|
buf.Write(jsonData)
|
||||||
|
|
||||||
|
// Write binary attachments
|
||||||
|
for _, data := range binaryData {
|
||||||
|
buf.Write(data)
|
||||||
|
}
|
||||||
|
|
||||||
|
return buf.Bytes(), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// parseV2Payload extracts message and binary attachments from v2 format
|
||||||
|
func parseV2Payload(data []byte) (*Message, error) {
|
||||||
|
if len(data) < 4 {
|
||||||
|
return nil, fmt.Errorf("payload too short")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read JSON length
|
||||||
|
jsonLen := binary.BigEndian.Uint32(data[:4])
|
||||||
|
if int(jsonLen) > len(data)-4 {
|
||||||
|
return nil, fmt.Errorf("invalid JSON length")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse JSON
|
||||||
|
var msg Message
|
||||||
|
if err := json.Unmarshal(data[4:4+jsonLen], &msg); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse message JSON: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read binary attachments
|
||||||
|
offset := 4 + int(jsonLen)
|
||||||
|
for i := range msg.Attachments {
|
||||||
|
att := &msg.Attachments[i]
|
||||||
|
if att.Size > 0 {
|
||||||
|
if offset+att.Size > len(data) {
|
||||||
|
return nil, fmt.Errorf("attachment %s: data truncated", att.Name)
|
||||||
|
}
|
||||||
|
// Re-encode as base64 for API compatibility
|
||||||
|
att.Content = base64.StdEncoding.EncodeToString(data[offset : offset+att.Size])
|
||||||
|
offset += att.Size
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &msg, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// gzipCompress compresses data using gzip
|
||||||
|
func gzipCompress(data []byte) ([]byte, error) {
|
||||||
|
var buf bytes.Buffer
|
||||||
|
w := gzip.NewWriter(&buf)
|
||||||
|
if _, err := w.Write(data); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
if err := w.Close(); err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
return buf.Bytes(), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// gzipDecompress decompresses gzip data
|
||||||
|
func gzipDecompress(data []byte) ([]byte, error) {
|
||||||
|
r, err := gzip.NewReader(bytes.NewReader(data))
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer r.Close()
|
||||||
|
return io.ReadAll(r)
|
||||||
|
}
|
||||||
|
|
||||||
|
// zstdCompress compresses data using zstd (faster than gzip, better ratio)
|
||||||
|
func zstdCompress(data []byte) ([]byte, error) {
|
||||||
|
encoder, err := zstd.NewWriter(nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer encoder.Close()
|
||||||
|
return encoder.EncodeAll(data, nil), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// zstdDecompress decompresses zstd data
|
||||||
|
func zstdDecompress(data []byte) ([]byte, error) {
|
||||||
|
decoder, err := zstd.NewReader(nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
defer decoder.Close()
|
||||||
|
return decoder.DecodeAll(data, nil)
|
||||||
|
}
|
||||||
|
|
|
||||||
|
|
@ -268,3 +268,443 @@ func TestEmptyMessageError(t *testing.T) {
|
||||||
t.Errorf("Expected ErrEmptyMessage, got %v", err)
|
t.Errorf("Expected ErrEmptyMessage, got %v", err)
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
func TestEncryptWithManifest(t *testing.T) {
|
||||||
|
msg := NewMessage("Licensed content")
|
||||||
|
password := "license-token-123"
|
||||||
|
|
||||||
|
// Create manifest with tracks
|
||||||
|
manifest := NewManifest("Summer EP 2024").
|
||||||
|
AddTrackFull("Intro", 0, 30, "intro").
|
||||||
|
AddTrackFull("Main Track", 30, 180, "full").
|
||||||
|
AddTrack("Outro", 180)
|
||||||
|
manifest.Artist = "Test Artist"
|
||||||
|
manifest.ReleaseType = "ep"
|
||||||
|
manifest.Format = "dapp.fm/v1"
|
||||||
|
|
||||||
|
encrypted, err := EncryptWithManifest(msg, password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptWithManifest failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get info without decryption - should have manifest
|
||||||
|
header, err := GetInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetInfo failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest == nil {
|
||||||
|
t.Fatal("Expected manifest in header")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.Title != "Summer EP 2024" {
|
||||||
|
t.Errorf("Title = %q, want %q", header.Manifest.Title, "Summer EP 2024")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.Artist != "Test Artist" {
|
||||||
|
t.Errorf("Artist = %q, want %q", header.Manifest.Artist, "Test Artist")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.ReleaseType != "ep" {
|
||||||
|
t.Errorf("ReleaseType = %q, want %q", header.Manifest.ReleaseType, "ep")
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(header.Manifest.Tracks) != 3 {
|
||||||
|
t.Errorf("Tracks count = %d, want 3", len(header.Manifest.Tracks))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify tracks
|
||||||
|
if header.Manifest.Tracks[0].Title != "Intro" {
|
||||||
|
t.Errorf("Track 0 Title = %q, want %q", header.Manifest.Tracks[0].Title, "Intro")
|
||||||
|
}
|
||||||
|
if header.Manifest.Tracks[0].Start != 0 {
|
||||||
|
t.Errorf("Track 0 Start = %v, want 0", header.Manifest.Tracks[0].Start)
|
||||||
|
}
|
||||||
|
if header.Manifest.Tracks[0].Type != "intro" {
|
||||||
|
t.Errorf("Track 0 Type = %q, want %q", header.Manifest.Tracks[0].Type, "intro")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Can still decrypt normally
|
||||||
|
decrypted, err := Decrypt(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Decrypt failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if decrypted.Body != "Licensed content" {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, "Licensed content")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestManifestBuilder(t *testing.T) {
|
||||||
|
manifest := NewManifest("Test Album")
|
||||||
|
manifest.Artist = "Artist Name"
|
||||||
|
manifest.Album = "Album Name"
|
||||||
|
manifest.Year = 2024
|
||||||
|
manifest.Genre = "Electronic"
|
||||||
|
manifest.ReleaseType = "album"
|
||||||
|
manifest.Tags = []string{"electronic", "ambient"}
|
||||||
|
manifest.Extra["custom_field"] = "custom_value"
|
||||||
|
|
||||||
|
// Add tracks
|
||||||
|
manifest.AddTrack("Track 1", 0)
|
||||||
|
manifest.AddTrack("Track 2", 120)
|
||||||
|
manifest.AddTrackFull("Track 3", 240, 360, "outro")
|
||||||
|
|
||||||
|
if manifest.Title != "Test Album" {
|
||||||
|
t.Errorf("Title = %q, want %q", manifest.Title, "Test Album")
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(manifest.Tracks) != 3 {
|
||||||
|
t.Fatalf("Track count = %d, want 3", len(manifest.Tracks))
|
||||||
|
}
|
||||||
|
|
||||||
|
// First track should have TrackNum 1
|
||||||
|
if manifest.Tracks[0].TrackNum != 1 {
|
||||||
|
t.Errorf("Track 1 TrackNum = %d, want 1", manifest.Tracks[0].TrackNum)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Third track should have end time
|
||||||
|
if manifest.Tracks[2].End != 360 {
|
||||||
|
t.Errorf("Track 3 End = %v, want 360", manifest.Tracks[2].End)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestManifestExpiration(t *testing.T) {
|
||||||
|
// Test perpetual license (no expiration)
|
||||||
|
perpetual := NewManifest("Perpetual Album")
|
||||||
|
if perpetual.IsExpired() {
|
||||||
|
t.Error("Perpetual license should not be expired")
|
||||||
|
}
|
||||||
|
if perpetual.TimeRemaining() != 0 {
|
||||||
|
t.Error("Perpetual license should have 0 time remaining (infinite)")
|
||||||
|
}
|
||||||
|
if perpetual.LicenseType != "perpetual" {
|
||||||
|
t.Errorf("LicenseType = %q, want perpetual", perpetual.LicenseType)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test streaming access (24 hours)
|
||||||
|
stream := NewManifest("Stream Album").WithStreamingAccess(24)
|
||||||
|
if stream.IsExpired() {
|
||||||
|
t.Error("Streaming license should not be expired immediately")
|
||||||
|
}
|
||||||
|
if stream.LicenseType != "stream" {
|
||||||
|
t.Errorf("LicenseType = %q, want stream", stream.LicenseType)
|
||||||
|
}
|
||||||
|
remaining := stream.TimeRemaining()
|
||||||
|
if remaining < 86000 || remaining > 86400 {
|
||||||
|
t.Errorf("TimeRemaining = %d, expected ~86400", remaining)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test rental with duration
|
||||||
|
rental := NewManifest("Rental Album").WithRentalDuration(3600) // 1 hour
|
||||||
|
if rental.IsExpired() {
|
||||||
|
t.Error("Rental license should not be expired immediately")
|
||||||
|
}
|
||||||
|
if rental.LicenseType != "rental" {
|
||||||
|
t.Errorf("LicenseType = %q, want rental", rental.LicenseType)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test preview (30 seconds)
|
||||||
|
preview := NewManifest("Preview Track").WithPreviewAccess(30)
|
||||||
|
if preview.IsExpired() {
|
||||||
|
t.Error("Preview license should not be expired immediately")
|
||||||
|
}
|
||||||
|
if preview.LicenseType != "preview" {
|
||||||
|
t.Errorf("LicenseType = %q, want preview", preview.LicenseType)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test already expired license
|
||||||
|
expired := NewManifest("Expired Album")
|
||||||
|
expired.ExpiresAt = 1000 // Very old timestamp
|
||||||
|
if !expired.IsExpired() {
|
||||||
|
t.Error("License with old expiration should be expired")
|
||||||
|
}
|
||||||
|
if expired.TimeRemaining() >= 0 {
|
||||||
|
t.Error("Expired license should have negative time remaining")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestExpirationInHeader(t *testing.T) {
|
||||||
|
msg := NewMessage("Licensed content")
|
||||||
|
password := "stream-token-123"
|
||||||
|
|
||||||
|
// Create streaming license (24 hours)
|
||||||
|
manifest := NewManifest("Streaming EP").WithStreamingAccess(24)
|
||||||
|
|
||||||
|
encrypted, err := EncryptWithManifest(msg, password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptWithManifest failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get info should show expiration
|
||||||
|
header, err := GetInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetInfo failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest == nil {
|
||||||
|
t.Fatal("Expected manifest in header")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.LicenseType != "stream" {
|
||||||
|
t.Errorf("LicenseType = %q, want stream", header.Manifest.LicenseType)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.ExpiresAt == 0 {
|
||||||
|
t.Error("ExpiresAt should not be 0 for streaming license")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.IssuedAt == 0 {
|
||||||
|
t.Error("IssuedAt should not be 0")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.IsExpired() {
|
||||||
|
t.Error("New streaming license should not be expired")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestManifestLinks(t *testing.T) {
|
||||||
|
manifest := NewManifest("Test Track").
|
||||||
|
AddLink("home", "https://example.com/artist").
|
||||||
|
AddLink("beatport", "https://beatport.com/artist/test").
|
||||||
|
AddLink("soundcloud", "https://soundcloud.com/test")
|
||||||
|
|
||||||
|
if len(manifest.Links) != 3 {
|
||||||
|
t.Fatalf("Links count = %d, want 3", len(manifest.Links))
|
||||||
|
}
|
||||||
|
|
||||||
|
if manifest.Links["home"] != "https://example.com/artist" {
|
||||||
|
t.Errorf("Links[home] = %q, want %q", manifest.Links["home"], "https://example.com/artist")
|
||||||
|
}
|
||||||
|
|
||||||
|
if manifest.Links["beatport"] != "https://beatport.com/artist/test" {
|
||||||
|
t.Errorf("Links[beatport] = %q, want %q", manifest.Links["beatport"], "https://beatport.com/artist/test")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test manifest with links in encrypted message
|
||||||
|
msg := NewMessage("Track content")
|
||||||
|
password := "link-test"
|
||||||
|
|
||||||
|
encrypted, err := EncryptWithManifest(msg, password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptWithManifest failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
header, err := GetInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetInfo failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest == nil {
|
||||||
|
t.Fatal("Expected manifest in header")
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(header.Manifest.Links) != 3 {
|
||||||
|
t.Fatalf("Header Links count = %d, want 3", len(header.Manifest.Links))
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.Links["home"] != "https://example.com/artist" {
|
||||||
|
t.Errorf("Header Links[home] = %q, want %q", header.Manifest.Links["home"], "https://example.com/artist")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV2BinaryFormat(t *testing.T) {
|
||||||
|
// Create message with binary attachment
|
||||||
|
binaryData := []byte("Hello, this is binary content! \x00\x01\x02\x03")
|
||||||
|
msg := NewMessage("V2 format test").
|
||||||
|
AddBinaryAttachment("test.bin", binaryData, "application/octet-stream")
|
||||||
|
|
||||||
|
password := "v2-test"
|
||||||
|
|
||||||
|
// Encrypt with v2 format
|
||||||
|
encrypted, err := EncryptV2(msg, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV2 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check header
|
||||||
|
header, err := GetInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetInfo failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Format != FormatV2 {
|
||||||
|
t.Errorf("Format = %q, want %q", header.Format, FormatV2)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Compression != CompressionZstd {
|
||||||
|
t.Errorf("Compression = %q, want %q", header.Compression, CompressionZstd)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
decrypted, err := Decrypt(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Decrypt failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if decrypted.Body != "V2 format test" {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, "V2 format test")
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(decrypted.Attachments) != 1 {
|
||||||
|
t.Fatalf("Attachments count = %d, want 1", len(decrypted.Attachments))
|
||||||
|
}
|
||||||
|
|
||||||
|
att := decrypted.Attachments[0]
|
||||||
|
if att.Name != "test.bin" {
|
||||||
|
t.Errorf("Attachment name = %q, want %q", att.Name, "test.bin")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decode attachment and verify content
|
||||||
|
decoded, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Failed to decode attachment: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if string(decoded) != string(binaryData) {
|
||||||
|
t.Errorf("Attachment content mismatch")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV2WithManifest(t *testing.T) {
|
||||||
|
binaryData := make([]byte, 1024) // 1KB of zeros
|
||||||
|
for i := range binaryData {
|
||||||
|
binaryData[i] = byte(i % 256)
|
||||||
|
}
|
||||||
|
|
||||||
|
msg := NewMessage("V2 with manifest").
|
||||||
|
AddBinaryAttachment("data.bin", binaryData, "application/octet-stream")
|
||||||
|
|
||||||
|
manifest := NewManifest("Test Album").
|
||||||
|
AddLink("home", "https://example.com")
|
||||||
|
manifest.Artist = "Test Artist"
|
||||||
|
|
||||||
|
password := "v2-manifest-test"
|
||||||
|
|
||||||
|
encrypted, err := EncryptV2WithManifest(msg, password, manifest)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV2WithManifest failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify header
|
||||||
|
header, err := GetInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetInfo failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Format != FormatV2 {
|
||||||
|
t.Errorf("Format = %q, want %q", header.Format, FormatV2)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest == nil {
|
||||||
|
t.Fatal("Expected manifest")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.Title != "Test Album" {
|
||||||
|
t.Errorf("Manifest Title = %q, want %q", header.Manifest.Title, "Test Album")
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Manifest.Artist != "Test Artist" {
|
||||||
|
t.Errorf("Manifest Artist = %q, want %q", header.Manifest.Artist, "Test Artist")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt and verify
|
||||||
|
decrypted, err := Decrypt(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Decrypt failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if len(decrypted.Attachments) != 1 {
|
||||||
|
t.Fatalf("Attachments count = %d, want 1", len(decrypted.Attachments))
|
||||||
|
}
|
||||||
|
|
||||||
|
decoded, _ := base64.StdEncoding.DecodeString(decrypted.Attachments[0].Content)
|
||||||
|
if len(decoded) != 1024 {
|
||||||
|
t.Errorf("Decoded length = %d, want 1024", len(decoded))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV2SizeSavings(t *testing.T) {
|
||||||
|
// Create a message with binary data
|
||||||
|
binaryData := make([]byte, 10000) // 10KB
|
||||||
|
for i := range binaryData {
|
||||||
|
binaryData[i] = byte(i % 256)
|
||||||
|
}
|
||||||
|
|
||||||
|
msg := NewMessage("Size comparison test")
|
||||||
|
msg.AddBinaryAttachment("large.bin", binaryData, "application/octet-stream")
|
||||||
|
|
||||||
|
password := "size-test"
|
||||||
|
|
||||||
|
// Encrypt with v1 (base64)
|
||||||
|
v1Encrypted, err := Encrypt(msg, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Encrypt v1 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt with v2 (binary + gzip)
|
||||||
|
v2Encrypted, err := EncryptV2(msg, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV2 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
t.Logf("V1 size: %d bytes", len(v1Encrypted))
|
||||||
|
t.Logf("V2 size: %d bytes", len(v2Encrypted))
|
||||||
|
t.Logf("Savings: %.1f%%", (1.0-float64(len(v2Encrypted))/float64(len(v1Encrypted)))*100)
|
||||||
|
|
||||||
|
// V2 should be smaller (at least 20% savings from base64 removal alone)
|
||||||
|
if len(v2Encrypted) >= len(v1Encrypted) {
|
||||||
|
t.Errorf("V2 should be smaller than V1: v2=%d, v1=%d", len(v2Encrypted), len(v1Encrypted))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Both should decrypt to the same content
|
||||||
|
d1, _ := Decrypt(v1Encrypted, password)
|
||||||
|
d2, _ := Decrypt(v2Encrypted, password)
|
||||||
|
|
||||||
|
if d1.Body != d2.Body {
|
||||||
|
t.Error("Decrypted bodies don't match")
|
||||||
|
}
|
||||||
|
|
||||||
|
c1, _ := base64.StdEncoding.DecodeString(d1.Attachments[0].Content)
|
||||||
|
c2, _ := base64.StdEncoding.DecodeString(d2.Attachments[0].Content)
|
||||||
|
|
||||||
|
if string(c1) != string(c2) {
|
||||||
|
t.Error("Decrypted attachment content doesn't match")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV2NoCompression(t *testing.T) {
|
||||||
|
msg := NewMessage("No compression test").
|
||||||
|
AddBinaryAttachment("test.txt", []byte("Hello World"), "text/plain")
|
||||||
|
|
||||||
|
password := "no-compress"
|
||||||
|
|
||||||
|
// Encrypt without compression
|
||||||
|
encrypted, err := EncryptV2WithOptions(msg, password, nil, CompressionNone)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV2WithOptions failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
header, err := GetInfo(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetInfo failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Format != FormatV2 {
|
||||||
|
t.Errorf("Format = %q, want %q", header.Format, FormatV2)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Compression != "" {
|
||||||
|
t.Errorf("Compression = %q, want empty", header.Compression)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Should still decrypt
|
||||||
|
decrypted, err := Decrypt(encrypted, password)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Decrypt failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if decrypted.Body != "No compression test" {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, "No compression test")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
|
||||||
827
pkg/smsg/stream.go
Normal file
827
pkg/smsg/stream.go
Normal file
|
|
@ -0,0 +1,827 @@
|
||||||
|
package smsg
|
||||||
|
|
||||||
|
// V3 Streaming Support with LTHN Rolling Keys
|
||||||
|
//
|
||||||
|
// This file implements zero-trust streaming where:
|
||||||
|
// - Content is encrypted once with a random CEK (Content Encryption Key)
|
||||||
|
// - CEK is wrapped (encrypted) with time-bound stream keys
|
||||||
|
// - Stream keys are derived using LTHN(date:license:fingerprint)
|
||||||
|
// - Rolling window: today and tomorrow keys are valid (24-48hr window)
|
||||||
|
// - Keys auto-expire - no revocation needed
|
||||||
|
//
|
||||||
|
// Server flow:
|
||||||
|
// 1. Generate random CEK
|
||||||
|
// 2. Encrypt content with CEK
|
||||||
|
// 3. For today & tomorrow: wrap CEK with DeriveStreamKey(date, license, fingerprint)
|
||||||
|
// 4. Store wrapped keys in header
|
||||||
|
//
|
||||||
|
// Client flow:
|
||||||
|
// 1. Derive stream key for today (or tomorrow)
|
||||||
|
// 2. Try to unwrap CEK from header
|
||||||
|
// 3. Decrypt content with CEK
|
||||||
|
|
||||||
|
import (
|
||||||
|
"crypto/rand"
|
||||||
|
"crypto/sha256"
|
||||||
|
"encoding/base64"
|
||||||
|
"encoding/binary"
|
||||||
|
"encoding/json"
|
||||||
|
"fmt"
|
||||||
|
"time"
|
||||||
|
|
||||||
|
"forge.lthn.ai/Snider/Enchantrix/pkg/crypt"
|
||||||
|
"forge.lthn.ai/Snider/Enchantrix/pkg/enchantrix"
|
||||||
|
"forge.lthn.ai/Snider/Enchantrix/pkg/trix"
|
||||||
|
)
|
||||||
|
|
||||||
|
// StreamParams contains the parameters needed for stream key derivation
|
||||||
|
type StreamParams struct {
|
||||||
|
License string // User's license identifier
|
||||||
|
Fingerprint string // Device/session fingerprint
|
||||||
|
Cadence Cadence // Key rotation cadence (default: daily)
|
||||||
|
ChunkSize int // Optional: chunk size for decrypt-while-downloading (0 = no chunking)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DeriveStreamKey derives a 32-byte ChaCha key from date, license, and fingerprint.
|
||||||
|
// Uses LTHN hash which is rainbow-table resistant (salt derived from input itself).
|
||||||
|
//
|
||||||
|
// The derived key is: SHA256(LTHN("YYYY-MM-DD:license:fingerprint"))
|
||||||
|
func DeriveStreamKey(date, license, fingerprint string) []byte {
|
||||||
|
// Build input string
|
||||||
|
input := fmt.Sprintf("%s:%s:%s", date, license, fingerprint)
|
||||||
|
|
||||||
|
// Use Enchantrix crypt service for LTHN hash
|
||||||
|
cryptService := crypt.NewService()
|
||||||
|
lthnHash := cryptService.Hash(crypt.LTHN, input)
|
||||||
|
|
||||||
|
// LTHN returns hex string, hash it again to get 32 bytes for ChaCha
|
||||||
|
key := sha256.Sum256([]byte(lthnHash))
|
||||||
|
return key[:]
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRollingDates returns today and tomorrow's date strings in YYYY-MM-DD format
|
||||||
|
// This is the default daily cadence.
|
||||||
|
func GetRollingDates() (current, next string) {
|
||||||
|
return GetRollingPeriods(CadenceDaily, time.Now().UTC())
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRollingDatesAt returns today and tomorrow relative to a specific time
|
||||||
|
func GetRollingDatesAt(t time.Time) (current, next string) {
|
||||||
|
return GetRollingPeriods(CadenceDaily, t.UTC())
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetRollingPeriods returns the current and next period strings based on cadence.
|
||||||
|
// The period string format varies by cadence:
|
||||||
|
// - daily: "2006-01-02"
|
||||||
|
// - 12h: "2006-01-02-AM" or "2006-01-02-PM"
|
||||||
|
// - 6h: "2006-01-02-00", "2006-01-02-06", "2006-01-02-12", "2006-01-02-18"
|
||||||
|
// - 1h: "2006-01-02-15" (hour in 24h format)
|
||||||
|
func GetRollingPeriods(cadence Cadence, t time.Time) (current, next string) {
|
||||||
|
t = t.UTC()
|
||||||
|
|
||||||
|
switch cadence {
|
||||||
|
case CadenceHalfDay:
|
||||||
|
// 12-hour periods: AM (00:00-11:59) and PM (12:00-23:59)
|
||||||
|
date := t.Format("2006-01-02")
|
||||||
|
if t.Hour() < 12 {
|
||||||
|
current = date + "-AM"
|
||||||
|
next = date + "-PM"
|
||||||
|
} else {
|
||||||
|
current = date + "-PM"
|
||||||
|
next = t.AddDate(0, 0, 1).Format("2006-01-02") + "-AM"
|
||||||
|
}
|
||||||
|
|
||||||
|
case CadenceQuarter:
|
||||||
|
// 6-hour periods: 00, 06, 12, 18
|
||||||
|
date := t.Format("2006-01-02")
|
||||||
|
hour := t.Hour()
|
||||||
|
period := (hour / 6) * 6
|
||||||
|
nextPeriod := period + 6
|
||||||
|
|
||||||
|
current = fmt.Sprintf("%s-%02d", date, period)
|
||||||
|
if nextPeriod >= 24 {
|
||||||
|
next = fmt.Sprintf("%s-%02d", t.AddDate(0, 0, 1).Format("2006-01-02"), 0)
|
||||||
|
} else {
|
||||||
|
next = fmt.Sprintf("%s-%02d", date, nextPeriod)
|
||||||
|
}
|
||||||
|
|
||||||
|
case CadenceHourly:
|
||||||
|
// Hourly periods
|
||||||
|
current = t.Format("2006-01-02-15")
|
||||||
|
next = t.Add(time.Hour).Format("2006-01-02-15")
|
||||||
|
|
||||||
|
default: // CadenceDaily or empty
|
||||||
|
current = t.Format("2006-01-02")
|
||||||
|
next = t.AddDate(0, 0, 1).Format("2006-01-02")
|
||||||
|
}
|
||||||
|
|
||||||
|
return
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetCadenceWindowDuration returns the duration of one period for a cadence
|
||||||
|
func GetCadenceWindowDuration(cadence Cadence) time.Duration {
|
||||||
|
switch cadence {
|
||||||
|
case CadenceHourly:
|
||||||
|
return time.Hour
|
||||||
|
case CadenceQuarter:
|
||||||
|
return 6 * time.Hour
|
||||||
|
case CadenceHalfDay:
|
||||||
|
return 12 * time.Hour
|
||||||
|
default: // CadenceDaily
|
||||||
|
return 24 * time.Hour
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// WrapCEK wraps a Content Encryption Key with a stream key
|
||||||
|
// Returns base64-encoded wrapped key (includes nonce)
|
||||||
|
func WrapCEK(cek, streamKey []byte) (string, error) {
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(streamKey)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
wrapped, err := sigil.In(cek)
|
||||||
|
if err != nil {
|
||||||
|
return "", fmt.Errorf("failed to wrap CEK: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return base64.StdEncoding.EncodeToString(wrapped), nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnwrapCEK unwraps a Content Encryption Key using a stream key
|
||||||
|
// Takes base64-encoded wrapped key, returns raw CEK bytes
|
||||||
|
func UnwrapCEK(wrappedB64 string, streamKey []byte) ([]byte, error) {
|
||||||
|
wrapped, err := base64.StdEncoding.DecodeString(wrappedB64)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to decode wrapped key: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(streamKey)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
cek, err := sigil.Out(wrapped)
|
||||||
|
if err != nil {
|
||||||
|
return nil, ErrDecryptionFailed
|
||||||
|
}
|
||||||
|
|
||||||
|
return cek, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GenerateCEK generates a random 32-byte Content Encryption Key
|
||||||
|
func GenerateCEK() ([]byte, error) {
|
||||||
|
cek := make([]byte, 32)
|
||||||
|
if _, err := rand.Read(cek); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to generate CEK: %w", err)
|
||||||
|
}
|
||||||
|
return cek, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// EncryptV3 encrypts a message using v3 streaming format with rolling keys.
|
||||||
|
// The content is encrypted with a random CEK, which is then wrapped with
|
||||||
|
// stream keys for today and tomorrow.
|
||||||
|
//
|
||||||
|
// When params.ChunkSize > 0, content is split into independently decryptable
|
||||||
|
// chunks, enabling decrypt-while-downloading and seeking.
|
||||||
|
func EncryptV3(msg *Message, params *StreamParams, manifest *Manifest) ([]byte, error) {
|
||||||
|
if params == nil || params.License == "" {
|
||||||
|
return nil, ErrLicenseRequired
|
||||||
|
}
|
||||||
|
if msg.Body == "" && len(msg.Attachments) == 0 {
|
||||||
|
return nil, ErrEmptyMessage
|
||||||
|
}
|
||||||
|
|
||||||
|
// Set timestamp if not set
|
||||||
|
if msg.Timestamp == 0 {
|
||||||
|
msg.Timestamp = time.Now().Unix()
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate random CEK
|
||||||
|
cek, err := GenerateCEK()
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine cadence (default to daily if not specified)
|
||||||
|
cadence := params.Cadence
|
||||||
|
if cadence == "" {
|
||||||
|
cadence = CadenceDaily
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get rolling periods based on cadence
|
||||||
|
current, next := GetRollingPeriods(cadence, time.Now().UTC())
|
||||||
|
|
||||||
|
// Wrap CEK with current period's stream key
|
||||||
|
currentKey := DeriveStreamKey(current, params.License, params.Fingerprint)
|
||||||
|
wrappedCurrent, err := WrapCEK(cek, currentKey)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to wrap CEK for current period: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrap CEK with next period's stream key
|
||||||
|
nextKey := DeriveStreamKey(next, params.License, params.Fingerprint)
|
||||||
|
wrappedNext, err := WrapCEK(cek, nextKey)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to wrap CEK for next period: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if chunked mode requested
|
||||||
|
if params.ChunkSize > 0 {
|
||||||
|
return encryptV3Chunked(msg, params, manifest, cek, cadence, current, next, wrappedCurrent, wrappedNext)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Non-chunked v3 (original behavior)
|
||||||
|
return encryptV3Standard(msg, params, manifest, cek, cadence, current, next, wrappedCurrent, wrappedNext)
|
||||||
|
}
|
||||||
|
|
||||||
|
// encryptV3Standard encrypts as a single block (original v3 behavior)
|
||||||
|
func encryptV3Standard(msg *Message, params *StreamParams, manifest *Manifest, cek []byte, cadence Cadence, current, next, wrappedCurrent, wrappedNext string) ([]byte, error) {
|
||||||
|
// Build v3 payload (similar to v2 but encrypted with CEK)
|
||||||
|
payload, attachmentData, err := buildV3Payload(msg)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compress payload
|
||||||
|
compressed, err := zstdCompress(payload)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("compression failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt with CEK
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(cek)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := sigil.In(compressed)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("encryption failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt attachment data with CEK
|
||||||
|
encryptedAttachments, err := sigil.In(attachmentData)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("attachment encryption failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create header with wrapped keys
|
||||||
|
headerMap := map[string]interface{}{
|
||||||
|
"version": Version,
|
||||||
|
"algorithm": "chacha20poly1305",
|
||||||
|
"format": FormatV3,
|
||||||
|
"compression": CompressionZstd,
|
||||||
|
"keyMethod": KeyMethodLTHNRolling,
|
||||||
|
"cadence": string(cadence),
|
||||||
|
"wrappedKeys": []WrappedKey{
|
||||||
|
{Date: current, Wrapped: wrappedCurrent},
|
||||||
|
{Date: next, Wrapped: wrappedNext},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
if manifest != nil {
|
||||||
|
if manifest.IssuedAt == 0 {
|
||||||
|
manifest.IssuedAt = time.Now().Unix()
|
||||||
|
}
|
||||||
|
headerMap["manifest"] = manifest
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build v3 binary format: [4-byte json len][json header][encrypted payload][encrypted attachments]
|
||||||
|
headerJSON, err := json.Marshal(headerMap)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to marshal header: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate total size
|
||||||
|
totalSize := 4 + len(headerJSON) + 4 + len(encrypted) + len(encryptedAttachments)
|
||||||
|
output := make([]byte, 0, totalSize)
|
||||||
|
|
||||||
|
// Write header length (4 bytes, big-endian)
|
||||||
|
headerLen := make([]byte, 4)
|
||||||
|
binary.BigEndian.PutUint32(headerLen, uint32(len(headerJSON)))
|
||||||
|
output = append(output, headerLen...)
|
||||||
|
|
||||||
|
// Write header JSON
|
||||||
|
output = append(output, headerJSON...)
|
||||||
|
|
||||||
|
// Write encrypted payload length (4 bytes, big-endian)
|
||||||
|
payloadLen := make([]byte, 4)
|
||||||
|
binary.BigEndian.PutUint32(payloadLen, uint32(len(encrypted)))
|
||||||
|
output = append(output, payloadLen...)
|
||||||
|
|
||||||
|
// Write encrypted payload
|
||||||
|
output = append(output, encrypted...)
|
||||||
|
|
||||||
|
// Write encrypted attachments
|
||||||
|
output = append(output, encryptedAttachments...)
|
||||||
|
|
||||||
|
// Wrap in trix container
|
||||||
|
t := &trix.Trix{
|
||||||
|
Header: headerMap,
|
||||||
|
Payload: output,
|
||||||
|
}
|
||||||
|
|
||||||
|
return trix.Encode(t, Magic, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
// encryptV3Chunked encrypts content into independently decryptable chunks
|
||||||
|
func encryptV3Chunked(msg *Message, params *StreamParams, manifest *Manifest, cek []byte, cadence Cadence, current, next, wrappedCurrent, wrappedNext string) ([]byte, error) {
|
||||||
|
chunkSize := params.ChunkSize
|
||||||
|
|
||||||
|
// Build raw content to chunk: metadata JSON + binary attachments
|
||||||
|
metaJSON, attachmentData, err := buildV3Payload(msg)
|
||||||
|
if err != nil {
|
||||||
|
return nil, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Combine into single byte slice for chunking
|
||||||
|
rawContent := append(metaJSON, attachmentData...)
|
||||||
|
totalSize := int64(len(rawContent))
|
||||||
|
|
||||||
|
// Create sigil with CEK for chunk encryption
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(cek)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt in chunks
|
||||||
|
var chunks [][]byte
|
||||||
|
var chunkIndex []ChunkInfo
|
||||||
|
offset := 0
|
||||||
|
|
||||||
|
for i := 0; offset < len(rawContent); i++ {
|
||||||
|
// Determine this chunk's size
|
||||||
|
end := offset + chunkSize
|
||||||
|
if end > len(rawContent) {
|
||||||
|
end = len(rawContent)
|
||||||
|
}
|
||||||
|
chunkData := rawContent[offset:end]
|
||||||
|
|
||||||
|
// Encrypt chunk (each gets its own nonce)
|
||||||
|
encryptedChunk, err := sigil.In(chunkData)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to encrypt chunk %d: %w", i, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
chunks = append(chunks, encryptedChunk)
|
||||||
|
chunkIndex = append(chunkIndex, ChunkInfo{
|
||||||
|
Offset: 0, // Will be calculated after we know all sizes
|
||||||
|
Size: len(encryptedChunk),
|
||||||
|
})
|
||||||
|
|
||||||
|
offset = end
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate chunk offsets
|
||||||
|
currentOffset := 0
|
||||||
|
for i := range chunkIndex {
|
||||||
|
chunkIndex[i].Offset = currentOffset
|
||||||
|
currentOffset += chunkIndex[i].Size
|
||||||
|
}
|
||||||
|
|
||||||
|
// Build header with chunked info
|
||||||
|
chunkedInfo := &ChunkedInfo{
|
||||||
|
ChunkSize: chunkSize,
|
||||||
|
TotalChunks: len(chunks),
|
||||||
|
TotalSize: totalSize,
|
||||||
|
Index: chunkIndex,
|
||||||
|
}
|
||||||
|
|
||||||
|
headerMap := map[string]interface{}{
|
||||||
|
"version": Version,
|
||||||
|
"algorithm": "chacha20poly1305",
|
||||||
|
"format": FormatV3,
|
||||||
|
"compression": CompressionNone, // No compression in chunked mode (per-chunk not supported yet)
|
||||||
|
"keyMethod": KeyMethodLTHNRolling,
|
||||||
|
"cadence": string(cadence),
|
||||||
|
"chunked": chunkedInfo,
|
||||||
|
"wrappedKeys": []WrappedKey{
|
||||||
|
{Date: current, Wrapped: wrappedCurrent},
|
||||||
|
{Date: next, Wrapped: wrappedNext},
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
if manifest != nil {
|
||||||
|
if manifest.IssuedAt == 0 {
|
||||||
|
manifest.IssuedAt = time.Now().Unix()
|
||||||
|
}
|
||||||
|
headerMap["manifest"] = manifest
|
||||||
|
}
|
||||||
|
|
||||||
|
// Concatenate all encrypted chunks
|
||||||
|
var payload []byte
|
||||||
|
for _, chunk := range chunks {
|
||||||
|
payload = append(payload, chunk...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrap in trix container
|
||||||
|
t := &trix.Trix{
|
||||||
|
Header: headerMap,
|
||||||
|
Payload: payload,
|
||||||
|
}
|
||||||
|
|
||||||
|
return trix.Encode(t, Magic, nil)
|
||||||
|
}
|
||||||
|
|
||||||
|
// DecryptV3 decrypts a v3 streaming message using rolling keys.
|
||||||
|
// It tries today's key first, then tomorrow's key.
|
||||||
|
// Automatically handles both chunked and non-chunked v3 formats.
|
||||||
|
func DecryptV3(data []byte, params *StreamParams) (*Message, *Header, error) {
|
||||||
|
if params == nil || params.License == "" {
|
||||||
|
return nil, nil, ErrLicenseRequired
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decode trix container
|
||||||
|
t, err := trix.Decode(data, Magic, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, fmt.Errorf("failed to decode container: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse header
|
||||||
|
headerJSON, err := json.Marshal(t.Header)
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, fmt.Errorf("failed to marshal header: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
var header Header
|
||||||
|
if err := json.Unmarshal(headerJSON, &header); err != nil {
|
||||||
|
return nil, nil, fmt.Errorf("failed to parse header: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify v3 format
|
||||||
|
if header.Format != FormatV3 {
|
||||||
|
return nil, nil, fmt.Errorf("expected v3 format, got: %s", header.Format)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.KeyMethod != KeyMethodLTHNRolling {
|
||||||
|
return nil, nil, fmt.Errorf("unsupported key method: %s", header.KeyMethod)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Determine cadence from header (or use params, or default to daily)
|
||||||
|
cadence := header.Cadence
|
||||||
|
if cadence == "" && params.Cadence != "" {
|
||||||
|
cadence = params.Cadence
|
||||||
|
}
|
||||||
|
if cadence == "" {
|
||||||
|
cadence = CadenceDaily
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to unwrap CEK with rolling keys
|
||||||
|
cek, err := tryUnwrapCEK(header.WrappedKeys, params, cadence)
|
||||||
|
if err != nil {
|
||||||
|
return nil, &header, err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check if chunked format
|
||||||
|
if header.Chunked != nil {
|
||||||
|
return decryptV3Chunked(t.Payload, cek, &header)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Non-chunked v3
|
||||||
|
return decryptV3Standard(t.Payload, cek, &header)
|
||||||
|
}
|
||||||
|
|
||||||
|
// decryptV3Standard handles non-chunked v3 decryption
|
||||||
|
func decryptV3Standard(payload []byte, cek []byte, header *Header) (*Message, *Header, error) {
|
||||||
|
if len(payload) < 8 {
|
||||||
|
return nil, header, ErrInvalidPayload
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read header length (skip - we already parsed from trix header)
|
||||||
|
headerLen := binary.BigEndian.Uint32(payload[:4])
|
||||||
|
pos := 4 + int(headerLen)
|
||||||
|
|
||||||
|
if len(payload) < pos+4 {
|
||||||
|
return nil, header, ErrInvalidPayload
|
||||||
|
}
|
||||||
|
|
||||||
|
// Read encrypted payload length
|
||||||
|
encryptedLen := binary.BigEndian.Uint32(payload[pos : pos+4])
|
||||||
|
pos += 4
|
||||||
|
|
||||||
|
if len(payload) < pos+int(encryptedLen) {
|
||||||
|
return nil, header, ErrInvalidPayload
|
||||||
|
}
|
||||||
|
|
||||||
|
// Extract encrypted payload and attachments
|
||||||
|
encryptedPayload := payload[pos : pos+int(encryptedLen)]
|
||||||
|
encryptedAttachments := payload[pos+int(encryptedLen):]
|
||||||
|
|
||||||
|
// Decrypt with CEK
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(cek)
|
||||||
|
if err != nil {
|
||||||
|
return nil, header, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
compressed, err := sigil.Out(encryptedPayload)
|
||||||
|
if err != nil {
|
||||||
|
return nil, header, ErrDecryptionFailed
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decompress
|
||||||
|
var decompressed []byte
|
||||||
|
if header.Compression == CompressionZstd {
|
||||||
|
decompressed, err = zstdDecompress(compressed)
|
||||||
|
if err != nil {
|
||||||
|
return nil, header, fmt.Errorf("decompression failed: %w", err)
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
decompressed = compressed
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse message
|
||||||
|
var msg Message
|
||||||
|
if err := json.Unmarshal(decompressed, &msg); err != nil {
|
||||||
|
return nil, header, fmt.Errorf("failed to parse message: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt attachments if present
|
||||||
|
if len(encryptedAttachments) > 0 {
|
||||||
|
attachmentData, err := sigil.Out(encryptedAttachments)
|
||||||
|
if err != nil {
|
||||||
|
return nil, header, fmt.Errorf("attachment decryption failed: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Restore attachment content from binary data
|
||||||
|
if err := restoreV3Attachments(&msg, attachmentData); err != nil {
|
||||||
|
return nil, header, err
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return &msg, header, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// decryptV3Chunked handles chunked v3 decryption
|
||||||
|
func decryptV3Chunked(payload []byte, cek []byte, header *Header) (*Message, *Header, error) {
|
||||||
|
if header.Chunked == nil {
|
||||||
|
return nil, header, fmt.Errorf("v3 chunked format missing chunked info")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create sigil for decryption
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(cek)
|
||||||
|
if err != nil {
|
||||||
|
return nil, header, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt all chunks
|
||||||
|
var decrypted []byte
|
||||||
|
|
||||||
|
for i, ci := range header.Chunked.Index {
|
||||||
|
if ci.Offset+ci.Size > len(payload) {
|
||||||
|
return nil, header, fmt.Errorf("chunk %d out of bounds", i)
|
||||||
|
}
|
||||||
|
|
||||||
|
chunkData := payload[ci.Offset : ci.Offset+ci.Size]
|
||||||
|
plaintext, err := sigil.Out(chunkData)
|
||||||
|
if err != nil {
|
||||||
|
return nil, header, fmt.Errorf("failed to decrypt chunk %d: %w", i, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
decrypted = append(decrypted, plaintext...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse decrypted content (metadata JSON + attachments)
|
||||||
|
var msg Message
|
||||||
|
if err := json.Unmarshal(decrypted, &msg); err != nil {
|
||||||
|
// First part should be JSON, but may be mixed with binary
|
||||||
|
// Try to find JSON boundary
|
||||||
|
for i := 0; i < len(decrypted); i++ {
|
||||||
|
if decrypted[i] == '}' {
|
||||||
|
if err := json.Unmarshal(decrypted[:i+1], &msg); err == nil {
|
||||||
|
// Found valid JSON, rest is attachment data
|
||||||
|
if err := restoreV3Attachments(&msg, decrypted[i+1:]); err != nil {
|
||||||
|
return nil, header, err
|
||||||
|
}
|
||||||
|
return &msg, header, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil, header, fmt.Errorf("failed to parse message: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &msg, header, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// tryUnwrapCEK attempts to unwrap the CEK using current or next period's key
|
||||||
|
func tryUnwrapCEK(wrappedKeys []WrappedKey, params *StreamParams, cadence Cadence) ([]byte, error) {
|
||||||
|
current, next := GetRollingPeriods(cadence, time.Now().UTC())
|
||||||
|
|
||||||
|
// Build map of available wrapped keys by period
|
||||||
|
keysByPeriod := make(map[string]string)
|
||||||
|
for _, wk := range wrappedKeys {
|
||||||
|
keysByPeriod[wk.Date] = wk.Wrapped
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try current period's key first
|
||||||
|
if wrapped, ok := keysByPeriod[current]; ok {
|
||||||
|
streamKey := DeriveStreamKey(current, params.License, params.Fingerprint)
|
||||||
|
if cek, err := UnwrapCEK(wrapped, streamKey); err == nil {
|
||||||
|
return cek, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try next period's key
|
||||||
|
if wrapped, ok := keysByPeriod[next]; ok {
|
||||||
|
streamKey := DeriveStreamKey(next, params.License, params.Fingerprint)
|
||||||
|
if cek, err := UnwrapCEK(wrapped, streamKey); err == nil {
|
||||||
|
return cek, nil
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil, ErrNoValidKey
|
||||||
|
}
|
||||||
|
|
||||||
|
// buildV3Payload builds the message JSON and binary attachment data
|
||||||
|
func buildV3Payload(msg *Message) ([]byte, []byte, error) {
|
||||||
|
// Create a copy of the message without attachment content
|
||||||
|
msgCopy := *msg
|
||||||
|
var attachmentData []byte
|
||||||
|
|
||||||
|
for i := range msgCopy.Attachments {
|
||||||
|
att := &msgCopy.Attachments[i]
|
||||||
|
if att.Content != "" {
|
||||||
|
// Decode base64 content to binary
|
||||||
|
data, err := base64.StdEncoding.DecodeString(att.Content)
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, fmt.Errorf("failed to decode attachment %s: %w", att.Name, err)
|
||||||
|
}
|
||||||
|
attachmentData = append(attachmentData, data...)
|
||||||
|
att.Content = "" // Clear content, will be restored on decrypt
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Marshal message (without attachment content)
|
||||||
|
payload, err := json.Marshal(&msgCopy)
|
||||||
|
if err != nil {
|
||||||
|
return nil, nil, fmt.Errorf("failed to marshal message: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return payload, attachmentData, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// restoreV3Attachments restores attachment content from decrypted binary data
|
||||||
|
func restoreV3Attachments(msg *Message, data []byte) error {
|
||||||
|
offset := 0
|
||||||
|
for i := range msg.Attachments {
|
||||||
|
att := &msg.Attachments[i]
|
||||||
|
if att.Size > 0 {
|
||||||
|
if offset+att.Size > len(data) {
|
||||||
|
return fmt.Errorf("attachment data truncated for %s", att.Name)
|
||||||
|
}
|
||||||
|
att.Content = base64.StdEncoding.EncodeToString(data[offset : offset+att.Size])
|
||||||
|
offset += att.Size
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// V3 Chunked Streaming Helpers
|
||||||
|
// =============================================================================
|
||||||
|
//
|
||||||
|
// When StreamParams.ChunkSize > 0, v3 format uses independently decryptable
|
||||||
|
// chunks, enabling:
|
||||||
|
// - Decrypt-while-downloading: Play media as it arrives
|
||||||
|
// - HTTP Range requests: Fetch specific chunks by byte range
|
||||||
|
// - Seekable playback: Jump to any position without decrypting everything
|
||||||
|
//
|
||||||
|
// Each chunk is encrypted with the same CEK but has its own nonce,
|
||||||
|
// making it independently decryptable.
|
||||||
|
|
||||||
|
// DecryptV3Chunk decrypts a single chunk by index.
|
||||||
|
// This enables streaming playback and seeking without decrypting the entire file.
|
||||||
|
//
|
||||||
|
// Usage for streaming:
|
||||||
|
//
|
||||||
|
// header, _ := GetV3Header(data)
|
||||||
|
// cek, _ := UnwrapCEKFromHeader(header, params)
|
||||||
|
// payload, _ := GetV3Payload(data)
|
||||||
|
// for i := 0; i < header.Chunked.TotalChunks; i++ {
|
||||||
|
// chunk, _ := DecryptV3Chunk(payload, cek, i, header.Chunked)
|
||||||
|
// player.Write(chunk)
|
||||||
|
// }
|
||||||
|
func DecryptV3Chunk(payload []byte, cek []byte, chunkIndex int, chunked *ChunkedInfo) ([]byte, error) {
|
||||||
|
if chunked == nil {
|
||||||
|
return nil, fmt.Errorf("chunked info is nil")
|
||||||
|
}
|
||||||
|
if chunkIndex < 0 || chunkIndex >= len(chunked.Index) {
|
||||||
|
return nil, fmt.Errorf("chunk index %d out of range [0, %d)", chunkIndex, len(chunked.Index))
|
||||||
|
}
|
||||||
|
|
||||||
|
ci := chunked.Index[chunkIndex]
|
||||||
|
if ci.Offset+ci.Size > len(payload) {
|
||||||
|
return nil, fmt.Errorf("chunk %d data out of bounds", chunkIndex)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Create sigil and decrypt
|
||||||
|
sigil, err := enchantrix.NewChaChaPolySigil(cek)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to create sigil: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
chunkData := payload[ci.Offset : ci.Offset+ci.Size]
|
||||||
|
return sigil.Out(chunkData)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetV3Header extracts the header from a v3 file without decrypting.
|
||||||
|
// Useful for getting chunk index for Range requests.
|
||||||
|
func GetV3Header(data []byte) (*Header, error) {
|
||||||
|
t, err := trix.Decode(data, Magic, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to decode container: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
headerJSON, err := json.Marshal(t.Header)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to marshal header: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
var header Header
|
||||||
|
if err := json.Unmarshal(headerJSON, &header); err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to parse header: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Format != FormatV3 {
|
||||||
|
return nil, fmt.Errorf("not a v3 format: %s", header.Format)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &header, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// UnwrapCEKFromHeader unwraps the CEK from a v3 header using stream params.
|
||||||
|
// Returns the CEK for use with DecryptV3Chunk.
|
||||||
|
func UnwrapCEKFromHeader(header *Header, params *StreamParams) ([]byte, error) {
|
||||||
|
if params == nil || params.License == "" {
|
||||||
|
return nil, ErrLicenseRequired
|
||||||
|
}
|
||||||
|
|
||||||
|
cadence := header.Cadence
|
||||||
|
if cadence == "" && params.Cadence != "" {
|
||||||
|
cadence = params.Cadence
|
||||||
|
}
|
||||||
|
if cadence == "" {
|
||||||
|
cadence = CadenceDaily
|
||||||
|
}
|
||||||
|
|
||||||
|
return tryUnwrapCEK(header.WrappedKeys, params, cadence)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetV3Payload extracts just the payload from a v3 file.
|
||||||
|
// Use with DecryptV3Chunk for individual chunk decryption.
|
||||||
|
func GetV3Payload(data []byte) ([]byte, error) {
|
||||||
|
t, err := trix.Decode(data, Magic, nil)
|
||||||
|
if err != nil {
|
||||||
|
return nil, fmt.Errorf("failed to decode container: %w", err)
|
||||||
|
}
|
||||||
|
return t.Payload, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetV3HeaderFromPrefix parses the v3 header from just the file prefix.
|
||||||
|
// This enables streaming: parse header as soon as first few KB arrive.
|
||||||
|
// Returns header and payload offset (where encrypted chunks start).
|
||||||
|
//
|
||||||
|
// File format:
|
||||||
|
// - Bytes 0-3: Magic "SMSG"
|
||||||
|
// - Bytes 4-5: Version (2-byte little endian)
|
||||||
|
// - Bytes 6-8: Header length (3-byte big endian)
|
||||||
|
// - Bytes 9+: Header JSON
|
||||||
|
// - Payload starts at offset 9 + headerLen
|
||||||
|
func GetV3HeaderFromPrefix(data []byte) (*Header, int, error) {
|
||||||
|
// Need at least magic + version + header length indicator
|
||||||
|
if len(data) < 9 {
|
||||||
|
return nil, 0, fmt.Errorf("need at least 9 bytes, got %d", len(data))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Check magic
|
||||||
|
if string(data[0:4]) != Magic {
|
||||||
|
return nil, 0, ErrInvalidMagic
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse header length (3 bytes big endian at offset 6-8)
|
||||||
|
headerLen := int(data[6])<<16 | int(data[7])<<8 | int(data[8])
|
||||||
|
if headerLen <= 0 || headerLen > 16*1024*1024 {
|
||||||
|
return nil, 0, fmt.Errorf("invalid header length: %d", headerLen)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Calculate payload offset
|
||||||
|
payloadOffset := 9 + headerLen
|
||||||
|
|
||||||
|
// Check if we have enough data for the header
|
||||||
|
if len(data) < payloadOffset {
|
||||||
|
return nil, 0, fmt.Errorf("need %d bytes for header, got %d", payloadOffset, len(data))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Parse header JSON
|
||||||
|
headerJSON := data[9:payloadOffset]
|
||||||
|
var header Header
|
||||||
|
if err := json.Unmarshal(headerJSON, &header); err != nil {
|
||||||
|
return nil, 0, fmt.Errorf("failed to parse header JSON: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Format != FormatV3 {
|
||||||
|
return nil, 0, fmt.Errorf("not a v3 format: %s", header.Format)
|
||||||
|
}
|
||||||
|
|
||||||
|
return &header, payloadOffset, nil
|
||||||
|
}
|
||||||
677
pkg/smsg/stream_test.go
Normal file
677
pkg/smsg/stream_test.go
Normal file
|
|
@ -0,0 +1,677 @@
|
||||||
|
package smsg
|
||||||
|
|
||||||
|
import (
|
||||||
|
"testing"
|
||||||
|
"time"
|
||||||
|
)
|
||||||
|
|
||||||
|
func TestDeriveStreamKey(t *testing.T) {
|
||||||
|
// Test that same inputs produce same key
|
||||||
|
key1 := DeriveStreamKey("2026-01-12", "license123", "fingerprint456")
|
||||||
|
key2 := DeriveStreamKey("2026-01-12", "license123", "fingerprint456")
|
||||||
|
|
||||||
|
if len(key1) != 32 {
|
||||||
|
t.Errorf("Key length = %d, want 32", len(key1))
|
||||||
|
}
|
||||||
|
|
||||||
|
if string(key1) != string(key2) {
|
||||||
|
t.Error("Same inputs should produce same key")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test that different dates produce different keys
|
||||||
|
key3 := DeriveStreamKey("2026-01-13", "license123", "fingerprint456")
|
||||||
|
if string(key1) == string(key3) {
|
||||||
|
t.Error("Different dates should produce different keys")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Test that different licenses produce different keys
|
||||||
|
key4 := DeriveStreamKey("2026-01-12", "license789", "fingerprint456")
|
||||||
|
if string(key1) == string(key4) {
|
||||||
|
t.Error("Different licenses should produce different keys")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestGetRollingDates(t *testing.T) {
|
||||||
|
today, tomorrow := GetRollingDates()
|
||||||
|
|
||||||
|
// Parse dates to verify format
|
||||||
|
todayTime, err := time.Parse("2006-01-02", today)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Invalid today format: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
tomorrowTime, err := time.Parse("2006-01-02", tomorrow)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("Invalid tomorrow format: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Tomorrow should be 1 day after today
|
||||||
|
diff := tomorrowTime.Sub(todayTime)
|
||||||
|
if diff != 24*time.Hour {
|
||||||
|
t.Errorf("Tomorrow should be 24h after today, got %v", diff)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestWrapUnwrapCEK(t *testing.T) {
|
||||||
|
// Generate a test CEK
|
||||||
|
cek, err := GenerateCEK()
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GenerateCEK failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Generate a stream key
|
||||||
|
streamKey := DeriveStreamKey("2026-01-12", "test-license", "test-fp")
|
||||||
|
|
||||||
|
// Wrap CEK
|
||||||
|
wrapped, err := WrapCEK(cek, streamKey)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("WrapCEK failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Unwrap CEK
|
||||||
|
unwrapped, err := UnwrapCEK(wrapped, streamKey)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("UnwrapCEK failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify CEK matches
|
||||||
|
if string(cek) != string(unwrapped) {
|
||||||
|
t.Error("Unwrapped CEK doesn't match original")
|
||||||
|
}
|
||||||
|
|
||||||
|
// Wrong key should fail
|
||||||
|
wrongKey := DeriveStreamKey("2026-01-12", "wrong-license", "test-fp")
|
||||||
|
_, err = UnwrapCEK(wrapped, wrongKey)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("UnwrapCEK with wrong key should fail")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestEncryptDecryptV3RoundTrip(t *testing.T) {
|
||||||
|
msg := NewMessage("Hello, this is a v3 streaming message!").
|
||||||
|
WithSubject("V3 Test").
|
||||||
|
WithFrom("stream@dapp.fm")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "test-license-123",
|
||||||
|
Fingerprint: "device-fp-456",
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest := NewManifest("Test Track")
|
||||||
|
manifest.Artist = "Test Artist"
|
||||||
|
manifest.LicenseType = "stream"
|
||||||
|
|
||||||
|
// Encrypt
|
||||||
|
encrypted, err := EncryptV3(msg, params, manifest)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt with same params
|
||||||
|
decrypted, header, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify message content
|
||||||
|
if decrypted.Body != msg.Body {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, msg.Body)
|
||||||
|
}
|
||||||
|
if decrypted.Subject != msg.Subject {
|
||||||
|
t.Errorf("Subject = %q, want %q", decrypted.Subject, msg.Subject)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify header
|
||||||
|
if header.Format != FormatV3 {
|
||||||
|
t.Errorf("Format = %q, want %q", header.Format, FormatV3)
|
||||||
|
}
|
||||||
|
if header.KeyMethod != KeyMethodLTHNRolling {
|
||||||
|
t.Errorf("KeyMethod = %q, want %q", header.KeyMethod, KeyMethodLTHNRolling)
|
||||||
|
}
|
||||||
|
if len(header.WrappedKeys) != 2 {
|
||||||
|
t.Errorf("WrappedKeys count = %d, want 2", len(header.WrappedKeys))
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify manifest
|
||||||
|
if header.Manifest == nil {
|
||||||
|
t.Fatal("Manifest is nil")
|
||||||
|
}
|
||||||
|
if header.Manifest.Title != "Test Track" {
|
||||||
|
t.Errorf("Manifest.Title = %q, want %q", header.Manifest.Title, "Test Track")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestDecryptV3WrongLicense(t *testing.T) {
|
||||||
|
msg := NewMessage("Secret content")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "correct-license",
|
||||||
|
Fingerprint: "device-fp",
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to decrypt with wrong license
|
||||||
|
wrongParams := &StreamParams{
|
||||||
|
License: "wrong-license",
|
||||||
|
Fingerprint: "device-fp",
|
||||||
|
}
|
||||||
|
|
||||||
|
_, _, err = DecryptV3(encrypted, wrongParams)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("DecryptV3 with wrong license should fail")
|
||||||
|
}
|
||||||
|
if err != ErrNoValidKey {
|
||||||
|
t.Errorf("Error = %v, want ErrNoValidKey", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestDecryptV3WrongFingerprint(t *testing.T) {
|
||||||
|
msg := NewMessage("Secret content")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "test-license",
|
||||||
|
Fingerprint: "correct-fingerprint",
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to decrypt with wrong fingerprint
|
||||||
|
wrongParams := &StreamParams{
|
||||||
|
License: "test-license",
|
||||||
|
Fingerprint: "wrong-fingerprint",
|
||||||
|
}
|
||||||
|
|
||||||
|
_, _, err = DecryptV3(encrypted, wrongParams)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("DecryptV3 with wrong fingerprint should fail")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestEncryptV3WithAttachment(t *testing.T) {
|
||||||
|
msg := NewMessage("Message with attachment")
|
||||||
|
msg.AddBinaryAttachment("test.mp3", []byte("fake audio data here"), "audio/mpeg")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "test-license",
|
||||||
|
Fingerprint: "test-fp",
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
decrypted, _, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify attachment
|
||||||
|
if len(decrypted.Attachments) != 1 {
|
||||||
|
t.Fatalf("Attachment count = %d, want 1", len(decrypted.Attachments))
|
||||||
|
}
|
||||||
|
|
||||||
|
att := decrypted.GetAttachment("test.mp3")
|
||||||
|
if att == nil {
|
||||||
|
t.Fatal("Attachment not found")
|
||||||
|
}
|
||||||
|
if att.MimeType != "audio/mpeg" {
|
||||||
|
t.Errorf("MimeType = %q, want %q", att.MimeType, "audio/mpeg")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestEncryptV3RequiresLicense(t *testing.T) {
|
||||||
|
msg := NewMessage("Test")
|
||||||
|
|
||||||
|
// Nil params
|
||||||
|
_, err := EncryptV3(msg, nil, nil)
|
||||||
|
if err != ErrLicenseRequired {
|
||||||
|
t.Errorf("Error = %v, want ErrLicenseRequired", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Empty license
|
||||||
|
_, err = EncryptV3(msg, &StreamParams{}, nil)
|
||||||
|
if err != ErrLicenseRequired {
|
||||||
|
t.Errorf("Error = %v, want ErrLicenseRequired", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCadencePeriods(t *testing.T) {
|
||||||
|
// Test at a known time: 2026-01-12 15:30:00 UTC
|
||||||
|
testTime := time.Date(2026, 1, 12, 15, 30, 0, 0, time.UTC)
|
||||||
|
|
||||||
|
tests := []struct {
|
||||||
|
cadence Cadence
|
||||||
|
expectedCurrent string
|
||||||
|
expectedNext string
|
||||||
|
}{
|
||||||
|
{CadenceDaily, "2026-01-12", "2026-01-13"},
|
||||||
|
{CadenceHalfDay, "2026-01-12-PM", "2026-01-13-AM"},
|
||||||
|
{CadenceQuarter, "2026-01-12-12", "2026-01-12-18"},
|
||||||
|
{CadenceHourly, "2026-01-12-15", "2026-01-12-16"},
|
||||||
|
}
|
||||||
|
|
||||||
|
for _, tc := range tests {
|
||||||
|
t.Run(string(tc.cadence), func(t *testing.T) {
|
||||||
|
current, next := GetRollingPeriods(tc.cadence, testTime)
|
||||||
|
if current != tc.expectedCurrent {
|
||||||
|
t.Errorf("current = %q, want %q", current, tc.expectedCurrent)
|
||||||
|
}
|
||||||
|
if next != tc.expectedNext {
|
||||||
|
t.Errorf("next = %q, want %q", next, tc.expectedNext)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCadenceHalfDayAM(t *testing.T) {
|
||||||
|
// Test in the morning
|
||||||
|
testTime := time.Date(2026, 1, 12, 9, 0, 0, 0, time.UTC)
|
||||||
|
current, next := GetRollingPeriods(CadenceHalfDay, testTime)
|
||||||
|
|
||||||
|
if current != "2026-01-12-AM" {
|
||||||
|
t.Errorf("current = %q, want %q", current, "2026-01-12-AM")
|
||||||
|
}
|
||||||
|
if next != "2026-01-12-PM" {
|
||||||
|
t.Errorf("next = %q, want %q", next, "2026-01-12-PM")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestCadenceQuarterBoundary(t *testing.T) {
|
||||||
|
// Test at 23:00 - should wrap to next day
|
||||||
|
testTime := time.Date(2026, 1, 12, 23, 0, 0, 0, time.UTC)
|
||||||
|
current, next := GetRollingPeriods(CadenceQuarter, testTime)
|
||||||
|
|
||||||
|
if current != "2026-01-12-18" {
|
||||||
|
t.Errorf("current = %q, want %q", current, "2026-01-12-18")
|
||||||
|
}
|
||||||
|
if next != "2026-01-13-00" {
|
||||||
|
t.Errorf("next = %q, want %q", next, "2026-01-13-00")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestEncryptDecryptV3WithCadence(t *testing.T) {
|
||||||
|
cadences := []Cadence{CadenceDaily, CadenceHalfDay, CadenceQuarter, CadenceHourly}
|
||||||
|
|
||||||
|
for _, cadence := range cadences {
|
||||||
|
t.Run(string(cadence), func(t *testing.T) {
|
||||||
|
msg := NewMessage("Testing " + string(cadence) + " cadence")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "cadence-test-license",
|
||||||
|
Fingerprint: "cadence-test-fp",
|
||||||
|
Cadence: cadence,
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt with same params
|
||||||
|
decrypted, header, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if decrypted.Body != msg.Body {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, msg.Body)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify cadence in header
|
||||||
|
if header.Cadence != cadence {
|
||||||
|
t.Errorf("Cadence = %q, want %q", header.Cadence, cadence)
|
||||||
|
}
|
||||||
|
})
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestRollingKeyWindow(t *testing.T) {
|
||||||
|
// This test verifies that both today's and tomorrow's keys work
|
||||||
|
msg := NewMessage("Rolling window test")
|
||||||
|
|
||||||
|
// Create params
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "rolling-test-license",
|
||||||
|
Fingerprint: "rolling-test-fp",
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt with current time
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Should decrypt successfully (within rolling window)
|
||||||
|
decrypted, header, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if decrypted.Body != msg.Body {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, msg.Body)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify we have both today and tomorrow keys
|
||||||
|
today, tomorrow := GetRollingDates()
|
||||||
|
hasToday := false
|
||||||
|
hasTomorrow := false
|
||||||
|
for _, wk := range header.WrappedKeys {
|
||||||
|
if wk.Date == today {
|
||||||
|
hasToday = true
|
||||||
|
}
|
||||||
|
if wk.Date == tomorrow {
|
||||||
|
hasTomorrow = true
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if !hasToday {
|
||||||
|
t.Error("Missing today's wrapped key")
|
||||||
|
}
|
||||||
|
if !hasTomorrow {
|
||||||
|
t.Error("Missing tomorrow's wrapped key")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// =============================================================================
|
||||||
|
// V3 Chunked Streaming Tests
|
||||||
|
// =============================================================================
|
||||||
|
|
||||||
|
func TestEncryptDecryptV3ChunkedBasic(t *testing.T) {
|
||||||
|
msg := NewMessage("This is a chunked streaming test message")
|
||||||
|
msg.WithSubject("Chunked Test")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "chunk-license",
|
||||||
|
Fingerprint: "chunk-fp",
|
||||||
|
ChunkSize: 64, // Small chunks for testing
|
||||||
|
}
|
||||||
|
|
||||||
|
manifest := NewManifest("Chunked Track")
|
||||||
|
manifest.Artist = "Test Artist"
|
||||||
|
|
||||||
|
// Encrypt with chunking
|
||||||
|
encrypted, err := EncryptV3(msg, params, manifest)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt - automatically handles chunked format
|
||||||
|
decrypted, header, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify content
|
||||||
|
if decrypted.Body != msg.Body {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, msg.Body)
|
||||||
|
}
|
||||||
|
if decrypted.Subject != msg.Subject {
|
||||||
|
t.Errorf("Subject = %q, want %q", decrypted.Subject, msg.Subject)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify header
|
||||||
|
if header.Format != FormatV3 {
|
||||||
|
t.Errorf("Format = %q, want %q", header.Format, FormatV3)
|
||||||
|
}
|
||||||
|
if header.Chunked == nil {
|
||||||
|
t.Fatal("Chunked info is nil")
|
||||||
|
}
|
||||||
|
if header.Chunked.ChunkSize != 64 {
|
||||||
|
t.Errorf("ChunkSize = %d, want 64", header.Chunked.ChunkSize)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV3ChunkedWithAttachment(t *testing.T) {
|
||||||
|
// Create a message with attachment larger than chunk size
|
||||||
|
attachmentData := make([]byte, 256)
|
||||||
|
for i := range attachmentData {
|
||||||
|
attachmentData[i] = byte(i)
|
||||||
|
}
|
||||||
|
|
||||||
|
msg := NewMessage("Message with large attachment")
|
||||||
|
msg.AddBinaryAttachment("test.bin", attachmentData, "application/octet-stream")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "attach-license",
|
||||||
|
Fingerprint: "attach-fp",
|
||||||
|
ChunkSize: 64, // Force multiple chunks
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify we have multiple chunks
|
||||||
|
header, err := GetV3Header(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetV3Header failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if header.Chunked.TotalChunks <= 1 {
|
||||||
|
t.Errorf("TotalChunks = %d, want > 1", header.Chunked.TotalChunks)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt
|
||||||
|
decrypted, _, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify attachment
|
||||||
|
if len(decrypted.Attachments) != 1 {
|
||||||
|
t.Fatalf("Attachment count = %d, want 1", len(decrypted.Attachments))
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV3ChunkedIndividualChunks(t *testing.T) {
|
||||||
|
// Create content that spans multiple chunks
|
||||||
|
largeContent := make([]byte, 200)
|
||||||
|
for i := range largeContent {
|
||||||
|
largeContent[i] = byte(i % 256)
|
||||||
|
}
|
||||||
|
|
||||||
|
msg := NewMessage("Chunk-by-chunk test")
|
||||||
|
msg.AddBinaryAttachment("data.bin", largeContent, "application/octet-stream")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "individual-license",
|
||||||
|
Fingerprint: "individual-fp",
|
||||||
|
ChunkSize: 50, // Force ~5 chunks
|
||||||
|
}
|
||||||
|
|
||||||
|
// Encrypt
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Get header and payload
|
||||||
|
header, err := GetV3Header(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetV3Header failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
payload, err := GetV3Payload(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetV3Payload failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Unwrap CEK
|
||||||
|
cek, err := UnwrapCEKFromHeader(header, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("UnwrapCEKFromHeader failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Decrypt each chunk individually
|
||||||
|
var allDecrypted []byte
|
||||||
|
for i := 0; i < header.Chunked.TotalChunks; i++ {
|
||||||
|
chunk, err := DecryptV3Chunk(payload, cek, i, header.Chunked)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3Chunk(%d) failed: %v", i, err)
|
||||||
|
}
|
||||||
|
allDecrypted = append(allDecrypted, chunk...)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify total size matches
|
||||||
|
if int64(len(allDecrypted)) != header.Chunked.TotalSize {
|
||||||
|
t.Errorf("Decrypted size = %d, want %d", len(allDecrypted), header.Chunked.TotalSize)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV3ChunkedWrongLicense(t *testing.T) {
|
||||||
|
msg := NewMessage("Secret chunked content")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "correct-chunked-license",
|
||||||
|
Fingerprint: "device-fp",
|
||||||
|
ChunkSize: 64,
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Try to decrypt with wrong license
|
||||||
|
wrongParams := &StreamParams{
|
||||||
|
License: "wrong-chunked-license",
|
||||||
|
Fingerprint: "device-fp",
|
||||||
|
}
|
||||||
|
|
||||||
|
_, _, err = DecryptV3(encrypted, wrongParams)
|
||||||
|
if err == nil {
|
||||||
|
t.Error("DecryptV3 (chunked) with wrong license should fail")
|
||||||
|
}
|
||||||
|
if err != ErrNoValidKey {
|
||||||
|
t.Errorf("Error = %v, want ErrNoValidKey", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV3ChunkedChunkIndex(t *testing.T) {
|
||||||
|
msg := NewMessage("Index test")
|
||||||
|
msg.AddBinaryAttachment("test.dat", make([]byte, 150), "application/octet-stream")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "index-license",
|
||||||
|
Fingerprint: "index-fp",
|
||||||
|
ChunkSize: 50,
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
header, err := GetV3Header(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetV3Header failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify index structure
|
||||||
|
if len(header.Chunked.Index) != header.Chunked.TotalChunks {
|
||||||
|
t.Errorf("Index length = %d, want %d", len(header.Chunked.Index), header.Chunked.TotalChunks)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Verify offsets are sequential
|
||||||
|
expectedOffset := 0
|
||||||
|
for i, ci := range header.Chunked.Index {
|
||||||
|
if ci.Offset != expectedOffset {
|
||||||
|
t.Errorf("Chunk %d offset = %d, want %d", i, ci.Offset, expectedOffset)
|
||||||
|
}
|
||||||
|
expectedOffset += ci.Size
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV3ChunkedSeekMiddleChunk(t *testing.T) {
|
||||||
|
// Create predictable data
|
||||||
|
data := make([]byte, 300)
|
||||||
|
for i := range data {
|
||||||
|
data[i] = byte(i % 256)
|
||||||
|
}
|
||||||
|
|
||||||
|
msg := NewMessage("Seek test")
|
||||||
|
msg.AddBinaryAttachment("seek.bin", data, "application/octet-stream")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "seek-license",
|
||||||
|
Fingerprint: "seek-fp",
|
||||||
|
ChunkSize: 100, // 3 data chunks minimum
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
header, err := GetV3Header(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetV3Header failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
payload, err := GetV3Payload(encrypted)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("GetV3Payload failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
cek, err := UnwrapCEKFromHeader(header, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("UnwrapCEKFromHeader failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Skip to middle chunk (simulate seeking)
|
||||||
|
if header.Chunked.TotalChunks < 2 {
|
||||||
|
t.Skip("Need at least 2 chunks for seek test")
|
||||||
|
}
|
||||||
|
|
||||||
|
middleIdx := header.Chunked.TotalChunks / 2
|
||||||
|
chunk, err := DecryptV3Chunk(payload, cek, middleIdx, header.Chunked)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3Chunk(%d) failed: %v", middleIdx, err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Just verify we got something
|
||||||
|
if len(chunk) == 0 {
|
||||||
|
t.Error("Middle chunk is empty")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestV3NonChunkedStillWorks(t *testing.T) {
|
||||||
|
// Verify non-chunked v3 still works (ChunkSize = 0)
|
||||||
|
msg := NewMessage("Non-chunked v3 test")
|
||||||
|
msg.WithSubject("No Chunks")
|
||||||
|
|
||||||
|
params := &StreamParams{
|
||||||
|
License: "non-chunk-license",
|
||||||
|
Fingerprint: "non-chunk-fp",
|
||||||
|
// ChunkSize = 0 (default) - no chunking
|
||||||
|
}
|
||||||
|
|
||||||
|
encrypted, err := EncryptV3(msg, params, nil)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("EncryptV3 (non-chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
decrypted, header, err := DecryptV3(encrypted, params)
|
||||||
|
if err != nil {
|
||||||
|
t.Fatalf("DecryptV3 (non-chunked) failed: %v", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
if decrypted.Body != msg.Body {
|
||||||
|
t.Errorf("Body = %q, want %q", decrypted.Body, msg.Body)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Non-chunked should not have Chunked info
|
||||||
|
if header.Chunked != nil {
|
||||||
|
t.Error("Non-chunked v3 should not have Chunked info")
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
@ -2,10 +2,20 @@
|
||||||
// SMSG (Secure Message) enables encrypted message exchange where the recipient
|
// SMSG (Secure Message) enables encrypted message exchange where the recipient
|
||||||
// decrypts using a pre-shared password. Useful for secure support replies,
|
// decrypts using a pre-shared password. Useful for secure support replies,
|
||||||
// confidential documents, and any scenario requiring password-protected content.
|
// confidential documents, and any scenario requiring password-protected content.
|
||||||
|
//
|
||||||
|
// Format versions:
|
||||||
|
// - v1: JSON with base64-encoded attachments (legacy)
|
||||||
|
// - v2: Binary format with zstd compression (current)
|
||||||
|
// - v3: Streaming with LTHN rolling keys (planned)
|
||||||
|
//
|
||||||
|
// Encryption note: Nonces are embedded in ciphertext, not transmitted separately.
|
||||||
|
// See smsg.go header comment for details.
|
||||||
package smsg
|
package smsg
|
||||||
|
|
||||||
import (
|
import (
|
||||||
|
"encoding/base64"
|
||||||
"errors"
|
"errors"
|
||||||
|
"time"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Magic bytes for SMSG format
|
// Magic bytes for SMSG format
|
||||||
|
|
@ -16,19 +26,22 @@ const Version = "1.0"
|
||||||
|
|
||||||
// Errors
|
// Errors
|
||||||
var (
|
var (
|
||||||
ErrInvalidMagic = errors.New("invalid SMSG magic")
|
ErrInvalidMagic = errors.New("invalid SMSG magic")
|
||||||
ErrInvalidPayload = errors.New("invalid SMSG payload")
|
ErrInvalidPayload = errors.New("invalid SMSG payload")
|
||||||
ErrDecryptionFailed = errors.New("decryption failed (wrong password?)")
|
ErrDecryptionFailed = errors.New("decryption failed (wrong password?)")
|
||||||
ErrPasswordRequired = errors.New("password is required")
|
ErrPasswordRequired = errors.New("password is required")
|
||||||
ErrEmptyMessage = errors.New("message cannot be empty")
|
ErrEmptyMessage = errors.New("message cannot be empty")
|
||||||
|
ErrStreamKeyExpired = errors.New("stream key expired (outside rolling window)")
|
||||||
|
ErrNoValidKey = errors.New("no valid wrapped key found for current date")
|
||||||
|
ErrLicenseRequired = errors.New("license is required for stream decryption")
|
||||||
)
|
)
|
||||||
|
|
||||||
// Attachment represents a file attached to the message
|
// Attachment represents a file attached to the message
|
||||||
type Attachment struct {
|
type Attachment struct {
|
||||||
Name string `json:"name"`
|
Name string `json:"name"`
|
||||||
Content string `json:"content"` // base64-encoded
|
Content string `json:"content,omitempty"` // base64-encoded (v1) or empty (v2, populated on decrypt)
|
||||||
MimeType string `json:"mime,omitempty"`
|
MimeType string `json:"mime,omitempty"`
|
||||||
Size int `json:"size,omitempty"`
|
Size int `json:"size,omitempty"` // binary size in bytes
|
||||||
}
|
}
|
||||||
|
|
||||||
// PKIInfo contains public key information for authenticated replies
|
// PKIInfo contains public key information for authenticated replies
|
||||||
|
|
@ -83,13 +96,25 @@ func (m *Message) WithTimestamp(ts int64) *Message {
|
||||||
return m
|
return m
|
||||||
}
|
}
|
||||||
|
|
||||||
// AddAttachment adds a file attachment
|
// AddAttachment adds a file attachment (content is base64-encoded)
|
||||||
func (m *Message) AddAttachment(name, content, mimeType string) *Message {
|
func (m *Message) AddAttachment(name, content, mimeType string) *Message {
|
||||||
m.Attachments = append(m.Attachments, Attachment{
|
m.Attachments = append(m.Attachments, Attachment{
|
||||||
Name: name,
|
Name: name,
|
||||||
Content: content,
|
Content: content,
|
||||||
MimeType: mimeType,
|
MimeType: mimeType,
|
||||||
Size: len(content),
|
Size: len(content), // base64 size for v1 compatibility
|
||||||
|
})
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddBinaryAttachment adds a raw binary attachment (for v2 format)
|
||||||
|
// The content will be base64-encoded for API compatibility
|
||||||
|
func (m *Message) AddBinaryAttachment(name string, data []byte, mimeType string) *Message {
|
||||||
|
m.Attachments = append(m.Attachments, Attachment{
|
||||||
|
Name: name,
|
||||||
|
Content: base64.StdEncoding.EncodeToString(data),
|
||||||
|
MimeType: mimeType,
|
||||||
|
Size: len(data), // actual binary size
|
||||||
})
|
})
|
||||||
return m
|
return m
|
||||||
}
|
}
|
||||||
|
|
@ -128,9 +153,264 @@ func (m *Message) GetAttachment(name string) *Attachment {
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Track represents a track marker in a release (like CD chapters)
|
||||||
|
type Track struct {
|
||||||
|
Title string `json:"title"`
|
||||||
|
Start float64 `json:"start"` // start time in seconds
|
||||||
|
End float64 `json:"end,omitempty"` // end time in seconds (0 = until next track)
|
||||||
|
Type string `json:"type,omitempty"` // intro, verse, chorus, drop, outro, etc.
|
||||||
|
TrackNum int `json:"track_num,omitempty"` // track number for multi-track releases
|
||||||
|
}
|
||||||
|
|
||||||
|
// Manifest contains public metadata visible without decryption
|
||||||
|
// This enables content discovery, indexing, and preview
|
||||||
|
type Manifest struct {
|
||||||
|
// Content identification
|
||||||
|
Title string `json:"title,omitempty"`
|
||||||
|
Artist string `json:"artist,omitempty"`
|
||||||
|
Album string `json:"album,omitempty"`
|
||||||
|
Genre string `json:"genre,omitempty"`
|
||||||
|
Year int `json:"year,omitempty"`
|
||||||
|
|
||||||
|
// Release info
|
||||||
|
ReleaseType string `json:"release_type,omitempty"` // single, album, ep, mix
|
||||||
|
Duration int `json:"duration,omitempty"` // total duration in seconds
|
||||||
|
Format string `json:"format,omitempty"` // dapp.fm/v1, etc.
|
||||||
|
|
||||||
|
// License expiration (for streaming/rental models)
|
||||||
|
ExpiresAt int64 `json:"expires_at,omitempty"` // Unix timestamp when license expires (0 = never)
|
||||||
|
IssuedAt int64 `json:"issued_at,omitempty"` // Unix timestamp when license was issued
|
||||||
|
LicenseType string `json:"license_type,omitempty"` // perpetual, rental, stream, preview
|
||||||
|
|
||||||
|
// Track list (like CD master)
|
||||||
|
Tracks []Track `json:"tracks,omitempty"`
|
||||||
|
|
||||||
|
// Artist links - direct to artist, skip the middlemen
|
||||||
|
Links map[string]string `json:"links,omitempty"` // platform -> URL (bandcamp, soundcloud, website, etc.)
|
||||||
|
|
||||||
|
// Custom metadata
|
||||||
|
Tags []string `json:"tags,omitempty"`
|
||||||
|
Extra map[string]string `json:"extra,omitempty"`
|
||||||
|
}
|
||||||
|
|
||||||
|
// NewManifest creates a new manifest with title
|
||||||
|
func NewManifest(title string) *Manifest {
|
||||||
|
return &Manifest{
|
||||||
|
Title: title,
|
||||||
|
Links: make(map[string]string),
|
||||||
|
Extra: make(map[string]string),
|
||||||
|
LicenseType: "perpetual",
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithExpiration sets the license expiration time
|
||||||
|
func (m *Manifest) WithExpiration(expiresAt int64) *Manifest {
|
||||||
|
m.ExpiresAt = expiresAt
|
||||||
|
if m.LicenseType == "perpetual" {
|
||||||
|
m.LicenseType = "rental"
|
||||||
|
}
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithRentalDuration sets expiration relative to issue time
|
||||||
|
func (m *Manifest) WithRentalDuration(durationSeconds int64) *Manifest {
|
||||||
|
if m.IssuedAt == 0 {
|
||||||
|
m.IssuedAt = time.Now().Unix()
|
||||||
|
}
|
||||||
|
m.ExpiresAt = m.IssuedAt + durationSeconds
|
||||||
|
m.LicenseType = "rental"
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithStreamingAccess sets up for streaming (short expiration, e.g., 24 hours)
|
||||||
|
func (m *Manifest) WithStreamingAccess(hours int) *Manifest {
|
||||||
|
m.IssuedAt = time.Now().Unix()
|
||||||
|
m.ExpiresAt = m.IssuedAt + int64(hours*3600)
|
||||||
|
m.LicenseType = "stream"
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// WithPreviewAccess sets up for preview (very short, e.g., 30 seconds)
|
||||||
|
func (m *Manifest) WithPreviewAccess(seconds int) *Manifest {
|
||||||
|
m.IssuedAt = time.Now().Unix()
|
||||||
|
m.ExpiresAt = m.IssuedAt + int64(seconds)
|
||||||
|
m.LicenseType = "preview"
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// IsExpired checks if the license has expired
|
||||||
|
func (m *Manifest) IsExpired() bool {
|
||||||
|
if m.ExpiresAt == 0 {
|
||||||
|
return false // No expiration = perpetual
|
||||||
|
}
|
||||||
|
return time.Now().Unix() > m.ExpiresAt
|
||||||
|
}
|
||||||
|
|
||||||
|
// TimeRemaining returns seconds until expiration (0 if perpetual, negative if expired)
|
||||||
|
func (m *Manifest) TimeRemaining() int64 {
|
||||||
|
if m.ExpiresAt == 0 {
|
||||||
|
return 0 // Perpetual
|
||||||
|
}
|
||||||
|
return m.ExpiresAt - time.Now().Unix()
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddTrack adds a track marker to the manifest
|
||||||
|
func (m *Manifest) AddTrack(title string, start float64) *Manifest {
|
||||||
|
m.Tracks = append(m.Tracks, Track{
|
||||||
|
Title: title,
|
||||||
|
Start: start,
|
||||||
|
TrackNum: len(m.Tracks) + 1,
|
||||||
|
})
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddTrackFull adds a track with all details
|
||||||
|
func (m *Manifest) AddTrackFull(title string, start, end float64, trackType string) *Manifest {
|
||||||
|
m.Tracks = append(m.Tracks, Track{
|
||||||
|
Title: title,
|
||||||
|
Start: start,
|
||||||
|
End: end,
|
||||||
|
Type: trackType,
|
||||||
|
TrackNum: len(m.Tracks) + 1,
|
||||||
|
})
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// AddLink adds an artist link (platform -> URL)
|
||||||
|
func (m *Manifest) AddLink(platform, url string) *Manifest {
|
||||||
|
if m.Links == nil {
|
||||||
|
m.Links = make(map[string]string)
|
||||||
|
}
|
||||||
|
m.Links[platform] = url
|
||||||
|
return m
|
||||||
|
}
|
||||||
|
|
||||||
|
// Format versions
|
||||||
|
const (
|
||||||
|
FormatV1 = "" // Original format: JSON with base64-encoded attachments
|
||||||
|
FormatV2 = "v2" // Binary format: JSON header + raw binary attachments
|
||||||
|
FormatV3 = "v3" // Streaming format: CEK wrapped with rolling LTHN keys, optional chunking
|
||||||
|
)
|
||||||
|
|
||||||
|
// Default chunk size for v3 chunked format (1MB)
|
||||||
|
const DefaultChunkSize = 1024 * 1024
|
||||||
|
|
||||||
|
// ChunkInfo describes a single chunk in v3 chunked format
|
||||||
|
type ChunkInfo struct {
|
||||||
|
Offset int `json:"offset"` // byte offset in payload
|
||||||
|
Size int `json:"size"` // encrypted chunk size (includes nonce + tag)
|
||||||
|
}
|
||||||
|
|
||||||
|
// ChunkedInfo contains chunking metadata for v3 streaming
|
||||||
|
// When present, enables decrypt-while-downloading and seeking
|
||||||
|
type ChunkedInfo struct {
|
||||||
|
ChunkSize int `json:"chunkSize"` // size of each chunk before encryption
|
||||||
|
TotalChunks int `json:"totalChunks"` // number of chunks
|
||||||
|
TotalSize int64 `json:"totalSize"` // total unencrypted size
|
||||||
|
Index []ChunkInfo `json:"index"` // chunk locations for seeking
|
||||||
|
}
|
||||||
|
|
||||||
|
// Compression types
|
||||||
|
const (
|
||||||
|
CompressionNone = "" // No compression (default, backwards compatible)
|
||||||
|
CompressionGzip = "gzip" // Gzip compression (stdlib, WASM compatible)
|
||||||
|
CompressionZstd = "zstd" // Zstandard compression (faster, better ratio)
|
||||||
|
)
|
||||||
|
|
||||||
|
// Key derivation methods for v3 streaming
|
||||||
|
const (
|
||||||
|
// KeyMethodDirect uses password directly (v1/v2 behavior)
|
||||||
|
KeyMethodDirect = ""
|
||||||
|
|
||||||
|
// KeyMethodLTHNRolling uses LTHN hash with rolling date windows
|
||||||
|
// Key = SHA256(LTHN(date:license:fingerprint))
|
||||||
|
// Valid keys: current period and next period (rolling window)
|
||||||
|
KeyMethodLTHNRolling = "lthn-rolling"
|
||||||
|
)
|
||||||
|
|
||||||
|
// Cadence defines how often stream keys rotate
|
||||||
|
type Cadence string
|
||||||
|
|
||||||
|
const (
|
||||||
|
// CadenceDaily rotates keys every 24 hours (default)
|
||||||
|
// Date format: "2006-01-02"
|
||||||
|
CadenceDaily Cadence = "daily"
|
||||||
|
|
||||||
|
// CadenceHalfDay rotates keys every 12 hours
|
||||||
|
// Date format: "2006-01-02-AM" or "2006-01-02-PM"
|
||||||
|
CadenceHalfDay Cadence = "12h"
|
||||||
|
|
||||||
|
// CadenceQuarter rotates keys every 6 hours
|
||||||
|
// Date format: "2006-01-02-00", "2006-01-02-06", "2006-01-02-12", "2006-01-02-18"
|
||||||
|
CadenceQuarter Cadence = "6h"
|
||||||
|
|
||||||
|
// CadenceHourly rotates keys every hour
|
||||||
|
// Date format: "2006-01-02-15" (24-hour format)
|
||||||
|
CadenceHourly Cadence = "1h"
|
||||||
|
)
|
||||||
|
|
||||||
|
// WrappedKey represents a CEK (Content Encryption Key) wrapped with a time-bound stream key.
|
||||||
|
// The stream key is derived from LTHN(date:license:fingerprint) and is never transmitted.
|
||||||
|
// Only the wrapped CEK (which includes its own nonce) is stored in the header.
|
||||||
|
type WrappedKey struct {
|
||||||
|
Date string `json:"date"` // ISO date "YYYY-MM-DD" for key derivation
|
||||||
|
Wrapped string `json:"wrapped"` // base64([nonce][ChaCha(CEK, streamKey)])
|
||||||
|
}
|
||||||
|
|
||||||
// Header represents the SMSG container header
|
// Header represents the SMSG container header
|
||||||
type Header struct {
|
type Header struct {
|
||||||
Version string `json:"version"`
|
Version string `json:"version"`
|
||||||
Algorithm string `json:"algorithm"`
|
Algorithm string `json:"algorithm"`
|
||||||
Hint string `json:"hint,omitempty"` // optional password hint
|
Format string `json:"format,omitempty"` // v2 for binary, v3 for streaming, empty for v1 (base64)
|
||||||
|
Compression string `json:"compression,omitempty"` // gzip, zstd, or empty for none
|
||||||
|
Hint string `json:"hint,omitempty"` // optional password hint
|
||||||
|
Manifest *Manifest `json:"manifest,omitempty"` // public metadata for discovery
|
||||||
|
|
||||||
|
// V3 streaming fields
|
||||||
|
KeyMethod string `json:"keyMethod,omitempty"` // lthn-rolling for v3
|
||||||
|
Cadence Cadence `json:"cadence,omitempty"` // key rotation frequency (daily, 12h, 6h, 1h)
|
||||||
|
WrappedKeys []WrappedKey `json:"wrappedKeys,omitempty"` // CEK wrapped with rolling keys
|
||||||
|
|
||||||
|
// V3 chunked streaming (optional - enables decrypt-while-downloading)
|
||||||
|
Chunked *ChunkedInfo `json:"chunked,omitempty"` // chunk index for seeking/range requests
|
||||||
|
}
|
||||||
|
|
||||||
|
// ========== ADAPTIVE BITRATE STREAMING (ABR) ==========
|
||||||
|
|
||||||
|
// ABRManifest represents a multi-bitrate variant playlist for adaptive streaming.
|
||||||
|
// Similar to HLS master playlist but with encrypted SMSG variants.
|
||||||
|
type ABRManifest struct {
|
||||||
|
Version string `json:"version"` // "abr-v1"
|
||||||
|
Title string `json:"title"` // Content title
|
||||||
|
Duration int `json:"duration"` // Total duration in seconds
|
||||||
|
Variants []Variant `json:"variants"` // Quality variants (sorted by bandwidth, ascending)
|
||||||
|
DefaultIdx int `json:"defaultIdx"` // Default variant index (typically 720p)
|
||||||
|
Password string `json:"-"` // Shared password for all variants (not serialized)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Variant represents a single quality level in an ABR stream.
|
||||||
|
// Each variant is a standard v3 chunked .smsg file.
|
||||||
|
type Variant struct {
|
||||||
|
Name string `json:"name"` // Human-readable name: "1080p", "720p", etc.
|
||||||
|
Bandwidth int `json:"bandwidth"` // Required bandwidth in bits per second
|
||||||
|
Width int `json:"width"` // Video width in pixels
|
||||||
|
Height int `json:"height"` // Video height in pixels
|
||||||
|
Codecs string `json:"codecs"` // Codec string: "avc1.640028,mp4a.40.2"
|
||||||
|
URL string `json:"url"` // Relative path to .smsg file
|
||||||
|
ChunkCount int `json:"chunkCount"` // Number of chunks (for progress calculation)
|
||||||
|
FileSize int64 `json:"fileSize"` // File size in bytes
|
||||||
|
}
|
||||||
|
|
||||||
|
// Standard ABR quality presets
|
||||||
|
var ABRPresets = []struct {
|
||||||
|
Name string
|
||||||
|
Width int
|
||||||
|
Height int
|
||||||
|
Bitrate string // For ffmpeg
|
||||||
|
BPS int // Bits per second
|
||||||
|
}{
|
||||||
|
{"1080p", 1920, 1080, "5M", 5000000},
|
||||||
|
{"720p", 1280, 720, "2.5M", 2500000},
|
||||||
|
{"480p", 854, 480, "1M", 1000000},
|
||||||
|
{"360p", 640, 360, "500K", 500000},
|
||||||
}
|
}
|
||||||
|
|
|
||||||
|
|
@ -7,8 +7,8 @@ import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
"github.com/Snider/Enchantrix/pkg/enchantrix"
|
"forge.lthn.ai/Snider/Enchantrix/pkg/enchantrix"
|
||||||
"github.com/Snider/Enchantrix/pkg/trix"
|
"forge.lthn.ai/Snider/Enchantrix/pkg/trix"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Decrypt decrypts a STMF payload using the server's private key.
|
// Decrypt decrypts a STMF payload using the server's private key.
|
||||||
|
|
|
||||||
|
|
@ -8,8 +8,8 @@ import (
|
||||||
"encoding/json"
|
"encoding/json"
|
||||||
"fmt"
|
"fmt"
|
||||||
|
|
||||||
"github.com/Snider/Enchantrix/pkg/enchantrix"
|
"forge.lthn.ai/Snider/Enchantrix/pkg/enchantrix"
|
||||||
"github.com/Snider/Enchantrix/pkg/trix"
|
"forge.lthn.ai/Snider/Enchantrix/pkg/trix"
|
||||||
)
|
)
|
||||||
|
|
||||||
// Encrypt encrypts form data using the server's public key.
|
// Encrypt encrypts form data using the server's public key.
|
||||||
|
|
|
||||||
|
|
@ -7,7 +7,7 @@ import (
|
||||||
"net/http"
|
"net/http"
|
||||||
"net/url"
|
"net/url"
|
||||||
|
|
||||||
"github.com/Snider/Borg/pkg/stmf"
|
"forge.lthn.ai/Snider/Borg/pkg/stmf"
|
||||||
)
|
)
|
||||||
|
|
||||||
// contextKey is a custom type for context keys to avoid collisions
|
// contextKey is a custom type for context keys to avoid collisions
|
||||||
|
|
|
||||||
Some files were not shown because too many files have changed in this diff Show more
Loading…
Add table
Reference in a new issue