New slash commands: - /collect:collect - Auto-detect and collect any resource - /collect:github - Collect GitHub repos/orgs - /collect:website - Crawl websites with depth control - /collect:excavate - Full project archaeology dig - /collect:papers - Collect whitepapers from registry All commands use Borg CLI (github.com/Snider/Borg) for collection. Co-Authored-By: Claude Opus 4.5 <noreply@anthropic.com>
1.3 KiB
1.3 KiB
| name | description | args |
|---|---|---|
| website | Crawl and collect a website using Borg | <url> [--depth N] [--format stim|tim|tar] [-o output] |
Website Collection
Crawl and collect websites using Borg.
Usage
/collect:website https://getmasari.org
/collect:website https://docs.lethean.io --depth 3
/collect:website https://graft.network --format stim -o graft-site.stim
Action
Run Borg to crawl the website:
borg collect website <url> [--depth <N>] [--format <format>] [-o <output>]
Default depth is 2 levels.
Options
| Option | Default | Description |
|---|---|---|
--depth |
2 | How many levels deep to crawl |
--format |
tar | Output format (tar, tim, stim) |
-o |
auto | Output filename |
Examples
# Basic crawl
borg collect website https://getmasari.org
# Deep crawl with encryption
borg collect website https://docs.lethean.io --depth 5 --format stim -o lethean-docs.stim
# Wayback Machine archive
borg collect website "https://web.archive.org/web/*/graft.network" --depth 3
Use Cases
- Project Documentation - Archive docs before they go offline
- Wayback Snapshots - Collect historical versions
- Forum Threads - Archive discussion pages
- PWA Collection - Use
borg collect pwafor progressive web apps