Summary:\n- added Codex marketplace registry plus awareness/ethics/guardrails sub-plugins\n- mirrored Claude plugin commands/scripts/hooks into codex api/ci/code/collect/coolify/core/issue/perf/qa/review/verify\n- embedded Axioms of Life ethics modal, guardrails, and kernel files under codex/ethics\n- added Codex parity report, improvements list, and MCP integration plan\n- extended Gemini MCP tools and docs for Codex awareness
1.3 KiB
1.3 KiB
| name | description | args |
|---|---|---|
| website | Crawl and collect a website using Borg | <url> [--depth N] [--format stim|tim|tar] [-o output] |
Website Collection
Crawl and collect websites using Borg.
Usage
/collect:website https://getmasari.org
/collect:website https://docs.lethean.io --depth 3
/collect:website https://graft.network --format stim -o graft-site.stim
Action
Run Borg to crawl the website:
borg collect website <url> [--depth <N>] [--format <format>] [-o <output>]
Default depth is 2 levels.
Options
| Option | Default | Description |
|---|---|---|
--depth |
2 | How many levels deep to crawl |
--format |
tar | Output format (tar, tim, stim) |
-o |
auto | Output filename |
Examples
# Basic crawl
borg collect website https://getmasari.org
# Deep crawl with encryption
borg collect website https://docs.lethean.io --depth 5 --format stim -o lethean-docs.stim
# Wayback Machine archive
borg collect website "https://web.archive.org/web/*/graft.network" --depth 3
Use Cases
- Project Documentation - Archive docs before they go offline
- Wayback Snapshots - Collect historical versions
- Forum Threads - Archive discussion pages
- PWA Collection - Use
borg collect pwafor progressive web apps