首页龙虾技能列表 › Bilibili Up To Kb — 技能工具

Bilibili Up To Kb — 技能工具

v0.1.0

Convert Bilibili (B站) videos into a searchable text knowledge base. Supports single videos and batch processing of entire UP主 channels. Uses local whisper.cp...

1· 370·1 当前·1 累计
by @shanjiaming·MIT-0
下载技能包
License
MIT-0
最后更新
2026/4/11
安全扫描
VirusTotal
可疑
查看报告
OpenClaw
可疑
medium confidence
The skill's code matches its stated purpose (download → transcribe → clean → index), but several runtime behaviors could leak transcripts or sensitive data (browser cookies, external model/CLI network calls) and the registry metadata doesn't fully declare the environment/credential needs — verify how the opencode CLI and model downloads behave before trusting it with private videos or credentials.
评估建议
This skill appears to do what its description says, but before running it consider: 1) Don't pass browser cookies unless you trust the environment — that option can expose other site cookies from your browser to yt-dlp. 2) Confirm how the opencode CLI and the CLEAN_MODEL operate: if opencode runs inference remotely or downloads models at runtime, your transcripts will be sent to an external service. If you need privacy, ensure opencode is configured to run locally or disable cleaning. 3) Only do...
详细分析 ▾
用途与能力
The name/description match what the scripts do: download Bilibili videos, run whisper.cpp locally, clean via an LLM-style tool, and build a KB. The skill does not declare required env vars in registry metadata even though SKILL.md and scripts reference WHISPER_CLI, WHISPER_MODEL, OPENCODE_BIN, CLEAN_MODEL and optional GEMINI_API_KEY and browser cookies. That mismatch is unexpected but not necessarily malicious.
指令范围
The scripts perform exactly the data flows described (yt-dlp → ffmpeg → whisper → clean with opencode → index). However, they optionally use --cookies-from-browser to access member-only content (this reads browser cookies via yt-dlp) and they feed transcript chunks into the opencode CLI. If opencode.run or the chosen CLEAN_MODEL execute remotely or fetch from a remote model hub, transcripts will be sent to a network service. The SKILL.md gives broad discretion (batching, auto-chunking) but the main risk is exfiltration of transcript text via third-party model/CLI or cloud LLM keys if configured (GEMINI_API_KEY referenced in docs).
安装机制
There is no automated install spec (instruction-only), so nothing is dropped automatically. The references recommend downloading whisper models from Hugging Face or a mirror (hf-mirror.com) via curl — these are expected but are external downloads the user must trust. Because the skill doesn't auto-extract or run arbitrary remote payloads, installation risk is moderate but depends on which model/CLI the user chooses to install.
凭证需求
Registry metadata lists no required credentials, which aligns with local whisper usage, but scripts and docs reference optional sensitive inputs: --cookies-from-browser (access to browser cookies), GEMINI_API_KEY (for an alternate summarize tool), and environment variables pointing at opencode and whisper binaries. Requesting browser cookies or an LLM API key is proportionate only for gated content or LLM-based cleaning — these are optional but sensitive. The skill does not require unrelated cloud credentials, so the issue is more about optional sensitive inputs that could expose transcripts to external services.
持久化与权限
The skill is user-invocable, not always-enabled, and does not request persistent platform privileges. Scripts operate in working directories and temporary folders; they do not modify other skills or system-wide settings.
安全有层次,运行前请审查代码。

License

MIT-0

可自由使用、修改和再分发,无需署名。

运行时依赖

无特殊依赖

版本

latestv0.1.02026/2/28

Initial release – Convert Bilibili videos/channels into a structured text knowledge base. - Supports single videos and batch processing of entire UP主 channels. - Uses local whisper.cpp for transcription; no API key required. - Automated transcript cleaning with paragraph-level coverage to fix ASR errors. - Outputs cleaned transcripts, metadata, and an index; raw transcripts stored separately. - Handles large channels and includes resumable, concurrency-safe bash scripts.

● 可疑

安装命令 点击复制

官方npx clawhub@latest install bilibili-up-to-kb
镜像加速npx clawhub@latest install bilibili-up-to-kb --registry https://cn.clawhub-mirror.com

技能文档

Convert B站 videos (single or entire channels) into cleaned, structured text knowledge bases.

Design Principle

Agent orchestrates, scripts execute. The agent's job is to decide WHAT to do and kick off the right script. All mechanical, repetitive work (downloading, transcribing, cleaning) is handled by shell scripts with built-in parallelism. The agent NEVER loops through videos one by one — it runs ONE command and the script handles concurrency internally.

Output Structure

kb/UP主名_UID/
├── BV号_视频标题.txt          # Cleaned transcript (user-facing)
├── BV号_视频标题.meta.json    # Video metadata
├── index.md                   # Summary index
└── .raw/                      # Hidden: whisper transcripts (if any)
    └── BV号_视频标题.txt

Key decisions:

  • File names include title for readability (BV1xxx_标题.txt)
  • Folder includes UP主 name (UP主名_UID/)
  • Raw transcripts hidden in .raw/
  • No _clean suffix — clean files are the main files
  • Per-video .meta.json with title, uploader, duration, etc.

Full Pipeline

Step 1: Download AI subtitles (fast, high concurrency OK)

# 30-50 concurrent is fine — B站 CDN handles it
scripts/batch_channel.sh "https://space.bilibili.com/UID/" ./kb/output zh 0 30

Step 2: For videos without AI subtitles, run whisper (LOW concurrency!)

# Metal GPU can only handle 1-4 parallel whisper instances
# More = slower total (GPU saturation)
scripts/batch_channel.sh "https://space.bilibili.com/UID/" ./kb/output zh 0 2 --whisper-only

Step 3: Clean + Index

# Clean whisper transcripts (AI subtitles skip automatically)
scripts/batch_clean.sh ./kb/UP主名_UID/
scripts/generate_index.sh ./kb/UP主名_UID/

Concurrency Guide

Critical: Different stages need different concurrency!

StageBottleneckRecommendedWhy
AI subtitle downloadNetwork30-50B站 CDN handles high parallel
Whisper transcribeMetal GPU1-4GPU饱和,多了反而慢
Transcript cleaningAPI rate limitALL (0)Network I/O only

Quick Start — Single Video

scripts/transcribe.sh "https://www.bilibili.com/video/BV..." ./output zh

Transcript Cleaning

AI subtitles are clean enough — skipped by default.

SourceCleaning needed?
B站 AI subtitlesNo — directly usable
whisper fallbackYes — goes through cleaning
Cleaning uses opencode/minimax-m2.5-free:
  • Fix homophones and garbled words
  • Add punctuation
  • Output MUST be Simplified Chinese
  • Keep uncertain proper nouns unchanged
  • Never substitute one real term for another

Chunk size: 80 lines. Retry: 3 attempts with 3s delay.

⚠️ Long-running tasks

Use nohup to avoid session compaction killing processes:

nohup bash scripts/batch_clean.sh ./kb/UP主名_UID/ 0 80 > /tmp/clean.log 2>&1 &
batch_clean.sh is resumable — safe to re-run after interruption.

⚠️ Large Channel Handling (1000+ videos)

Script auto-detects large channels (>800 videos) and fetches in chunks to avoid timeout.

# Auto-chunked, just re-run to resume
nohup bash scripts/batch_channel.sh "https://space.bilibili.com/UID/" ./kb/output > /tmp/batch.log 2>&1 &

If still fails, manually fetch URL list:

for i in $(seq 1 500 2000); do
  yt-dlp --flat-playlist --playlist-start $i --playlist-end $((i+499)) \
    --print url "https://space.bilibili.com/UID/" >> /tmp/urls.txt
done
cat /tmp/urls.txt | xargs -P 20 -I {} bash scripts/transcribe.sh {} ./kb/OUTPUT zh

⚠️ Thermal & Fan Warning

Keep system cool — avoid fan spin!

StageRiskMitigation
Whisper (GPU)HIGHKeep concurrency ≤2, monitor temps
AI subtitle downloadLowCan run 30-50 concurrent
Cleaning (API)NonePure network I/O, no local load
If fans start spinning:
  • Stop whisper processes immediately
  • Wait for cooldown
  • Resume with lower concurrency (1-2)
# Check GPU temp (if using CUDA)
nvidia-smi

# Check Mac CPU/GPU temp sudo powermetrics --sample-rate 1000 -i 1 -n 1 | grep -E "CPU|GPU"

Dependencies

Required: yt-dlp, ffmpeg, whisper.cpp (+ model), opencode CLI Optional: Browser cookies for member-only content (--cookies-from-browser chrome)

Environment Variables

VariableDefaultDescription
WHISPER_CLIwhisper-cliPath to whisper.cpp
WHISPER_MODEL~/.whisper-cpp/ggml-small.binWhisper model
OPENCODE_BIN~/.opencode/bin/opencodeopencode CLI
CLEAN_MODELopencode/minimax-m2.5-freeCleaning model

Tips

  • China users: Use hf-mirror.com for whisper model
  • Long videos (1h+): Auto-segmented into 10-min chunks
  • Resumable: All batch scripts skip already-processed files
数据来源:ClawHub ↗ · 中文优化:龙虾技能库
OpenClaw 技能定制 / 插件定制 / 私有工作流定制

免费技能或插件可能存在安全风险,如需更匹配、更安全的方案,建议联系付费定制

了解定制服务