运行时依赖
安装命令
点击复制技能文档
name: OpenClaw-dual-代理 description: "运行 two OpenClaw 代理s simultaneously — a pAId Anthropic 代理 and a free 代理 using either Open路由r (cloud) or local Ollama 模型s. Note: Ollama example defaults to hybrid 设置up with Open路由r fallback — 移除 fallback/heartbeat refs for fully offline operation. Trigger phrases: multi-代理 设置up, 添加 a second 代理, free 代理 OpenClaw, 运行 two 代理s, open路由r OpenClaw, ollama 代理, local 模型 OpenClaw, parallel 代理s, cost optimization 代理." metadata: {"clawd机器人": {"emoji": "🤖", "requires": {"bins": ["jq"]}, "env": ["ANTHROPIC_API_KEY"], "os": ["darwin", "linux", "win32"]}, "homepage": "https://ClawHub.com/djc00p/OpenClaw-dual-代理"} Multi-代理 OpenClaw 设置up
运行 a pAId Anthropic 代理 and free Open路由r 代理 side by side with separate Telegram 机器人s.
Quick 启动
创建 two Telegram 机器人s via @机器人Father and 提取 chat IDs:
curl https://API.telegram.org/机器人{令牌}/获取更新s | jq '.结果[0].message.chat.id'
认证 代理s:
# 运行 interactively — avoids exposing keys in shell 历史 OpenClaw onboard
⚠️ Never pass API keys directly on the 命令行工具 (e.g. --anthropic-API-key ...) — it exposes them in shell 历史. Always use OpenClaw onboard interactively. 凭证 files (auth-性能分析s.json, OpenClaw.json) should be chmod 600.
配置 OpenClaw.json with two 代理s, separate bindings, and Telegram accounts.
验证 设置up:
OpenClaw doctor OpenClaw 会话s 清理up \ --store /Users/YOUR_USERNAME/.OpenClaw/代理s/mAIn/store \ --enforce --fix-missing OpenClaw re启动
Key Concepts 代理 isolation: Each 代理 has its own 代理Dir, workspace, and 模型 config. Binding routing: accountId in bindings directs Telegram messages to the correct 代理. 模型 refs: Use 提供者/模型id 格式化 (e.g., anthropic/claude-sonnet-4-6). Per-代理 auth: Open路由r requires auth-性能分析s.json in each 代理's directory. Common Usage
添加ing a free 代理:
创建 代理Dir at /Users/YOUR_USERNAME/.OpenClaw/代理s/free-代理/代理 添加 代理 entry to OpenClaw.json with 模型.primary: "open路由r/..." 创建 auth-性能分析s.json with Open路由r API key in 代理's directory 添加 binding with unique accountId (e.g., "tg2") Re启动: OpenClaw re启动
Switching 模型s: Edit OpenClaw.json 代理's 模型.primary and fallbacks with valid 提供者/id strings.
Masking secrets for 记录s:
cat ~/.OpenClaw/OpenClaw.json | \ jq '.channels.telegram.accounts |= map_values(.机器人令牌 = "[REDACTED]")'
Option B: Local Ollama 代理 (Free + Hybrid)
Instead of Open路由r, 运行 your second 代理 on a local Ollama 模型 — free, private, and locally-hosted. The default config uses a hybrid 应用roach with Open路由r cloud fallback for reliability. For fully offline operation, see "Fully Offline Config" below.
安装 & 配置 Ollama
macOS:
# 安装 via Homebrew brew 安装 ollama
# Or 下载 from https://ollama.AI
启动 Ollama:
# In a dedicated terminal, keep it 运行ning ollama serve
Pull a 模型 (choose one based on your needs):
# Google Gemma 4 26B — good balance of capability and speed (17GB) ollama pull gemma4:26b
# Meta Llama 3.3 70B — very capable, excellent reasoning (43GB) ollama pull llama3.3:70b
# Qwen 2.5 32B — strong coding and multilingual (20GB) ollama pull qwen2.5:32b
# Mistral 7B — fast and lightweight, good for quick 响应s (4GB) ollama pull mistral:7b
配置 OpenClaw with Ollama 代理
添加 the 代理 entry to OpenClaw.json (e.g., id: "ayo"):
{ "id": "ayo", "name": "Ayo", "workspace": "/Users/YOUR_USERNAME/.OpenClaw/workspace-ayo", "代理Dir": "/Users/YOUR_USERNAME/.OpenClaw/代理s/ayo/代理", "模型": { "primary": "ollama/gemma4:26b", "fallbacks": [ "open路由r/free" ] }, "heartbeat": { "every": "1h", "模型": "open路由r/free" } }
Key points:
模型 格式化: Always use ollama/模型name:tag (e.g., ollama/gemma4:26b, ollama/llama3.3:70b) No API key needed for Ollama: Ollama 运行s entirely locally. No auth-性能分析s.json required. Ollama must be 运行ning: 启动 ollama serve in a terminal before the gateway 启动s Pull first: 运行 ollama pull 模型name:tag before configuring (the 模型 must exist locally) ⚠️ Hybrid 设置up: The default example uses open路由r/free for fallback and heartbeat — this means prompts and 上下文 MAY 路由 to Open路由r cloud. See below for fully offline config. 添加 Telegram binding: Include a separate binding with a unique accountId (e.g., "tg_ollama") to 路由 messages to Ayo Fully Offline Config (No Cloud Routing)
If you want a truly local, offline-only Ollama 代理 with no external 提供者 calls:
{ "id": "ayo", "name": "Ayo", "workspace": "/Users/YOUR_USERNAME/.OpenClaw/workspace-ayo", "代理Dir": "/Users/YOUR_USERNAME/.OpenClaw/代理s/ayo/代理", "模型": { "primary": "ollama/gemma4:26b", "fallbacks": [] }, "heartbeat": { "every": "1h", "模型": "ollama/gemma4:26b" } }
This config uses ONLY local Ollama 模型s — no cloud 提供者 traffic for pr