首页龙虾技能列表 › Relay To Agent — 智能体中继

🤖 Relay To Agent — 智能体中继

v0.0.1

智能体中继工具,支持消息转发。

6· 2,654·13 当前·13 累计
by @ericsantos (Eric Santos)·MIT-0
下载技能包
License
MIT-0
最后更新
2026/2/26
安全扫描
VirusTotal
无害
查看报告
OpenClaw
安全
high confidence
The skill is internally consistent with its stated purpose: it relays messages to OpenAI-compatible chat endpoints, requires only a node runtime and an API key, and its behavior (config, session storage, network calls) matches the description.
评估建议
This skill appears to do what it says: it sends messages to an OpenAI-compatible API and stores local session history. Before installing or running it: 1) Verify agents.json (or the RELAY_CONFIG path) to ensure baseUrl points to a trusted provider — your RELAY_API_KEY will be sent to that base URL. 2) Be aware that message contents are written to ~/.cache/relay-to-agent/sessions (up to 50 messages) — remove these files if you need to protect sensitive data. 3) If you run npm install to use the s...
详细分析 ▾
用途与能力
Name/description, required binary (node), primary credential (RELAY_API_KEY), and the included script all align: the tool reads agents.json, calls an OpenAI-compatible base URL, and returns replies. No unrelated credentials or binaries are requested.
指令范围
Runtime instructions and the script perform expected actions: they read agents.json (or RELAY_CONFIG), use RELAY_API_KEY and optional RELAY_BASE_URL, send chat completions to the configured endpoint, and persist up to 50 messages per agent+session under ~/.cache/relay-to-agent/sessions. Note: user content is written to disk (session files) and transmitted to the configured remote endpoint — verify that endpoint is trusted for sensitive content.
安装机制
This is instruction-only (no install spec). A node script and package.json/package-lock are included; there is no automated download-from-URL or other high-risk installer. Running it may require doing an npm install locally; the declared dependency (openai-fetch) is reasonable for the stated purpose.
凭证需求
Only RELAY_API_KEY is required as the primary secret; optional RELAY_BASE_URL and RELAY_CONFIG are documented. The requested env vars are proportional and justified by the skill's function.
持久化与权限
The skill is not always-enabled and does not modify other skills or system-wide settings. It stores session data under the user's home directory (~/.cache/relay-to-agent/sessions), which is expected for multi-turn state; this is a moderate local persistence but scoped to the user's home.
安全有层次,运行前请审查代码。

License

MIT-0

可自由使用、修改和再分发,无需署名。

运行时依赖

无特殊依赖

版本

latestv0.0.12026/1/28

Initial release of relay-to-agent. - Added support for relaying messages to AI agents on any OpenAI-compatible API. - Supports multi-turn conversations with session management. - Includes script to list available agents, send messages, and reset sessions. - Configuration via agents.json and environment variables. - Compatible with Connect Chat, OpenRouter, LiteLLM, vLLM, Ollama, and other OpenAI-compatible services.

● 无害

安装命令 点击复制

官方npx clawhub@latest install relay-to-agent
镜像加速npx clawhub@latest install relay-to-agent --registry https://cn.clawhub-mirror.com

技能文档

Send messages to AI agents on any OpenAI-compatible endpoint. Works with Connect Chat, OpenRouter, LiteLLM, vLLM, Ollama, and any service implementing the Chat Completions API.

列表 可用 agents

node {baseDir}/scripts/relay.mjs --list

发送 消息 到 agent

node {baseDir}/scripts/relay.mjs --agent linkedin-alchemist "Transform this article into a LinkedIn post"

Multi-turn conversation

# First message
node {baseDir}/scripts/relay.mjs --agent connect-flow-ai "Analyze our latest campaign"

# Follow-up (same session, agent remembers context) node {baseDir}/scripts/relay.mjs --agent connect-flow-ai "Compare with last month"

重置 会话

node {baseDir}/scripts/relay.mjs --agent linkedin-alchemist --reset "Start fresh with this article..."

Options

FlagDescriptionDefault
--agent IDTarget agent identifier(required)
--resetReset conversation before sendingoff
--listList available agents
--session IDCustom session identifierdefault
--jsonRaw JSON outputoff

Configuration

agents.json

Configure agents and endpoint in {baseDir}/agents.json:

{
  "baseUrl": "https://api.example.com/v1",
  "agents": [
    {
      "id": "my-agent",
      "name": "My Agent",
      "description": "What this agent does",
      "model": "model-id-on-the-api"
    }
  ]
}

Environment variables

export RELAY_API_KEY="sk-..."          # API key (required)
export RELAY_BASE_URL="https://..."    # Override base URL from config
export RELAY_CONFIG="/path/to/agents.json"  # Custom config path

Compatible Services

  • Connect Chatapi.connectchat.ai/api
  • OpenRouteropenrouter.ai/api/v1
  • LiteLLMlocalhost:4000/v1
  • vLLMlocalhost:8000/v1
  • Ollamalocalhost:11434/v1
  • 任何 OpenAI-compatible API

会话 Management

Sessions are stored locally at ~/.cache/relay-to-agent/sessions/. Each agent+session combination keeps up to 50 messages. Use --session for parallel conversations with the same agent.

数据来源:ClawHub ↗ · 中文优化:龙虾技能库
OpenClaw 技能定制 / 插件定制 / 私有工作流定制

免费技能或插件可能存在安全风险,如需更匹配、更安全的方案,建议联系付费定制

了解定制服务