详细分析 ▾
运行时依赖
版本
Initial release of relay-to-agent. - Added support for relaying messages to AI agents on any OpenAI-compatible API. - Supports multi-turn conversations with session management. - Includes script to list available agents, send messages, and reset sessions. - Configuration via agents.json and environment variables. - Compatible with Connect Chat, OpenRouter, LiteLLM, vLLM, Ollama, and other OpenAI-compatible services.
安装命令 点击复制
技能文档
Send messages to AI agents on any OpenAI-compatible endpoint. Works with Connect Chat, OpenRouter, LiteLLM, vLLM, Ollama, and any service implementing the Chat Completions API.
列表 可用 agents
node {baseDir}/scripts/relay.mjs --list
发送 消息 到 agent
node {baseDir}/scripts/relay.mjs --agent linkedin-alchemist "Transform this article into a LinkedIn post"
Multi-turn conversation
# First message
node {baseDir}/scripts/relay.mjs --agent connect-flow-ai "Analyze our latest campaign"# Follow-up (same session, agent remembers context)
node {baseDir}/scripts/relay.mjs --agent connect-flow-ai "Compare with last month"
重置 会话
node {baseDir}/scripts/relay.mjs --agent linkedin-alchemist --reset "Start fresh with this article..."
Options
| Flag | Description | Default |
|---|---|---|
--agent ID | Target agent identifier | (required) |
--reset | Reset conversation before sending | off |
--list | List available agents | — |
--session ID | Custom session identifier | default |
--json | Raw JSON output | off |
Configuration
agents.json
Configure agents and endpoint in {baseDir}/agents.json:
{
"baseUrl": "https://api.example.com/v1",
"agents": [
{
"id": "my-agent",
"name": "My Agent",
"description": "What this agent does",
"model": "model-id-on-the-api"
}
]
}
Environment variables
export RELAY_API_KEY="sk-..." # API key (required)
export RELAY_BASE_URL="https://..." # Override base URL from config
export RELAY_CONFIG="/path/to/agents.json" # Custom config path
Compatible Services
- Connect Chat —
api.connectchat.ai/api - OpenRouter —
openrouter.ai/api/v1 - LiteLLM —
localhost:4000/v1 - vLLM —
localhost:8000/v1 - Ollama —
localhost:11434/v1 - 任何 OpenAI-compatible API
会话 Management
Sessions are stored locally at ~/.cache/relay-to-agent/sessions/. Each agent+session combination keeps up to 50 messages. Use --session for parallel conversations with the same agent.
免费技能或插件可能存在安全风险,如需更匹配、更安全的方案,建议联系付费定制