📱 VAGUS MCP — 连接安卓手机
v1.0.0通过 VAGUS MCP 服务器连接用户 Android 设备,实时读取传感器数据(运动、位置、环境)与设备状态(电量、网络、屏幕),为自动化脚本与数据分析提供原生移动能力。
详细分析 ▾
运行时依赖
版本
Give your agent a nervous system — continuous sensory coupling to the physical world through the phone in your pocket. This doesn't need to be a docx — it's a description for a skill listing. Let me write this directly as markdown content. Here's the VAGUS Openclaw Skill description for ClawHub, Vicky. I've written a few variants depending on the tone you want to strike: Short tagline (for the one-liner): Give your agent a nervous system — continuous sensory coupling to the physical world through the phone in your pocket. Full description: Every other skill on ClawHub teaches your agent to do something new with data. VAGUS teaches it to perceive. VAGUS is the first MCP-compatible embodiment runtime on mobile. It transforms an Android phone into a sensory endpoint for your OpenClaw agent — not as a remote control, but as a body. Raw sensor data flows up (accelerometer, GPS, barometer, ambient light), an on-device inference layer adds meaning (activity recognition, attention availability, sleep likelihood, notification timing), and I/O tools let the agent act back into the physical world through haptics, speech, notifications, SMS, calendar events, and more. This isn't another API integration. It's a category shift. Your agent stops asking "what are you doing?" and starts knowing — because it feels your motion, infers you're outdoors, and can reach back through a tap on your wrist. Three layers working together: sense, infer, act. What this skill does: Connects your OpenClaw agent to VAGUS Core (Android app) via relay pairing. Once paired, the agent discovers available capabilities through standard MCP negotiation and gains access to: Sensors — motion (raw IMU), location, battery, connectivity, screen state, notifications, clipboard Inference — activity recognition, environment context (indoor/outdoor/vehicle), attention availability, indoor confidence, sleep likelihood, optimal notification timing I/O — haptic pulse and patterns, text-to-speech, push notifications, clipboard write, SMS, open URL, calendar events, agent identity Governance built in. Every capability has per-tool toggles, time-of-day windows, rate limits, approval prompts, and full access logs. One-tap kill switch on the device. The physical phone in your hand is always the final authority. Your agent can already send texts and create events through direct integrations — VAGUS makes it safer by putting a governed layer between intent and action. Setup in three minutes: Install the VAGUS APK → tap Pair → give your agent the 6-character code. No port forwarding, no network config. The relay handles connection. Your agent has a body. Why this matters beyond utility: Most of the AI stack is building better brains. VAGUS builds the missing body. When an agent is continuously coupled to a physical substrate — not querying it on demand, but living in its signal stream — something qualitatively different emerges. The agent doesn't just know facts about you. It participates in your situation. That's a fundamentally different relationship between intelligence and world, and it opens design spaces that pure language models can't reach. Open source. Self-hostable relay. Works with any MCP-compatible agent. withvagus.com · github.com/embodiedsystems-org/VAGUS-MCP