📦 Llama Llama3 — 本地运行Llama3全家桶

v1.0.1

在本地设备集群一键运行 Llama 3.3/3.2/3.1,自动路由至最佳可用硬件,无需云端即可体验 Meta 开源大模型全家桶。

2· 130·2 当前·2 累计
twinsgeeks 头像by @twinsgeeks (Twin Geeks)
下载技能包
最后更新
2026/4/4
0
安全扫描
VirusTotal
Pending
查看报告
OpenClaw
安全
high confidence
NULL
评估建议
This skill is internally consistent with being a local fleet router, but you should still do basic hygiene before installing: 1) Inspect the PyPI package 'ollama-herd' and the linked GitHub repository to confirm the code matches the docs and that model downloads are interactive as stated. 2) Run the software in an isolated/test environment first (or a VM) to verify it only listens on localhost or your intended network interfaces. 3) Review ~/.fleet-manager/ contents and logs for any sensitive da...
详细分析 ▾
用途与能力
Name/description (a local fleet router for Llama models) lines up with what's requested and documented: the SKILL.md instructs installing a herd router package, running local binaries (herd, herd-node), and talking to localhost endpoints. Required binaries (curl/wget, optional python/pip) are appropriate. Declared config paths (~/.fleet-manager/latency.db and logs/herd.jsonl) are consistent with a fleet manager that records latency and logs.
指令范围
SKILL.md is instruction-only and stays within the stated purpose: it tells the operator to pip install the herd package, run herd and herd-node, and call local HTTP endpoints. It does not instruct reading arbitrary user files or exfiltrating data. One point to note: metadata lists fleet config paths (logs/db) which are sensitive — the doc warns not to modify them, but if installed, the herd software will likely read/write those files. Verify that behavior in the package source before trusting logs/latency data.
安装机制
There is no formal install spec in the registry (instruction-only), but the SKILL.md recommends 'pip install ollama-herd' from PyPI. Installing third-party packages from PyPI is a normal distribution route but carries modest risk—inspect the PyPI package and the linked GitHub repo before installing. No downloads from unknown personal servers or archive extracts are specified.
凭证需求
The skill requests no environment variables, no credentials, and no system config paths beyond its own fleet config directory. That is proportionate for a local fleet router that operates on localhost and local devices.
持久化与权限
always is false and the skill does not request system-wide privileges or modifications to other skills. The declared config paths imply it will maintain local state under ~/.fleet-manager/, which is expected for this type of software; ensure you are comfortable with that directory being created/used.
安全有层次,运行前请审查代码。

运行时依赖

🖥️ OSmacOS · Linux · Windows

版本

latestv1.0.12026/3/31

NULL

Pending

安装命令

点击复制
官方npx clawhub@latest install llama-llama3
镜像加速npx clawhub@latest install llama-llama3 --registry https://cn.longxiaskill.com
数据来源ClawHub ↗ · 中文优化:龙虾技能库