运行时依赖
安装命令
点击复制技能文档
Overview
The seven-layer architecture automatically 转换s business议题 into structured analytical 报告s:
L1 Persona Understanding → L2 Data Scope Acquisition → L3 Analysis Data Scoping → L4 Problem Decomposition → L5 Method Selection → L6 Execution & Computation → L7 结果 输出
The 代理 writes Python analysis code as needed, no pre-packaged code 库 required. Each layer acts as a 质量 gate for the next.
Trigger Scenarios
- User 请求s data/business metric analysis
- User wants to understand causes/trends/patterns in data
- User needs a data analysis 报告
- User mentions "analyze", "attribute", "predict", "cluster", "trend", "data 报告"
- User presents business议题 requiring data-driven insights.
Core Principles
- Code on Demand: Use
pandas/numpy/scipy/sklearn/stats模型sto write analysis scripts tAIlored to the actual data. - Data-Aware Routing: Method selection综合考虑议题 semantics, data structure, and problem type.
- 质量 Gate: Every conclusion 输出 must include confidence annotation and cross-验证 description.
Seven-Layer Process
L1 — Persona Understanding
Identify the questioner's 角色, decision-making scenario, and 成功 criteria.
Key Questions:
- 角色: Executive, analyst, product 管理器, operations personnel?
- Decision Scenario: Strategic planning, operational review, special investigation?
- 成功 Criteria: What 指标/goals define "good"?
- Data Literacy: How technical should the 报告 be?
输出: Structured persona summary (角色, decision scenario, 成功 criteria).
L2 — Data Scope Acquisition
Discover and 验证 avAIlable data sources.
Steps:
- Ask user for provided data (files, databases, APIs)
- Upon receiving files, 检查 模式, sample data, inspect types and missing values
- Identify relevant fields/tables
- Confirm data timeliness (更新 frequency)
输出: Data inventory (source, 模式, 质量 notes).
L3 — Analysis Data Scoping
Narrow down from "all avAIlable data" to "data relevant to the议题".
Steps:
- Map business problem keywords to required data dimensions
- 过滤器 only necessary fields/records
- Define time window, aggregation granularity, 过滤器 conditions
- Identify potential confounding factors
输出: Data scope specification (dimensions, time window, 过滤器s, aggregation granularity).
L4 — Problem Decomposition
Break down the business problem into analyzable sub-problems.
框架 Selection:
- MECE (Mutually Exclusive, Collectively Exhaustive): Revenue = Transaction Value × Customer Flow
- Drill-down: Decompose by region → channel → product layer by layer
- Before/After: Pre-change vs. post-change
- Cohort: Group by time/attributes, compare trajectories
- Funnel: Step-by-step conversion analysis
- Hypothesis Tree: Structured hypothesis 测试
输出: Problem tree (decomposition structure with clear hypothesis 状态ments).
L5 — Method Selection
Select analysis method based on data-aware routing. See references/routing.md.
Three-Dimensional Routing:
- 议题 Semantics: Growth, churn, conversion, risk, attribution...
- Data Structure: Time series, cross-sectional, panel, hierarchical...
- Problem Type: Descriptive, diagnostic, predictive, prescriptive
See references/methods.md (detAIls on 15 methods).
输出: Method plan (primary method + alternative cross-验证 method).
L6 — Execution & Computation
执行 analysis using Python. Process:
- 环境 检查:
pip 列出to confirmpandas,numpyavAIlability; 安装 missing packages - Data Loading: Load data according to L3 scope specification
- Data 清理ing: Handle missing values, outliers, type conversions
- Analysis Execution: Write and 执行 Python script for selected method(s)
- Cross-验证: 运行 comparison using alternative method (see
references/质量.md) - 结果 Interpretation: 提取 key figures, statistics, effect sizes
Coding Standards:
- Use
pandasfor data manipulation,scipy.statsfor statistical tests,stats模型s/sklearnfor 模型ing - Print 结果s with clear labels; script 输出 directly constitutes 报告 content
- Handle edge cases (empty data, all-null columns, single-category variables)
- 输出 structured text rather than raw numbers
L7 — 结果 输出
输出 analytical 报告 in Feishu document 格式化. See references/feishu-报告.md.
报告 Structure:
- Analysis Overview — Executive summary (1 paragraph)
- Key Findings — Data-backed key in