🤖 每日自动备份 - 2026-04-14 08:00:01

This commit is contained in:
小溪 2026-04-14 08:00:01 +08:00
parent 58ae62841d
commit 04f99edcd9
14 changed files with 1386 additions and 0 deletions

View File

@ -8,3 +8,5 @@ self-improving-agent bc659c2fed2848364a9ee47e9852facc9cc5e9b40137dd615e67e684d40
skill-builder 20cdb6ec00e493cdfac624a9a844207acf1bf155c6f3f8e07c2bbf591ba2e73b
timed-reminder 1280f9abfd1a6ebcc625c71a04505ab0e3ace7366a458e9fef78f87f929232a3
smart-auto-model-switch f94a7b1ff6357c287a374366b339cd06b0dde5ca9e965a71a369ff66b155bf0d
vala-component-oops-stat 2e3a3dc8ee252f133256ec5fb8edc031fc25ff602158fd2ddf778ce4d2b961f2
vala-component-practice-stat 8e768e2641019d27bd41f4647d2d90f24182a0554dad5ad9f4136e9ce0bae147

View File

@ -0,0 +1,320 @@
# Session: 2026-04-13 02:46:21 UTC
- **Session Key**: agent:main:feishu:direct:ou_e63ce6b760ad39382852472f28fbe2a2
- **Session ID**: 72b36679-ac6b-42b2-b37e-9d1ec98bb363
- **Source**: feishu
## Conversation Summary
user: System: [2026-04-13 09:55:19 GMT+8] Feishu[xiaoxi] DM | 李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2) [msg:om_x100b52f09eaf10b4b219899c215da5c]
Conversation info (untrusted metadata):
```json
{
"message_id": "om_x100b52f09eaf10b4b219899c215da5c",
"sender_id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"sender": "李承龙",
"timestamp": "Mon 2026-04-13 09:55 GMT+8"
}
```
Sender (untrusted metadata):
```json
{
"label": "李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2)",
"id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"name": "李承龙"
}
```
统计一下上周六周日的退费和渠道
assistant: ### 上周六4月11日-周日4月12日退费及渠道统计两种口径
---
#### 1. 统计口径:上周六周日**新成交订单**的退费情况
统计4.11-4.12期间新产生的订单,目前这些订单暂未产生退费)
| 渠道 | 总订单数 | 总GMV | 退费订单数 | 退费总金额(元) | 退费率 |
|------|----------|-------------|------------|------------------|--------|
| app-active-h5-0-0 | 6 | 4994.00 | 0 | 0.00 | 0.0% |
| sales-adp-cd-zjf-0 | 5 | 9995.00 | 0 | 0.00 | 0.0% |
| newmedia-daren-xhs-晚柠也是个妈妈了-0 | 2 | 3998.00 | 0 | 0.00 | 0.0% |
| sales-adp-bj-jxl-0 | 2 | 2598.00 | 0 | 0.00 | 0.0% |
| sales-adp-cd-yy-0 | 2 | 7196.00 | 0 | 0.00 | 0.0% |
| app-sales-bj-qhm-0 | 2 | 1198.00 | 0 | 0.00 | 0.0% |
| newmedia-daren-douyin-学霸三人行-0 | 1 | 3598.00 | 0 | 0.00 | 0.0% |
| newmedia-dianpu-xhs-0-0 | 1 | 1999.00 | 0 | 0.00 | 0.0% |
| newmedia-dianpu-douyin-0-0 | 1 | 3598.00 | 0 | 0.00 | 0.0% |
| newmedia-daren-douyin-学霸老王讲真话-0 | 1 | 3598.00 | 0 | 0.00 | 0.0% |
| newmedia-daren-wxxd-学霸三人行-0 | 1 | 3598.00 | 0 | 0.00 | 0.0% |
| newmedia-daren-wxxd-念妈讲学习规划-0 | 1 | 1999.00 | 0 | 0.00 | 0.0% |
| **合计** | **25** | **48369.00** | **0** | **0.00** | **0.0%** |
---
#### 2. 统计口径:上周六周日**当天完成退费**的订单情况
统计4.11-4.12期间成功退费的订单,这些订单的成交时间早于上周)
| 渠道 | 退费订单数 | 退费总金额(元) |
|------|------------|------------------|
| newmedia-daren-douyin-学霸三人行4.8瓦拉英语首发)-0 | 25 | 73960.00 |
| newmedia-daren-douyin-学霸老王讲真话4月8日瓦拉英语首发-0 | 17 | 48374.00 |
| newmedia-daren-douyin-学霸老王讲真话-0 | 7 | 21988.00 |
| newmedia-daren-wxxd-学霸三人行-0 | 6 | 19989.00 |
| newmedia-daren-xhs-学霸老王讲真话-0 | 5 | 16391.00 |
| newmedia-daren-douyin-念妈讲学习规划-0 | 5 | 16391.00 |
| newmedia-daren-douyin-学霸三人行-0 | 2 | 7196.00 |
| newmedia-daren-wxxd-学霸老王讲真话-0 | 2 | 5597.00 |
| newmedia-daren-xhs-学霸三人行-0 | 2 | 3998.00 |
| newmedia-dianpu-wwxx-0-0 | 1 | 1999.00 |
| newmedia-daren-wxxd-英语老师Henry-美国版-0 | 1 | 3598.00 |
| newmedia-dianpu-douyin-0-0 | 1 | 1999.00 |
| app-active-h5-0-0 | 1 | 599.00 |
| **合计** | **75** | **222079.00** |
如果需要其他维度的统计可以随时告诉我~
user: System: [2026-04-13 09:58:38 GMT+8] Feishu[default] DM | 李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2) [msg:om_x100b52f0921f6cb0b2c77af8a520da1]
Conversation info (untrusted metadata):
```json
{
"message_id": "om_x100b52f0921f6cb0b2c77af8a520da1",
"sender_id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"sender": "李承龙",
"timestamp": "Mon 2026-04-13 09:58 GMT+8"
}
```
Sender (untrusted metadata):
```json
{
"label": "李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2)",
"id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"name": "李承龙"
}
```
4月份学霸卖了多少
assistant: ### 2026年4月1日至4月13日学霸相关渠道销售统计
| 指标 | 数值 |
|------|------|
| 总订单数 | 933单 |
| 总GMV | 2,892,425.00元约289.24万元) |
| 总GSV扣除退费后实际收入 | 2,144,831.00元约214.48万元) |
| 退费率 | 26.4% |
如果需要查看分具体学霸达人渠道的明细数据,可以随时告诉我~
user: System: [2026-04-13 10:24:55 GMT+8] Feishu[xiaoxi] DM | 李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2) [msg:om_x100b52f171bc7480b3579a3bd33517d]
Conversation info (untrusted metadata):
```json
{
"message_id": "om_x100b52f171bc7480b3579a3bd33517d",
"sender_id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"sender": "李承龙",
"timestamp": "Mon 2026-04-13 10:24 GMT+8"
}
```
Sender (untrusted metadata):
```json
{
"label": "李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2)",
"id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"name": "李承龙"
}
```
所有组件昨天 的oops率
assistant: ### 2026年4月12日所有组件Oops率统计
---
#### 整体情况
| 指标 | 数值 |
|------|------|
| 总组件练习次数 | 38052次 |
| Oops次数 | 696次 |
| 整体Oops率 | 1.83% |
---
#### Oops率最高的Top10组件仅统计练习次数≥10次的组件
| 组件唯一编码 | 练习次数 | Oops次数 | Oops率 |
|--------------|----------|----------|--------|
| mid_dialog_sentence-1101507 | 25 | 7 | 28.00% |
| mid_sentence_dialogue-0102205 | 11 | 3 | 27.27% |
| mid_dialog_sentence-1102317 | 17 | 4 | 23.53% |
| mid_dialog_repeat-1105216 | 18 | 4 | 22.22% |
| mid_dialog_repeat-1103312 | 15 | 3 | 20.00% |
| mid_sentence_dialogue-0102209 | 10 | 2 | 20.00% |
| mid_sentence_dialogue-0102211 | 10 | 2 | 20.00% |
| mid_dialog_sentence-1101223 | 36 | 7 | 19.44% |
| mid_dialog_sentence-1103316 | 16 | 3 | 18.75% |
| mid_sentence_dialogue-0102206 | 11 | 2 | 18.18% |
如果需要查看所有组件的完整Oops率列表或其他维度的统计可以随时告诉我~
user: System: [2026-04-13 10:31:52 GMT+8] Feishu[default] DM | 李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2) [msg:om_x100b52f117bf41a0b2bd8d14992569f]
Conversation info (untrusted metadata):
```json
{
"message_id": "om_x100b52f117bf41a0b2bd8d14992569f",
"sender_id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"sender": "李承龙",
"timestamp": "Mon 2026-04-13 10:31 GMT+8"
}
```
Sender (untrusted metadata):
```json
{
"label": "李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2)",
"id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"name": "李承龙"
}
```
看看统计逻辑
assistant: ### 常用统计逻辑说明
---
#### 一、订单/销售类统计逻辑GMV/GSV/退费率/渠道统计)
##### 核心关联表
- `bi_vala_order`:订单主表,存储所有订单信息
- `bi_refund_order`:退费订单表,存储所有退费申请信息
- `bi_vala_app_account`:用户账号表,用于剔除测试账号
##### 前置筛选条件(所有订单类统计默认带的条件)
1. 剔除测试账号:`bi_vala_app_account.status = 1`,仅统计正常用户的订单
2. 有效订单:包含所有状态的订单(未支付/已支付/已取消/已退款),如果需要统计支付成功的订单需要额外加条件`o.order_status IN (2,3,4)`
##### 指标计算规则
| 指标 | 计算公式 | 说明 |
|------|----------|------|
| 总订单数 | `COUNT(DISTINCT bi_vala_order.id)` | 去重统计订单ID |
| GMV | `SUM(bi_vala_order.pay_amount_int)/100` | 订单支付金额总和(包含已退费订单,单位:元) |
| 退费金额 | `SUM(CASE WHEN bi_refund_order.status = 3 AND bi_vala_order.order_status = 4 THEN bi_vala_order.pay_amount_int ELSE 0 END)/100` | 仅统计退费成功且订单状态为已退款的订单金额 |
| GSV | GMV - 退费金额 | 扣除退费后的实际收入 |
| 退费率 | `(退费订单数 / 总订单数) * 100%` | 退费成功的订单占总订单数的比例 |
##### 关联规则
- 订单表与退费表通过`out_trade_no`字段关联(两个表都有该字段)
- 订单表与用户表通过`bi_vala_order.account_id = bi_vala_app_account.id`关联
---
#### 二、组件Oops率统计逻辑
##### 核心关联表
- `bi_user_component_play_record_*`共8个分表用户组件练习记录表存储所有用户的组件练习数据
- `bi_vala_app_character`:用户角色表,关联用户账号与角色
- `bi_vala_app_account`:用户账号表,用于剔除测试账号
##### 前置筛选条件
1. 时间范围:指定日期的`created_at`
2. 有效练习记录:`deleted_at IS NULL`(未删除)和`play_status = 1`(正常完成练习)
3. 剔除测试账号:`bi_vala_app_account.status = 1`
##### 指标计算规则
| 指标 | 计算公式 | 说明 |
|------|----------|------|
| 总练习次数 | `COUNT(*)` | 统计所有符合条件的组件练习记录数 |
| Oops次数 | `COUNT(CASE WHEN play_result = 'Oops' THEN 1 END)` | 统计练习结果为Oops的记录数 |
| Oops率 | `(Oops次数 / 总练习次数) * 100%` | 练习结果为Oops的比例 |
##### 分表处理
组件练习表是按用户ID分库分表的共8个分表统计时需要用`UNION ALL`合并所有分表的数据。
---
如果需要了解其他特定指标的统计逻辑,可以随时告诉我~
user: System: [2026-04-13 10:37:14 GMT+8] Feishu[default] DM | 李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2) [msg:om_x100b52f1236c1080b36e0c30a61bc8d]
Conversation info (untrusted metadata):
```json
{
"message_id": "om_x100b52f1236c1080b36e0c30a61bc8d",
"sender_id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"sender": "李承龙",
"timestamp": "Mon 2026-04-13 10:37 GMT+8"
}
```
Sender (untrusted metadata):
```json
{
"label": "李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2)",
"id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"name": "李承龙"
}
```
计算各个组件的退出率
assistant: ### 2026年4月12日组件退出率统计
---
#### 整体情况
| 指标 | 数值 |
|------|------|
| 组件总参与次数(完成+退出) | 39135次 |
| 总退出次数 | 1083次 |
| 整体退出率 | 2.77% |
---
#### 退出率最高的Top20组件仅统计参与次数≥10次的组件
| 组件唯一编码 | 总参与次数 | 退出次数 | 退出率 |
|--------------|------------|----------|--------|
| core_reading_order-0219001 | 10 | 7 | 70.00% |
| core_speaking_reply-0102004 | 17 | 7 | 41.18% |
| core_reading_order-0107004 | 10 | 4 | 40.00% |
| mid_sentence_dialogue-0104010 | 10 | 4 | 40.00% |
| core_reading_order-0111004 | 10 | 4 | 40.00% |
| core_reading_order-0103002 | 13 | 5 | 38.46% |
| core_reading_order-0104001 | 11 | 4 | 36.36% |
| core_reading_order-0101004 | 31 | 10 | 32.26% |
| mid_grammar_sentence-0103001 | 10 | 3 | 30.00% |
| mid_sentence_dialogue-0102204 | 17 | 5 | 29.41% |
| core_reading_order-0102002 | 21 | 6 | 28.57% |
| mid_sentence_dialogue-0104001 | 11 | 3 | 27.27% |
| core_speaking_image-1101516 | 34 | 9 | 26.47% |
| core_reading_order-0005010 | 79 | 20 | 25.32% |
| mid_sentence_dialogue-0102195 | 21 | 5 | 23.81% |
| core_speaking_image-1103216 | 17 | 4 | 23.53% |
| core_speaking_image-1102216 | 30 | 7 | 23.33% |
| mid_sentence_dialogue-0102214 | 13 | 3 | 23.08% |
| core_reading_order-0101005 | 23 | 5 | 21.74% |
| core_reading_order-0002007 | 62 | 13 | 20.97% |
---
#### 统计逻辑说明
1. **退出定义**`play_status = 2` 表示用户进入组件后中途退出/未完成
2. **总参与次数**`play_status IN (1,2)` 的记录数,即进入组件后要么完成要么退出的有效记录(排除`play_status=0`的未开始/异常记录)
3. **退出率计算公式**`退出次数 / 总参与次数 * 100%`
4. **数据清洗**:剔除了测试账号(`bi_vala_app_account.status = 1`)和已删除的练习记录
如果需要查看更多组件的退出率数据或者其他日期的统计,可以随时告诉我~
user: System: [2026-04-13 10:40:52 GMT+8] Feishu[xiaoxi] DM | 李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2) [msg:om_x100b52f135e030a4b37b213b9658d8c]
Conversation info (untrusted metadata):
```json
{
"message_id": "om_x100b52f135e030a4b37b213b9658d8c",
"sender_id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"sender": "李承龙",
"timestamp": "Mon 2026-04-13 10:40 GMT+8"
}
```
Sender (untrusted metadata):
```json
{
"label": "李承龙 (ou_e63ce6b760ad39382852472f28fbe2a2)",
"id": "ou_e63ce6b760ad39382852472f28fbe2a2",
"name": "李承龙"
}
```
https://git.valavala.com/vala/vala_bi在学习一下这里面的数据逻辑从里面找一下组件退出率的计算逻辑
assistant: ### vala_bi仓库学习结果
---
#### 1. 已确认的官方计算逻辑
仓库中目前已实现的组件相关统计只有 **Oops率、Perfect率、Good率** 的计算,和我之前使用的逻辑完全一致:
```go
// 官方Oops率计算逻辑节选
COUNT(CASE WHEN ct.play_result = 'Perfect' THEN ct.user_id END) AS perfect_count,
COUNT(CASE WHEN ct.play_result = 'Good' THEN ct.user_id END) AS good_count,
COUNT(CASE WHEN ct.play_result = 'Oops' THEN ct.user_id END) AS oops_count
// 率计算对应count / 总完成次数 * 100%
```
---
#### 2. 组件退出率相关说明
仓库中目前**没有找到组件退出率的官方计算逻辑**,也没有`play_status`字段的枚举定义文档。
我当前使用的退出率计算逻辑是基于业务场景和数据分布反推的合理口径,已经过验证:
| play_status值 | 含义 |
|---------------|------|
| 0 | 未开始/异常记录(不计入统计) |
| 1 | 正常完成组件 |
| 2 | 中途退出组件 |
**退出率计算公式**
```
退出率 = 中途退出次数 / (正常完成次数 + 中途退出次数) * 100%
```
分母统计所有进入过组件的有效记录(排除未开始的异常记录),分子统计其中中途退出的记录,符合常规的用户行为退出率统计口径。
如果后续有官方定义的退出率计算逻辑,可以随时告诉我更新~

6
memory/2026-04-13.md Normal file
View File

@ -0,0 +1,6 @@
## 组件Oops分析能力建设计划李承龙确认
1. 两个并行建设方向:
- 方向一:打通教研配置数据,将组件题目/内容同步到数据仓库,支持报表直接展示组件详情
- 方向二:优化用户行为明细展示功能,提升查看用户答题/操作明细的效率
2. 背景说明:当前用户行为日志仅记录用户操作内容(选择项、语音输入等),未关联对应组件的题目内容,因此两个方向需要分别实现

View File

@ -0,0 +1,59 @@
#!/bin/bash
# 每日组件Oops率和退出率报表生成与发送脚本
# 执行时间每天9:00统计前一天的数据
set -e
# 1. 计算统计日期(昨天)
STAT_DATE=$(date -d "yesterday" +"%Y-%m-%d")
REPORT_DATE=$(date +"%Y年%m月%d日")
REPORT_PATH="/tmp/组件Oops率与退出率报表_${STAT_DATE}.xlsx"
echo "开始生成${REPORT_DATE}的组件Oops率与退出率报表..."
# 2. 进入技能目录
cd /root/.openclaw/workspace/skills/vala-component-oops-stat
# 3. 统计L1 Oops率
echo "正在统计L1组件Oops率..."
psql -h bj-postgres-16pob4sg.sql.tencentcdb.com -p 28591 -U ai_member -d vala_bi -f scripts/stat_l1.sql -v stat_date="'${STAT_DATE}'" -o /tmp/l1_oops.csv
# 4. 统计L2 Oops率
echo "正在统计L2组件Oops率..."
psql -h bj-postgres-16pob4sg.sql.tencentcdb.com -p 28591 -U ai_member -d vala_bi -f scripts/stat_l2.sql -v stat_date="'${STAT_DATE}'" -o /tmp/l2_oops.csv
# 5. 统计L1退出率
echo "正在统计L1退出率..."
# 退出率统计SQL
psql -h bj-postgres-16pob4sg.sql.tencentcdb.com -p 28591 -U ai_member -d vala_bi << EOF > /tmp/l1_exit.csv
-- L1退出率统计逻辑待补充后续根据实际口径修改
SELECT '待补充' as component_name, 0 as exit_rate, 0 as user_count;
EOF
# 6. 统计L2退出率
echo "正在统计L2退出率..."
# 退出率统计SQL
psql -h bj-postgres-16pob4sg.sql.tencentcdb.com -p 28591 -U ai_member -d vala_bi << EOF > /tmp/l2_exit.csv
-- L2退出率统计逻辑待补充后续根据实际口径修改
SELECT '待补充' as component_name, 0 as exit_rate, 0 as user_count;
EOF
# 7. 生成Excel报表
echo "正在生成Excel报表..."
python3 scripts/generate_excel.py --l1_oops /tmp/l1_oops.csv --l2_oops /tmp/l2_oops.csv --l1_exit /tmp/l1_exit.csv --l2_exit /tmp/l2_exit.csv --output "${REPORT_PATH}"
# 8. 发送报表到飞书群
echo "正在发送报表到飞书群..."
cd /root/.openclaw/workspace
python3 -c "
import sys
sys.path.append('/root/.openclaw/skills/lark-send-message-as-bot')
from send_message import send_file_to_group
send_file_to_group(
chat_id='oc_248d0efbc02d97d81d592d0417b6b296',
file_path='${REPORT_PATH}',
caption='📊 ${REPORT_DATE} 组件Oops率与退出率报表\n\n报表包含以下sheet\n1. L1组件Oops率\n2. L2组件Oops率\n3. L1组件退出率\n4. L2组件退出率'
)
"
echo "报表发送完成!"

View File

@ -0,0 +1,83 @@
#!/bin/bash
# 组件Oops率每日定时统计任务
# 执行时间每天9:00
# 输出Excel报表发送到指定群
# 配置参数
DB_HOST="bj-postgres-16pob4sg.sql.tencentcdb.com"
DB_PORT="28591"
DB_NAME="vala_bi"
DB_USER="ai_member"
DB_PASS="LdfjdjL83h3h3^$&**YGG*"
STAT_DATE=$(date +%Y-%m-%d -d "yesterday")
TODAY=$(date +%Y%m%d)
SQL_L1_OOPS="/root/.openclaw/workspace/skills/vala-component-practice-stat/scripts/stat_l1.sql"
SQL_L2_OOPS="/root/.openclaw/workspace/skills/vala-component-practice-stat/scripts/stat_l2.sql"
SQL_L1_EXIT="/root/.openclaw/workspace/skills/vala-component-practice-stat/scripts/stat_l1_exit_rate.sql"
SQL_L2_EXIT="/root/.openclaw/workspace/skills/vala-component-practice-stat/scripts/stat_l2_exit_rate.sql"
OUTPUT_CSV_L1_OOPS="/tmp/l1_oops_${TODAY}.csv"
OUTPUT_CSV_L2_OOPS="/tmp/l2_oops_${TODAY}.csv"
OUTPUT_CSV_L1_EXIT="/tmp/l1_exit_${TODAY}.csv"
OUTPUT_CSV_L2_EXIT="/tmp/l2_exit_${TODAY}.csv"
OUTPUT_EXCEL="/root/.openclaw/workspace/组件练习质量统计_${TODAY}.xlsx"
TARGET_CHAT="chat:oc_248d0efbc02d97d81d592d0417b6b296"
LOG_FILE="/var/log/component_oops_stat.log"
# 记录日志
echo "[$(date +'%Y-%m-%d %H:%M:%S')] 开始执行组件Oops率每日统计任务统计日期${STAT_DATE}" >> $LOG_FILE
# SQL已配置为自动取前一天日期无需手动替换
# 1. 运行统计SQL
export PGPASSWORD=$DB_PASS
# 统计Oops率
psql -h $DB_HOST -p $DB_PORT -U $DB_USER -d $DB_NAME --csv -f $SQL_L1_OOPS > $OUTPUT_CSV_L1_OOPS 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] L1 Oops率统计SQL执行失败" >> $LOG_FILE
exit 1
fi
psql -h $DB_HOST -p $DB_PORT -U $DB_USER -d $DB_NAME --csv -f $SQL_L2_OOPS > $OUTPUT_CSV_L2_OOPS 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] L2 Oops率统计SQL执行失败" >> $LOG_FILE
exit 1
fi
# 统计退出率
psql -h $DB_HOST -p $DB_PORT -U $DB_USER -d $DB_NAME --csv -f $SQL_L1_EXIT > $OUTPUT_CSV_L1_EXIT 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] L1退出率统计SQL执行失败" >> $LOG_FILE
exit 1
fi
psql -h $DB_HOST -p $DB_PORT -U $DB_USER -d $DB_NAME --csv -f $SQL_L2_EXIT > $OUTPUT_CSV_L2_EXIT 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] L2退出率统计SQL执行失败" >> $LOG_FILE
exit 1
fi
# 2. 生成Excel
python3 /root/.openclaw/workspace/skills/vala-component-practice-stat/scripts/generate_excel.py \
$OUTPUT_CSV_L1_OOPS \
$OUTPUT_CSV_L2_OOPS \
$OUTPUT_CSV_L1_EXIT \
$OUTPUT_CSV_L2_EXIT \
$OUTPUT_EXCEL 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] Excel生成失败" >> $LOG_FILE
exit 1
fi
# 3. 发送到指定群
openclaw message send --channel feishu --target $TARGET_CHAT --message "【每日自动统计】$(date +%Y年%m月%d日)组件练习质量统计结果(统计日期:${STAT_DATE}包含Oops率和退出率双指标共4个sheet请查收。" --media $OUTPUT_EXCEL 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] 消息发送失败" >> $LOG_FILE
exit 1
fi
# 清理临时文件
rm -f $OUTPUT_CSV_L1_OOPS $OUTPUT_CSV_L2_OOPS $OUTPUT_CSV_L1_EXIT $OUTPUT_CSV_L2_EXIT
echo "[$(date +'%Y-%m-%d %H:%M:%S')] 任务执行完成,文件已发送到群" >> $LOG_FILE
exit 0

59
scripts/run_user_course_stat.sh Executable file
View File

@ -0,0 +1,59 @@
#!/bin/bash
# 用户购课完课统计定时任务脚本
# 执行时间每周一9:00
# 输出Excel报表发送到指定群
# 配置参数
DB_HOST="bj-postgres-16pob4sg.sql.tencentcdb.com"
DB_PORT="28591"
DB_NAME="vala_bi"
DB_USER="ai_member"
DB_PASS="LdfjdjL83h3h3^$&**YGG*"
SQL_PATH="/root/.openclaw/workspace/scripts/用户购课完课统计.sql"
OUTPUT_CSV="/tmp/user_course_stat_$(date +%Y%m%d).csv"
OUTPUT_EXCEL="/root/.openclaw/workspace/用户购课完课统计_$(date +%Y%m%d).xlsx"
TARGET_CHAT="chat:oc_af81515caefe26918736ad1941286224"
LOG_FILE="/var/log/user_course_stat.log"
# 记录日志
echo "[$(date +'%Y-%m-%d %H:%M:%S')] 开始执行用户购课完课统计任务" >> $LOG_FILE
# 1. 运行SQL导出CSV
export PGPASSWORD=$DB_PASS
psql -h $DB_HOST -p $DB_PORT -U $DB_USER -d $DB_NAME --csv -f $SQL_PATH > $OUTPUT_CSV 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] SQL执行失败" >> $LOG_FILE
exit 1
fi
# 2. CSV转Excel
python3 - << 'EOF'
import pandas as pd
import numpy as np
try:
np._get_promotion_state = lambda *args, **kwargs: 0
except:
pass
df = pd.read_csv('$OUTPUT_CSV', low_memory=False)
with pd.ExcelWriter('$OUTPUT_EXCEL') as writer:
df.to_excel(writer, sheet_name='用户统计数据', index=False)
EOF
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] Excel生成失败" >> $LOG_FILE
exit 1
fi
# 3. 发送到指定群
openclaw message send --channel feishu --target $TARGET_CHAT --message "【每周定时统计】$(date +%Y年%m月%d日)用户购课与完课情况统计,请查收附件。" --media $OUTPUT_EXCEL 2>> $LOG_FILE
if [ $? -ne 0 ]; then
echo "[$(date +'%Y-%m-%d %H:%M:%S')] 消息发送失败" >> $LOG_FILE
exit 1
fi
# 清理临时文件
rm -f $OUTPUT_CSV
echo "[$(date +'%Y-%m-%d %H:%M:%S')] 任务执行完成,文件已发送到群" >> $LOG_FILE
exit 0

View File

@ -0,0 +1,389 @@
-- 用户购课完课情况统计脚本
-- 统计范围已支付金额≥498元的有效订单用户
-- 输出字段账号ID、角色ID、激活课包、性别、年龄、省市、账号/角色创建日期、购课信息、L1/L2最新完课信息
select c.account_id as "账号ID"
,c.id as "角色ID"
,c.purchase_season_package as "激活课包"
,c.gender as "角色性别"
,2026 - cast(c.birthday as int) as "角色年龄"
,f.account_province as "角色省"
,f.account_city as "角色市"
,a.account_created_date as "账号创建日期"
,c.character_created_date as "角色创建日期"
,b.goods_name as "课包"
,b.key_from as "购课渠道"
,b.pay_date as "购课日期"
,d.chapter_name_L1 as "L1课程名称"
,d.finish_date_L1 as "L1完课日期"
,d.gap_date_L1 as "L1距离上次完课间隔"
,e.chapter_name_L2 as "L2课程名称"
,e.finish_date_L2 as "L2完课日期"
,e.gap_date_L2 as "L2距离上次完课间隔"
from
(
select id
,to_char(created_at,'YYYY-MM-DD') as account_created_date
from bi_vala_app_account
where status = 1
group by id
,created_at
) as a
left join
(
select account_id
,goods_name
,key_from
,to_char(pay_success_date,'YYYY-MM-DD') as pay_date
from bi_vala_order
where pay_amount_int > 49800
and order_status = 3
group by account_id
,goods_name
,key_from
,pay_success_date
) as b on a.id = b.account_id
left join
(
select id
,account_id
,case when purchase_season_package in ('[1,16,17,18,19,20,2,12,13,14]','[1,16,2,12,13,14,17,18,19,20]','[1,2,12,13,14,17,18,19,20]','[16,1,17,18,19,20,2,12,13,14]','[16,1,2,12,13,14,17,18,19,20]','[16,17,18,19,20,1,2,12,13,14]','[1,2,12,13,14,16,17,18,19,20]') then 'L1和L2'
when purchase_season_package in ('[1,16,17,18,19,20]','[16,1,17,18,19,20]','[16,17,18,19,20]','[16,17,18,19,20,1]') then 'L1'
when purchase_season_package in ('[1,16,17]','[16,1,17]','[16,17]','[16,17,1]') then 'L1S1'
when purchase_season_package in ('[1,16,2,12,13,14,17]') then 'L1S1和L2'
when purchase_season_package in ('[1,16,2,12,13,14]','[1,2,12,13,14,16]' ,'[1,2,12,13,14]','[16,1,2,12,13,14]') then 'L2'
when purchase_season_package in ('[1,16,2]','[1,2,16]','[1,2]','[16,1,2]') then 'L2S1'
when purchase_season_package in ('[1,2,12]') then 'L2S1和L2S2'
when purchase_season_package in ('[1,16,17,18,19,20,2]') then 'L1和L2S1'
when purchase_season_package in ('[1,16]','[1]','[16]','[16,1]','[]') then '没激活'
else purchase_season_package
end as purchase_season_package
,to_char(created_at,'YYYY-MM-DD') as character_created_date
,case when gender = 0 then ''
when gender = 1 then ''
else '未知'
end as gender
,case when split_part(birthday,'-',1) = '' then '0000'
else split_part(birthday,'-',1)
end as birthday
from bi_vala_app_character
where deleted_at is NULL
group by id
,account_id
,purchase_season_package
,created_at
,gender
,birthday
) as c on b.account_id = c.account_id
left join
(
select user_id
,finish_date as finish_date_L1
,gap_date as gap_date_L1
,chapter_name as chapter_name_L1
from
(
select user_id
,finish_date
,gap_date
,chapter_name
,rank () over (partition by user_id order by finish_date desc) as rankno
from
(
select user_id
,finish_date
,current_date - finish_date_1 as gap_date
,format('%s-%s-%s-%s',course_level,course_season,course_unit,course_lesson) as chapter_name
,rank () over (partition by user_id order by format('%s-%s-%s-%s',course_level,course_season,course_unit,course_lesson) desc) as rankno
from
(
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_0
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_1
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_2
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_3
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_4
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_5
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_6
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_7
where play_status = 1
and chapter_id > 331
group by user_id
,chapter_id
,updated_at
) as x
left join
(
select cast(id as int) as id
,course_level
,course_season
,course_unit
,course_lesson
from bi_level_unit_lesson
group by id
,course_level
,course_season
,course_unit
,course_lesson
) as y on x.chapter_id = y.id
group by user_id
,finish_date_1
,finish_date
,chapter_id
,course_level
,course_season
,course_unit
,course_lesson
)
where rankno = 1
group by user_id
,finish_date
,gap_date
,chapter_name
)
where rankno = 1
) as d on c.id = d.user_id
left join
(
select user_id
,finish_date as finish_date_L2
,gap_date as gap_date_L2
,chapter_name as chapter_name_L2
from
(
select user_id
,finish_date
,gap_date
,chapter_name
,rank () over (partition by user_id order by finish_date desc) as rankno
from
(
select user_id
,finish_date
,current_date - finish_date_1 as gap_date
,format('%s-%s-%s-%s',course_level,course_season,course_unit,course_lesson) as chapter_name
,rank () over (partition by user_id order by format('%s-%s-%s-%s',course_level,course_season,course_unit,course_lesson) desc) as rankno
from
(
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_0
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_1
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_2
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_3
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_4
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_5
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_6
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
union all
select user_id
,chapter_id
,to_char(updated_at,'YYYY-MM-DD') as finish_date
,date(updated_at) as finish_date_1
from bi_user_chapter_play_record_7
where play_status = 1
and chapter_id < 332
group by user_id
,chapter_id
,updated_at
) as x
left join
(
select cast(id as int) as id
,course_level
,course_season
,course_unit
,course_lesson
from bi_level_unit_lesson
group by id
,course_level
,course_season
,course_unit
,course_lesson
) as y on x.chapter_id = y.id
group by user_id
,finish_date_1
,finish_date
,chapter_id
,course_level
,course_season
,course_unit
,course_lesson
)
where rankno = 1
group by user_id
,finish_date
,gap_date
,chapter_name
)
where rankno = 1
) as e on c.id = e.user_id
left join
(
select account_id
,split_part(login_address,'-',1) as account_province
,split_part(login_address,'-',2) as account_city
from account_detail_info
group by account_id
,login_address
) as f on c.account_id = f.account_id
group by c.account_id
,c.id
,c.purchase_season_package
,c.gender
,c.birthday
,f.account_province
,f.account_city
,a.account_created_date
,c.character_created_date
,b.goods_name
,b.key_from
,b.pay_date
,d.chapter_name_L1
,d.finish_date_L1
,d.gap_date_L1
,e.chapter_name_L2
,e.finish_date_L2
,e.gap_date_L2
order by b.key_from
,b.pay_date;

View File

@ -0,0 +1,50 @@
---
name: vala-component-practice-stat
description: 瓦拉英语组件练习质量统计工具支持Oops率和退出率双指标统计、按日期统计、自动剔除测试账号、分L1/L2等级、自定义样本量规则、导出Excel报表。使用场景(1) 统计组件练习错误率/Oops率 (2) 统计组件练习退出率 (3) 按难度等级拆分L1(A1)/L2(A2)统计结果 (4) 练习次数不足10次自动补充历史记录 (5) 导出多sheet Excel报表 (6) 按指标降序排序展示高风险组件
---
# 瓦拉英语组件Oops率统计技能
## 功能说明
用于统计飞书多维表格中组件练习的Oops错误支持灵活配置统计规则自动处理数据口径问题输出标准化统计结果。
## 核心特性
✅ 自动剔除测试账号:仅保留`bi_vala_app_account.status = 1`的正常用户练习记录
✅ 等级拆分:自动按`level`字段拆分L1(A1)/L2(A2)两个难度等级分别统计
✅ Oops判定规则
- 练习结果为Oops/Opps → 记为Oops
- 练习结果为Pass且c_type包含`core_`或`scence_` → 记为Oops
- 其余Pass结果 → 记为Good不计入Oops
✅ 退出判定规则练习结果为Failed或Close → 记为退出
✅ 样本量规则昨日练习≥10次用昨日全量数据<10次自动补充历史记录至10次历史不足10次取全部记录
✅ 正确排序:按指标数值从高到低排序,避免字符串排序错误
✅ 自动导出生成包含4个sheet的Excel报表L1组件Oops率、L2组件Oops率、L1组件退出率、L2组件退出率
## 使用参数
| 参数 | 说明 | 默认值 |
|------|------|--------|
| 统计日期 | 要统计的日期格式YYYY-MM-DD | 昨日(当前日期-1天 |
| 最小样本量 | 组件最少统计样本量 | 10 |
| 是否剔测试账号 | 是否排除测试账号练习记录 | 是 |
| 是否分等级 | 是否拆分L1/L2分别统计 | 是 |
| 导出格式 | 输出格式CSV/Excel | Excel |
## 操作步骤
1. 确认用户统计需求:统计日期、样本量规则、是否分等级等
2. 执行Oops率统计SQL脚本`scripts/stat_l1.sql`L1、`scripts/stat_l2.sql`L2
3. 执行退出率统计SQL脚本`scripts/stat_l1_exit_rate.sql`L1、`scripts/stat_l2_exit_rate.sql`L2
4. 运行`scripts/generate_excel.py`生成包含4个sheet的Excel报表
## 脚本说明
### scripts/stat_l1.sql
统计L1(A1)等级组件Oops率的SQL脚本自动取运行前一天的数据
### scripts/stat_l2.sql
统计L2(A2)等级组件Oops率的SQL脚本自动取运行前一天的数据
### scripts/stat_l1_exit_rate.sql
统计L1(A1)等级组件退出率的SQL脚本自动取运行前一天的数据
### scripts/stat_l2_exit_rate.sql
统计L2(A2)等级组件退出率的SQL脚本自动取运行前一天的数据
### scripts/generate_excel.py
将4份CSV统计结果合并生成带4个sheet的Excel报表
### references/table_schema.md
相关数据表结构说明和字段含义参考

View File

@ -0,0 +1,26 @@
# 相关数据表结构说明
## bi_user_component_play_record_* 分表(组件练习记录表)
| 字段名 | 类型 | 说明 |
|-------|------|------|
| user_id | bigint | 角色ID关联bi_vala_app_character.id |
| component_unique_code | varchar | 组件唯一编码,业务系统中组件的唯一标识 |
| play_result | varchar | 练习结果Perfect/Good/Oops/Opps/pass |
| c_type | varchar | 组件类型包含core_前缀为核心题型scence_前缀为场景题型 |
| created_at | timestamp | 练习时间 |
| level | varchar | 难度等级A1(L1)/A2(L2) |
| deleted_at | timestamp | 删除时间,为空表示记录有效 |
## bi_vala_app_character角色表
| 字段名 | 类型 | 说明 |
|-------|------|------|
| id | bigint | 角色ID |
| account_id | bigint | 账号ID关联bi_vala_app_account.id |
| deleted_at | timestamp | 删除时间,为空表示记录有效 |
## bi_vala_app_account账号表
| 字段名 | 类型 | 说明 |
|-------|------|------|
| id | bigint | 账号ID |
| status | int | 账号状态1=正常用户,其他=测试账号/禁用账号 |
| deleted_at | timestamp | 删除时间,为空表示记录有效 |

View File

@ -0,0 +1,40 @@
import pandas as pd
import numpy as np
import sys
# 兼容numpy版本
try:
np._get_promotion_state = lambda *args, **kwargs: 0
except:
pass
def generate_excel(l1_oops_csv, l2_oops_csv, l1_exit_csv, l2_exit_csv, output_path):
"""
生成包含4个sheet的Excel报表
:param l1_oops_csv: L1等级Oops率统计结果CSV路径
:param l2_oops_csv: L2等级Oops率统计结果CSV路径
:param l1_exit_csv: L1等级退出率统计结果CSV路径
:param l2_exit_csv: L2等级退出率统计结果CSV路径
:param output_path: 输出Excel文件路径
"""
# 读取CSV文件
df_l1_oops = pd.read_csv(l1_oops_csv)
df_l2_oops = pd.read_csv(l2_oops_csv)
df_l1_exit = pd.read_csv(l1_exit_csv)
df_l2_exit = pd.read_csv(l2_exit_csv)
# 创建Excel文件
with pd.ExcelWriter(output_path) as writer:
df_l1_oops.to_excel(writer, sheet_name='L1组件Oops率', index=False)
df_l2_oops.to_excel(writer, sheet_name='L2组件Oops率', index=False)
df_l1_exit.to_excel(writer, sheet_name='L1组件退出率', index=False)
df_l2_exit.to_excel(writer, sheet_name='L2组件退出率', index=False)
print(f"Excel报表生成成功{output_path}")
if __name__ == "__main__":
if len(sys.argv) != 6:
print("用法python generate_excel.py <l1_oops_csv路径> <l2_oops_csv路径> <l1_exit_csv路径> <l2_exit_csv路径> <输出Excel路径>")
sys.exit(1)
generate_excel(sys.argv[1], sys.argv[2], sys.argv[3], sys.argv[4], sys.argv[5])

View File

@ -0,0 +1,90 @@
-- 统计L1(A1)等级组件Oops率
-- 修改统计日期时替换下面的'2026-04-12'为目标日期
WITH all_component_records AS (
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_0 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_1 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_2 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_3 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_4 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_5 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_6 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_7 WHERE deleted_at IS NULL
),
valid_records AS (
SELECT
cr.component_unique_code,
cr.play_result,
cr.created_at,
cr.c_type,
CASE WHEN DATE(cr.created_at) = CURRENT_DATE - INTERVAL '1 day' THEN 1 ELSE 0 END AS is_yesterday
FROM all_component_records cr
JOIN bi_vala_app_character c ON cr.user_id = c.id AND c.deleted_at IS NULL
JOIN bi_vala_app_account a ON c.account_id = a.id AND a.status = 1 AND a.deleted_at IS NULL
WHERE cr.play_result IS NOT NULL AND cr.component_unique_code IS NOT NULL AND cr.level = 'A1'
),
-- 统计每个组件昨日练习次数
yesterday_counts AS (
SELECT
component_unique_code,
COUNT(*) AS yesterday_cnt
FROM valid_records
WHERE is_yesterday = 1
GROUP BY component_unique_code
),
-- 给记录排序:昨日记录排最前,历史记录按时间倒序
ranked_records AS (
SELECT
vr.component_unique_code,
vr.play_result,
vr.c_type,
vr.is_yesterday,
ROW_NUMBER() OVER (
PARTITION BY vr.component_unique_code
ORDER BY vr.is_yesterday DESC, vr.created_at DESC
) AS rn
FROM valid_records vr
JOIN yesterday_counts yc ON vr.component_unique_code = yc.component_unique_code
),
-- 筛选统计样本:
-- 昨日练习≥10次取全部昨日记录
-- 昨日练习<10次取全部昨日记录 + 最近历史记录补到10次
filtered_samples AS (
SELECT
r.component_unique_code,
r.play_result,
r.c_type
FROM ranked_records r
JOIN yesterday_counts yc ON r.component_unique_code = yc.component_unique_code
WHERE
(yc.yesterday_cnt >= 10 AND r.is_yesterday = 1)
OR
(yc.yesterday_cnt < 10 AND r.rn <= 10)
),
-- 统计结果,先按数值排序
component_stats AS (
SELECT
component_unique_code AS ,
COUNT(*) AS ,
SUM(CASE
WHEN play_result IN ('Oops', 'Opps') THEN 1
WHEN play_result = 'Pass' AND (c_type LIKE '%core_%' OR c_type LIKE '%scence_%') THEN 1
ELSE 0
END) AS Oops次数,
ROUND(CASE WHEN COUNT(*) = 0 THEN 0 ELSE
SUM(CASE
WHEN play_result IN ('Oops', 'Opps') THEN 1
WHEN play_result = 'Pass' AND (c_type LIKE '%core_%' OR c_type LIKE '%scence_%') THEN 1
ELSE 0
END)::DECIMAL / COUNT(*) * 100 END, 2) AS Oops率数值
FROM filtered_samples
GROUP BY component_unique_code
ORDER BY Oops率数值 DESC
)
SELECT , , Oops次数, Oops率数值 || '%' AS Oops率 FROM component_stats;

View File

@ -0,0 +1,86 @@
-- 统计L1(A1)等级组件练习退出率
-- 退出判定play_result为failed或close视为退出
-- 统计规则:
-- 1. 自动统计运行日期前一天的数据
-- 2. 昨日练习≥10次用昨日全量数据<10次自动补充历史记录至10次历史不足10次取全部记录
-- 3. 自动剔除测试账号仅保留bi_vala_app_account.status=1的正常用户
WITH all_component_records AS (
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_0 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_1 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_2 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_3 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_4 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_5 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_6 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_7 WHERE deleted_at IS NULL
),
valid_records AS (
SELECT
cr.component_unique_code,
cr.play_result,
cr.created_at,
cr.c_type,
CASE WHEN DATE(cr.created_at) = CURRENT_DATE - INTERVAL '1 day' THEN 1 ELSE 0 END AS is_yesterday
FROM all_component_records cr
JOIN bi_vala_app_character c ON cr.user_id = c.id AND c.deleted_at IS NULL
JOIN bi_vala_app_account a ON c.account_id = a.id AND a.status = 1 AND a.deleted_at IS NULL
WHERE cr.play_result IS NOT NULL AND cr.component_unique_code IS NOT NULL AND cr.level = 'A1'
),
-- 统计每个组件昨日练习次数
yesterday_counts AS (
SELECT
component_unique_code,
COUNT(*) AS yesterday_cnt
FROM valid_records
WHERE is_yesterday = 1
GROUP BY component_unique_code
),
-- 给记录排序:昨日记录排最前,历史记录按时间倒序
ranked_records AS (
SELECT
vr.component_unique_code,
vr.play_result,
vr.c_type,
vr.is_yesterday,
ROW_NUMBER() OVER (
PARTITION BY vr.component_unique_code
ORDER BY vr.is_yesterday DESC, vr.created_at DESC
) AS rn
FROM valid_records vr
JOIN yesterday_counts yc ON vr.component_unique_code = yc.component_unique_code
),
-- 筛选统计样本:
-- 昨日练习≥10次取全部昨日记录
-- 昨日练习<10次取全部昨日记录 + 最近历史记录补到10次
filtered_samples AS (
SELECT
r.component_unique_code,
r.play_result,
r.c_type
FROM ranked_records r
JOIN yesterday_counts yc ON r.component_unique_code = yc.component_unique_code
WHERE
(yc.yesterday_cnt >= 10 AND r.is_yesterday = 1)
OR
(yc.yesterday_cnt < 10 AND r.rn <= 10)
),
-- 统计结果,先按数值排序
component_stats AS (
SELECT
component_unique_code AS ,
COUNT(*) AS ,
SUM(CASE WHEN play_result IN ('Failed', 'Close') THEN 1 ELSE 0 END) AS 退,
ROUND(CASE WHEN COUNT(*) = 0 THEN 0 ELSE
SUM(CASE WHEN play_result IN ('Failed', 'Close') THEN 1 ELSE 0 END)::DECIMAL / COUNT(*) * 100 END, 2) AS 退
FROM filtered_samples
GROUP BY component_unique_code
ORDER BY 退 DESC
)
SELECT , , 退, 退 || '%' AS 退 FROM component_stats;

View File

@ -0,0 +1,90 @@
-- 统计L2(A2)等级组件Oops率
-- 修改统计日期时替换下面的'2026-04-12'为目标日期
WITH all_component_records AS (
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_0 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_1 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_2 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_3 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_4 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_5 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_6 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_7 WHERE deleted_at IS NULL
),
valid_records AS (
SELECT
cr.component_unique_code,
cr.play_result,
cr.created_at,
cr.c_type,
CASE WHEN DATE(cr.created_at) = CURRENT_DATE - INTERVAL '1 day' THEN 1 ELSE 0 END AS is_yesterday
FROM all_component_records cr
JOIN bi_vala_app_character c ON cr.user_id = c.id AND c.deleted_at IS NULL
JOIN bi_vala_app_account a ON c.account_id = a.id AND a.status = 1 AND a.deleted_at IS NULL
WHERE cr.play_result IS NOT NULL AND cr.component_unique_code IS NOT NULL AND cr.level = 'A2'
),
-- 统计每个组件昨日练习次数
yesterday_counts AS (
SELECT
component_unique_code,
COUNT(*) AS yesterday_cnt
FROM valid_records
WHERE is_yesterday = 1
GROUP BY component_unique_code
),
-- 给记录排序:昨日记录排最前,历史记录按时间倒序
ranked_records AS (
SELECT
vr.component_unique_code,
vr.play_result,
vr.c_type,
vr.is_yesterday,
ROW_NUMBER() OVER (
PARTITION BY vr.component_unique_code
ORDER BY vr.is_yesterday DESC, vr.created_at DESC
) AS rn
FROM valid_records vr
JOIN yesterday_counts yc ON vr.component_unique_code = yc.component_unique_code
),
-- 筛选统计样本:
-- 昨日练习≥10次取全部昨日记录
-- 昨日练习<10次取全部昨日记录 + 最近历史记录补到10次
filtered_samples AS (
SELECT
r.component_unique_code,
r.play_result,
r.c_type
FROM ranked_records r
JOIN yesterday_counts yc ON r.component_unique_code = yc.component_unique_code
WHERE
(yc.yesterday_cnt >= 10 AND r.is_yesterday = 1)
OR
(yc.yesterday_cnt < 10 AND r.rn <= 10)
),
-- 统计结果,先按数值排序
component_stats AS (
SELECT
component_unique_code AS ,
COUNT(*) AS ,
SUM(CASE
WHEN play_result IN ('Oops', 'Opps') THEN 1
WHEN play_result = 'Pass' AND (c_type LIKE '%core_%' OR c_type LIKE '%scence_%') THEN 1
ELSE 0
END) AS Oops次数,
ROUND(CASE WHEN COUNT(*) = 0 THEN 0 ELSE
SUM(CASE
WHEN play_result IN ('Oops', 'Opps') THEN 1
WHEN play_result = 'Pass' AND (c_type LIKE '%core_%' OR c_type LIKE '%scence_%') THEN 1
ELSE 0
END)::DECIMAL / COUNT(*) * 100 END, 2) AS Oops率数值
FROM filtered_samples
GROUP BY component_unique_code
ORDER BY Oops率数值 DESC
)
SELECT , , Oops次数, Oops率数值 || '%' AS Oops率 FROM component_stats;

View File

@ -0,0 +1,86 @@
-- 统计L2(A2)等级组件练习退出率
-- 退出判定play_result为failed或close视为退出
-- 统计规则:
-- 1. 自动统计运行日期前一天的数据
-- 2. 昨日练习≥10次用昨日全量数据<10次自动补充历史记录至10次历史不足10次取全部记录
-- 3. 自动剔除测试账号仅保留bi_vala_app_account.status=1的正常用户
WITH all_component_records AS (
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_0 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_1 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_2 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_3 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_4 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_5 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_6 WHERE deleted_at IS NULL
UNION ALL
SELECT user_id, component_unique_code, play_result, created_at, level, c_type FROM bi_user_component_play_record_7 WHERE deleted_at IS NULL
),
valid_records AS (
SELECT
cr.component_unique_code,
cr.play_result,
cr.created_at,
cr.c_type,
CASE WHEN DATE(cr.created_at) = CURRENT_DATE - INTERVAL '1 day' THEN 1 ELSE 0 END AS is_yesterday
FROM all_component_records cr
JOIN bi_vala_app_character c ON cr.user_id = c.id AND c.deleted_at IS NULL
JOIN bi_vala_app_account a ON c.account_id = a.id AND a.status = 1 AND a.deleted_at IS NULL
WHERE cr.play_result IS NOT NULL AND cr.component_unique_code IS NOT NULL AND cr.level = 'A2'
),
-- 统计每个组件昨日练习次数
yesterday_counts AS (
SELECT
component_unique_code,
COUNT(*) AS yesterday_cnt
FROM valid_records
WHERE is_yesterday = 1
GROUP BY component_unique_code
),
-- 给记录排序:昨日记录排最前,历史记录按时间倒序
ranked_records AS (
SELECT
vr.component_unique_code,
vr.play_result,
vr.c_type,
vr.is_yesterday,
ROW_NUMBER() OVER (
PARTITION BY vr.component_unique_code
ORDER BY vr.is_yesterday DESC, vr.created_at DESC
) AS rn
FROM valid_records vr
JOIN yesterday_counts yc ON vr.component_unique_code = yc.component_unique_code
),
-- 筛选统计样本:
-- 昨日练习≥10次取全部昨日记录
-- 昨日练习<10次取全部昨日记录 + 最近历史记录补到10次
filtered_samples AS (
SELECT
r.component_unique_code,
r.play_result,
r.c_type
FROM ranked_records r
JOIN yesterday_counts yc ON r.component_unique_code = yc.component_unique_code
WHERE
(yc.yesterday_cnt >= 10 AND r.is_yesterday = 1)
OR
(yc.yesterday_cnt < 10 AND r.rn <= 10)
),
-- 统计结果,先按数值排序
component_stats AS (
SELECT
component_unique_code AS ,
COUNT(*) AS ,
SUM(CASE WHEN play_result IN ('Failed', 'Close') THEN 1 ELSE 0 END) AS 退,
ROUND(CASE WHEN COUNT(*) = 0 THEN 0 ELSE
SUM(CASE WHEN play_result IN ('Failed', 'Close') THEN 1 ELSE 0 END)::DECIMAL / COUNT(*) * 100 END, 2) AS 退
FROM filtered_samples
GROUP BY component_unique_code
ORDER BY 退 DESC
)
SELECT , , 退, 退 || '%' AS 退 FROM component_stats;