fix: update

This commit is contained in:
Daniel
2026-03-03 17:27:55 +08:00
parent 29c921f498
commit 1764a44eb3
22 changed files with 818 additions and 30 deletions

3
.env
View File

@@ -1,2 +1,3 @@
# Mapbox 地图令牌
VITE_MAPBOX_ACCESS_TOKEN=pk.eyJ1IjoiZDI5cTAiLCJhIjoiY21oaGRmcTkzMGltZzJscHR1N2FhZnY5dCJ9.7ueF2lS6-C9Mm_xon7NnIA
VITE_MAPBOX_ACCESS_TOKEN=pk.eyJ1IjoiZDI5cTAiLCJhIjoiY21tYWQyOXI3MGFrZzJwcjJmZGltODI4ZCJ9.0jW_aK91VJExw6ffKGqWIA
DASHSCOPE_API_KEY=sk-029a4c4d761d49b99cfe6073234ac443

View File

@@ -1,4 +1,4 @@
# Mapbox 地图令牌 (波斯湾区域展示)
# Mapbox 地图令牌(仅在此或 .env 中配置,勿写进源码;若曾泄漏请到 Mapbox 控制台轮换)
# 免费申请: https://account.mapbox.com/access-tokens/
VITE_MAPBOX_ACCESS_TOKEN=your_mapbox_public_token_here

3
.env的副本 Normal file
View File

@@ -0,0 +1,3 @@
# Mapbox 地图令牌
VITE_MAPBOX_ACCESS_TOKEN=pk.eyJ1IjoiZDI5cTAiLCJhIjoiY21tYWQyOXI3MGFrZzJwcjJmZGltODI4ZCJ9.0jW_aK91VJExw6ffKGqWIA
DASHSCOPE_API_KEY=sk-029a4c4d761d49b99cfe6073234ac443

8
.gitignore vendored
View File

@@ -26,8 +26,8 @@ dist-ssr
# API database
# server/data.db
# Env
# .env
# .env.local
# .env.*.local
# Env(含 token勿提交
.env
.env.local
.env.*.local
.pyc

View File

@@ -54,6 +54,86 @@ pip install -r requirements.txt
**事件脉络不更新时**:多半是未启动 `npm run gdelt`。只跑 `npm run api` 时,事件脉络会显示空或仅有缓存。
## 如何检查爬虫是否工作正常
按下面顺序做即可确认整条链路(爬虫 → 数据库 → Node 重载 → API/WebSocket正常。
### 1. 一键验证(推荐)
先启动 API再执行验证脚本可选是否顺带启动爬虫
```bash
# 终端 1必须
npm run api
# 终端 2执行验证不启动爬虫只检查当前状态
./scripts/verify-pipeline.sh
# 或:顺带启动爬虫并等首次抓取后再验证
./scripts/verify-pipeline.sh --start-crawler
```
脚本会检查API 健康、态势数据含 `lastUpdated`、爬虫服务是否可达、`news_content`/situation_update、战损字段、`POST /api/crawler/notify` 是否可用。
### 2. 手动快速检查
| 步骤 | 命令 / 操作 | 正常表现 |
|-----|-------------|----------|
| API 是否在跑 | `curl -s http://localhost:3001/api/health` | 返回 `{"ok":true}` |
| 态势是否可读 | `curl -s http://localhost:3001/api/situation \| head -c 300` | 含 `lastUpdated``usForces``recentUpdates` |
| RSS 能否抓到 | `npm run crawler:test` | 输出「RSS 抓取: N 条」N>0 表示有命中 |
| 爬虫服务gdelt | `curl -s http://localhost:8000/crawler/status` | 返回 JSON`db_path`/`db_exists` 等 |
| 库里有无爬虫数据 | `sqlite3 server/data.db "SELECT COUNT(*) FROM situation_update; SELECT COUNT(*) FROM news_content;"` 或访问 `http://localhost:3001/api/db/dashboard` | situation_update、news_content 条数 > 0跑过流水线后 |
| 通知后是否重载 | 爬虫写库后会 POST `/api/crawler/notify`Node 会 `reloadFromFile` 再广播 | 前端/`/api/situation``lastUpdated` 和内容会更新 |
### 3. 跑一轮流水线(不常驻爬虫时)
不启动 gdelt 时,可单次跑完整流水线(抓取 → 去重 → 写表 → notify
```bash
npm run api # 保持运行
cd crawler && python3 -c "
from pipeline import run_full_pipeline
from config import DB_PATH, API_BASE
n_fetched, n_news, n_panel = run_full_pipeline(db_path=DB_PATH, api_base=API_BASE, notify=True)
print('抓取:', n_fetched, '去重新增:', n_news, '面板写入:', n_panel)
"
```
有网络且有关键词命中时,应看到非零数字;再查 `curl -s http://localhost:3001/api/situation` 或前端事件脉络是否出现新数据。
### 4. 仅测提取逻辑(不写库)
```bash
npm run crawler:test:extraction # 规则/db_merge 测试
# 或按 README「快速自测命令」用示例文本调 extract_from_news 看 combat_losses_delta / key_location_updates
```
**常见现象**:抓取 0 条 → 网络/RSS 被墙或关键词未命中situation_update 为空 → 未跑流水线或去重后无新增;前端不刷新 → 未开 `npm run api` 或未开爬虫gdelt
### 5. 爬虫与面板是否联通
专门检查「爬虫写库」与「面板展示」是否一致:
```bash
./scripts/check-crawler-panel-connectivity.sh
```
会对比:爬虫侧的 `situation_update` 条数 vs 面板 API 返回的 `recentUpdates` 条数,并说明为何战损/基地等不一定随每条新闻变化。
## 爬虫与面板数据联动说明
| 面板展示 | 数据来源(表/接口) | 是否由爬虫更新 | 说明 |
|----------|---------------------|----------------|------|
| **事件脉络** (recentUpdates) | situation_update → getSituation() | ✅ 是 | 每条去重后的新闻会写入 situation_updateNode 收到 notify 后重载 DB 再广播 |
| **地图冲突点** (conflictEvents) | gdelt_events 或 RSS→gdelt 回填 | ✅ 是 | GDELT 或 GDELT 禁用时由 situation_update 同步到 gdelt_events |
| **战损/装备毁伤** (combatLosses) | combat_losses | ⚠️ 有条件 | 仅当 AI/规则从新闻中提取到数字如「2 名美军死亡」merge 才写入增量 |
| **基地/地点状态** (keyLocations) | key_location | ⚠️ 有条件 | 仅当提取到 key_location_updates如某基地遭袭时更新 |
| **力量摘要/指数/资产** (summary, powerIndex, assets) | force_summary, power_index, force_asset | ❌ 否 | 仅 seed 初始化,爬虫不写 |
| **华尔街/报复情绪** (wallStreet, retaliation) | wall_street_trend, retaliation_* | ⚠️ 有条件 | 仅当提取器输出对应字段时更新 |
因此:**新闻很多、但战损/基地数字不动**是正常现象——多数标题不含可解析的伤亡/基地数字只有事件脉络recentUpdates和地图冲突点会随每条新闻增加。若**事件脉络也不更新**,请确认 Node 终端在爬虫每轮抓取后是否出现 `[crawler/notify] DB 已重载`;若无,检查爬虫的 `API_BASE` 是否指向当前 API默认 `http://localhost:3001`)。
## 写库流水线(与 server/README 第五节一致)
RSS 与主入口均走统一流水线 `pipeline.run_full_pipeline`
@@ -80,6 +160,7 @@ RSS → 抓取 → 清洗 → 去重 → 写 news_content / situation_update /
- `DB_PATH`: SQLite 路径,默认 `../server/data.db`
- `API_BASE`: Node API 地址,默认 `http://localhost:3001`
- **`DASHSCOPE_API_KEY`**阿里云通义DashScopeAPI Key。**设置后全程使用商业模型,无需本机安装 Ollama**(适合 Mac 版本较低无法跑 Ollama 的情况)。获取: [阿里云百炼 / DashScope](https://dashscope.console.aliyun.com/) → 创建 API-KEY复制到环境变量或项目根目录 `.env``DASHSCOPE_API_KEY=sk-xxx`。摘要、分类、战损/基地提取均走通义。
- `GDELT_QUERY`: 搜索关键词,默认 `United States Iran military`
- `GDELT_MAX_RECORDS`: 最大条数,默认 30
- `GDELT_TIMESPAN`: 时间范围,`1h` / `1d` / `1week`,默认 `1d`(近日资讯)

View File

@@ -2,6 +2,7 @@
"""
AI 清洗新闻数据,严格按面板字段约束输出
面板 EventTimelinePanel 所需summary(≤120字)、category(枚举)、severity(枚举)
优先使用 DASHSCOPE_API_KEY通义无需 Ollama否则 Ollama最后规则兜底
"""
import os
import re
@@ -9,6 +10,7 @@ from typing import Optional
CLEANER_AI_DISABLED = os.environ.get("CLEANER_AI_DISABLED", "0") == "1"
OLLAMA_MODEL = os.environ.get("OLLAMA_MODEL", "llama3.1")
DASHSCOPE_API_KEY = os.environ.get("DASHSCOPE_API_KEY", "").strip()
# 面板 schema必须与 EventTimelinePanel / SituationUpdate 一致
SUMMARY_MAX_LEN = 120 # 面板 line-clamp-2 展示
@@ -30,6 +32,38 @@ def _rule_clean(text: str, max_len: int = SUMMARY_MAX_LEN) -> str:
return _sanitize_summary(text, max_len)
def _call_dashscope_summary(text: str, max_len: int, timeout: int = 8) -> Optional[str]:
"""调用阿里云通义DashScope提炼摘要无需 Ollama。需设置 DASHSCOPE_API_KEY"""
if not DASHSCOPE_API_KEY or CLEANER_AI_DISABLED or not text or len(str(text).strip()) < 5:
return None
try:
import dashscope
from http import HTTPStatus
dashscope.api_key = DASHSCOPE_API_KEY
prompt = f"""将新闻提炼为1-2句简洁中文事实直接输出纯文本不要标号、引号、解释。限{max_len}字内。
原文:{str(text)[:350]}
输出:"""
r = dashscope.Generation.call(
model="qwen-turbo",
messages=[{"role": "user", "content": prompt}],
result_format="message",
max_tokens=150,
)
if r.status_code != HTTPStatus.OK:
return None
out = (r.output.get("choices", [{}])[0].get("message", {}).get("content", "") or "").strip()
out = re.sub(r"^[\d\.\-\*\s]+", "", out)
out = re.sub(r"^['\"\s]+|['\"\s]+$", "", out)
out = _sanitize_summary(out, max_len)
if out and len(out) > 3:
return out
return None
except Exception:
return None
def _call_ollama_summary(text: str, max_len: int, timeout: int = 6) -> Optional[str]:
"""调用 Ollama 提炼摘要输出须为纯文本、≤max_len 字"""
if CLEANER_AI_DISABLED or not text or len(str(text).strip()) < 5:
@@ -71,7 +105,11 @@ def clean_news_for_panel(text: str, max_len: int = SUMMARY_MAX_LEN) -> str:
t = str(text).strip()
if not t:
return ""
res = _call_ollama_summary(t, max_len, timeout=6)
# 优先商业模型(通义),再 Ollama最后规则
if DASHSCOPE_API_KEY:
res = _call_dashscope_summary(t, max_len, timeout=8)
else:
res = _call_ollama_summary(t, max_len, timeout=6)
if res:
return res
return _rule_clean(t, max_len)

View File

@@ -1,7 +1,7 @@
# -*- coding: utf-8 -*-
"""
AI 新闻分类与严重度判定
优先使用 Ollama 本地模型(免费),失败则回退到规则
优先 DASHSCOPE_API_KEY通义无需 Ollama否则 Ollama最后规则
设置 PARSER_AI_DISABLED=1 可只用规则(更快)
"""
import os
@@ -11,7 +11,8 @@ Category = Literal["deployment", "alert", "intel", "diplomatic", "other"]
Severity = Literal["low", "medium", "high", "critical"]
PARSER_AI_DISABLED = os.environ.get("PARSER_AI_DISABLED", "0") == "1"
OLLAMA_MODEL = os.environ.get("OLLAMA_MODEL", "llama3.1") # 或 qwen2.5:7b
OLLAMA_MODEL = os.environ.get("OLLAMA_MODEL", "llama3.1")
DASHSCOPE_API_KEY = os.environ.get("DASHSCOPE_API_KEY", "").strip()
_CATEGORIES = ("deployment", "alert", "intel", "diplomatic", "other")
_SEVERITIES = ("low", "medium", "high", "critical")
@@ -32,8 +33,37 @@ def _parse_ai_response(text: str) -> Tuple[Category, Severity]:
return cat, sev # type: ignore
def _call_dashscope(text: str, timeout: int = 6) -> Optional[Tuple[Category, Severity]]:
"""调用阿里云通义DashScope分类无需 Ollama。需设置 DASHSCOPE_API_KEY"""
if not DASHSCOPE_API_KEY or PARSER_AI_DISABLED:
return None
try:
import dashscope
from http import HTTPStatus
dashscope.api_key = DASHSCOPE_API_KEY
prompt = f"""Classify this news about US-Iran/middle east (one line only):
- category: deployment|alert|intel|diplomatic|other
- severity: low|medium|high|critical
News: {text[:300]}
Reply format: category:severity (e.g. alert:high)"""
r = dashscope.Generation.call(
model="qwen-turbo",
messages=[{"role": "user", "content": prompt}],
result_format="message",
max_tokens=32,
)
if r.status_code != HTTPStatus.OK:
return None
out = r.output.get("choices", [{}])[0].get("message", {}).get("content", "")
return _parse_ai_response(out)
except Exception:
return None
def _call_ollama(text: str, timeout: int = 5) -> Optional[Tuple[Category, Severity]]:
"""调用 Ollama 本地模型。需先运行 ollama run llama3.1 或 qwen2.5:7b"""
"""调用 Ollama 本地模型。需先运行 ollama run llama3.1"""
if PARSER_AI_DISABLED:
return None
try:
@@ -73,9 +103,16 @@ def _rule_severity(text: str, category: Category) -> Severity:
return severity(text, category)
def _call_ai(text: str) -> Optional[Tuple[Category, Severity]]:
"""优先通义,再 Ollama"""
if DASHSCOPE_API_KEY:
return _call_dashscope(text)
return _call_ollama(text)
def classify(text: str) -> Category:
"""分类。AI 失败时回退规则"""
res = _call_ollama(text)
res = _call_ai(text)
if res:
return res[0]
return _rule_classify(text)
@@ -83,7 +120,7 @@ def classify(text: str) -> Category:
def severity(text: str, category: Category) -> Severity:
"""严重度。AI 失败时回退规则"""
res = _call_ollama(text)
res = _call_ai(text)
if res:
return res[1]
return _rule_severity(text, category)
@@ -95,7 +132,7 @@ def classify_and_severity(text: str) -> Tuple[Category, Severity]:
from parser import classify, severity
c = classify(text)
return c, severity(text, c)
res = _call_ollama(text)
res = _call_ai(text)
if res:
return res
return _rule_classify(text), _rule_severity(text, _rule_classify(text))

View File

@@ -14,10 +14,14 @@ def _notify_api(api_base: str) -> bool:
"""调用 Node API 触发立即广播"""
try:
import urllib.request
token = os.environ.get("API_CRAWLER_TOKEN", "").strip()
req = urllib.request.Request(
f"{api_base.rstrip('/')}/api/crawler/notify",
method="POST",
headers={"Content-Type": "application/json"},
headers={
"Content-Type": "application/json",
**({"X-Crawler-Token": token} if token else {}),
},
)
with urllib.request.urlopen(req, timeout=5) as resp:
return resp.status == 200

View File

@@ -242,7 +242,16 @@ def _write_to_db(events: List[dict]) -> None:
def _notify_node() -> None:
try:
r = requests.post(f"{API_BASE}/api/crawler/notify", timeout=5, proxies={"http": None, "https": None})
headers = {}
token = os.environ.get("API_CRAWLER_TOKEN", "").strip()
if token:
headers["X-Crawler-Token"] = token
r = requests.post(
f"{API_BASE}/api/crawler/notify",
timeout=5,
headers=headers,
proxies={"http": None, "https": None},
)
if r.status_code != 200:
print(" [warn] notify API 失败")
except Exception as e:
@@ -340,7 +349,10 @@ def crawler_backfill():
return {"ok": False, "error": "db not found"}
try:
from db_merge import merge
if os.environ.get("CLEANER_AI_DISABLED", "0") == "1":
use_dashscope = bool(os.environ.get("DASHSCOPE_API_KEY", "").strip())
if use_dashscope:
from extractor_dashscope import extract_from_news
elif os.environ.get("CLEANER_AI_DISABLED", "0") == "1":
from extractor_rules import extract_from_news
else:
from extractor_ai import extract_from_news

View File

@@ -0,0 +1,61 @@
#!/usr/bin/env bash
# 检查爬虫数据与面板数据是否联通
# 用法: ./scripts/check-crawler-panel-connectivity.sh
# 需先启动: npm run api可选: npm run gdelt
set -e
API_URL="${API_URL:-http://localhost:3001}"
CRAWLER_URL="${CRAWLER_URL:-http://localhost:8000}"
echo "=========================================="
echo "爬虫 ↔ 面板 联通检查"
echo "API: $API_URL | Crawler: $CRAWLER_URL"
echo "=========================================="
# 1. 爬虫侧situation_update 条数
CRAWLER_SU_COUNT=""
if curl -sf "$CRAWLER_URL/crawler/status" >/dev/null 2>&1; then
if command -v jq &>/dev/null; then
CRAWLER_SU_COUNT=$(curl -sf "$CRAWLER_URL/crawler/status" | jq -r '.situation_update_count // "?"')
else
CRAWLER_SU_COUNT="(需 jq 查看)"
fi
echo "[爬虫] situation_update 条数: $CRAWLER_SU_COUNT"
else
echo "[爬虫] 未启动或不可达 (curl $CRAWLER_URL/crawler/status 失败)"
fi
# 2. 面板侧API 返回的 recentUpdates 条数、lastUpdated
if ! curl -sf "$API_URL/api/health" >/dev/null 2>&1; then
echo "[API] 未启动,请先运行: npm run api"
exit 1
fi
SIT=$(curl -sf "$API_URL/api/situation" 2>/dev/null || echo "{}")
if command -v jq &>/dev/null; then
RU_LEN=$(echo "$SIT" | jq '.recentUpdates | length')
LAST=$(echo "$SIT" | jq -r '.lastUpdated // "?"')
echo "[面板] recentUpdates 条数: $RU_LEN | lastUpdated: $LAST"
else
echo "[面板] 态势数据已获取 (安装 jq 可显示条数)"
fi
# 3. 一致性:爬虫写的是 server/data.dbNode 通过 notify 重载后应一致
echo ""
echo "--- 联动说明 ---"
echo " • 事件脉络 (recentUpdates) ← situation_update 表,由爬虫 write_updates() 写入"
echo " • 爬虫每次抓取后会 POST $API_URL/api/crawler/notifyNode 会 reloadFromFile() 后广播"
echo " • 若爬虫有数据但面板 recentUpdates 很少/为空:检查 Node 终端是否出现 [crawler/notify] DB 已重载"
echo " • 若从未出现:检查 API_BASE 是否指向当前 API默认 http://localhost:3001"
echo " • 战损/基地/力量指数:仅当 AI/规则从新闻中提取到数字时才会更新,多数新闻不会触发"
echo "=========================================="
# 4. 可选:触发一次 notify 看 Node 是否重载(不启动爬虫时可用于测试)
# 非交互时跳过;交互时可用: echo y | ./scripts/check-crawler-panel-connectivity.sh
if [[ -t 0 ]]; then
echo ""
read -r -p "是否发送一次 POST /api/crawler/notify 测试 Node 重载? [y/N] " ans
if [[ "${ans,,}" = "y" ]]; then
curl -sf -X POST "$API_URL/api/crawler/notify" && echo " 已发送 notify请看 Node 终端是否打印 [crawler/notify] DB 已重载"
fi
fi

Binary file not shown.

View File

@@ -7,6 +7,8 @@ const fs = require('fs')
const dbPath = process.env.DB_PATH || path.join(__dirname, 'data.db')
let _db = null
/** sql.js 构造函数initDb 时注入,供 reloadFromFile 使用 */
let _sqlJs = null
function getDb() {
if (!_db) throw new Error('DB not initialized. Call initDb() first.')
@@ -239,6 +241,7 @@ function runMigrations(db) {
async function initDb() {
const initSqlJs = require('sql.js')
const SQL = await initSqlJs()
_sqlJs = SQL
let data = new Uint8Array(0)
if (fs.existsSync(dbPath)) {
data = new Uint8Array(fs.readFileSync(dbPath))
@@ -261,6 +264,30 @@ async function initDb() {
return _db
}
/**
* 从磁盘重新加载 DB爬虫写入同一文件后调用使 Node 内存中的库与文件一致)
*/
function reloadFromFile() {
if (!_sqlJs || !_db) throw new Error('DB not initialized. Call initDb() first.')
let data = new Uint8Array(0)
if (fs.existsSync(dbPath)) {
data = new Uint8Array(fs.readFileSync(dbPath))
}
const nativeDb = new _sqlJs.Database(data)
function persist() {
try {
const buf = nativeDb.export()
fs.writeFileSync(dbPath, Buffer.from(buf))
} catch (e) {
console.error('[db] persist error:', e.message)
}
}
nativeDb.run('PRAGMA journal_mode = WAL')
const wrapped = wrapDatabase(nativeDb, persist)
runMigrations(wrapped)
_db = wrapped
}
const proxy = {
prepare(sql) {
return getDb().prepare(sql)
@@ -276,3 +303,4 @@ const proxy = {
module.exports = proxy
module.exports.initDb = initDb
module.exports.getDb = getDb
module.exports.reloadFromFile = reloadFromFile

View File

@@ -14,6 +14,9 @@ const openApiSpec = require('./openapi')
const app = express()
const PORT = process.env.API_PORT || 3001
// 爬虫通知用的共享密钥API_CRAWLER_TOKEN仅在服务端与爬虫进程间传递
const CRAWLER_TOKEN = process.env.API_CRAWLER_TOKEN || ''
app.set('trust proxy', 1)
app.use(cors())
app.use(express.json())
@@ -23,7 +26,14 @@ app.use('/api-docs', swaggerUi.serve, swaggerUi.setup(openApiSpec))
app.use('/api', routes)
app.get('/api/health', (_, res) => res.json({ ok: true }))
app.post('/api/crawler/notify', (_, res) => {
app.post('/api/crawler/notify', (req, res) => {
// 若配置了 API_CRAWLER_TOKEN则要求爬虫携带 X-Crawler-Token 头
if (CRAWLER_TOKEN) {
const token = req.headers['x-crawler-token']
if (typeof token !== 'string' || token !== CRAWLER_TOKEN) {
return res.status(401).json({ error: 'unauthorized' })
}
}
notifyCrawlerUpdate()
res.json({ ok: true })
})
@@ -59,13 +69,18 @@ function broadcastSituation() {
app.set('broadcastSituation', broadcastSituation)
setInterval(broadcastSituation, 3000)
// 供爬虫调用:更新 situation.updated_at 并立即广播
// 供爬虫调用:先从磁盘重载 DB纳入爬虫写入更新 updated_at 并立即广播
function notifyCrawlerUpdate() {
try {
const db = require('./db')
db.reloadFromFile()
db.prepare("INSERT OR REPLACE INTO situation (id, data, updated_at) VALUES (1, '{}', ?)").run(new Date().toISOString())
broadcastSituation()
} catch (_) {}
const n = db.prepare('SELECT COUNT(*) as c FROM situation_update').get().c
console.log('[crawler/notify] DB 已重载并广播situation_update 条数:', n)
} catch (e) {
console.error('[crawler/notify]', e?.message || e)
}
}
db.initDb().then(() => {

View File

@@ -5,8 +5,22 @@ const db = require('./db')
const router = express.Router()
// 数据库 Dashboard返回各表原始数据
router.get('/db/dashboard', (req, res) => {
// 简单鉴权:通过环境变量配置的 API_ADMIN_KEY 保护敏感接口(不返回真实密钥)
const ADMIN_API_KEY = process.env.API_ADMIN_KEY || ''
function requireAdmin(req, res, next) {
if (!ADMIN_API_KEY) {
return res.status(500).json({ error: 'admin key not configured' })
}
const token = req.headers['x-api-key']
if (typeof token !== 'string' || token !== ADMIN_API_KEY) {
return res.status(401).json({ error: 'unauthorized' })
}
return next()
}
// 数据库 Dashboard返回各表原始数据需 admin 鉴权)
router.get('/db/dashboard', requireAdmin, (req, res) => {
try {
const tables = [
'feedback',
@@ -58,8 +72,14 @@ router.get('/db/dashboard', (req, res) => {
}
})
// 资讯内容(独立表,供后续消费)
// 资讯内容(独立表,供后续消费,可选 admin key若配置了 ADMIN_API_KEY 则也要求鉴权
router.get('/news', (req, res) => {
if (ADMIN_API_KEY) {
const token = req.headers['x-api-key']
if (typeof token !== 'string' || token !== ADMIN_API_KEY) {
return res.status(401).json({ error: 'unauthorized' })
}
}
try {
const limit = Math.min(parseInt(req.query.limit, 10) || 50, 200)
const rows = db.prepare('SELECT id, title, summary, url, source, published_at, category, severity, created_at FROM news_content ORDER BY published_at DESC LIMIT ?').all(limit)

View File

@@ -15,6 +15,7 @@ import {
ISRAEL_STRIKE_SOURCE,
ISRAEL_STRIKE_TARGETS,
} from '@/data/mapLocations'
import { EXTENDED_WAR_ZONES } from '@/data/extendedWarData'
const MAPBOX_TOKEN = config.mapboxAccessToken || ''
@@ -64,6 +65,9 @@ const ALLIES_ADMIN = [
// 伊朗攻击源 德黑兰 [lng, lat]
const TEHRAN_SOURCE: [number, number] = [51.389, 35.6892]
// 真主党打击源(黎巴嫩南部大致位置),用于绘制向以色列北部的攻击矢量
const HEZBOLLAH_SOURCE: [number, number] = [35.3, 33.2]
/** 二次贝塞尔曲线路径,更平滑的弧线 height 控制弧高 */
function parabolaPath(
start: [number, number],
@@ -150,6 +154,7 @@ export function WarMap() {
const lincolnPathsRef = useRef<[number, number][][]>([])
const fordPathsRef = useRef<[number, number][][]>([])
const israelPathsRef = useRef<[number, number][][]>([])
const hezbollahPathsRef = useRef<[number, number][][]>([])
const situation = useReplaySituation()
const { usForces, iranForces, conflictEvents = [] } = situation
@@ -210,9 +215,15 @@ export function WarMap() {
() => ISRAEL_STRIKE_TARGETS.map((t) => parabolaPath(ISRAEL_STRIKE_SOURCE, t)),
[]
)
// 真主党 → 以色列北部三处目标(低平弧线)
const hezbollahPaths = useMemo(
() => EXTENDED_WAR_ZONES.activeAttacks.map((t) => parabolaPath(HEZBOLLAH_SOURCE, t.coords, 1.5)),
[]
)
lincolnPathsRef.current = lincolnPaths
fordPathsRef.current = fordPaths
israelPathsRef.current = israelPaths
hezbollahPathsRef.current = hezbollahPaths
const lincolnLinesGeoJson = useMemo(
() => ({
@@ -247,6 +258,17 @@ export function WarMap() {
}),
[israelPaths]
)
const hezbollahLinesGeoJson = useMemo(
() => ({
type: 'FeatureCollection' as const,
features: hezbollahPaths.map((coords) => ({
type: 'Feature' as const,
properties: {},
geometry: { type: 'LineString' as const, coordinates: coords },
})),
}),
[hezbollahPaths]
)
const attackLinesGeoJson = useMemo(
() => ({
@@ -260,6 +282,23 @@ export function WarMap() {
[attackPaths]
)
// 真主党当前攻击目标点
const hezbollahTargetsGeoJson = useMemo(
() => ({
type: 'FeatureCollection' as const,
features: EXTENDED_WAR_ZONES.activeAttacks.map((t) => ({
type: 'Feature' as const,
properties: { name: t.name, type: t.type, damage: t.damage },
geometry: { type: 'Point' as const, coordinates: t.coords },
})),
}),
[]
)
// 霍尔木兹海峡交战区 & 真主党势力范围(静态面)
const hormuzZone = EXTENDED_WAR_ZONES.hormuzCombatZone
const hezbollahZone = EXTENDED_WAR_ZONES.hezbollahZone
// GDELT 冲突事件13 绿, 46 橙闪, 710 红脉
const { conflictEventsGreen, conflictEventsOrange, conflictEventsRed } = useMemo(() => {
const green: GeoJSON.Feature<GeoJSON.Point>[] = []
@@ -404,6 +443,24 @@ export function WarMap() {
)
israelSrc.setData({ type: 'FeatureCollection', features })
}
// 真主党打击以色列北部:橙红色光点,低平飞行
const hezSrc = map.getSource('hezbollah-strike-dots') as
| { setData: (d: GeoJSON.FeatureCollection) => void }
| undefined
const hezPaths = hezbollahPathsRef.current
if (hezSrc && hezPaths.length > 0) {
const features: GeoJSON.Feature<GeoJSON.Point>[] = hezPaths.map((path, i) => {
const progress =
(elapsed / FLIGHT_DURATION_MS + 0.2 + i / Math.max(hezPaths.length, 1)) % 1
const coord = interpolateOnPath(path, progress)
return {
type: 'Feature' as const,
properties: {},
geometry: { type: 'Point' as const, coordinates: coord },
}
})
hezSrc.setData({ type: 'FeatureCollection', features })
}
// 伊朗被打击目标:蓝色脉冲 (2s 周期), 半径随 zoom 缩放phase/r/opacity 钳位
if (map.getLayer('allied-strike-targets-pulse')) {
const cycle = 2000
@@ -427,6 +484,15 @@ export function WarMap() {
map.setPaintProperty('gdelt-events-red-pulse', 'circle-radius', r)
map.setPaintProperty('gdelt-events-red-pulse', 'circle-opacity', opacity)
}
// 真主党攻击目标:橙红脉冲,效果与 allied-strike-targets 保持一致
if (map.getLayer('hezbollah-attack-targets-pulse')) {
const cycle = 2000
const phase = Math.max(0, Math.min(1, (elapsed % cycle) / cycle))
const r = Math.max(0, 30 * phase * zoomScale)
const opacity = Math.min(1, Math.max(0, 1 - phase * 1.15))
map.setPaintProperty('hezbollah-attack-targets-pulse', 'circle-radius', r)
map.setPaintProperty('hezbollah-attack-targets-pulse', 'circle-opacity', opacity)
}
} catch (_) {}
}
animRef.current = requestAnimationFrame(tick)
@@ -532,6 +598,12 @@ export function WarMap() {
<span className="flex items-center gap-1">
<span className="h-1.5 w-1.5 rounded-full bg-[#EF4444]" />
</span>
<span className="flex items-center gap-1">
<span className="h-1.5 w-1.5 rounded-sm bg-yellow-400/50" />
</span>
<span className="flex items-center gap-1">
<span className="h-1.5 w-1.5 rounded-sm bg-lime-400/40" />
</span>
</div>
<Map
ref={mapRef}
@@ -676,6 +748,72 @@ export function WarMap() {
/>
</Source>
{/* 真主党对以色列北部的攻击矢量线(低平红线) */}
<Source id="hezbollah-attack-lines" type="geojson" data={hezbollahLinesGeoJson}>
<Layer
id="hezbollah-attack-lines"
type="line"
paint={{
'line-color': 'rgba(248, 113, 113, 0.7)',
'line-width': ['interpolate', ['linear'], ['zoom'], 4, 0.6, 8, 1.2, 12, 2],
}}
/>
</Source>
{/* 真主党打击光点(沿矢量路径移动) */}
<Source
id="hezbollah-strike-dots"
type="geojson"
data={{
type: 'FeatureCollection',
features: hezbollahPaths.map((path) => ({
type: 'Feature' as const,
properties: {},
geometry: { type: 'Point' as const, coordinates: path[0] },
})),
}}
>
<Layer
id="hezbollah-strike-dots-glow"
type="circle"
paint={{
'circle-radius': ['interpolate', ['linear'], ['zoom'], 4, 2.5, 8, 4.5, 12, 7],
'circle-color': 'rgba(248, 113, 113, 0.6)',
'circle-blur': 0.25,
}}
/>
<Layer
id="hezbollah-strike-dots-core"
type="circle"
paint={{
'circle-radius': ['interpolate', ['linear'], ['zoom'], 4, 1, 8, 2, 12, 3.5],
'circle-color': '#fb923c',
'circle-stroke-width': 0.5,
'circle-stroke-color': '#fff',
}}
/>
</Source>
<Source id="hezbollah-attack-targets" type="geojson" data={hezbollahTargetsGeoJson}>
<Layer
id="hezbollah-attack-targets-dot"
type="circle"
paint={{
'circle-radius': ['interpolate', ['linear'], ['zoom'], 4, 2, 8, 3.5, 12, 5],
'circle-color': '#F97316',
'circle-stroke-width': 0.5,
'circle-stroke-color': '#fff',
}}
/>
<Layer
id="hezbollah-attack-targets-pulse"
type="circle"
paint={{
'circle-radius': 0,
'circle-color': 'rgba(248, 113, 113, 0.45)',
'circle-opacity': 0,
}}
/>
</Source>
{/* 美以联军打击伊朗:路径线 */}
<Source id="allied-strike-lines-lincoln" type="geojson" data={lincolnLinesGeoJson}>
<Layer
@@ -1061,6 +1199,107 @@ export function WarMap() {
}}
/>
</Source>
{/* 霍尔木兹海峡交战区 - 金黄色 mesh 区域 */}
<Source id="hormuz-combat-zone" type="geojson" data={hormuzZone}>
<Layer
id="hormuz-combat-fill"
type="fill"
paint={{
'fill-color': (hormuzZone.properties as any).style.fillColor,
'fill-opacity': (hormuzZone.properties as any).style.fillOpacity ?? 0.4,
}}
/>
<Layer
id="hormuz-combat-outline"
type="line"
paint={{
'line-color': '#FACC15',
'line-width': 1.5,
'line-dasharray': [1.5, 1.5],
}}
/>
</Source>
{/* 真主党势力范围 - 绿色半透明区域 */}
<Source id="hezbollah-zone" type="geojson" data={hezbollahZone}>
<Layer
id="hezbollah-fill"
type="fill"
paint={{
'fill-color': (hezbollahZone.properties as any).color || '#32CD32',
'fill-opacity': 0.28,
}}
/>
<Layer
id="hezbollah-outline"
type="line"
paint={{
'line-color': '#22C55E',
'line-width': 1.2,
}}
/>
</Source>
{/* 霍尔木兹海峡区域标注 */}
<Source
id="hormuz-label"
type="geojson"
data={{
type: 'Feature',
properties: { name: (hormuzZone.properties as any).name },
geometry: {
type: 'Point',
coordinates: EXTENDED_WAR_ZONES.hormuzLabelCenter,
},
}}
>
<Layer
id="hormuz-label-text"
type="symbol"
layout={{
'text-field': ['get', 'name'],
// 字体尽量小一些,避免遮挡
'text-size': ['interpolate', ['linear'], ['zoom'], 4, 7, 7, 9, 10, 11],
'text-anchor': 'center',
}}
paint={{
'text-color': '#FACC15',
'text-halo-color': '#1a1a1a',
'text-halo-width': 1,
}}
/>
</Source>
{/* 真主党势力范围标注 */}
<Source
id="hezbollah-label"
type="geojson"
data={{
type: 'Feature',
properties: { name: (hezbollahZone.properties as any).name },
geometry: {
type: 'Point',
coordinates: EXTENDED_WAR_ZONES.hezbollahLabelCenter,
},
}}
>
<Layer
id="hezbollah-label-text"
type="symbol"
layout={{
'text-field': ['get', 'name'],
// 字体尽量小一些,避免遮挡
'text-size': ['interpolate', ['linear'], ['zoom'], 4, 7, 7, 9, 10, 11],
'text-anchor': 'center',
}}
paint={{
'text-color': '#22C55E',
'text-halo-color': '#1a1a1a',
'text-halo-width': 1,
}}
/>
</Source>
</Map>
</div>
)

View File

@@ -1,11 +1,11 @@
/**
* 应用配置(不依赖 .env
* 应用配置:敏感项仅从环境变量读取,勿在源码中写 token
* 构建时 Vite 会将 VITE_* 内联到前端token 只应放在 .env且 .env 不提交)
*/
export const config = {
/** Mapbox 地图令牌 */
mapboxAccessToken:
'pk.eyJ1IjoiZDI5cTAiLCJhIjoiY21oaGRmcTkzMGltZzJscHR1N2FhZnY5dCJ9.7ueF2lS6-C9Mm_xon7NnIA',
/** Mapbox 地图令牌(仅从 VITE_MAPBOX_ACCESS_TOKEN 读取,勿硬编码) */
mapboxAccessToken: import.meta.env.VITE_MAPBOX_ACCESS_TOKEN ?? '',
/** 是否显示滚动情报 */
showNewsTicker: false,
showNewsTicker: import.meta.env.VITE_SHOW_NEWS_TICKER === 'true',
}

245
src/data/extendedWarData.ts Normal file
View File

@@ -0,0 +1,245 @@
// 扩展战区与打击数据2026-03-03 态势)
// 仅用于前端展示,不参与任何真实评估
export const EXTENDED_WAR_ZONES = {
// 1. 霍尔木兹海峡交战区 (Strait of Hormuz) — 多边形,包络海峡水道及两侧水域 [lng, lat]
hormuzCombatZone: {
type: 'Feature' as const,
properties: {
name: '霍尔木兹海峡交战区',
status: 'BLOCKED / ENGAGED',
style: {
fillColor: '#FFD700',
fillOpacity: 0.4,
meshPattern: 'diagonal-line',
},
},
geometry: {
type: 'Polygon' as const,
coordinates: [
[
[55.0, 25.0],
[55.5, 25.4],
[56.2, 26.0],
[56.8, 26.6],
[57.2, 27.0],
[57.0, 27.4],
[56.4, 27.2],
[55.8, 26.6],
[55.2, 25.9],
[54.8, 25.4],
[55.0, 25.0],
],
],
},
},
// 霍尔木兹区域标注点(多边形中心附近,用于显示文字)
hormuzLabelCenter: [56.0, 26.2] as [number, number],
// 2. 真主党势力范围 (Hezbollah) — 黎巴嫩南部 + 贝卡谷地,多边形 [lng, lat]
hezbollahZone: {
type: 'Feature' as const,
properties: {
name: '真主党势力范围',
status: 'OFFENSIVE ACTIVE',
color: '#32CD32',
},
geometry: {
type: 'MultiPolygon' as const,
coordinates: [
// 黎巴嫩南部(利塔尼河以南)
[
[
[35.05, 33.05],
[35.45, 33.15],
[35.85, 33.35],
[35.95, 33.65],
[35.75, 33.95],
[35.35, 33.85],
[35.05, 33.55],
[35.05, 33.05],
],
],
// 贝卡谷地
[
[
[35.85, 33.75],
[36.15, 33.85],
[36.45, 34.05],
[36.55, 34.35],
[36.35, 34.55],
[35.95, 34.45],
[35.75, 34.15],
[35.85, 33.75],
],
],
],
},
},
// 真主党区域标注点(用于显示文字)
hezbollahLabelCenter: [35.7, 33.7] as [number, number],
// 3. 真主党当前攻击目标 (North Israel Targets)
activeAttacks: [
{
name: 'Meron Intelligence Base',
coords: [35.41, 32.99] as [number, number],
type: 'Rocket Strike',
damage: 'High',
},
{
name: 'Ramat David Airbase',
coords: [35.18, 32.66] as [number, number],
type: 'Drone Swarm',
damage: 'Moderate',
},
{
name: 'Mishmar HaCarmel (Haifa)',
coords: [35.01, 32.76] as [number, number],
type: 'Precision Missile',
damage: 'Intercepted',
},
],
} as const
// 战损评估点位(以色列打击黎巴嫩 & 联军打击伊朗本土)
export const STRIKE_DAMAGE_ASSESSMENT = {
lebanonFront: [
{
id: 'L1',
name: 'Dahieh Command',
coords: [35.5, 33.86] as [number, number],
type: 'Leadership',
color: '#ff4d4d',
},
{
id: 'L2',
name: 'Litani Ammo Depot',
coords: [35.32, 33.34] as [number, number],
type: 'Logistics',
color: '#ff4d4d',
},
{
id: 'L3',
name: 'Baalbek Logistics Hub',
coords: [36.2, 34.01] as [number, number],
type: 'Logistics',
color: '#ffb84d',
},
{
id: 'L4',
name: 'Tyre Coastal Battery',
coords: [35.19, 33.27] as [number, number],
type: 'Naval',
color: '#ffb84d',
},
{
id: 'L5',
name: 'Hermel UAV Site',
coords: [36.38, 34.39] as [number, number],
type: 'UAV',
color: '#ffd84d',
},
],
iranMainland: [
{
id: 'I1',
name: 'Parchin Military Complex',
coords: [51.76, 35.53] as [number, number],
type: 'Strategic',
severity: 'Critical',
marker: 'Explosion',
},
{
id: 'I2',
name: 'Mehrabad Airbase',
coords: [51.31, 35.68] as [number, number],
type: 'Airbase',
severity: 'High',
marker: 'Runway',
},
{
id: 'I3',
name: 'Hesa Aircraft Factory',
coords: [51.59, 32.92] as [number, number],
type: 'Industrial',
severity: 'Moderate',
marker: 'Factory',
},
{
id: 'I4',
name: 'Natanz Enrichment Entrance',
coords: [51.91, 33.72] as [number, number],
type: 'Nuclear',
severity: 'Critical',
marker: 'Radiation',
},
{
id: 'I5',
name: 'Bushehr Air Defense Net',
coords: [50.88, 28.82] as [number, number],
type: 'AirDefense',
severity: 'High',
marker: 'Radar',
},
{
id: 'I6',
name: 'Shahid Rajaee Port',
coords: [56.12, 27.14] as [number, number],
type: 'Naval',
severity: 'Critical',
marker: 'Blocked',
},
{
id: 'I7',
name: 'Kermanshah Silo Cluster',
coords: [47.16, 34.35] as [number, number],
type: 'Missile',
severity: 'Critical',
marker: 'Silo',
},
{
id: 'I8',
name: 'Tabriz Tactical Airbase 2',
coords: [46.24, 38.12] as [number, number],
type: 'Airbase',
severity: 'High',
marker: 'Runway',
},
{
id: 'I9',
name: 'Arak Heavy Water Support',
coords: [49.23, 34.11] as [number, number],
type: 'Nuclear',
severity: 'High',
marker: 'Power',
},
{
id: 'I10',
name: 'Fordow Entrance',
coords: [50.99, 34.88] as [number, number],
type: 'Nuclear',
severity: 'Critical',
marker: 'Tunnel',
},
{
id: 'I11',
name: 'Nojeh Airbase',
coords: [48.8, 35.21] as [number, number],
type: 'Airbase',
severity: 'High',
marker: 'Runway',
},
{
id: 'I12',
name: 'Kish SIGINT Site',
coords: [53.98, 26.54] as [number, number],
type: 'Radar',
severity: 'Moderate',
marker: 'Sensor',
},
],
} as const

View File

@@ -3,9 +3,13 @@
set -e
cd "$(dirname "$0")"
# 无 Ollama 时禁用 AIGDELT 国内常超时,仅用 RSS 更新
export CLEANER_AI_DISABLED=1
export PARSER_AI_DISABLED=1
# 若存在 .env 则加载(可在此设置 DASHSCOPE_API_KEY=sk-xxx勿提交 .env
[ -f .env ] && set -a && . ./.env && set +a
# AI 模式:有 DASHSCOPE_API_KEY 时用通义(商业模型,无需 Ollama否则用 Ollama 或规则
export CLEANER_AI_DISABLED=0
export PARSER_AI_DISABLED=0
# GDELT 国内常超时,仅用 RSS 更新(如需 GDELT 可改为 0
export GDELT_DISABLED=1
export RSS_INTERVAL_SEC=60