112 lines
2.9 KiB
Markdown
112 lines
2.9 KiB
Markdown
# Docker 部署到服务器
|
||
|
||
将 US-Iran 态势面板打包成 Docker 镜像,便于移植到任意服务器。
|
||
|
||
## 架构
|
||
|
||
| 服务 | 端口 | 说明 |
|
||
|--------|------|--------------------------|
|
||
| api | 3001 | 前端静态 + REST API + WebSocket |
|
||
| crawler| 8000 | RSS 爬虫 + GDELT,内部服务 |
|
||
|
||
- 数据库:SQLite,挂载到 `app-data` volume(`/data/data.db`)
|
||
- 前端与 API 合并到同一镜像,访问 `http://主机:3001` 即可
|
||
|
||
## 快速部署
|
||
|
||
```bash
|
||
# 1. 克隆项目
|
||
git clone <repo> usa-dashboard && cd usa-dashboard
|
||
|
||
# 2. 构建并启动(需先配置 Mapbox Token,见下方)
|
||
docker compose up -d --build
|
||
|
||
# 3. 访问
|
||
# 前端 + API: http://localhost:3001
|
||
# 爬虫状态: http://localhost:8000/crawler/status
|
||
```
|
||
|
||
## Mapbox Token(地图展示)
|
||
|
||
构建时需将 Token 传入前端,否则地图为占位模式:
|
||
|
||
```bash
|
||
# 方式 1:.env 文件
|
||
echo "VITE_MAPBOX_ACCESS_TOKEN=pk.xxx" > .env
|
||
docker compose up -d --build
|
||
|
||
# 方式 2:环境变量
|
||
VITE_MAPBOX_ACCESS_TOKEN=pk.xxx docker compose up -d --build
|
||
```
|
||
|
||
## 推送到私有仓库并移植
|
||
|
||
```bash
|
||
# 1. 打标签(替换为你的仓库地址)
|
||
docker compose build
|
||
docker tag usa-dashboard-api your-registry/usa-dashboard-api:latest
|
||
docker tag usa-dashboard-crawler your-registry/usa-dashboard-crawler:latest
|
||
|
||
# 2. 推送
|
||
docker push your-registry/usa-dashboard-api:latest
|
||
docker push your-registry/usa-dashboard-crawler:latest
|
||
|
||
# 3. 在目标服务器拉取并启动
|
||
docker pull your-registry/usa-dashboard-api:latest
|
||
docker pull your-registry/usa-dashboard-crawler:latest
|
||
# 需准备 docker-compose.yml 或等效编排,见下方
|
||
```
|
||
|
||
## 仅用镜像启动(无 compose)
|
||
|
||
```bash
|
||
# 1. 创建网络与数据卷
|
||
docker network create usa-net
|
||
docker volume create usa-data
|
||
|
||
# 2. 启动 API(前端+接口)
|
||
docker run -d --name api --network usa-net \
|
||
-p 3001:3001 \
|
||
-v usa-data:/data \
|
||
-e DB_PATH=/data/data.db \
|
||
usa-dashboard-api
|
||
|
||
# 3. 启动爬虫(通过 usa-net 访问 api)
|
||
docker run -d --name crawler --network usa-net \
|
||
-v usa-data:/data \
|
||
-e DB_PATH=/data/data.db \
|
||
-e API_BASE=http://api:3001 \
|
||
-e CLEANER_AI_DISABLED=1 \
|
||
-e GDELT_DISABLED=1 \
|
||
usa-dashboard-crawler
|
||
```
|
||
|
||
爬虫通过 `API_BASE` 调用 Node 的 `/api/crawler/notify`,两容器需在同一网络内。
|
||
|
||
## 国内服务器 / 镜像加速
|
||
|
||
拉取 `node`、`python` 等基础镜像慢时:
|
||
|
||
1. **Docker 镜像加速**:见 [docs/DOCKER_MIRROR.md](docs/DOCKER_MIRROR.md)
|
||
2. **构建时使用国内镜像源**:
|
||
```bash
|
||
docker compose build --build-arg REGISTRY=docker.m.daocloud.io/library/
|
||
docker compose up -d
|
||
```
|
||
|
||
## 常用操作
|
||
|
||
```bash
|
||
# 查看日志
|
||
docker compose logs -f
|
||
|
||
# 重启
|
||
docker compose restart
|
||
|
||
# 停止并删除容器(数据卷保留)
|
||
docker compose down
|
||
|
||
# 回填战损数据(从 situation_update 重新提取)
|
||
curl -X POST http://localhost:8000/crawler/backfill
|
||
```
|