1305 lines
36 KiB
Markdown
1305 lines
36 KiB
Markdown
# Cosma Nav + QC — Plan de finition
|
|
|
|
> **For agentic workers:** REQUIRED SUB-SKILL: Use superpowers:subagent-driven-development (recommended) or superpowers:executing-plans to implement this plan task-by-task. Steps use checkbox (`- [ ]`) syntax for tracking.
|
|
|
|
**Goal:** Intégrer cosma-nav à cosma-qc sur cosma-vm (.83) : routes /map et /nav, hook post-job, archivage NAS .156, toutes issues Gitea fermées, E2E validé avec données réelles.
|
|
|
|
**Architecture:** cosma-nav (Flask :5051) tourne en service systemd sur .83 derrière Caddy. Le dispatcher cosma-qc appelle un hook post-job qui décime le PLY sur ml-stack (.84) et l'archive sur NAS .156. Le viewer cosma-nav sert des pages HTML par job via routes `/api/job/<id>/...`.
|
|
|
|
**Tech Stack:** Python 3, Flask, Leaflet.js (CDN), Chart.js (CDN), Three.js (CDN), open3d, h5py, pyproj, systemd, Caddy, rsync, CIFS
|
|
|
|
---
|
|
|
|
## Fichiers créés / modifiés
|
|
|
|
| Fichier (repo cosma-nav) | Action | Rôle |
|
|
|--------------------------|--------|------|
|
|
| `viz/server.py` | Modify | Ajouter `_DATA_BASE`, `_load_job_data()`, routes `/map` `/nav` `/api/job/<id>/...` |
|
|
| `viz/templates/map.html` | Create | Page Leaflet GPS/USBL |
|
|
| `viz/templates/nav.html` | Create | Page Chart.js depth/altitude/RTK |
|
|
| `viz/static/js/map.js` | Create | Leaflet init, track USV, markers USBL |
|
|
| `viz/static/js/nav_charts.js` | Create | Chart.js depth + altitude + RTK status |
|
|
| `scripts/pre_decimate.py` | Create | Décimation PLY open3d + SCP cosma-vm |
|
|
| `scripts/archive_job.sh` | Create | rsync job → NAS .156 |
|
|
| `scripts/check_jobs.py` | Create | Vérification intégrité jobs (H5 + PLY présents) |
|
|
| `deploy/cosma-nav.service` | Create | systemd unit (user=cosma, port 5051) |
|
|
| `deploy/caddy-fragment.conf` | Create | Routes Caddy /nav /map → :5051 |
|
|
| `tests/test_server_routes.py` | Create | Tests Flask routes map/nav/api |
|
|
| `tests/test_pre_decimate.py` | Create | Test décimation PLY |
|
|
| `tests/test_check_jobs.py` | Create | Test check_jobs |
|
|
|
|
| Fichier (repo cosma-qc, sur .83) | Action | Rôle |
|
|
|-----------------------------------|--------|------|
|
|
| `scripts/dispatcher.py` | Modify | Ajouter appel `post_job_hook(job_id)` après statut done |
|
|
| `scripts/post_job_hook.py` | Create | Orchestre pre_decimate + archive + notif |
|
|
| `app/templates/_jobs_table.html` | Modify | Bouton "QC →" lien `/nav?job=<id>` |
|
|
|
|
---
|
|
|
|
## Task 1 : Audit live .83
|
|
|
|
**Files:** aucun (inspection uniquement)
|
|
|
|
- [ ] **Step 1.1 : SSH .83, vérifier cosma-qc**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
systemctl status cosma-qc-dispatcher
|
|
docker ps | grep cosma-qc-app
|
|
```
|
|
|
|
Attendu : dispatcher `active (running)`, container `cosma-qc-app` sur port 3849.
|
|
Noter l'état réel.
|
|
|
|
- [ ] **Step 1.2 : Vérifier Caddy**
|
|
|
|
```bash
|
|
systemctl status caddy
|
|
cat /etc/caddy/Caddyfile
|
|
```
|
|
|
|
Noter si routes `/nav` et `/map` existent déjà.
|
|
|
|
- [ ] **Step 1.3 : Vérifier NAS .156**
|
|
|
|
```bash
|
|
mount | grep nas
|
|
ls /mnt/nas 2>/dev/null || echo "NAS non monté"
|
|
```
|
|
|
|
Si NAS non monté → créer issue `[infra] Monter NAS .156`.
|
|
|
|
- [ ] **Step 1.4 : Vérifier cosma-nav**
|
|
|
|
```bash
|
|
ls /home/cosma/cosma-nav 2>/dev/null || echo "cosma-nav absent"
|
|
systemctl status cosma-nav 2>/dev/null || echo "service absent"
|
|
```
|
|
|
|
- [ ] **Step 1.5 : Vérifier accès workers**
|
|
|
|
```bash
|
|
ssh gpu "nvidia-smi | head -5" 2>/dev/null || echo "worker .84 inaccessible"
|
|
```
|
|
|
|
- [ ] **Step 1.6 : Noter les gaps**
|
|
|
|
Remplir cette liste (cocher ce qui manque) :
|
|
- [ ] NAS .156 non monté
|
|
- [ ] cosma-nav absent
|
|
- [ ] systemd cosma-nav absent
|
|
- [ ] Caddy routes /nav /map absentes
|
|
- [ ] Hook post-job absent
|
|
- [ ] pre_decimate.py absent
|
|
- [ ] archive_job.sh absent
|
|
|
|
---
|
|
|
|
## Task 2 : Gitea — labels et issues
|
|
|
|
**Files:** aucun (actions Chrome sur Gitea .82)
|
|
|
|
- [ ] **Step 2.1 : Ouvrir Gitea cosma-qc dans Chrome**
|
|
|
|
URL : `http://192.168.0.82:3000/floppyrj45/cosma-qc`
|
|
|
|
- [ ] **Step 2.2 : Créer les labels manquants**
|
|
|
|
Aller dans Issues → Labels → New Label :
|
|
|
|
| Nom | Couleur hex |
|
|
|-----|-------------|
|
|
| `infra` | `#2ecc71` |
|
|
| `backend` | `#3498db` |
|
|
| `frontend` | `#f39c12` |
|
|
| `deploy` | `#e67e22` |
|
|
| `test` | `#9b59b6` |
|
|
|
|
- [ ] **Step 2.3 : Créer les issues pour chaque gap identifié en Task 1**
|
|
|
|
Pour chaque gap coché en Step 1.6, créer une issue avec ce format :
|
|
|
|
Titre : `[label] Description courte`
|
|
Corps :
|
|
```
|
|
## Contexte
|
|
<une ligne sur pourquoi c'est nécessaire>
|
|
|
|
## Critère de done
|
|
- [ ] <vérification concrète>
|
|
```
|
|
|
|
Issues standard à créer (si gap confirmé) :
|
|
|
|
1. `[infra] Monter NAS .156 sur cosma-vm et ml-stack`
|
|
2. `[backend] Créer pre_decimate.py (décimation PLY + SCP)`
|
|
3. `[backend] Créer archive_job.sh (rsync → NAS .156)`
|
|
4. `[backend] Créer check_jobs.py (intégrité jobs)`
|
|
5. `[frontend] Routes /map /api/job/<id>/map-data + map.html + map.js`
|
|
6. `[frontend] Routes /nav /api/job/<id>/trajectory + nav.html + nav_charts.js`
|
|
7. `[deploy] Cloner cosma-nav sur .83 + requirements`
|
|
8. `[deploy] Créer service systemd cosma-nav (port 5051)`
|
|
9. `[deploy] Configurer Caddy routes /nav /map → :5051`
|
|
10. `[deploy] Intégrer hook post-job dans dispatcher cosma-qc`
|
|
11. `[test] Validation E2E avec données réelles`
|
|
|
|
- [ ] **Step 2.4 : Noter les numéros d'issues créées**
|
|
|
|
(ex: #1=infra, #2=pre_decimate, etc.) — utilisés dans les messages de commit.
|
|
|
|
---
|
|
|
|
## Task 3 : [infra] Monter NAS .156
|
|
|
|
**Files:** `/etc/fstab` sur cosma-vm (.83) et ml-stack (.84)
|
|
|
|
*Sauter cette task si NAS déjà monté (Step 1.3 OK).*
|
|
|
|
- [ ] **Step 3.1 : Installer cifs-utils sur .83**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
sudo apt-get install -y cifs-utils
|
|
```
|
|
|
|
- [ ] **Step 3.2 : Créer credentials NAS sur .83**
|
|
|
|
```bash
|
|
sudo bash -c 'cat > /root/.nas-credentials << EOF
|
|
username=admin
|
|
password=vj'"'"']C9yJA-jYt)U
|
|
EOF
|
|
chmod 600 /root/.nas-credentials'
|
|
```
|
|
|
|
- [ ] **Step 3.3 : Créer point de montage et ajouter à fstab sur .83**
|
|
|
|
```bash
|
|
sudo mkdir -p /mnt/nas
|
|
echo "//192.168.0.156/cosma /mnt/nas cifs credentials=/root/.nas-credentials,uid=cosma,gid=cosma,iocharset=utf8,_netdev 0 0" | sudo tee -a /etc/fstab
|
|
sudo mount /mnt/nas
|
|
```
|
|
|
|
- [ ] **Step 3.4 : Vérifier montage**
|
|
|
|
```bash
|
|
ls /mnt/nas
|
|
```
|
|
|
|
Attendu : liste des dossiers NAS sans erreur.
|
|
|
|
- [ ] **Step 3.5 : Répéter sur ml-stack (.84)**
|
|
|
|
```bash
|
|
ssh root@192.168.0.84
|
|
apt-get install -y cifs-utils
|
|
mkdir -p /mnt/nas-cosma
|
|
bash -c 'cat > /root/.nas-credentials << EOF
|
|
username=admin
|
|
password=vj'"'"']C9yJA-jYt)U
|
|
EOF
|
|
chmod 600 /root/.nas-credentials'
|
|
echo "//192.168.0.156/cosma /mnt/nas-cosma cifs credentials=/root/.nas-credentials,iocharset=utf8,_netdev 0 0" >> /etc/fstab
|
|
mount /mnt/nas-cosma
|
|
ls /mnt/nas-cosma
|
|
```
|
|
|
|
- [ ] **Step 3.6 : Commit infra (pas de code — noter dans issue)**
|
|
|
|
Commenter l'issue Gitea #1 : "NAS .156 monté sur .83 (/mnt/nas) et .84 (/mnt/nas-cosma). Vérification OK."
|
|
Fermer l'issue.
|
|
|
|
---
|
|
|
|
## Task 4 : [backend] pre_decimate.py
|
|
|
|
**Files:**
|
|
- Create: `scripts/pre_decimate.py`
|
|
- Create: `tests/test_pre_decimate.py`
|
|
|
|
- [ ] **Step 4.1 : Créer le dossier scripts**
|
|
|
|
```bash
|
|
mkdir -p scripts
|
|
touch scripts/__init__.py
|
|
```
|
|
|
|
- [ ] **Step 4.2 : Écrire le test en premier**
|
|
|
|
```python
|
|
# tests/test_pre_decimate.py
|
|
import numpy as np
|
|
import tempfile
|
|
from pathlib import Path
|
|
|
|
def _make_tiny_ply(path):
|
|
"""Create a minimal PLY with 1000 points."""
|
|
import open3d as o3d
|
|
pcd = o3d.geometry.PointCloud()
|
|
pts = np.random.rand(1000, 3).astype(np.float64)
|
|
pcd.points = o3d.utility.Vector3dVector(pts)
|
|
o3d.io.write_point_cloud(str(path), pcd)
|
|
|
|
def test_decimate_reduces_points():
|
|
from scripts.pre_decimate import decimate_ply
|
|
with tempfile.TemporaryDirectory() as tmp:
|
|
src = Path(tmp) / "model.ply"
|
|
dst = Path(tmp) / "model_decimated.ply"
|
|
_make_tiny_ply(src)
|
|
decimate_ply(str(src), str(dst), max_pts=100)
|
|
import open3d as o3d
|
|
pcd = o3d.io.read_point_cloud(str(dst))
|
|
assert len(pcd.points) <= 100
|
|
|
|
def test_decimate_small_ply_unchanged():
|
|
from scripts.pre_decimate import decimate_ply
|
|
with tempfile.TemporaryDirectory() as tmp:
|
|
src = Path(tmp) / "small.ply"
|
|
dst = Path(tmp) / "small_decimated.ply"
|
|
_make_tiny_ply(src)
|
|
decimate_ply(str(src), str(dst), max_pts=5000)
|
|
import open3d as o3d
|
|
pcd = o3d.io.read_point_cloud(str(dst))
|
|
assert len(pcd.points) == 1000
|
|
|
|
def test_decimate_missing_src_raises():
|
|
from scripts.pre_decimate import decimate_ply
|
|
import pytest
|
|
with tempfile.TemporaryDirectory() as tmp:
|
|
with pytest.raises(FileNotFoundError):
|
|
decimate_ply("/nonexistent.ply", str(Path(tmp) / "out.ply"))
|
|
```
|
|
|
|
- [ ] **Step 4.3 : Lancer le test — vérifier qu'il échoue**
|
|
|
|
```bash
|
|
pytest tests/test_pre_decimate.py -v
|
|
```
|
|
|
|
Attendu : `ImportError: cannot import name 'decimate_ply'`
|
|
|
|
- [ ] **Step 4.4 : Implémenter pre_decimate.py**
|
|
|
|
```python
|
|
# scripts/pre_decimate.py
|
|
"""Decimate a PLY point cloud and optionally SCP to cosma-vm."""
|
|
import argparse
|
|
import subprocess
|
|
import sys
|
|
from pathlib import Path
|
|
import numpy as np
|
|
import open3d as o3d
|
|
|
|
|
|
def decimate_ply(src: str, dst: str, max_pts: int = 300_000) -> None:
|
|
src_path = Path(src)
|
|
if not src_path.exists():
|
|
raise FileNotFoundError(src)
|
|
pcd = o3d.io.read_point_cloud(str(src_path))
|
|
n = len(pcd.points)
|
|
if n > max_pts:
|
|
vol = float(np.prod(pcd.get_max_bound() - pcd.get_min_bound()))
|
|
vox = max((vol / max_pts) ** (1 / 3), 0.02)
|
|
pcd = pcd.voxel_down_sample(vox)
|
|
o3d.io.write_point_cloud(dst, pcd)
|
|
|
|
|
|
def scp_to_cosma(local_path: str, job_id: int, cosma_host: str = "192.168.0.83",
|
|
cosma_user: str = "cosma", data_base: str = "/home/cosma/cosma-qc-data/jobs") -> None:
|
|
remote = f"{cosma_user}@{cosma_host}:{data_base}/{job_id}/"
|
|
subprocess.run(["ssh", f"{cosma_user}@{cosma_host}", f"mkdir -p {data_base}/{job_id}"], check=True)
|
|
subprocess.run(["scp", local_path, remote], check=True)
|
|
|
|
|
|
def main() -> None:
|
|
p = argparse.ArgumentParser(description="Decimate PLY and SCP to cosma-vm")
|
|
p.add_argument("src", help="Source PLY path")
|
|
p.add_argument("dst", help="Destination PLY path (decimated)")
|
|
p.add_argument("--job-id", type=int, help="Job ID for SCP destination")
|
|
p.add_argument("--scp", action="store_true", help="SCP decimated file to cosma-vm after")
|
|
p.add_argument("--max-pts", type=int, default=300_000)
|
|
p.add_argument("--cosma-host", default="192.168.0.83")
|
|
args = p.parse_args()
|
|
decimate_ply(args.src, args.dst, args.max_pts)
|
|
print(f"Decimated: {args.dst}")
|
|
if args.scp and args.job_id:
|
|
scp_to_cosma(args.dst, args.job_id, args.cosma_host)
|
|
print(f"SCP done → {args.cosma_host}:{args.job_id}/")
|
|
|
|
|
|
if __name__ == "__main__":
|
|
main()
|
|
```
|
|
|
|
- [ ] **Step 4.5 : Lancer les tests — vérifier qu'ils passent**
|
|
|
|
```bash
|
|
pytest tests/test_pre_decimate.py -v
|
|
```
|
|
|
|
Attendu : 3 tests PASSED.
|
|
|
|
- [ ] **Step 4.6 : Commit**
|
|
|
|
```bash
|
|
git add scripts/pre_decimate.py scripts/__init__.py tests/test_pre_decimate.py
|
|
git commit -m "feat: closes #2 — pre_decimate.py décimation PLY + SCP cosma-vm"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 5 : [backend] archive_job.sh
|
|
|
|
**Files:**
|
|
- Create: `scripts/archive_job.sh`
|
|
|
|
*Pas de test unitaire pour bash — vérification manuelle en Step 5.3.*
|
|
|
|
- [ ] **Step 5.1 : Créer archive_job.sh**
|
|
|
|
```bash
|
|
#!/usr/bin/env bash
|
|
# archive_job.sh <job_id> [data_base] [nas_base]
|
|
# Rsyncs job data to NAS .156
|
|
set -euo pipefail
|
|
|
|
JOB_ID="${1:?Usage: archive_job.sh <job_id>}"
|
|
DATA_BASE="${2:-/home/cosma/cosma-qc-data/jobs}"
|
|
NAS_BASE="${3:-/mnt/nas/archives}"
|
|
|
|
SRC="${DATA_BASE}/${JOB_ID}/"
|
|
DST="${NAS_BASE}/${JOB_ID}/"
|
|
|
|
mkdir -p "${DST}"
|
|
rsync -av --progress "${SRC}" "${DST}"
|
|
echo "Archive done: ${DST}"
|
|
```
|
|
|
|
```bash
|
|
# Rendre exécutable
|
|
chmod +x scripts/archive_job.sh
|
|
```
|
|
|
|
- [ ] **Step 5.2 : Commit**
|
|
|
|
```bash
|
|
git add scripts/archive_job.sh
|
|
git commit -m "feat: closes #3 — archive_job.sh rsync job → NAS .156"
|
|
```
|
|
|
|
- [ ] **Step 5.3 : Test manuel (sur .83 après déploiement)**
|
|
|
|
```bash
|
|
# Créer un job factice
|
|
mkdir -p /tmp/job_test/99
|
|
echo "test" > /tmp/job_test/99/dummy.txt
|
|
mkdir -p /mnt/nas/archives
|
|
bash scripts/archive_job.sh 99 /tmp/job_test /mnt/nas/archives
|
|
ls /mnt/nas/archives/99/
|
|
```
|
|
|
|
Attendu : `dummy.txt` présent dans `/mnt/nas/archives/99/`.
|
|
|
|
---
|
|
|
|
## Task 6 : [backend] check_jobs.py
|
|
|
|
**Files:**
|
|
- Create: `scripts/check_jobs.py`
|
|
- Create: `tests/test_check_jobs.py`
|
|
|
|
- [ ] **Step 6.1 : Écrire les tests**
|
|
|
|
```python
|
|
# tests/test_check_jobs.py
|
|
import json
|
|
import tempfile
|
|
from pathlib import Path
|
|
|
|
|
|
def _make_job_dir(base: Path, job_id: int, has_h5: bool = True, has_ply: bool = True) -> Path:
|
|
job_dir = base / str(job_id)
|
|
job_dir.mkdir(parents=True)
|
|
if has_h5:
|
|
(job_dir / "sparse_fixes.h5").touch()
|
|
(job_dir / "trajectory_world.h5").touch()
|
|
if has_ply:
|
|
(job_dir / "model_decimated.ply").touch()
|
|
return job_dir
|
|
|
|
|
|
def test_complete_job_is_ok():
|
|
from scripts.check_jobs import check_job
|
|
with tempfile.TemporaryDirectory() as tmp:
|
|
base = Path(tmp)
|
|
_make_job_dir(base, 1)
|
|
result = check_job(1, str(base))
|
|
assert result["status"] == "ok"
|
|
assert result["job_id"] == 1
|
|
|
|
|
|
def test_missing_h5_flagged():
|
|
from scripts.check_jobs import check_job
|
|
with tempfile.TemporaryDirectory() as tmp:
|
|
base = Path(tmp)
|
|
_make_job_dir(base, 2, has_h5=False)
|
|
result = check_job(2, str(base))
|
|
assert result["status"] == "incomplete"
|
|
assert "sparse_fixes.h5" in result["missing"]
|
|
|
|
|
|
def test_missing_ply_flagged():
|
|
from scripts.check_jobs import check_job
|
|
with tempfile.TemporaryDirectory() as tmp:
|
|
base = Path(tmp)
|
|
_make_job_dir(base, 3, has_ply=False)
|
|
result = check_job(3, str(base))
|
|
assert result["status"] == "incomplete"
|
|
assert "model_decimated.ply" in result["missing"]
|
|
|
|
|
|
def test_missing_job_dir():
|
|
from scripts.check_jobs import check_job
|
|
result = check_job(999, "/nonexistent/path")
|
|
assert result["status"] == "missing"
|
|
```
|
|
|
|
- [ ] **Step 6.2 : Lancer les tests — vérifier échec**
|
|
|
|
```bash
|
|
pytest tests/test_check_jobs.py -v
|
|
```
|
|
|
|
Attendu : `ImportError`
|
|
|
|
- [ ] **Step 6.3 : Implémenter check_jobs.py**
|
|
|
|
```python
|
|
# scripts/check_jobs.py
|
|
"""Check integrity of processed jobs (H5 + PLY present)."""
|
|
import argparse
|
|
import json
|
|
from pathlib import Path
|
|
from typing import Any
|
|
|
|
REQUIRED_FILES = ["sparse_fixes.h5", "trajectory_world.h5", "model_decimated.ply"]
|
|
|
|
|
|
def check_job(job_id: int, data_base: str = "/home/cosma/cosma-qc-data/jobs") -> dict[str, Any]:
|
|
job_dir = Path(data_base) / str(job_id)
|
|
if not job_dir.exists():
|
|
return {"job_id": job_id, "status": "missing", "missing": []}
|
|
missing = [f for f in REQUIRED_FILES if not (job_dir / f).exists()]
|
|
return {
|
|
"job_id": job_id,
|
|
"status": "ok" if not missing else "incomplete",
|
|
"missing": missing,
|
|
}
|
|
|
|
|
|
def main() -> None:
|
|
p = argparse.ArgumentParser()
|
|
p.add_argument("job_ids", nargs="*", type=int)
|
|
p.add_argument("--data-base", default="/home/cosma/cosma-qc-data/jobs")
|
|
p.add_argument("--all", action="store_true", help="Check all job dirs")
|
|
args = p.parse_args()
|
|
base = Path(args.data_base)
|
|
if args.all:
|
|
ids = [int(d.name) for d in base.iterdir() if d.is_dir() and d.name.isdigit()]
|
|
else:
|
|
ids = args.job_ids
|
|
results = [check_job(jid, args.data_base) for jid in sorted(ids)]
|
|
print(json.dumps(results, indent=2))
|
|
|
|
|
|
if __name__ == "__main__":
|
|
main()
|
|
```
|
|
|
|
- [ ] **Step 6.4 : Lancer les tests — vérifier succès**
|
|
|
|
```bash
|
|
pytest tests/test_check_jobs.py -v
|
|
```
|
|
|
|
Attendu : 4 tests PASSED.
|
|
|
|
- [ ] **Step 6.5 : Commit**
|
|
|
|
```bash
|
|
git add scripts/check_jobs.py tests/test_check_jobs.py
|
|
git commit -m "feat: closes #4 — check_jobs.py vérification intégrité jobs"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 7 : [frontend] Route /map + map.html + map.js
|
|
|
|
**Files:**
|
|
- Modify: `viz/server.py`
|
|
- Create: `viz/templates/map.html`
|
|
- Create: `viz/static/js/map.js`
|
|
- Create: `tests/test_server_routes.py`
|
|
|
|
- [ ] **Step 7.1 : Écrire les tests routes**
|
|
|
|
```python
|
|
# tests/test_server_routes.py
|
|
import h5py
|
|
import numpy as np
|
|
import pytest
|
|
import tempfile
|
|
from pathlib import Path
|
|
from viz.server import app, _DATA_BASE
|
|
import viz.server as srv
|
|
|
|
|
|
@pytest.fixture
|
|
def client(tmp_path):
|
|
"""Flask test client with a synthetic job directory."""
|
|
job_dir = tmp_path / "1"
|
|
job_dir.mkdir()
|
|
# Create sparse_fixes.h5
|
|
with h5py.File(str(job_dir / "sparse_fixes.h5"), "w") as f:
|
|
f.attrs["utm_zone_number"] = 32
|
|
f.attrs["utm_south"] = False
|
|
grp = f.create_group("usv_gps")
|
|
n = 20
|
|
grp.create_dataset("easting", data=np.linspace(700000, 700500, n))
|
|
grp.create_dataset("northing", data=np.linspace(4800000, 4800500, n))
|
|
grp.create_dataset("t_ns", data=np.arange(n, dtype=np.int64) * int(1e9))
|
|
grp.create_dataset("rtk_status", data=np.ones(n, dtype=np.int32))
|
|
srv._DATA_BASE = str(tmp_path)
|
|
app.config["TESTING"] = True
|
|
with app.test_client() as c:
|
|
yield c
|
|
|
|
|
|
def test_map_route_returns_200(client):
|
|
r = client.get("/map?job=1")
|
|
assert r.status_code == 200
|
|
assert b"leaflet" in r.data.lower()
|
|
|
|
|
|
def test_nav_route_returns_200(client):
|
|
r = client.get("/nav?job=1")
|
|
assert r.status_code == 200
|
|
assert b"chart" in r.data.lower()
|
|
|
|
|
|
def test_api_map_data_returns_latlon(client):
|
|
r = client.get("/api/job/1/map-data")
|
|
assert r.status_code == 200
|
|
data = r.get_json()
|
|
assert "usv_gps" in data
|
|
assert "lat" in data["usv_gps"]
|
|
assert len(data["usv_gps"]["lat"]) > 0
|
|
# Mediterranean coords sanity check
|
|
assert 40 < data["usv_gps"]["lat"][0] < 50
|
|
```
|
|
|
|
- [ ] **Step 7.2 : Lancer les tests — vérifier échec**
|
|
|
|
```bash
|
|
pytest tests/test_server_routes.py -v
|
|
```
|
|
|
|
Attendu : `AssertionError` (routes /map /nav inexistantes)
|
|
|
|
- [ ] **Step 7.3 : Modifier viz/server.py — ajouter DATA_BASE + routes map/nav**
|
|
|
|
Ajouter après les imports existants :
|
|
|
|
```python
|
|
from pathlib import Path
|
|
from flask import render_template, request
|
|
```
|
|
|
|
Ajouter après `_PLY_PATH = ""` :
|
|
|
|
```python
|
|
_DATA_BASE = "/home/cosma/cosma-qc-data/jobs"
|
|
```
|
|
|
|
Ajouter après la fonction `_load_data()` :
|
|
|
|
```python
|
|
def _load_job_map_data(job_id: int) -> dict:
|
|
from pyproj import Proj
|
|
job_dir = Path(_DATA_BASE) / str(job_id)
|
|
fixes_h5 = job_dir / "sparse_fixes.h5"
|
|
out: dict = {}
|
|
try:
|
|
with h5py.File(str(fixes_h5), "r") as f:
|
|
zone_num = int(f.attrs.get("utm_zone_number", 32))
|
|
south = bool(f.attrs.get("utm_south", False))
|
|
proj = Proj(proj="utm", zone=zone_num, ellps="WGS84", south=south)
|
|
if "usv_gps" in f:
|
|
e = f["usv_gps/easting"][:]
|
|
n = f["usv_gps/northing"][:]
|
|
step = max(1, len(e) // 2000)
|
|
lon, lat = proj(e[::step], n[::step], inverse=True)
|
|
rtk = f["usv_gps/rtk_status"][::step].tolist()
|
|
out["usv_gps"] = {"lat": lat.tolist(), "lon": lon.tolist(), "rtk": rtk}
|
|
except Exception as ex:
|
|
out["error"] = str(ex)
|
|
return out
|
|
```
|
|
|
|
Ajouter les routes Flask (avant `_main()`) :
|
|
|
|
```python
|
|
@app.route("/map")
|
|
def map_viewer():
|
|
return render_template("map.html")
|
|
|
|
|
|
@app.route("/nav")
|
|
def nav_viewer():
|
|
return render_template("nav.html")
|
|
|
|
|
|
@app.route("/api/job/<int:job_id>/map-data")
|
|
def api_job_map_data(job_id: int):
|
|
return jsonify(_load_job_map_data(job_id))
|
|
|
|
|
|
@app.route("/api/job/<int:job_id>/trajectory")
|
|
def api_job_trajectory(job_id: int):
|
|
job_dir = Path(_DATA_BASE) / str(job_id)
|
|
global _TRAJ_H5, _FIXES_H5, _PLY_PATH
|
|
_TRAJ_H5 = str(job_dir / "trajectory_world.h5")
|
|
_FIXES_H5 = str(job_dir / "sparse_fixes.h5")
|
|
_PLY_PATH = str(job_dir / "model_decimated.ply")
|
|
return jsonify(_load_data())
|
|
```
|
|
|
|
Modifier `_main()` — ajouter arg `--data-base` :
|
|
|
|
```python
|
|
p.add_argument("--data-base", default="/home/cosma/cosma-qc-data/jobs")
|
|
# après args = p.parse_args():
|
|
global _DATA_BASE
|
|
_DATA_BASE = args.data_base
|
|
```
|
|
|
|
- [ ] **Step 7.4 : Créer viz/templates/map.html**
|
|
|
|
```html
|
|
<!DOCTYPE html>
|
|
<html lang="fr">
|
|
<head>
|
|
<meta charset="UTF-8">
|
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
<title>COSMA — Carte GPS/USBL</title>
|
|
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css"/>
|
|
<style>
|
|
body { margin: 0; font-family: sans-serif; background: #111; color: #eee; }
|
|
#header { padding: 8px 16px; background: #1a1a2e; font-size: 14px; }
|
|
#map { height: calc(100vh - 36px); }
|
|
.legend { background: rgba(0,0,0,0.7); padding: 8px; border-radius: 4px; color: #eee; font-size: 12px; }
|
|
.legend-item { display: flex; align-items: center; gap: 6px; margin-bottom: 4px; }
|
|
.dot { width: 12px; height: 12px; border-radius: 50%; }
|
|
</style>
|
|
</head>
|
|
<body>
|
|
<div id="header">COSMA — Carte GPS/USBL | Job <span id="job-id">—</span></div>
|
|
<div id="map"></div>
|
|
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
|
|
<script src="/static/js/map.js"></script>
|
|
</body>
|
|
</html>
|
|
```
|
|
|
|
- [ ] **Step 7.5 : Créer viz/static/js/map.js**
|
|
|
|
```javascript
|
|
(function () {
|
|
const params = new URLSearchParams(window.location.search);
|
|
const jobId = params.get('job') || '1';
|
|
document.getElementById('job-id').textContent = jobId;
|
|
|
|
const map = L.map('map', { preferCanvas: true });
|
|
L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
|
|
attribution: '© OpenStreetMap',
|
|
maxZoom: 19,
|
|
}).addTo(map);
|
|
|
|
fetch('/api/job/' + jobId + '/map-data')
|
|
.then(function (r) { return r.json(); })
|
|
.then(function (data) {
|
|
if (data.error) { alert('Erreur données: ' + data.error); return; }
|
|
|
|
// USV GPS track
|
|
if (data.usv_gps) {
|
|
const pts = data.usv_gps.lat.map(function (lat, i) {
|
|
return [lat, data.usv_gps.lon[i]];
|
|
});
|
|
const line = L.polyline(pts, { color: '#2196f3', weight: 2 }).addTo(map);
|
|
map.fitBounds(line.getBounds(), { padding: [20, 20] });
|
|
|
|
// RTK quality markers (show only RTK-fixed = status 1)
|
|
data.usv_gps.lat.forEach(function (lat, i) {
|
|
if (data.usv_gps.rtk[i] === 1) {
|
|
L.circleMarker([lat, data.usv_gps.lon[i]], {
|
|
radius: 3, color: '#4caf50', fillOpacity: 0.8, weight: 0,
|
|
}).addTo(map);
|
|
}
|
|
});
|
|
}
|
|
|
|
// USBL AUV markers
|
|
if (data.auv_usbl) {
|
|
data.auv_usbl.lat.forEach(function (lat, i) {
|
|
L.circleMarker([lat, data.auv_usbl.lon[i]], {
|
|
radius: 4, color: '#f44336', fillOpacity: 0.7, weight: 0,
|
|
}).addTo(map);
|
|
});
|
|
}
|
|
|
|
// Legend
|
|
const legend = L.control({ position: 'bottomright' });
|
|
legend.onAdd = function () {
|
|
const div = L.DomUtil.create('div', 'legend');
|
|
div.innerHTML =
|
|
'<div class="legend-item"><div class="dot" style="background:#2196f3"></div>GPS USV</div>' +
|
|
'<div class="legend-item"><div class="dot" style="background:#4caf50"></div>GPS RTK fixed</div>' +
|
|
'<div class="legend-item"><div class="dot" style="background:#f44336"></div>USBL AUV</div>';
|
|
return div;
|
|
};
|
|
legend.addTo(map);
|
|
})
|
|
.catch(function (err) { console.error('map.js:', err); });
|
|
}());
|
|
```
|
|
|
|
- [ ] **Step 7.6 : Lancer les tests**
|
|
|
|
```bash
|
|
pytest tests/test_server_routes.py -v
|
|
```
|
|
|
|
Attendu : 3 tests PASSED.
|
|
|
|
- [ ] **Step 7.7 : Lancer tous les tests du projet**
|
|
|
|
```bash
|
|
pytest -v
|
|
```
|
|
|
|
Attendu : tous PASSED (anciens + nouveaux).
|
|
|
|
- [ ] **Step 7.8 : Commit**
|
|
|
|
```bash
|
|
git add viz/server.py viz/templates/map.html viz/static/js/map.js tests/test_server_routes.py
|
|
git commit -m "feat: closes #5 — route /map + api/job/<id>/map-data + Leaflet viewer"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 8 : [frontend] Route /nav + nav.html + nav_charts.js
|
|
|
|
**Files:**
|
|
- Create: `viz/templates/nav.html`
|
|
- Create: `viz/static/js/nav_charts.js`
|
|
*(routes /nav et /api/job/<id>/trajectory déjà ajoutées en Task 7)*
|
|
|
|
- [ ] **Step 8.1 : Ajouter test route /nav dans test_server_routes.py**
|
|
|
|
Ajouter dans le fichier existant :
|
|
|
|
```python
|
|
def test_api_job_trajectory_returns_profile(client):
|
|
r = client.get("/api/job/1/trajectory")
|
|
assert r.status_code == 200
|
|
data = r.get_json()
|
|
# sparse_fixes.h5 has usv_gps — should appear
|
|
assert "usv_gps" in data or "error_fixes" in data
|
|
```
|
|
|
|
- [ ] **Step 8.2 : Lancer — vérifier que ce test passe déjà**
|
|
|
|
```bash
|
|
pytest tests/test_server_routes.py::test_api_job_trajectory_returns_profile -v
|
|
```
|
|
|
|
Attendu : PASSED (route ajoutée en Task 7).
|
|
|
|
- [ ] **Step 8.3 : Créer viz/templates/nav.html**
|
|
|
|
```html
|
|
<!DOCTYPE html>
|
|
<html lang="fr">
|
|
<head>
|
|
<meta charset="UTF-8">
|
|
<meta name="viewport" content="width=device-width, initial-scale=1.0">
|
|
<title>COSMA — Navigation AUV</title>
|
|
<script src="https://cdn.jsdelivr.net/npm/chart.js@4.4.0/dist/chart.umd.min.js"></script>
|
|
<style>
|
|
body { margin: 0; font-family: sans-serif; background: #111; color: #eee; }
|
|
#header { padding: 8px 16px; background: #1a1a2e; font-size: 14px; }
|
|
.charts { display: grid; grid-template-columns: 1fr 1fr; gap: 12px; padding: 12px; height: calc(100vh - 60px); box-sizing: border-box; }
|
|
.chart-box { background: #1a1a2e; border-radius: 8px; padding: 12px; display: flex; flex-direction: column; }
|
|
.chart-box h3 { margin: 0 0 8px; font-size: 13px; color: #9c88ff; }
|
|
canvas { flex: 1; }
|
|
#status { padding: 8px 16px; font-size: 12px; color: #888; }
|
|
</style>
|
|
</head>
|
|
<body>
|
|
<div id="header">COSMA — Navigation AUV | Job <span id="job-id">—</span></div>
|
|
<div id="status">Chargement…</div>
|
|
<div class="charts">
|
|
<div class="chart-box"><h3>Profondeur (m)</h3><canvas id="depth-chart"></canvas></div>
|
|
<div class="chart-box"><h3>Altitude fond (m)</h3><canvas id="alt-chart"></canvas></div>
|
|
</div>
|
|
<script src="/static/js/nav_charts.js"></script>
|
|
</body>
|
|
</html>
|
|
```
|
|
|
|
- [ ] **Step 8.4 : Créer viz/static/js/nav_charts.js**
|
|
|
|
```javascript
|
|
(function () {
|
|
const params = new URLSearchParams(window.location.search);
|
|
const jobId = params.get('job') || '1';
|
|
document.getElementById('job-id').textContent = jobId;
|
|
|
|
const chartDefaults = {
|
|
type: 'line',
|
|
options: {
|
|
responsive: true,
|
|
maintainAspectRatio: false,
|
|
animation: false,
|
|
plugins: { legend: { display: false } },
|
|
scales: {
|
|
x: { ticks: { color: '#888', maxTicksLimit: 8 }, grid: { color: '#333' } },
|
|
y: { ticks: { color: '#888' }, grid: { color: '#333' } },
|
|
},
|
|
},
|
|
};
|
|
|
|
fetch('/api/job/' + jobId + '/trajectory')
|
|
.then(function (r) { return r.json(); })
|
|
.then(function (data) {
|
|
const status = document.getElementById('status');
|
|
if (!data.auv_profile) {
|
|
status.textContent = 'Aucun profil AUV disponible pour ce job.';
|
|
return;
|
|
}
|
|
status.textContent = 'Traj status: ' + (data.traj_status || 'unknown');
|
|
const labels = data.auv_profile.t_s.map(function (t) { return t.toFixed(1) + 's'; });
|
|
|
|
new Chart(document.getElementById('depth-chart').getContext('2d'), Object.assign({}, chartDefaults, {
|
|
data: {
|
|
labels: labels,
|
|
datasets: [{
|
|
data: data.auv_profile.depth_m,
|
|
borderColor: '#2196f3',
|
|
borderWidth: 1.5,
|
|
pointRadius: 0,
|
|
fill: false,
|
|
}],
|
|
},
|
|
options: Object.assign({}, chartDefaults.options, {
|
|
scales: Object.assign({}, chartDefaults.options.scales, {
|
|
y: { reverse: true, ticks: { color: '#888' }, grid: { color: '#333' } },
|
|
}),
|
|
}),
|
|
}));
|
|
|
|
new Chart(document.getElementById('alt-chart').getContext('2d'), Object.assign({}, chartDefaults, {
|
|
data: {
|
|
labels: labels,
|
|
datasets: [{
|
|
data: data.auv_profile.altitude_m,
|
|
borderColor: '#4caf50',
|
|
borderWidth: 1.5,
|
|
pointRadius: 0,
|
|
fill: false,
|
|
}],
|
|
},
|
|
}));
|
|
})
|
|
.catch(function (err) {
|
|
document.getElementById('status').textContent = 'Erreur: ' + err;
|
|
});
|
|
}());
|
|
```
|
|
|
|
- [ ] **Step 8.5 : Lancer tous les tests**
|
|
|
|
```bash
|
|
pytest -v
|
|
```
|
|
|
|
Attendu : tous PASSED.
|
|
|
|
- [ ] **Step 8.6 : Commit**
|
|
|
|
```bash
|
|
git add viz/templates/nav.html viz/static/js/nav_charts.js tests/test_server_routes.py
|
|
git commit -m "feat: closes #6 — route /nav + Chart.js depth/altitude viewer"
|
|
```
|
|
|
|
---
|
|
|
|
## Task 9 : [deploy] Cloner cosma-nav sur .83 + requirements
|
|
|
|
**Files:** sur cosma-vm (.83) uniquement
|
|
|
|
- [ ] **Step 9.1 : SSH .83, cloner le repo**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
cd /home/cosma
|
|
git clone http://floppyrj45:67e1615fcfd06cf2df7872ac25e824f3afdb2bc1@192.168.0.82:3000/floppyrj45/cosma-nav.git
|
|
cd cosma-nav
|
|
```
|
|
|
|
- [ ] **Step 9.2 : Créer virtualenv et installer les dépendances**
|
|
|
|
```bash
|
|
python3 -m venv .venv
|
|
source .venv/bin/activate
|
|
pip install -r requirements.txt
|
|
```
|
|
|
|
- [ ] **Step 9.3 : Vérifier que le serveur démarre**
|
|
|
|
```bash
|
|
source .venv/bin/activate
|
|
python -m viz.server --port 5051 &
|
|
sleep 2
|
|
curl -s http://localhost:5051/ | head -5
|
|
kill %1
|
|
```
|
|
|
|
Attendu : réponse HTTP (redirect ou HTML).
|
|
|
|
- [ ] **Step 9.4 : Créer le fichier deploy/cosma-nav.service**
|
|
|
|
*(en local, puis push)*
|
|
|
|
```ini
|
|
# deploy/cosma-nav.service
|
|
[Unit]
|
|
Description=COSMA Nav viewer Flask
|
|
After=network.target
|
|
|
|
[Service]
|
|
Type=simple
|
|
User=cosma
|
|
WorkingDirectory=/home/cosma/cosma-nav
|
|
ExecStart=/home/cosma/cosma-nav/.venv/bin/python -m viz.server \
|
|
--port 5051 \
|
|
--data-base /home/cosma/cosma-qc-data/jobs
|
|
Restart=on-failure
|
|
RestartSec=5
|
|
Environment=PYTHONUNBUFFERED=1
|
|
|
|
[Install]
|
|
WantedBy=multi-user.target
|
|
```
|
|
|
|
```bash
|
|
git add deploy/cosma-nav.service
|
|
git commit -m "feat: closes #7,#8 — deploy systemd cosma-nav.service"
|
|
git push origin master
|
|
```
|
|
|
|
- [ ] **Step 9.5 : Installer le service sur .83**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
cd /home/cosma/cosma-nav && git pull
|
|
sudo cp deploy/cosma-nav.service /etc/systemd/system/
|
|
sudo systemctl daemon-reload
|
|
sudo systemctl enable cosma-nav
|
|
sudo systemctl start cosma-nav
|
|
systemctl status cosma-nav
|
|
```
|
|
|
|
Attendu : `active (running)`.
|
|
|
|
- [ ] **Step 9.6 : Vérifier le port**
|
|
|
|
```bash
|
|
curl -s http://localhost:5051/ | head -3
|
|
```
|
|
|
|
Attendu : réponse HTTP.
|
|
|
|
---
|
|
|
|
## Task 10 : [deploy] Caddy config routes /nav /map
|
|
|
|
**Files:**
|
|
- Create: `deploy/caddy-fragment.conf` (local, reference)
|
|
- Modify: `/etc/caddy/Caddyfile` (sur .83 directement)
|
|
|
|
- [ ] **Step 10.1 : Créer le fragment de config Caddy (local, pour référence)**
|
|
|
|
```
|
|
# deploy/caddy-fragment.conf
|
|
# À insérer dans le bloc du site cosma-vm dans /etc/caddy/Caddyfile
|
|
|
|
handle_path /nav* {
|
|
reverse_proxy localhost:5051
|
|
}
|
|
|
|
handle_path /map* {
|
|
reverse_proxy localhost:5051
|
|
}
|
|
|
|
handle_path /api/job/* {
|
|
reverse_proxy localhost:5051
|
|
}
|
|
```
|
|
|
|
```bash
|
|
git add deploy/caddy-fragment.conf
|
|
git commit -m "docs: closes #9 — fragment Caddy cosma-nav routes /nav /map"
|
|
git push origin master
|
|
```
|
|
|
|
- [ ] **Step 10.2 : Modifier /etc/caddy/Caddyfile sur .83**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
sudo nano /etc/caddy/Caddyfile
|
|
```
|
|
|
|
Dans le bloc du site (probablement `192.168.0.83` ou le domaine Caddy), ajouter avant le bloc cosma-qc :
|
|
|
|
```
|
|
handle_path /nav* {
|
|
reverse_proxy localhost:5051
|
|
}
|
|
|
|
handle_path /map* {
|
|
reverse_proxy localhost:5051
|
|
}
|
|
|
|
handle /api/job/* {
|
|
reverse_proxy localhost:5051
|
|
}
|
|
```
|
|
|
|
- [ ] **Step 10.3 : Recharger Caddy**
|
|
|
|
```bash
|
|
sudo caddy reload --config /etc/caddy/Caddyfile
|
|
```
|
|
|
|
- [ ] **Step 10.4 : Vérifier les routes**
|
|
|
|
```bash
|
|
curl -s -o /dev/null -w "%{http_code}" http://localhost/nav?job=1
|
|
curl -s -o /dev/null -w "%{http_code}" http://localhost/map?job=1
|
|
```
|
|
|
|
Attendu : `200` pour les deux.
|
|
|
|
---
|
|
|
|
## Task 11 : [deploy] Hook post-job dispatcher cosma-qc
|
|
|
|
**Files (repo cosma-qc sur .83):**
|
|
- Create: `scripts/post_job_hook.py`
|
|
- Modify: `scripts/dispatcher.py`
|
|
- Modify: `app/templates/_jobs_table.html`
|
|
- Modify: `app/main.py`
|
|
|
|
- [ ] **Step 11.1 : SSH .83, aller dans cosma-qc**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
cd /home/cosma/cosma-qc
|
|
```
|
|
|
|
- [ ] **Step 11.2 : Créer scripts/post_job_hook.py**
|
|
|
|
```python
|
|
# scripts/post_job_hook.py
|
|
"""Post-job hook: décimation PLY sur ml-stack + SCP + archivage NAS."""
|
|
import subprocess
|
|
import sys
|
|
from pathlib import Path
|
|
|
|
COSMA_NAV_DIR = "/home/cosma/cosma-nav"
|
|
DATA_BASE = "/home/cosma/cosma-qc-data/jobs"
|
|
ML_STACK = "root@192.168.0.84"
|
|
NAS_ARCHIVE = "/mnt/nas/archives"
|
|
|
|
|
|
def run_post_job(job_id: int) -> None:
|
|
job_dir = Path(DATA_BASE) / str(job_id)
|
|
ply_src = job_dir / "model.ply"
|
|
ply_dst = job_dir / "model_decimated.ply"
|
|
|
|
# 1. Décimation PLY sur ml-stack si PLY brut présent
|
|
if ply_src.exists():
|
|
print(f"[post_job] Décimation PLY job {job_id} sur ml-stack...")
|
|
venv_python = "/home/cosma/cosma-nav/.venv/bin/python"
|
|
script = f"{COSMA_NAV_DIR}/scripts/pre_decimate.py"
|
|
# SCP le PLY brut vers ml-stack, décimer là-bas, SCP retour
|
|
remote_src = f"/tmp/cosma_job_{job_id}_model.ply"
|
|
subprocess.run(["scp", str(ply_src), f"{ML_STACK}:{remote_src}"], check=True)
|
|
remote_dst = f"/tmp/cosma_job_{job_id}_decimated.ply"
|
|
subprocess.run([
|
|
"ssh", ML_STACK,
|
|
f"cd {COSMA_NAV_DIR} && {venv_python} scripts/pre_decimate.py {remote_src} {remote_dst}"
|
|
], check=True)
|
|
subprocess.run(["scp", f"{ML_STACK}:{remote_dst}", str(ply_dst)], check=True)
|
|
subprocess.run(["ssh", ML_STACK, f"rm -f {remote_src} {remote_dst}"], check=False)
|
|
print(f"[post_job] PLY décimé : {ply_dst}")
|
|
|
|
# 2. Archivage rsync → NAS
|
|
print(f"[post_job] Archivage job {job_id} → NAS...")
|
|
archive_sh = f"{COSMA_NAV_DIR}/scripts/archive_job.sh"
|
|
subprocess.run(["bash", archive_sh, str(job_id), DATA_BASE, NAS_ARCHIVE], check=True)
|
|
print(f"[post_job] Job {job_id} archivé.")
|
|
|
|
|
|
if __name__ == "__main__":
|
|
run_post_job(int(sys.argv[1]))
|
|
```
|
|
|
|
- [ ] **Step 11.3 : Trouver et modifier dispatcher.py**
|
|
|
|
```bash
|
|
grep -n "done\|status\|complete" scripts/dispatcher.py | head -20
|
|
```
|
|
|
|
Trouver la ligne où le job passe à "done" (ex: `job.status = "done"` ou `update_job_status(job_id, "done")`).
|
|
|
|
Ajouter après cette ligne :
|
|
|
|
```python
|
|
# Post-job hook cosma-nav
|
|
import threading
|
|
import subprocess
|
|
threading.Thread(
|
|
target=lambda: subprocess.run(
|
|
["python", "scripts/post_job_hook.py", str(job_id)],
|
|
cwd="/home/cosma/cosma-qc"
|
|
),
|
|
daemon=True
|
|
).start()
|
|
```
|
|
|
|
- [ ] **Step 11.4 : Ajouter bouton QC dans _jobs_table.html**
|
|
|
|
```bash
|
|
grep -n "done\|status\|btn" app/templates/_jobs_table.html | head -20
|
|
```
|
|
|
|
Dans la ligne/colonne du job, ajouter un bouton lien conditionnel (si status == done) :
|
|
|
|
```html
|
|
{% if job.status == 'done' %}
|
|
<a href="http://192.168.0.83/nav?job={{ job.id }}" target="_blank"
|
|
style="font-size:11px;padding:2px 6px;background:#9c88ff;color:#fff;border-radius:3px;text-decoration:none">
|
|
QC →
|
|
</a>
|
|
{% endif %}
|
|
```
|
|
|
|
- [ ] **Step 11.5 : Redémarrer le dispatcher**
|
|
|
|
```bash
|
|
sudo systemctl restart cosma-qc-dispatcher
|
|
systemctl status cosma-qc-dispatcher
|
|
```
|
|
|
|
Attendu : `active (running)`.
|
|
|
|
- [ ] **Step 11.6 : Commit cosma-qc**
|
|
|
|
```bash
|
|
cd /home/cosma/cosma-qc
|
|
git add scripts/post_job_hook.py scripts/dispatcher.py app/templates/_jobs_table.html
|
|
git commit -m "feat: closes #10 — hook post-job cosma-nav + bouton QC dashboard"
|
|
git push
|
|
```
|
|
|
|
---
|
|
|
|
## Task 12 : [test] Validation E2E avec données réelles
|
|
|
|
**Files:** aucun (validation opérationnelle)
|
|
|
|
- [ ] **Step 12.1 : Lancer tous les tests unitaires**
|
|
|
|
```bash
|
|
# Local (cosma-nav repo)
|
|
cd /home/cosma/cosma-nav
|
|
source .venv/bin/activate
|
|
pytest -v
|
|
```
|
|
|
|
Attendu : tous PASSED.
|
|
|
|
- [ ] **Step 12.2 : Ouvrir le dashboard cosma-qc dans Chrome**
|
|
|
|
URL : `http://192.168.0.83:3849` ou `http://192.168.0.83/cosma-qc`
|
|
|
|
Vérifier qu'un job "done" affiche le bouton "QC →".
|
|
|
|
- [ ] **Step 12.3 : Cliquer "QC →" sur un job done**
|
|
|
|
Le viewer `/nav?job=<id>` doit s'ouvrir avec les graphes depth/altitude.
|
|
|
|
- [ ] **Step 12.4 : Ouvrir /map?job=<id> dans Chrome**
|
|
|
|
URL : `http://192.168.0.83/map?job=<id>`
|
|
|
|
Vérifier : track GPS bleu visible, markers USBL rouges si disponibles.
|
|
|
|
- [ ] **Step 12.5 : Vérifier le NAS**
|
|
|
|
```bash
|
|
ssh cosma@192.168.0.83
|
|
ls /mnt/nas/archives/<job_id>/
|
|
```
|
|
|
|
Attendu : fichiers du job présents (frames + PLY).
|
|
|
|
- [ ] **Step 12.6 : Vérifier check_jobs.py**
|
|
|
|
```bash
|
|
cd /home/cosma/cosma-nav && source .venv/bin/activate
|
|
python scripts/check_jobs.py --all --data-base /home/cosma/cosma-qc-data/jobs
|
|
```
|
|
|
|
Attendu : jobs done avec `"status": "ok"`.
|
|
|
|
- [ ] **Step 12.7 : Vérifier toutes les issues Gitea fermées**
|
|
|
|
Ouvrir Chrome → `http://192.168.0.82:3000/floppyrj45/cosma-qc/issues`
|
|
|
|
Filtrer par "Closed" — vérifier que les 11 issues sont fermées.
|
|
|
|
- [ ] **Step 12.8 : Commit final cosma-nav**
|
|
|
|
```bash
|
|
cd /home/cosma/cosma-nav
|
|
git add -A
|
|
git commit -m "feat: closes #11 — E2E validé, produit fini"
|
|
git push origin master
|
|
```
|
|
|
|
---
|
|
|
|
## Critères de succès finaux
|
|
|
|
- [ ] `systemctl is-active cosma-nav` → `active`
|
|
- [ ] `http://192.168.0.83/nav?job=<id>` → 200 + graphes visibles
|
|
- [ ] `http://192.168.0.83/map?job=<id>` → 200 + carte GPS/USBL visible
|
|
- [ ] PLY archivé sur NAS .156 pour au moins un job
|
|
- [ ] Bouton "QC →" visible dans dashboard cosma-qc sur jobs done
|
|
- [ ] `pytest -v` → tous PASSED (cosma-nav)
|
|
- [ ] Toutes issues Gitea cosma-qc fermées
|