etape laptop
9
.env
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
# Ports
|
||||||
|
FRONTEND_PORT=8880
|
||||||
|
BACKEND_PORT=8800
|
||||||
|
|
||||||
|
# Upload
|
||||||
|
MAX_UPLOAD_SIZE=52428800
|
||||||
|
|
||||||
|
# Chemins
|
||||||
|
DATA_DIR=/data
|
||||||
9
.env.example
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
# Ports
|
||||||
|
FRONTEND_PORT=8080
|
||||||
|
BACKEND_PORT=8000
|
||||||
|
|
||||||
|
# Upload
|
||||||
|
MAX_UPLOAD_SIZE=52428800
|
||||||
|
|
||||||
|
# Chemins
|
||||||
|
DATA_DIR=/data
|
||||||
80
CLAUDE.md
Normal file
@@ -0,0 +1,80 @@
|
|||||||
|
# CLAUDE.md
|
||||||
|
|
||||||
|
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
|
||||||
|
|
||||||
|
## Project Overview
|
||||||
|
|
||||||
|
WebCarto is a self-hosted web application for displaying and editing cartographic data (KML/GeoJSON files) with satellite/hybrid/vector map backgrounds. Deployed via Docker. UI uses "gruvbox dark vintage" theme. Specification in `consigne.md`.
|
||||||
|
|
||||||
|
## Tech Stack
|
||||||
|
|
||||||
|
- **Frontend**: Vite + React + TypeScript, MapLibre GL JS, TailwindCSS v4, Zustand, @tmcw/togeojson, DOMPurify
|
||||||
|
- **Backend**: Python FastAPI, Pydantic, SQLite via SQLModel, uvicorn
|
||||||
|
- **Deploy**: Docker Compose — 2 services: backend (FastAPI + /data volume), frontend (Vite build + Nginx)
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
|
||||||
|
```
|
||||||
|
frontend/ → React SPA (Vite)
|
||||||
|
src/components/ → MapView, Header, LayerPanel, PropertyPanel, ImportDialog, ToastContainer, StatusBar
|
||||||
|
src/stores/mapStore → Zustand store (datasets, features, selection, undo, toasts)
|
||||||
|
src/api/client → API client (fetch wrapper)
|
||||||
|
backend/ → FastAPI REST API
|
||||||
|
app/main.py → FastAPI app + CORS + startup
|
||||||
|
app/models.py → SQLModel models: Dataset, Feature, FeatureVersion
|
||||||
|
app/routes/datasets → Import, list, get, export
|
||||||
|
app/routes/features → Update feature (with versioning)
|
||||||
|
app/config.py → DATA_DIR, DATABASE_URL, MAX_UPLOAD_SIZE
|
||||||
|
app/database.py → Engine + session dependency
|
||||||
|
tests/test_api.py → 9 tests (health, import, CRUD, export, 404)
|
||||||
|
samples/ → example.geojson, example.kml
|
||||||
|
```
|
||||||
|
|
||||||
|
## Development Commands
|
||||||
|
|
||||||
|
### Docker
|
||||||
|
```bash
|
||||||
|
docker-compose up --build # Lancer le stack complet
|
||||||
|
docker-compose down # Arrêter
|
||||||
|
```
|
||||||
|
|
||||||
|
### Frontend (from `frontend/`)
|
||||||
|
```bash
|
||||||
|
npm install # Installer les dépendances
|
||||||
|
npm run dev # Serveur dev (Vite, port 5173, proxy /api → backend:8000)
|
||||||
|
npm run build # Build production (tsc + vite)
|
||||||
|
npx tsc -b # Vérification TypeScript seule
|
||||||
|
npm run lint # ESLint
|
||||||
|
```
|
||||||
|
|
||||||
|
### Backend (from `backend/`)
|
||||||
|
```bash
|
||||||
|
python3 -m venv .venv && .venv/bin/pip install -r requirements.txt pytest httpx # Setup
|
||||||
|
.venv/bin/uvicorn app.main:app --reload # Dev server
|
||||||
|
.venv/bin/pytest tests/ -v # Tous les tests
|
||||||
|
.venv/bin/pytest tests/test_api.py -k test_import # Un test spécifique
|
||||||
|
```
|
||||||
|
|
||||||
|
## Key Design Decisions
|
||||||
|
|
||||||
|
- **KML parsing côté frontend** (@tmcw/togeojson), envoyé au backend en GeoJSON normalisé
|
||||||
|
- **Fonds de carte** définis dans MapView.tsx (OSM raster, Esri satellite, Stamen labels). 3 modes: vector/satellite/hybrid
|
||||||
|
- **Versioning**: table `feature_versions` stocke before/after JSON pour chaque modification
|
||||||
|
- **Pas d'auth en v1** — accès LAN uniquement, structure prête pour OIDC/reverse proxy
|
||||||
|
- **En dev local**, le backend stocke dans `backend/data/` (en Docker: volume `/data`)
|
||||||
|
- **Tests backend** utilisent SQLite in-memory avec `StaticPool` pour l'isolation
|
||||||
|
|
||||||
|
## API Endpoints
|
||||||
|
|
||||||
|
```
|
||||||
|
GET /api/health # Health check
|
||||||
|
POST /api/datasets/import # Import (multipart: file + geojson)
|
||||||
|
GET /api/datasets # Lister les datasets
|
||||||
|
GET /api/datasets/{id} # Dataset + features
|
||||||
|
PUT /api/features/{id} # Update geometry/properties (avec versioning)
|
||||||
|
POST /api/datasets/{id}/export?format=geojson # Export GeoJSON
|
||||||
|
```
|
||||||
|
|
||||||
|
## Language
|
||||||
|
|
||||||
|
All user-facing text, comments in code, and commit messages should be in French. Technical identifiers (variable names, API paths) stay in English.
|
||||||
10
amelioration.md
Normal file
@@ -0,0 +1,10 @@
|
|||||||
|
- [x] ajout des carte googles maps ( satellite, hybrid, vector ) en plus de openstreet maps
|
||||||
|
- [x] possibilité d'edition d'un import kml ( visualisation du contenu dans volet droit)
|
||||||
|
- [x] possibilité de selectionner/deplacer un objet sur la carte (surligne dans volet gauche et affiche detail dans volet droit) avec une selection + clicquer/deplacer
|
||||||
|
- [x] extraction et affichage des images contenues dans un fichier kml
|
||||||
|
- [x] ajoute bouton pour supprimer un objet kml ou ses objet ( volet gauche)
|
||||||
|
- [x] enregistre la postion et le niveau de zoom de sorte que pour toute nouvelle connexion on retrouve l'emplacement et le meme niveau de zoom
|
||||||
|
- [x] lors du rechargement de la page, les objets ne sont pas affiché ?
|
||||||
|
- [x] un clic sur l'image affiche un popup avec l'image en grand
|
||||||
|
- [x] ajoute une icon oeil, pour afficher ou masquer (oeil barre) un fichier d'objet kml, oubien un ou des objets de ce fichier ( afficher l'oeil dans le volet gauche)
|
||||||
|
- [ ] avec les touche clavier haut et bas permet de selection l objet suivant ou precedent
|
||||||
14
backend/Dockerfile
Normal file
@@ -0,0 +1,14 @@
|
|||||||
|
FROM python:3.12-slim
|
||||||
|
|
||||||
|
WORKDIR /app
|
||||||
|
|
||||||
|
COPY requirements.txt .
|
||||||
|
RUN pip install --no-cache-dir -r requirements.txt
|
||||||
|
|
||||||
|
COPY . .
|
||||||
|
|
||||||
|
RUN mkdir -p /data
|
||||||
|
|
||||||
|
EXPOSE 8000
|
||||||
|
|
||||||
|
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000"]
|
||||||
0
backend/app/__init__.py
Normal file
BIN
backend/app/__pycache__/__init__.cpython-313.pyc
Normal file
BIN
backend/app/__pycache__/config.cpython-313.pyc
Normal file
BIN
backend/app/__pycache__/database.cpython-313.pyc
Normal file
BIN
backend/app/__pycache__/main.cpython-313.pyc
Normal file
BIN
backend/app/__pycache__/models.cpython-313.pyc
Normal file
9
backend/app/config.py
Normal file
@@ -0,0 +1,9 @@
|
|||||||
|
import os
|
||||||
|
from pathlib import Path
|
||||||
|
|
||||||
|
_default_data = os.path.join(os.path.dirname(os.path.dirname(__file__)), "data")
|
||||||
|
DATA_DIR = Path(os.getenv("DATA_DIR", _default_data))
|
||||||
|
DATA_DIR.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
DATABASE_URL = os.getenv("DATABASE_URL", f"sqlite:///{DATA_DIR / 'webcarto.db'}")
|
||||||
|
MAX_UPLOAD_SIZE = int(os.getenv("MAX_UPLOAD_SIZE", 50 * 1024 * 1024)) # 50 Mo
|
||||||
13
backend/app/database.py
Normal file
@@ -0,0 +1,13 @@
|
|||||||
|
from sqlmodel import SQLModel, create_engine, Session
|
||||||
|
from .config import DATABASE_URL
|
||||||
|
|
||||||
|
engine = create_engine(DATABASE_URL, echo=False)
|
||||||
|
|
||||||
|
|
||||||
|
def init_db():
|
||||||
|
SQLModel.metadata.create_all(engine)
|
||||||
|
|
||||||
|
|
||||||
|
def get_session():
|
||||||
|
with Session(engine) as session:
|
||||||
|
yield session
|
||||||
33
backend/app/main.py
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
from fastapi import FastAPI
|
||||||
|
from fastapi.middleware.cors import CORSMiddleware
|
||||||
|
from starlette.formparsers import MultiPartParser
|
||||||
|
from .database import init_db
|
||||||
|
from .config import MAX_UPLOAD_SIZE
|
||||||
|
from .routes import datasets, features, images, settings
|
||||||
|
|
||||||
|
# Relever la limite de taille des parts multipart (défaut Starlette: 1 Mo)
|
||||||
|
MultiPartParser.max_part_size = MAX_UPLOAD_SIZE
|
||||||
|
|
||||||
|
app = FastAPI(title="WebCarto API", version="0.1.0")
|
||||||
|
|
||||||
|
app.add_middleware(
|
||||||
|
CORSMiddleware,
|
||||||
|
allow_origins=["*"],
|
||||||
|
allow_methods=["*"],
|
||||||
|
allow_headers=["*"],
|
||||||
|
)
|
||||||
|
|
||||||
|
app.include_router(datasets.router, prefix="/api")
|
||||||
|
app.include_router(features.router, prefix="/api")
|
||||||
|
app.include_router(images.router, prefix="/api")
|
||||||
|
app.include_router(settings.router, prefix="/api")
|
||||||
|
|
||||||
|
|
||||||
|
@app.on_event("startup")
|
||||||
|
def on_startup():
|
||||||
|
init_db()
|
||||||
|
|
||||||
|
|
||||||
|
@app.get("/api/health")
|
||||||
|
def health():
|
||||||
|
return {"status": "ok"}
|
||||||
42
backend/app/models.py
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
from datetime import datetime, timezone
|
||||||
|
from typing import Optional
|
||||||
|
from sqlmodel import SQLModel, Field, Column
|
||||||
|
import sqlalchemy as sa
|
||||||
|
|
||||||
|
|
||||||
|
class Dataset(SQLModel, table=True):
|
||||||
|
id: Optional[int] = Field(default=None, primary_key=True)
|
||||||
|
name: str
|
||||||
|
raw_filename: str
|
||||||
|
feature_count: int = 0
|
||||||
|
bbox_json: Optional[str] = None # JSON string [minLng, minLat, maxLng, maxLat]
|
||||||
|
created_at: datetime = Field(
|
||||||
|
default_factory=lambda: datetime.now(timezone.utc),
|
||||||
|
sa_column=Column(sa.DateTime(timezone=True), default=sa.func.now()),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class Feature(SQLModel, table=True):
|
||||||
|
id: Optional[int] = Field(default=None, primary_key=True)
|
||||||
|
dataset_id: int = Field(foreign_key="dataset.id", index=True)
|
||||||
|
geometry_json: str # GeoJSON geometry as JSON string
|
||||||
|
properties_json: str # GeoJSON properties as JSON string
|
||||||
|
|
||||||
|
|
||||||
|
class FeatureVersion(SQLModel, table=True):
|
||||||
|
id: Optional[int] = Field(default=None, primary_key=True)
|
||||||
|
feature_id: int = Field(foreign_key="feature.id", index=True)
|
||||||
|
before_json: str
|
||||||
|
after_json: str
|
||||||
|
timestamp: datetime = Field(
|
||||||
|
default_factory=lambda: datetime.now(timezone.utc),
|
||||||
|
sa_column=Column(sa.DateTime(timezone=True), default=sa.func.now()),
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
class MapSettings(SQLModel, table=True):
|
||||||
|
id: Optional[int] = Field(default=None, primary_key=True)
|
||||||
|
center_lng: float = 2.35
|
||||||
|
center_lat: float = 48.85
|
||||||
|
zoom: float = 5.0
|
||||||
|
base_layer: str = "vector"
|
||||||
0
backend/app/routes/__init__.py
Normal file
BIN
backend/app/routes/__pycache__/__init__.cpython-313.pyc
Normal file
BIN
backend/app/routes/__pycache__/datasets.cpython-313.pyc
Normal file
BIN
backend/app/routes/__pycache__/features.cpython-313.pyc
Normal file
BIN
backend/app/routes/__pycache__/images.cpython-313.pyc
Normal file
BIN
backend/app/routes/__pycache__/settings.cpython-313.pyc
Normal file
296
backend/app/routes/datasets.py
Normal file
@@ -0,0 +1,296 @@
|
|||||||
|
import json
|
||||||
|
import shutil
|
||||||
|
import xml.etree.ElementTree as ET
|
||||||
|
import re
|
||||||
|
import base64
|
||||||
|
import logging
|
||||||
|
from pathlib import Path
|
||||||
|
from fastapi import APIRouter, Depends, UploadFile, File, Form, HTTPException
|
||||||
|
from sqlmodel import Session, select
|
||||||
|
from ..database import get_session
|
||||||
|
from ..models import Dataset, Feature, FeatureVersion
|
||||||
|
from ..config import DATA_DIR, MAX_UPLOAD_SIZE
|
||||||
|
from .images import extract_and_save_images, IMAGES_DIR
|
||||||
|
|
||||||
|
logger = logging.getLogger(__name__)
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/datasets", tags=["datasets"])
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("")
|
||||||
|
def list_datasets(session: Session = Depends(get_session)):
|
||||||
|
datasets = session.exec(select(Dataset)).all()
|
||||||
|
results = []
|
||||||
|
for ds in datasets:
|
||||||
|
bbox = json.loads(ds.bbox_json) if ds.bbox_json else None
|
||||||
|
results.append({
|
||||||
|
"id": ds.id,
|
||||||
|
"name": ds.name,
|
||||||
|
"feature_count": ds.feature_count,
|
||||||
|
"created_at": ds.created_at.isoformat(),
|
||||||
|
"bbox": bbox,
|
||||||
|
})
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{dataset_id}")
|
||||||
|
def get_dataset(dataset_id: int, session: Session = Depends(get_session)):
|
||||||
|
ds = session.get(Dataset, dataset_id)
|
||||||
|
if not ds:
|
||||||
|
raise HTTPException(404, "Dataset non trouvé")
|
||||||
|
|
||||||
|
features = session.exec(
|
||||||
|
select(Feature).where(Feature.dataset_id == dataset_id)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
bbox = json.loads(ds.bbox_json) if ds.bbox_json else None
|
||||||
|
return {
|
||||||
|
"id": ds.id,
|
||||||
|
"name": ds.name,
|
||||||
|
"feature_count": ds.feature_count,
|
||||||
|
"created_at": ds.created_at.isoformat(),
|
||||||
|
"bbox": bbox,
|
||||||
|
"raw_filename": ds.raw_filename,
|
||||||
|
"features": [
|
||||||
|
{
|
||||||
|
"id": f.id,
|
||||||
|
"geometry": json.loads(f.geometry_json),
|
||||||
|
"properties": json.loads(f.properties_json),
|
||||||
|
}
|
||||||
|
for f in features
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{dataset_id}")
|
||||||
|
def delete_dataset(dataset_id: int, session: Session = Depends(get_session)):
|
||||||
|
ds = session.get(Dataset, dataset_id)
|
||||||
|
if not ds:
|
||||||
|
raise HTTPException(404, "Dataset non trouvé")
|
||||||
|
|
||||||
|
# Supprimer les versions de toutes les features
|
||||||
|
features = session.exec(
|
||||||
|
select(Feature).where(Feature.dataset_id == dataset_id)
|
||||||
|
).all()
|
||||||
|
for f in features:
|
||||||
|
versions = session.exec(
|
||||||
|
select(FeatureVersion).where(FeatureVersion.feature_id == f.id)
|
||||||
|
).all()
|
||||||
|
for v in versions:
|
||||||
|
session.delete(v)
|
||||||
|
session.delete(f)
|
||||||
|
|
||||||
|
# Supprimer le dossier images
|
||||||
|
img_dir = IMAGES_DIR / str(dataset_id)
|
||||||
|
if img_dir.exists():
|
||||||
|
shutil.rmtree(img_dir)
|
||||||
|
|
||||||
|
# Supprimer le fichier raw
|
||||||
|
raw_path = DATA_DIR / "raw" / ds.raw_filename
|
||||||
|
if raw_path.exists():
|
||||||
|
raw_path.unlink()
|
||||||
|
|
||||||
|
session.delete(ds)
|
||||||
|
session.commit()
|
||||||
|
return {"ok": True}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/import")
|
||||||
|
async def import_dataset(
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
geojson: str = Form(...),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
):
|
||||||
|
# Sauvegarder le fichier brut
|
||||||
|
raw_dir = DATA_DIR / "raw"
|
||||||
|
raw_dir.mkdir(exist_ok=True)
|
||||||
|
|
||||||
|
content = await file.read()
|
||||||
|
raw_path = raw_dir / file.filename
|
||||||
|
# Éviter les écrasements
|
||||||
|
counter = 1
|
||||||
|
while raw_path.exists():
|
||||||
|
stem = Path(file.filename).stem
|
||||||
|
suffix = Path(file.filename).suffix
|
||||||
|
raw_path = raw_dir / f"{stem}_{counter}{suffix}"
|
||||||
|
counter += 1
|
||||||
|
raw_path.write_bytes(content)
|
||||||
|
|
||||||
|
# Parser le GeoJSON
|
||||||
|
try:
|
||||||
|
fc = json.loads(geojson)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
raise HTTPException(400, "GeoJSON invalide")
|
||||||
|
|
||||||
|
if fc.get("type") != "FeatureCollection":
|
||||||
|
raise HTTPException(400, "Le JSON doit être un FeatureCollection")
|
||||||
|
|
||||||
|
features_data = fc.get("features", [])
|
||||||
|
|
||||||
|
# Calculer la bbox
|
||||||
|
bbox = _compute_bbox(features_data)
|
||||||
|
|
||||||
|
# Créer le dataset
|
||||||
|
ds = Dataset(
|
||||||
|
name=Path(file.filename).stem,
|
||||||
|
raw_filename=raw_path.name,
|
||||||
|
feature_count=len(features_data),
|
||||||
|
bbox_json=json.dumps(bbox) if bbox else None,
|
||||||
|
)
|
||||||
|
session.add(ds)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(ds)
|
||||||
|
|
||||||
|
# Créer les features
|
||||||
|
for i, f_data in enumerate(features_data):
|
||||||
|
geometry = f_data.get("geometry", {})
|
||||||
|
properties = f_data.get("properties", {})
|
||||||
|
# Extraire les éventuelles images base64 inline (envoyées dans le JSON)
|
||||||
|
properties = extract_and_save_images(properties, ds.id, i)
|
||||||
|
feature = Feature(
|
||||||
|
dataset_id=ds.id,
|
||||||
|
geometry_json=json.dumps(geometry),
|
||||||
|
properties_json=json.dumps(properties),
|
||||||
|
)
|
||||||
|
session.add(feature)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
# Si KML, extraire les images base64 depuis le fichier brut
|
||||||
|
if file.filename and file.filename.lower().endswith(".kml"):
|
||||||
|
_extract_kml_images(raw_path, ds.id, session)
|
||||||
|
|
||||||
|
bbox_out = json.loads(ds.bbox_json) if ds.bbox_json else None
|
||||||
|
return {
|
||||||
|
"id": ds.id,
|
||||||
|
"name": ds.name,
|
||||||
|
"feature_count": ds.feature_count,
|
||||||
|
"created_at": ds.created_at.isoformat(),
|
||||||
|
"bbox": bbox_out,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{dataset_id}/export")
|
||||||
|
def export_dataset(dataset_id: int, format: str = "geojson", session: Session = Depends(get_session)):
|
||||||
|
ds = session.get(Dataset, dataset_id)
|
||||||
|
if not ds:
|
||||||
|
raise HTTPException(404, "Dataset non trouvé")
|
||||||
|
|
||||||
|
features = session.exec(
|
||||||
|
select(Feature).where(Feature.dataset_id == dataset_id)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
fc = {
|
||||||
|
"type": "FeatureCollection",
|
||||||
|
"features": [
|
||||||
|
{
|
||||||
|
"type": "Feature",
|
||||||
|
"geometry": json.loads(f.geometry_json),
|
||||||
|
"properties": json.loads(f.properties_json),
|
||||||
|
}
|
||||||
|
for f in features
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
from fastapi.responses import Response
|
||||||
|
return Response(
|
||||||
|
content=json.dumps(fc, ensure_ascii=False, indent=2),
|
||||||
|
media_type="application/geo+json",
|
||||||
|
headers={"Content-Disposition": f'attachment; filename="{ds.name}.geojson"'},
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def _compute_bbox(features: list) -> list | None:
|
||||||
|
coords = []
|
||||||
|
for f in features:
|
||||||
|
_extract_coords(f.get("geometry", {}), coords)
|
||||||
|
if not coords:
|
||||||
|
return None
|
||||||
|
lngs = [c[0] for c in coords]
|
||||||
|
lats = [c[1] for c in coords]
|
||||||
|
return [min(lngs), min(lats), max(lngs), max(lats)]
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_coords(geometry: dict, coords: list):
|
||||||
|
gtype = geometry.get("type", "")
|
||||||
|
coordinates = geometry.get("coordinates")
|
||||||
|
if not coordinates:
|
||||||
|
return
|
||||||
|
if gtype == "Point":
|
||||||
|
coords.append(coordinates)
|
||||||
|
elif gtype in ("MultiPoint", "LineString"):
|
||||||
|
coords.extend(coordinates)
|
||||||
|
elif gtype in ("MultiLineString", "Polygon"):
|
||||||
|
for ring in coordinates:
|
||||||
|
coords.extend(ring)
|
||||||
|
elif gtype == "MultiPolygon":
|
||||||
|
for polygon in coordinates:
|
||||||
|
for ring in polygon:
|
||||||
|
coords.extend(ring)
|
||||||
|
elif gtype == "GeometryCollection":
|
||||||
|
for g in geometry.get("geometries", []):
|
||||||
|
_extract_coords(g, coords)
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_kml_images(kml_path: Path, dataset_id: int, session: Session):
|
||||||
|
"""Extraire les images base64 des gx:imageUrl du fichier KML brut
|
||||||
|
et les associer aux features correspondantes par index de Placemark."""
|
||||||
|
try:
|
||||||
|
tree = ET.parse(kml_path)
|
||||||
|
except ET.ParseError as e:
|
||||||
|
logger.warning(f"Impossible de parser le KML {kml_path}: {e}")
|
||||||
|
return
|
||||||
|
|
||||||
|
root = tree.getroot()
|
||||||
|
ns = {
|
||||||
|
"kml": "http://www.opengis.net/kml/2.2",
|
||||||
|
"gx": "http://www.google.com/kml/ext/2.2",
|
||||||
|
}
|
||||||
|
|
||||||
|
placemarks = root.findall(".//kml:Placemark", ns)
|
||||||
|
features = session.exec(
|
||||||
|
select(Feature).where(Feature.dataset_id == dataset_id)
|
||||||
|
).all()
|
||||||
|
|
||||||
|
if len(placemarks) != len(features):
|
||||||
|
logger.warning(
|
||||||
|
f"KML {kml_path}: {len(placemarks)} placemarks vs {len(features)} features, "
|
||||||
|
"extraction images par index impossible"
|
||||||
|
)
|
||||||
|
return
|
||||||
|
|
||||||
|
img_dir = IMAGES_DIR / str(dataset_id)
|
||||||
|
img_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
data_uri_re = re.compile(r"data:image/(\w+);base64,(.+)", re.DOTALL)
|
||||||
|
|
||||||
|
for i, (pm, feature) in enumerate(zip(placemarks, features)):
|
||||||
|
image_urls = pm.findall(".//gx:imageUrl", ns)
|
||||||
|
if not image_urls:
|
||||||
|
continue
|
||||||
|
|
||||||
|
saved = []
|
||||||
|
for j, img_el in enumerate(image_urls):
|
||||||
|
data_uri = (img_el.text or "").strip()
|
||||||
|
match = data_uri_re.match(data_uri)
|
||||||
|
if not match:
|
||||||
|
continue
|
||||||
|
ext = match.group(1)
|
||||||
|
if ext == "jpeg":
|
||||||
|
ext = "jpg"
|
||||||
|
b64_data = match.group(2)
|
||||||
|
try:
|
||||||
|
raw = base64.b64decode(b64_data)
|
||||||
|
filename = f"{i}_{j}.{ext}"
|
||||||
|
(img_dir / filename).write_bytes(raw)
|
||||||
|
saved.append(f"/api/images/{dataset_id}/{filename}")
|
||||||
|
except Exception as e:
|
||||||
|
logger.warning(f"Erreur décodage image placemark {i} img {j}: {e}")
|
||||||
|
continue
|
||||||
|
|
||||||
|
if saved:
|
||||||
|
props = json.loads(feature.properties_json)
|
||||||
|
existing = props.get("_images", [])
|
||||||
|
props["_images"] = existing + saved
|
||||||
|
feature.properties_json = json.dumps(props)
|
||||||
|
session.add(feature)
|
||||||
|
|
||||||
|
session.commit()
|
||||||
96
backend/app/routes/features.py
Normal file
@@ -0,0 +1,96 @@
|
|||||||
|
import json
|
||||||
|
from fastapi import APIRouter, Depends, HTTPException
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional
|
||||||
|
from sqlmodel import Session, select
|
||||||
|
from ..database import get_session
|
||||||
|
from ..models import Dataset, Feature, FeatureVersion
|
||||||
|
from .images import IMAGES_DIR
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/features", tags=["features"])
|
||||||
|
|
||||||
|
|
||||||
|
class FeatureUpdate(BaseModel):
|
||||||
|
geometry: Optional[dict] = None
|
||||||
|
properties: Optional[dict] = None
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/{feature_id}")
|
||||||
|
def update_feature(
|
||||||
|
feature_id: int,
|
||||||
|
data: FeatureUpdate,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
):
|
||||||
|
feature = session.get(Feature, feature_id)
|
||||||
|
if not feature:
|
||||||
|
raise HTTPException(404, "Feature non trouvée")
|
||||||
|
|
||||||
|
before = {
|
||||||
|
"geometry": json.loads(feature.geometry_json),
|
||||||
|
"properties": json.loads(feature.properties_json),
|
||||||
|
}
|
||||||
|
|
||||||
|
if data.geometry is not None:
|
||||||
|
feature.geometry_json = json.dumps(data.geometry)
|
||||||
|
if data.properties is not None:
|
||||||
|
feature.properties_json = json.dumps(data.properties)
|
||||||
|
|
||||||
|
after = {
|
||||||
|
"geometry": json.loads(feature.geometry_json),
|
||||||
|
"properties": json.loads(feature.properties_json),
|
||||||
|
}
|
||||||
|
|
||||||
|
# Sauvegarder la version
|
||||||
|
version = FeatureVersion(
|
||||||
|
feature_id=feature_id,
|
||||||
|
before_json=json.dumps(before),
|
||||||
|
after_json=json.dumps(after),
|
||||||
|
)
|
||||||
|
session.add(version)
|
||||||
|
session.add(feature)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(feature)
|
||||||
|
|
||||||
|
# Compter les versions pour ce feature
|
||||||
|
from sqlmodel import select, func
|
||||||
|
count = session.exec(
|
||||||
|
select(func.count()).where(FeatureVersion.feature_id == feature_id)
|
||||||
|
).one()
|
||||||
|
|
||||||
|
return {"id": feature.id, "version": count}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/{feature_id}")
|
||||||
|
def delete_feature(
|
||||||
|
feature_id: int,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
):
|
||||||
|
feature = session.get(Feature, feature_id)
|
||||||
|
if not feature:
|
||||||
|
raise HTTPException(404, "Feature non trouvée")
|
||||||
|
|
||||||
|
# Supprimer les versions
|
||||||
|
versions = session.exec(
|
||||||
|
select(FeatureVersion).where(FeatureVersion.feature_id == feature_id)
|
||||||
|
).all()
|
||||||
|
for v in versions:
|
||||||
|
session.delete(v)
|
||||||
|
|
||||||
|
# Supprimer les fichiers images associés
|
||||||
|
props = json.loads(feature.properties_json)
|
||||||
|
for img_url in props.get("_images", []):
|
||||||
|
if img_url.startswith("/api/images/"):
|
||||||
|
filename = img_url.split("/")[-1]
|
||||||
|
filepath = IMAGES_DIR / str(feature.dataset_id) / filename
|
||||||
|
if filepath.exists():
|
||||||
|
filepath.unlink()
|
||||||
|
|
||||||
|
# Décrémenter le compteur du dataset
|
||||||
|
dataset = session.get(Dataset, feature.dataset_id)
|
||||||
|
if dataset:
|
||||||
|
dataset.feature_count = max(0, dataset.feature_count - 1)
|
||||||
|
session.add(dataset)
|
||||||
|
|
||||||
|
session.delete(feature)
|
||||||
|
session.commit()
|
||||||
|
return {"ok": True}
|
||||||
141
backend/app/routes/images.py
Normal file
@@ -0,0 +1,141 @@
|
|||||||
|
import json
|
||||||
|
import base64
|
||||||
|
import re
|
||||||
|
import uuid
|
||||||
|
from pathlib import Path
|
||||||
|
from fastapi import APIRouter, Depends, UploadFile, File, HTTPException
|
||||||
|
from fastapi.responses import FileResponse
|
||||||
|
from sqlmodel import Session
|
||||||
|
from ..database import get_session
|
||||||
|
from ..models import Feature
|
||||||
|
from ..config import DATA_DIR
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/images", tags=["images"])
|
||||||
|
|
||||||
|
IMAGES_DIR = DATA_DIR / "images"
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/{dataset_id}/{filename}")
|
||||||
|
def get_image(dataset_id: int, filename: str):
|
||||||
|
"""Servir une image stockée."""
|
||||||
|
path = IMAGES_DIR / str(dataset_id) / filename
|
||||||
|
if not path.exists() or not path.is_file():
|
||||||
|
raise HTTPException(404, "Image non trouvée")
|
||||||
|
# Sécurité : vérifier que le chemin résolu est bien dans IMAGES_DIR
|
||||||
|
if not path.resolve().is_relative_to(IMAGES_DIR.resolve()):
|
||||||
|
raise HTTPException(403, "Accès interdit")
|
||||||
|
media_type = "image/jpeg"
|
||||||
|
if filename.endswith(".png"):
|
||||||
|
media_type = "image/png"
|
||||||
|
elif filename.endswith(".webp"):
|
||||||
|
media_type = "image/webp"
|
||||||
|
return FileResponse(path, media_type=media_type)
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/features/{feature_id}")
|
||||||
|
async def upload_image(
|
||||||
|
feature_id: int,
|
||||||
|
file: UploadFile = File(...),
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
):
|
||||||
|
"""Uploader une nouvelle image pour une feature."""
|
||||||
|
feature = session.get(Feature, feature_id)
|
||||||
|
if not feature:
|
||||||
|
raise HTTPException(404, "Feature non trouvée")
|
||||||
|
|
||||||
|
props = json.loads(feature.properties_json)
|
||||||
|
images = props.get("_images", [])
|
||||||
|
|
||||||
|
# Sauvegarder le fichier
|
||||||
|
img_dir = IMAGES_DIR / str(feature.dataset_id)
|
||||||
|
img_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
ext = Path(file.filename).suffix or ".jpg"
|
||||||
|
filename = f"{feature_id}_{uuid.uuid4().hex[:8]}{ext}"
|
||||||
|
filepath = img_dir / filename
|
||||||
|
content = await file.read()
|
||||||
|
filepath.write_bytes(content)
|
||||||
|
|
||||||
|
# Ajouter l'URL dans les propriétés
|
||||||
|
url = f"/api/images/{feature.dataset_id}/{filename}"
|
||||||
|
images.append(url)
|
||||||
|
props["_images"] = images
|
||||||
|
feature.properties_json = json.dumps(props)
|
||||||
|
session.add(feature)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
return {"url": url, "images": images}
|
||||||
|
|
||||||
|
|
||||||
|
@router.delete("/features/{feature_id}/{filename}")
|
||||||
|
def delete_image(
|
||||||
|
feature_id: int,
|
||||||
|
filename: str,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
):
|
||||||
|
"""Supprimer une image d'une feature."""
|
||||||
|
feature = session.get(Feature, feature_id)
|
||||||
|
if not feature:
|
||||||
|
raise HTTPException(404, "Feature non trouvée")
|
||||||
|
|
||||||
|
props = json.loads(feature.properties_json)
|
||||||
|
images = props.get("_images", [])
|
||||||
|
|
||||||
|
# Trouver et supprimer l'URL correspondante
|
||||||
|
url = f"/api/images/{feature.dataset_id}/{filename}"
|
||||||
|
if url not in images:
|
||||||
|
raise HTTPException(404, "Image non trouvée dans cette feature")
|
||||||
|
|
||||||
|
images.remove(url)
|
||||||
|
props["_images"] = images
|
||||||
|
feature.properties_json = json.dumps(props)
|
||||||
|
session.add(feature)
|
||||||
|
session.commit()
|
||||||
|
|
||||||
|
# Supprimer le fichier
|
||||||
|
filepath = IMAGES_DIR / str(feature.dataset_id) / filename
|
||||||
|
if filepath.exists() and filepath.resolve().is_relative_to(IMAGES_DIR.resolve()):
|
||||||
|
filepath.unlink()
|
||||||
|
|
||||||
|
return {"images": images}
|
||||||
|
|
||||||
|
|
||||||
|
def extract_and_save_images(properties: dict, dataset_id: int, feature_index: int) -> dict:
|
||||||
|
"""Extraire les images base64 des propriétés et les sauvegarder en fichiers.
|
||||||
|
|
||||||
|
Les data URIs dans _images sont remplacées par des URLs /api/images/...
|
||||||
|
"""
|
||||||
|
images = properties.get("_images", [])
|
||||||
|
if not images:
|
||||||
|
return properties
|
||||||
|
|
||||||
|
img_dir = IMAGES_DIR / str(dataset_id)
|
||||||
|
img_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
saved_urls = []
|
||||||
|
for i, img in enumerate(images):
|
||||||
|
if isinstance(img, str) and img.startswith("data:image"):
|
||||||
|
# Extraire le base64
|
||||||
|
match = re.match(r"data:image/(\w+);base64,(.+)", img, re.DOTALL)
|
||||||
|
if match:
|
||||||
|
ext = match.group(1)
|
||||||
|
if ext == "jpeg":
|
||||||
|
ext = "jpg"
|
||||||
|
b64_data = match.group(2)
|
||||||
|
try:
|
||||||
|
raw = base64.b64decode(b64_data)
|
||||||
|
filename = f"{feature_index}_{i}.{ext}"
|
||||||
|
filepath = img_dir / filename
|
||||||
|
filepath.write_bytes(raw)
|
||||||
|
saved_urls.append(f"/api/images/{dataset_id}/{filename}")
|
||||||
|
except Exception:
|
||||||
|
continue
|
||||||
|
elif isinstance(img, str) and img.startswith("/api/images/"):
|
||||||
|
# Déjà une URL serveur
|
||||||
|
saved_urls.append(img)
|
||||||
|
elif isinstance(img, str) and img.startswith("http"):
|
||||||
|
# URL externe, garder telle quelle
|
||||||
|
saved_urls.append(img)
|
||||||
|
|
||||||
|
properties["_images"] = saved_urls
|
||||||
|
return properties
|
||||||
62
backend/app/routes/settings.py
Normal file
@@ -0,0 +1,62 @@
|
|||||||
|
from fastapi import APIRouter, Depends
|
||||||
|
from pydantic import BaseModel
|
||||||
|
from typing import Optional
|
||||||
|
from sqlmodel import Session
|
||||||
|
from ..database import get_session
|
||||||
|
from ..models import MapSettings
|
||||||
|
|
||||||
|
router = APIRouter(prefix="/settings", tags=["settings"])
|
||||||
|
|
||||||
|
|
||||||
|
class MapSettingsUpdate(BaseModel):
|
||||||
|
center_lng: Optional[float] = None
|
||||||
|
center_lat: Optional[float] = None
|
||||||
|
zoom: Optional[float] = None
|
||||||
|
base_layer: Optional[str] = None
|
||||||
|
|
||||||
|
|
||||||
|
@router.get("/map")
|
||||||
|
def get_map_settings(session: Session = Depends(get_session)):
|
||||||
|
settings = session.get(MapSettings, 1)
|
||||||
|
if not settings:
|
||||||
|
return {
|
||||||
|
"center_lng": 2.35,
|
||||||
|
"center_lat": 48.85,
|
||||||
|
"zoom": 5.0,
|
||||||
|
"base_layer": "vector",
|
||||||
|
}
|
||||||
|
return {
|
||||||
|
"center_lng": settings.center_lng,
|
||||||
|
"center_lat": settings.center_lat,
|
||||||
|
"zoom": settings.zoom,
|
||||||
|
"base_layer": settings.base_layer,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@router.put("/map")
|
||||||
|
def save_map_settings(
|
||||||
|
data: MapSettingsUpdate,
|
||||||
|
session: Session = Depends(get_session),
|
||||||
|
):
|
||||||
|
settings = session.get(MapSettings, 1)
|
||||||
|
if not settings:
|
||||||
|
settings = MapSettings(id=1)
|
||||||
|
|
||||||
|
if data.center_lng is not None:
|
||||||
|
settings.center_lng = data.center_lng
|
||||||
|
if data.center_lat is not None:
|
||||||
|
settings.center_lat = data.center_lat
|
||||||
|
if data.zoom is not None:
|
||||||
|
settings.zoom = data.zoom
|
||||||
|
if data.base_layer is not None:
|
||||||
|
settings.base_layer = data.base_layer
|
||||||
|
|
||||||
|
session.add(settings)
|
||||||
|
session.commit()
|
||||||
|
session.refresh(settings)
|
||||||
|
return {
|
||||||
|
"center_lng": settings.center_lng,
|
||||||
|
"center_lat": settings.center_lat,
|
||||||
|
"zoom": settings.zoom,
|
||||||
|
"base_layer": settings.base_layer,
|
||||||
|
}
|
||||||
1
backend/data/raw/bad.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_1.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_10.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_2.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_3.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_4.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_5.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_6.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_7.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_8.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/bad_9.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "Point"}
|
||||||
1
backend/data/raw/test.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_1.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_10.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_11.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_12.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_13.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_14.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_15.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_16.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_17.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_18.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_19.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_2.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_20.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_21.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_22.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_23.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_24.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_25.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_26.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_27.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_28.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_29.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_3.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_30.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_31.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_32.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_33.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_34.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_35.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_4.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_5.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_6.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_7.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_8.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/test_9.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_1.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_10.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_2.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_3.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_4.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_5.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_6.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_7.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_8.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
1
backend/data/raw/villes_9.geojson
Normal file
@@ -0,0 +1 @@
|
|||||||
|
{"type": "FeatureCollection", "features": [{"type": "Feature", "geometry": {"type": "Point", "coordinates": [2.35, 48.85]}, "properties": {"name": "Paris", "description": "Capitale de la France"}}, {"type": "Feature", "geometry": {"type": "Point", "coordinates": [5.37, 43.3]}, "properties": {"name": "Marseille"}}]}
|
||||||
0
backend/data/webcarto.db
Normal file
7
backend/requirements.txt
Normal file
@@ -0,0 +1,7 @@
|
|||||||
|
fastapi>=0.115
|
||||||
|
uvicorn[standard]>=0.34
|
||||||
|
sqlmodel>=0.0.22
|
||||||
|
alembic>=1.15
|
||||||
|
pydantic>=2.10
|
||||||
|
python-multipart>=0.0.17
|
||||||
|
aiofiles>=24.1
|
||||||
0
backend/tests/__init__.py
Normal file
BIN
backend/tests/__pycache__/__init__.cpython-313.pyc
Normal file
BIN
backend/tests/__pycache__/test_api.cpython-313-pytest-9.0.2.pyc
Normal file
188
backend/tests/test_api.py
Normal file
@@ -0,0 +1,188 @@
|
|||||||
|
import json
|
||||||
|
import pytest
|
||||||
|
from fastapi.testclient import TestClient
|
||||||
|
from sqlmodel import SQLModel, create_engine, Session
|
||||||
|
from sqlalchemy.pool import StaticPool
|
||||||
|
from app.main import app
|
||||||
|
from app.database import get_session
|
||||||
|
from app.models import Dataset, Feature, FeatureVersion, MapSettings # noqa: F401
|
||||||
|
|
||||||
|
# SQLite in-memory avec StaticPool pour garder la même connexion
|
||||||
|
engine = create_engine(
|
||||||
|
"sqlite://",
|
||||||
|
connect_args={"check_same_thread": False},
|
||||||
|
poolclass=StaticPool,
|
||||||
|
)
|
||||||
|
|
||||||
|
|
||||||
|
def override_get_session():
|
||||||
|
with Session(engine) as session:
|
||||||
|
yield session
|
||||||
|
|
||||||
|
|
||||||
|
app.dependency_overrides[get_session] = override_get_session
|
||||||
|
client = TestClient(app)
|
||||||
|
|
||||||
|
SAMPLE_GEOJSON = {
|
||||||
|
"type": "FeatureCollection",
|
||||||
|
"features": [
|
||||||
|
{
|
||||||
|
"type": "Feature",
|
||||||
|
"geometry": {"type": "Point", "coordinates": [2.35, 48.85]},
|
||||||
|
"properties": {"name": "Paris", "description": "Capitale de la France"},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
"type": "Feature",
|
||||||
|
"geometry": {"type": "Point", "coordinates": [5.37, 43.30]},
|
||||||
|
"properties": {"name": "Marseille"},
|
||||||
|
},
|
||||||
|
],
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
@pytest.fixture(autouse=True)
|
||||||
|
def setup_db():
|
||||||
|
SQLModel.metadata.create_all(engine)
|
||||||
|
yield
|
||||||
|
SQLModel.metadata.drop_all(engine)
|
||||||
|
|
||||||
|
|
||||||
|
def test_health():
|
||||||
|
r = client.get("/api/health")
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert r.json() == {"status": "ok"}
|
||||||
|
|
||||||
|
|
||||||
|
def test_list_datasets_empty():
|
||||||
|
r = client.get("/api/datasets")
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert r.json() == []
|
||||||
|
|
||||||
|
|
||||||
|
def test_import_geojson():
|
||||||
|
geojson_str = json.dumps(SAMPLE_GEOJSON)
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": geojson_str},
|
||||||
|
files={"file": ("test.geojson", geojson_str.encode(), "application/json")},
|
||||||
|
)
|
||||||
|
assert r.status_code == 200
|
||||||
|
data = r.json()
|
||||||
|
assert data["name"] == "test"
|
||||||
|
assert data["feature_count"] == 2
|
||||||
|
assert data["id"] is not None
|
||||||
|
|
||||||
|
|
||||||
|
def test_import_invalid_geojson():
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": '{"type": "Point"}'},
|
||||||
|
files={"file": ("bad.geojson", b'{"type": "Point"}', "application/json")},
|
||||||
|
)
|
||||||
|
assert r.status_code == 400
|
||||||
|
|
||||||
|
|
||||||
|
def test_get_dataset():
|
||||||
|
geojson_str = json.dumps(SAMPLE_GEOJSON)
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": geojson_str},
|
||||||
|
files={"file": ("villes.geojson", geojson_str.encode(), "application/json")},
|
||||||
|
)
|
||||||
|
ds_id = r.json()["id"]
|
||||||
|
|
||||||
|
r = client.get(f"/api/datasets/{ds_id}")
|
||||||
|
assert r.status_code == 200
|
||||||
|
data = r.json()
|
||||||
|
assert data["name"] == "villes"
|
||||||
|
assert len(data["features"]) == 2
|
||||||
|
assert data["features"][0]["properties"]["name"] == "Paris"
|
||||||
|
|
||||||
|
|
||||||
|
def test_update_feature():
|
||||||
|
geojson_str = json.dumps(SAMPLE_GEOJSON)
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": geojson_str},
|
||||||
|
files={"file": ("test.geojson", geojson_str.encode(), "application/json")},
|
||||||
|
)
|
||||||
|
ds_id = r.json()["id"]
|
||||||
|
|
||||||
|
r = client.get(f"/api/datasets/{ds_id}")
|
||||||
|
feature_id = r.json()["features"][0]["id"]
|
||||||
|
|
||||||
|
r = client.put(
|
||||||
|
f"/api/features/{feature_id}",
|
||||||
|
json={"properties": {"name": "Paris modifié", "description": "Mis à jour"}},
|
||||||
|
)
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert r.json()["version"] == 1
|
||||||
|
|
||||||
|
r = client.get(f"/api/datasets/{ds_id}")
|
||||||
|
f = next(f for f in r.json()["features"] if f["id"] == feature_id)
|
||||||
|
assert f["properties"]["name"] == "Paris modifié"
|
||||||
|
|
||||||
|
|
||||||
|
def test_export_dataset():
|
||||||
|
geojson_str = json.dumps(SAMPLE_GEOJSON)
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": geojson_str},
|
||||||
|
files={"file": ("test.geojson", geojson_str.encode(), "application/json")},
|
||||||
|
)
|
||||||
|
ds_id = r.json()["id"]
|
||||||
|
|
||||||
|
r = client.post(f"/api/datasets/{ds_id}/export?format=geojson")
|
||||||
|
assert r.status_code == 200
|
||||||
|
fc = r.json()
|
||||||
|
assert fc["type"] == "FeatureCollection"
|
||||||
|
assert len(fc["features"]) == 2
|
||||||
|
|
||||||
|
|
||||||
|
def test_dataset_not_found():
|
||||||
|
r = client.get("/api/datasets/9999")
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_feature_not_found():
|
||||||
|
r = client.put("/api/features/9999", json={"properties": {"name": "x"}})
|
||||||
|
assert r.status_code == 404
|
||||||
|
|
||||||
|
|
||||||
|
def test_delete_feature():
|
||||||
|
geojson_str = json.dumps(SAMPLE_GEOJSON)
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": geojson_str},
|
||||||
|
files={"file": ("test.geojson", geojson_str.encode(), "application/json")},
|
||||||
|
)
|
||||||
|
ds_id = r.json()["id"]
|
||||||
|
r = client.get(f"/api/datasets/{ds_id}")
|
||||||
|
feature_id = r.json()["features"][0]["id"]
|
||||||
|
|
||||||
|
r = client.delete(f"/api/features/{feature_id}")
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert r.json()["ok"] is True
|
||||||
|
|
||||||
|
# Vérifier que la feature n'existe plus
|
||||||
|
r = client.get(f"/api/datasets/{ds_id}")
|
||||||
|
assert len(r.json()["features"]) == 1
|
||||||
|
assert r.json()["feature_count"] == 1
|
||||||
|
|
||||||
|
|
||||||
|
def test_delete_dataset():
|
||||||
|
geojson_str = json.dumps(SAMPLE_GEOJSON)
|
||||||
|
r = client.post(
|
||||||
|
"/api/datasets/import",
|
||||||
|
data={"geojson": geojson_str},
|
||||||
|
files={"file": ("test.geojson", geojson_str.encode(), "application/json")},
|
||||||
|
)
|
||||||
|
ds_id = r.json()["id"]
|
||||||
|
|
||||||
|
r = client.delete(f"/api/datasets/{ds_id}")
|
||||||
|
assert r.status_code == 200
|
||||||
|
assert r.json()["ok"] is True
|
||||||
|
|
||||||
|
# Vérifier que le dataset n'existe plus
|
||||||
|
r = client.get(f"/api/datasets/{ds_id}")
|
||||||
|
assert r.status_code == 404
|
||||||
226
consigne.md
Normal file
@@ -0,0 +1,226 @@
|
|||||||
|
Prompt Claude Code (WebApp carto KML/GeoJSON + fonds type Google Earth, Docker)
|
||||||
|
|
||||||
|
Tu es Claude Code. Objectif : développer une webapp self-hosted moderne (UI gruvbox dark vintage) installable en Docker, qui affiche une carte avec fonds satellite / hybride / vecteur, permet d’importer des fichiers locaux KML et GeoJSON, puis d’éditer/ déplacer des points et modifier leurs attributs (texte/HTML), en affichant aussi les images intégrées (ex : dans les descriptions KML ou champs GeoJSON).
|
||||||
|
|
||||||
|
Contraintes & exigences
|
||||||
|
|
||||||
|
Projet “production-ready” minimal, mais extensible.
|
||||||
|
|
||||||
|
Import local via navigateur (drag&drop + bouton).
|
||||||
|
|
||||||
|
Aucune dépendance à Google Earth desktop : tout doit marcher via navigateur.
|
||||||
|
|
||||||
|
Style : gruvbox dark vintage (thème cohérent, contrastes, lisibilité).
|
||||||
|
|
||||||
|
Docker : un docker-compose.yml pour lancer l’ensemble.
|
||||||
|
|
||||||
|
Stockage : persistance des datasets importés + historique des modifications (versioning simple).
|
||||||
|
|
||||||
|
Sécurité : accès LAN seulement (pas d’auth pour v1), mais structure prête pour ajout futur.
|
||||||
|
|
||||||
|
Performance : datasets raisonnables (ex: 5k–50k points). Prévoir clustering/virtualisation si besoin.
|
||||||
|
|
||||||
|
Architecture attendue
|
||||||
|
|
||||||
|
Frontend : SPA moderne (TypeScript), composants UI, panneau latéral, import, édition.
|
||||||
|
|
||||||
|
Backend : API REST (Python FastAPI) pour persister les couches/datasets, servir les fichiers, gérer versions.
|
||||||
|
|
||||||
|
Stockage : SQLite + stockage fichiers sur volume (KML/GeoJSON originaux + export).
|
||||||
|
|
||||||
|
Export : permettre d’exporter la couche éditée en GeoJSON (v1) + KML (v2).
|
||||||
|
|
||||||
|
Tests : au moins tests unitaires backend (API import/validation) + tests frontend basiques (import parsing).
|
||||||
|
|
||||||
|
Fonctionnalités détaillées (MVP)
|
||||||
|
|
||||||
|
Carte & fonds
|
||||||
|
|
||||||
|
Afficher une carte plein écran.
|
||||||
|
|
||||||
|
Proposer 3 fonds : “Satellite”, “Hybride”, “Vecteur”.
|
||||||
|
|
||||||
|
Utiliser un fournisseur de tuiles compatible (pas besoin d’API key au début si possible), mais prévoir la configuration via .env (URL templates, attribution).
|
||||||
|
|
||||||
|
Afficher l’attribution correctement.
|
||||||
|
|
||||||
|
Import fichiers
|
||||||
|
|
||||||
|
Import KML local (fichier .kml / .kmz si simple).
|
||||||
|
|
||||||
|
Import GeoJSON local.
|
||||||
|
|
||||||
|
À l’import : parser, valider, normaliser (projection WGS84).
|
||||||
|
|
||||||
|
Convertir en “couches” internes : points / lignes / polygones.
|
||||||
|
|
||||||
|
Conserver le “raw” original pour re-téléchargement.
|
||||||
|
|
||||||
|
Edition
|
||||||
|
|
||||||
|
Sélection sur la carte (clic) + liste des features dans un panneau.
|
||||||
|
|
||||||
|
Déplacement des points (drag) + undo/redo (au moins undo 1 niveau).
|
||||||
|
|
||||||
|
Edition des propriétés : name, description, champs libres.
|
||||||
|
|
||||||
|
Affichage du contenu rich (HTML safe) dans un viewer.
|
||||||
|
|
||||||
|
Si images :
|
||||||
|
|
||||||
|
Cas 1 : description contient une URL http(s) -> afficher image.
|
||||||
|
|
||||||
|
Cas 2 : KML contient des références d’images (ou CDATA) -> extraire URLs si présentes.
|
||||||
|
|
||||||
|
Ne pas tenter d’OCR, ne pas embedder des binaires dans v1.
|
||||||
|
|
||||||
|
UI/UX
|
||||||
|
|
||||||
|
Layout :
|
||||||
|
|
||||||
|
Header : nom projet + boutons Import / Export / Save / Settings.
|
||||||
|
|
||||||
|
Volet gauche : liste couches + recherche + filtres.
|
||||||
|
|
||||||
|
Volet droit : panneau “Propriétés” de la feature sélectionnée (avec preview images).
|
||||||
|
|
||||||
|
Thème gruvbox dark vintage : palette, tokens CSS, typographie lisible.
|
||||||
|
|
||||||
|
Toasts + barre de statut (import, parsing, sauvegarde).
|
||||||
|
|
||||||
|
Backend
|
||||||
|
|
||||||
|
Endpoints :
|
||||||
|
|
||||||
|
POST /api/datasets/import (reçoit metadata + contenu si small, ou upload multipart).
|
||||||
|
|
||||||
|
GET /api/datasets
|
||||||
|
|
||||||
|
GET /api/datasets/{id}
|
||||||
|
|
||||||
|
PUT /api/features/{id} (update geometry + properties)
|
||||||
|
|
||||||
|
POST /api/datasets/{id}/export?format=geojson
|
||||||
|
|
||||||
|
Validation Pydantic : structure GeoJSON, taille max, champs autorisés.
|
||||||
|
|
||||||
|
Versioning simple : table feature_versions (timestamp, before/after).
|
||||||
|
|
||||||
|
Choix techniques (à appliquer)
|
||||||
|
|
||||||
|
Carte : privilégier MapLibre GL JS (vector) + possibilité raster, OU Leaflet si plus simple.
|
||||||
|
|
||||||
|
KML parsing : parser côté frontend (lib JS) puis envoyer au backend en GeoJSON normalisé.
|
||||||
|
|
||||||
|
GeoJSON : édition via libs de dessin (MapLibre + draw) ou (Leaflet + Geoman).
|
||||||
|
|
||||||
|
State management : simple (Zustand) ou équivalent.
|
||||||
|
|
||||||
|
UI : TailwindCSS + composants (shadcn-like ou headless) pour rapidité.
|
||||||
|
|
||||||
|
Livrables attendus par toi (Claude Code)
|
||||||
|
|
||||||
|
Arborescence projet complète.
|
||||||
|
|
||||||
|
docker-compose.yml + Dockerfiles frontend/backend.
|
||||||
|
|
||||||
|
Backend FastAPI prêt : routes, modèles, migrations (alembic) si nécessaire.
|
||||||
|
|
||||||
|
Frontend prêt : carte, import, liste, sélection, déplacement point, édition propriétés.
|
||||||
|
|
||||||
|
README clair : installation, variables .env, limites connues, roadmap.
|
||||||
|
|
||||||
|
Dataset d’exemple (1 KML, 1 GeoJSON) dans /samples/.
|
||||||
|
|
||||||
|
Roadmap (après MVP)
|
||||||
|
|
||||||
|
Support KMZ, styles KML (icônes, couleurs).
|
||||||
|
|
||||||
|
Edition lignes/polygones.
|
||||||
|
|
||||||
|
Auth (OIDC / reverse proxy).
|
||||||
|
|
||||||
|
Multi-utilisateurs, permissions.
|
||||||
|
|
||||||
|
Import de grands datasets (streaming, worker).
|
||||||
|
|
||||||
|
Offline tiles / serveur tuiles interne.
|
||||||
|
|
||||||
|
Commence par proposer la structure du repo + fichiers Docker + une première version fonctionnelle MVP. Implémente ensuite par incréments : carte -> import GeoJSON -> édition point -> import KML -> persistance -> export.
|
||||||
|
|
||||||
|
Brainstorming outils (stack Docker) pour ce projet
|
||||||
|
Carto / rendu carte (frontend)
|
||||||
|
|
||||||
|
MapLibre GL JS : rendu vectoriel moderne, performant, style JSON, compatible “vecteur” + raster.
|
||||||
|
|
||||||
|
Leaflet : très simple, énorme écosystème, mais moins “GPU”/vecteur natif.
|
||||||
|
|
||||||
|
Mapbox Draw / MapLibre Draw ou Leaflet-Geoman : outils d’édition/dessin (drag points, edit).
|
||||||
|
|
||||||
|
Import / conversion
|
||||||
|
|
||||||
|
togeojson (KML -> GeoJSON, JS) : classique, efficace.
|
||||||
|
|
||||||
|
@tmcw/togeojson variantes / ou libs KML parser modernes.
|
||||||
|
|
||||||
|
geojson-validation ou validation custom côté backend.
|
||||||
|
|
||||||
|
Pour KMZ plus tard : unzip + parsing KML.
|
||||||
|
|
||||||
|
UI moderne
|
||||||
|
|
||||||
|
Vite + React + TypeScript
|
||||||
|
|
||||||
|
TailwindCSS + tokens de thème (gruvbox).
|
||||||
|
|
||||||
|
Composants : Radix UI (headless) ou shadcn/ui-like.
|
||||||
|
|
||||||
|
State : Zustand (simple, efficace).
|
||||||
|
|
||||||
|
Backend & persistance
|
||||||
|
|
||||||
|
FastAPI + Pydantic
|
||||||
|
|
||||||
|
SQLite (MVP) via SQLModel ou SQLAlchemy
|
||||||
|
|
||||||
|
Alembic (migrations si tu veux solide)
|
||||||
|
|
||||||
|
Stockage fichiers : volume Docker (/data)
|
||||||
|
|
||||||
|
Export & traitement géo
|
||||||
|
|
||||||
|
Shapely / pyproj côté backend si besoin d’opérations (simplification, reprojection), mais v1 peut rester “pass-through” en WGS84.
|
||||||
|
|
||||||
|
Sinon 100% côté frontend (GeoJSON only) et backend persiste.
|
||||||
|
|
||||||
|
Dev / qualité
|
||||||
|
|
||||||
|
pytest (backend)
|
||||||
|
|
||||||
|
ruff + black (backend)
|
||||||
|
|
||||||
|
eslint + prettier (frontend)
|
||||||
|
|
||||||
|
OpenAPI auto (FastAPI) pour tester vite.
|
||||||
|
|
||||||
|
Docker / déploiement
|
||||||
|
|
||||||
|
2 services :
|
||||||
|
|
||||||
|
backend (FastAPI + volume /data)
|
||||||
|
|
||||||
|
frontend (build Vite puis Nginx)
|
||||||
|
|
||||||
|
Option 3 : un Caddy ou Traefik plus tard (pas requis MVP).
|
||||||
|
|
||||||
|
Variables .env : URLs des fonds de carte, limites upload, chemins data.
|
||||||
|
|
||||||
|
Fonds “satellite / hybride”
|
||||||
|
|
||||||
|
Attention : “Google Earth” au sens strict n’est pas un fournisseur de tuiles libre. Pour MVP :
|
||||||
|
|
||||||
|
Utiliser des raster tiles (satellite) et vector tiles (OSM vector) via fournisseurs compatibles, configurables.
|
||||||
|
|
||||||
|
Prévoir abstraction “BaseLayerProvider” configurable (URL template + attribution + type raster/vector).
|
||||||
|
|
||||||
|
“Hybride” = satellite + labels (deux couches superposées).
|
||||||
BIN
data/images/1/18_0.png
Normal file
|
After Width: | Height: | Size: 954 KiB |
BIN
data/images/1/19_0.png
Normal file
|
After Width: | Height: | Size: 836 KiB |
BIN
data/images/1/20_0.png
Normal file
|
After Width: | Height: | Size: 1.2 MiB |
BIN
data/images/1/21_0.png
Normal file
|
After Width: | Height: | Size: 1.2 MiB |
BIN
data/images/1/22_0.png
Normal file
|
After Width: | Height: | Size: 1.1 MiB |
BIN
data/images/1/23_0.png
Normal file
|
After Width: | Height: | Size: 1021 KiB |
BIN
data/images/1/24_0.png
Normal file
|
After Width: | Height: | Size: 1.1 MiB |
BIN
data/images/1/25_0.png
Normal file
|
After Width: | Height: | Size: 1.1 MiB |
BIN
data/images/1/26_0.png
Normal file
|
After Width: | Height: | Size: 1.1 MiB |
BIN
data/images/1/27_0.png
Normal file
|
After Width: | Height: | Size: 978 KiB |