generated from gilles/template-webapp
claude code
This commit is contained in:
@@ -12,38 +12,33 @@ Tout ce qui est indiqué ici est la référence pour les agents backend.
|
||||
---
|
||||
|
||||
## Objectif du backend
|
||||
- Problème métier couvert : <A REMPLIR - PROJET> (exemple: suivi manuel sur tableur — a supprimer)
|
||||
- Responsabilités principales : <A COMPLETER PAR AGENT>
|
||||
- Hors périmètre : <A REMPLIR - PROJET> (exemple: à personnaliser — a supprimer)
|
||||
- Problème métier couvert : Centraliser et structurer les données d'inventaire domestique avec recherche efficace et gestion de fichiers <!-- complété par codex -->
|
||||
- Responsabilités principales : API REST CRUD (items, locations, categories), upload/stockage fichiers, recherche full-text, validation données, génération OpenAPI <!-- complété par codex -->
|
||||
- Hors périmètre : Rendu frontend (SPA indépendante), authentification complexe (MVP mono-utilisateur), analytics avancées <!-- complété par codex -->
|
||||
|
||||
## Interfaces
|
||||
- API publique (API = Interface de Programmation) : <A COMPLETER PAR AGENT>
|
||||
- Authentification/autorisation : <A COMPLETER PAR AGENT>
|
||||
- Intégrations externes : <A REMPLIR - PROJET> (exemple: ERP existant — a supprimer)
|
||||
- API publique : REST JSON à `/api/v1/`, OpenAPI 3.0 auto-générée à `/docs`, endpoints principaux = items, locations, categories, documents, search <!-- complété par codex -->
|
||||
- Authentification/autorisation : Optionnelle pour MVP (déploiement local), si activée = session cookie basique, pas de rôles complexes (mono-utilisateur) <!-- complété par codex -->
|
||||
- Intégrations externes : Aucune intégration externe, système autonome <!-- complété par codex -->
|
||||
|
||||
## Données
|
||||
- Base(s) utilisée(s) : <A COMPLETER PAR AGENT>
|
||||
- Modèle de données clé : <A COMPLETER PAR AGENT>
|
||||
- Stratégie de migration : <A COMPLETER PAR AGENT>
|
||||
- Base(s) utilisée(s) : SQLite (fichier homestock.db) avec extension FTS5 pour recherche full-text <!-- complété par codex -->
|
||||
- Modèle de données clé : Item (objet principal), Location (hiérarchie lieu/meuble/tiroir), Category (domaine), Document (fichier), relations Many-to-One et Many-to-Many <!-- complété par codex -->
|
||||
- Stratégie de migration : Alembic pour migrations versionnées, auto-génération à partir des modèles SQLAlchemy, migrations réversibles up/down <!-- complété par codex -->
|
||||
|
||||
## Architecture interne
|
||||
- Style (monolithe modulaire, hexagonal, etc.) : <A COMPLETER PAR AGENT>
|
||||
- Modules principaux : <A COMPLETER PAR AGENT>
|
||||
- Couche d’accès aux données : <A COMPLETER PAR AGENT>
|
||||
- Style : Monolithe modulaire avec séparation claire routers → services → repositories (3-layer architecture) <!-- complété par codex -->
|
||||
- Modules principaux : `routers/` (endpoints FastAPI), `services/` (logique métier), `models/` (ORM SQLAlchemy), `schemas/` (Pydantic validation), `repositories/` (accès BDD) <!-- complété par codex -->
|
||||
- Couche d'accès aux données : Repository pattern avec SQLAlchemy async sessions, abstraction BDD pour faciliter tests et évolution <!-- complété par codex -->
|
||||
|
||||
## Qualité & exploitation
|
||||
- Observabilité (logs/metrics/traces = journaux/mesures/traces) : <A COMPLETER PAR AGENT>
|
||||
- Tests (unitaires/intégration) : <A COMPLETER PAR AGENT>
|
||||
- Performance attendue : <A REMPLIR - PROJET> (exemple: à personnaliser — a supprimer)
|
||||
- Observabilité : Logs structurés avec loguru (format JSON), endpoint `/health` pour healthcheck, pas de tracing distribué (monolithe) <!-- complété par codex -->
|
||||
- Tests : pytest avec tests unitaires (services/), tests intégration (routers/ avec TestClient), fixtures pour BDD test, couverture 70%+ sur logique métier <!-- complété par codex -->
|
||||
- Performance attendue : Réponse API <200ms pour GET simple, <500ms pour recherche full-text, upload fichiers <5s pour 10MB, pas de contrainte scalabilité (mono-utilisateur) <!-- complété par codex -->
|
||||
|
||||
## Conventions
|
||||
- Organisation du code : <A COMPLETER PAR AGENT>
|
||||
- Nommage : <A COMPLETER PAR AGENT>
|
||||
- Gestion erreurs : <A COMPLETER PAR AGENT>
|
||||
- Organisation du code : `backend/app/` racine, sous-dossiers par responsabilité (routers/, services/, models/, schemas/, repositories/), un fichier par entité <!-- complété par codex -->
|
||||
- Nommage : snake_case pour tout (variables, fonctions, fichiers), préfixes get_/create_/update_/delete_ pour CRUD, suffixes _service/_repository selon couche <!-- complété par codex -->
|
||||
- Gestion erreurs : HTTPException FastAPI pour erreurs API avec codes standards (400/404/500), exceptions métier custom héritant de BaseException, logging erreurs avec contexte <!-- complété par codex -->
|
||||
|
||||
---
|
||||
|
||||
## Exemple (a supprimer)
|
||||
- Style : monolithe modulaire avec modules `users`, `billing`, `catalog`.
|
||||
- API : REST `/api/v1` + JWT (Jeton d’authentification).
|
||||
- DB : PostgreSQL, migrations via outils natifs.
|
||||
---
|
||||
@@ -0,0 +1,35 @@
|
||||
# syntax=docker/dockerfile:1
|
||||
|
||||
# Image de base Python 3.11
|
||||
FROM python:3.11-slim
|
||||
|
||||
# Variables d'environnement Python
|
||||
ENV PYTHONUNBUFFERED=1 \
|
||||
PYTHONDONTWRITEBYTECODE=1 \
|
||||
PIP_NO_CACHE_DIR=1 \
|
||||
PIP_DISABLE_PIP_VERSION_CHECK=1
|
||||
|
||||
# Installer uv (modern Python package manager)
|
||||
RUN pip install uv
|
||||
|
||||
# Répertoire de travail
|
||||
WORKDIR /app
|
||||
|
||||
# Copier les fichiers de dépendances
|
||||
COPY pyproject.toml README.md ./
|
||||
|
||||
# Installer les dépendances (prod + dev pour hot-reload)
|
||||
RUN uv sync
|
||||
|
||||
# Copier le code source
|
||||
COPY . .
|
||||
|
||||
# Créer les répertoires nécessaires
|
||||
RUN mkdir -p /app/data /app/uploads
|
||||
|
||||
# Exposer le port FastAPI
|
||||
EXPOSE 8000
|
||||
|
||||
# Commande par défaut (peut être overridée par docker-compose)
|
||||
# Note: --reload est activé pour le développement
|
||||
CMD ["uv", "run", "uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "8000", "--reload"]
|
||||
|
||||
71
backend/README.md
Normal file
71
backend/README.md
Normal file
@@ -0,0 +1,71 @@
|
||||
# HomeStock Backend
|
||||
|
||||
Backend API pour la gestion d'inventaire domestique.
|
||||
|
||||
## Stack technique
|
||||
|
||||
- **FastAPI** : Framework web moderne et performant
|
||||
- **SQLAlchemy 2.0+** : ORM avec support asynchrone
|
||||
- **SQLite** : Base de données embarquée
|
||||
- **Alembic** : Migrations de base de données
|
||||
- **Pydantic** : Validation des données
|
||||
- **Loguru** : Logging avancé
|
||||
|
||||
## Installation
|
||||
|
||||
```bash
|
||||
# Avec uv (recommandé)
|
||||
uv sync
|
||||
|
||||
# Ou avec pip
|
||||
pip install -e .
|
||||
```
|
||||
|
||||
## Lancement
|
||||
|
||||
```bash
|
||||
# Mode développement (avec hot-reload)
|
||||
uv run uvicorn app.main:app --reload --host 0.0.0.0 --port 8000
|
||||
|
||||
# Ou via le Makefile depuis la racine
|
||||
make dev-backend
|
||||
```
|
||||
|
||||
## Tests
|
||||
|
||||
```bash
|
||||
# Lancer les tests
|
||||
uv run pytest
|
||||
|
||||
# Avec couverture
|
||||
uv run pytest --cov=app --cov-report=html
|
||||
```
|
||||
|
||||
## Migrations
|
||||
|
||||
```bash
|
||||
# Créer une nouvelle migration
|
||||
uv run alembic revision --autogenerate -m "Description"
|
||||
|
||||
# Appliquer les migrations
|
||||
uv run alembic upgrade head
|
||||
|
||||
# Rollback
|
||||
uv run alembic downgrade -1
|
||||
```
|
||||
|
||||
## Structure
|
||||
|
||||
```
|
||||
backend/
|
||||
├── app/
|
||||
│ ├── core/ # Configuration, database, logging
|
||||
│ ├── models/ # Modèles SQLAlchemy
|
||||
│ ├── schemas/ # Schémas Pydantic
|
||||
│ ├── routers/ # Endpoints API
|
||||
│ ├── services/ # Logique métier
|
||||
│ ├── repositories/ # Accès données
|
||||
│ └── utils/ # Utilitaires
|
||||
├── alembic/ # Migrations
|
||||
└── tests/ # Tests
|
||||
```
|
||||
69
backend/alembic.ini
Normal file
69
backend/alembic.ini
Normal file
@@ -0,0 +1,69 @@
|
||||
# Configuration Alembic pour les migrations de base de données
|
||||
# Documentation : https://alembic.sqlalchemy.org/en/latest/tutorial.html
|
||||
|
||||
[alembic]
|
||||
# Chemin vers le dossier des migrations
|
||||
script_location = alembic
|
||||
|
||||
# Template utilisé pour générer les fichiers de migration
|
||||
file_template = %%(year)d%%(month).2d%%(day).2d_%%(hour).2d%%(minute).2d_%%(rev)s_%%(slug)s
|
||||
|
||||
# sys.path path, will be prepended to sys.path if present.
|
||||
# prepend_sys_path = .
|
||||
|
||||
# Fuseau horaire pour les timestamps des migrations
|
||||
# timezone = UTC
|
||||
|
||||
# Longueur maximale des caractères pour le slug du nom de fichier de migration
|
||||
# truncate_slug_length = 40
|
||||
|
||||
# Nom de la branche principale (pour multi-branches)
|
||||
# version_path_separator = os # Use os.pathsep. Default configuration used for new projects.
|
||||
|
||||
# URL de connexion à la base de données
|
||||
# Note : Cette valeur sera overridée par env.py qui lit depuis config.py
|
||||
sqlalchemy.url = sqlite:///./data/homestock.db
|
||||
|
||||
# Encode pour les migrations Python
|
||||
# output_encoding = utf-8
|
||||
|
||||
[post_write_hooks]
|
||||
# Hook pour formatter automatiquement les migrations avec ruff
|
||||
hooks = ruff
|
||||
ruff.type = console_scripts
|
||||
ruff.entrypoint = ruff
|
||||
ruff.options = format REVISION_SCRIPT_FILENAME
|
||||
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
[handlers]
|
||||
keys = console
|
||||
|
||||
[formatters]
|
||||
keys = generic
|
||||
|
||||
[logger_root]
|
||||
level = WARN
|
||||
handlers = console
|
||||
qualname =
|
||||
|
||||
[logger_sqlalchemy]
|
||||
level = WARN
|
||||
handlers =
|
||||
qualname = sqlalchemy.engine
|
||||
|
||||
[logger_alembic]
|
||||
level = INFO
|
||||
handlers =
|
||||
qualname = alembic
|
||||
|
||||
[handler_console]
|
||||
class = StreamHandler
|
||||
args = (sys.stderr,)
|
||||
level = NOTSET
|
||||
formatter = generic
|
||||
|
||||
[formatter_generic]
|
||||
format = %(levelname)-5.5s [%(name)s] %(message)s
|
||||
datefmt = %H:%M:%S
|
||||
99
backend/alembic/env.py
Normal file
99
backend/alembic/env.py
Normal file
@@ -0,0 +1,99 @@
|
||||
"""Environnement Alembic pour les migrations de base de données.
|
||||
|
||||
Ce fichier configure et exécute les migrations SQLAlchemy avec Alembic.
|
||||
Il supporte les migrations synchrones et asynchrones.
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
from logging.config import fileConfig
|
||||
|
||||
from sqlalchemy import pool
|
||||
from sqlalchemy.engine import Connection
|
||||
from sqlalchemy.ext.asyncio import async_engine_from_config
|
||||
|
||||
from alembic import context
|
||||
|
||||
# Import de la configuration et des modèles
|
||||
from app.core.config import settings
|
||||
from app.core.database import Base
|
||||
|
||||
# Import explicite de tous les modèles pour autogenerate
|
||||
import app.models # noqa: F401
|
||||
|
||||
# Configuration Alembic
|
||||
config = context.config
|
||||
|
||||
# Interpréter le fichier de configuration pour le logging Python
|
||||
if config.config_file_name is not None:
|
||||
fileConfig(config.config_file_name)
|
||||
|
||||
# Métadonnées des modèles pour autogenerate
|
||||
target_metadata = Base.metadata
|
||||
|
||||
# Override de l'URL de connexion depuis settings
|
||||
config.set_main_option("sqlalchemy.url", settings.DATABASE_URL)
|
||||
|
||||
|
||||
def run_migrations_offline() -> None:
|
||||
"""Exécute les migrations en mode 'offline'.
|
||||
|
||||
Configure le contexte avec uniquement une URL sans créer d'Engine.
|
||||
Les commandes SQL sont émises vers un fichier script au lieu d'être
|
||||
exécutées directement sur la base de données.
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
context.configure(
|
||||
url=url,
|
||||
target_metadata=target_metadata,
|
||||
literal_binds=True,
|
||||
dialect_opts={"paramstyle": "named"},
|
||||
compare_type=True, # Détecte les changements de types
|
||||
compare_server_default=True, # Détecte les changements de valeurs par défaut
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
def do_run_migrations(connection: Connection) -> None:
|
||||
"""Exécute les migrations avec une connexion donnée."""
|
||||
context.configure(
|
||||
connection=connection,
|
||||
target_metadata=target_metadata,
|
||||
compare_type=True,
|
||||
compare_server_default=True,
|
||||
)
|
||||
|
||||
with context.begin_transaction():
|
||||
context.run_migrations()
|
||||
|
||||
|
||||
async def run_async_migrations() -> None:
|
||||
"""Exécute les migrations en mode asynchrone.
|
||||
|
||||
Crée un moteur asynchrone et exécute les migrations.
|
||||
"""
|
||||
connectable = async_engine_from_config(
|
||||
config.get_section(config.config_ini_section, {}),
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
async with connectable.connect() as connection:
|
||||
await connection.run_sync(do_run_migrations)
|
||||
|
||||
await connectable.dispose()
|
||||
|
||||
|
||||
def run_migrations_online() -> None:
|
||||
"""Exécute les migrations en mode 'online'.
|
||||
|
||||
Crée un Engine et associe une connexion au contexte.
|
||||
"""
|
||||
asyncio.run(run_async_migrations())
|
||||
|
||||
|
||||
if context.is_offline_mode():
|
||||
run_migrations_offline()
|
||||
else:
|
||||
run_migrations_online()
|
||||
26
backend/alembic/script.py.mako
Normal file
26
backend/alembic/script.py.mako
Normal file
@@ -0,0 +1,26 @@
|
||||
"""${message}
|
||||
|
||||
Revision ID: ${up_revision}
|
||||
Revises: ${down_revision | comma,n}
|
||||
Create Date: ${create_date}
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
${imports if imports else ""}
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = ${repr(up_revision)}
|
||||
down_revision = ${repr(down_revision)}
|
||||
branch_labels = ${repr(branch_labels)}
|
||||
depends_on = ${repr(depends_on)}
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Applique la migration (passage à la version suivante)."""
|
||||
${upgrades if upgrades else "pass"}
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Annule la migration (retour à la version précédente)."""
|
||||
${downgrades if downgrades else "pass"}
|
||||
@@ -0,0 +1,107 @@
|
||||
"""Initial migration: create all tables
|
||||
|
||||
Revision ID: 8ba5962640dd
|
||||
Revises:
|
||||
Create Date: 2026-01-27 21:22:27.022127
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = '8ba5962640dd'
|
||||
down_revision = None
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Applique la migration (passage à la version suivante)."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.create_table('categories',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('name', sa.String(length=100), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=True),
|
||||
sa.Column('color', sa.String(length=7), nullable=True),
|
||||
sa.Column('icon', sa.String(length=50), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_categories_name'), 'categories', ['name'], unique=True)
|
||||
op.create_table('locations',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('name', sa.String(length=100), nullable=False),
|
||||
sa.Column('type', sa.Enum('ROOM', 'FURNITURE', 'DRAWER', 'BOX', name='locationtype', native_enum=False, length=20), nullable=False),
|
||||
sa.Column('parent_id', sa.Integer(), nullable=True),
|
||||
sa.Column('path', sa.String(length=500), nullable=False),
|
||||
sa.Column('description', sa.String(length=500), nullable=True),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.ForeignKeyConstraint(['parent_id'], ['locations.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id')
|
||||
)
|
||||
op.create_index(op.f('ix_locations_name'), 'locations', ['name'], unique=False)
|
||||
op.create_index(op.f('ix_locations_parent_id'), 'locations', ['parent_id'], unique=False)
|
||||
op.create_index(op.f('ix_locations_path'), 'locations', ['path'], unique=False)
|
||||
op.create_table('items',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('name', sa.String(length=200), nullable=False),
|
||||
sa.Column('description', sa.Text(), nullable=True),
|
||||
sa.Column('quantity', sa.Integer(), nullable=False),
|
||||
sa.Column('status', sa.Enum('IN_STOCK', 'IN_USE', 'BROKEN', 'SOLD', 'LENT', name='itemstatus', native_enum=False, length=20), nullable=False),
|
||||
sa.Column('brand', sa.String(length=100), nullable=True),
|
||||
sa.Column('model', sa.String(length=100), nullable=True),
|
||||
sa.Column('serial_number', sa.String(length=100), nullable=True),
|
||||
sa.Column('price', sa.Numeric(precision=10, scale=2), nullable=True),
|
||||
sa.Column('purchase_date', sa.Date(), nullable=True),
|
||||
sa.Column('notes', sa.Text(), nullable=True),
|
||||
sa.Column('category_id', sa.Integer(), nullable=False),
|
||||
sa.Column('location_id', sa.Integer(), nullable=False),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.ForeignKeyConstraint(['category_id'], ['categories.id'], ondelete='RESTRICT'),
|
||||
sa.ForeignKeyConstraint(['location_id'], ['locations.id'], ondelete='RESTRICT'),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('serial_number')
|
||||
)
|
||||
op.create_index(op.f('ix_items_category_id'), 'items', ['category_id'], unique=False)
|
||||
op.create_index(op.f('ix_items_location_id'), 'items', ['location_id'], unique=False)
|
||||
op.create_index(op.f('ix_items_name'), 'items', ['name'], unique=False)
|
||||
op.create_table('documents',
|
||||
sa.Column('id', sa.Integer(), autoincrement=True, nullable=False),
|
||||
sa.Column('filename', sa.String(length=255), nullable=False),
|
||||
sa.Column('original_name', sa.String(length=255), nullable=False),
|
||||
sa.Column('type', sa.Enum('PHOTO', 'MANUAL', 'INVOICE', 'WARRANTY', 'OTHER', name='documenttype', native_enum=False, length=20), nullable=False),
|
||||
sa.Column('mime_type', sa.String(length=100), nullable=False),
|
||||
sa.Column('size_bytes', sa.Integer(), nullable=False),
|
||||
sa.Column('file_path', sa.String(length=500), nullable=False),
|
||||
sa.Column('description', sa.String(length=500), nullable=True),
|
||||
sa.Column('item_id', sa.Integer(), nullable=False),
|
||||
sa.Column('created_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.Column('updated_at', sa.DateTime(timezone=True), server_default=sa.text('(CURRENT_TIMESTAMP)'), nullable=False),
|
||||
sa.ForeignKeyConstraint(['item_id'], ['items.id'], ondelete='CASCADE'),
|
||||
sa.PrimaryKeyConstraint('id'),
|
||||
sa.UniqueConstraint('filename')
|
||||
)
|
||||
op.create_index(op.f('ix_documents_item_id'), 'documents', ['item_id'], unique=False)
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Annule la migration (retour à la version précédente)."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_index(op.f('ix_documents_item_id'), table_name='documents')
|
||||
op.drop_table('documents')
|
||||
op.drop_index(op.f('ix_items_name'), table_name='items')
|
||||
op.drop_index(op.f('ix_items_location_id'), table_name='items')
|
||||
op.drop_index(op.f('ix_items_category_id'), table_name='items')
|
||||
op.drop_table('items')
|
||||
op.drop_index(op.f('ix_locations_path'), table_name='locations')
|
||||
op.drop_index(op.f('ix_locations_parent_id'), table_name='locations')
|
||||
op.drop_index(op.f('ix_locations_name'), table_name='locations')
|
||||
op.drop_table('locations')
|
||||
op.drop_index(op.f('ix_categories_name'), table_name='categories')
|
||||
op.drop_table('categories')
|
||||
# ### end Alembic commands ###
|
||||
@@ -0,0 +1,30 @@
|
||||
"""add_url_to_items
|
||||
|
||||
Revision ID: ee8035073398
|
||||
Revises: 8ba5962640dd
|
||||
Create Date: 2026-01-28 18:17:51.225223
|
||||
|
||||
"""
|
||||
from alembic import op
|
||||
import sqlalchemy as sa
|
||||
|
||||
|
||||
# revision identifiers, used by Alembic.
|
||||
revision = 'ee8035073398'
|
||||
down_revision = '8ba5962640dd'
|
||||
branch_labels = None
|
||||
depends_on = None
|
||||
|
||||
|
||||
def upgrade() -> None:
|
||||
"""Applique la migration (passage à la version suivante)."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.add_column('items', sa.Column('url', sa.String(length=500), nullable=True))
|
||||
# ### end Alembic commands ###
|
||||
|
||||
|
||||
def downgrade() -> None:
|
||||
"""Annule la migration (retour à la version précédente)."""
|
||||
# ### commands auto generated by Alembic - please adjust! ###
|
||||
op.drop_column('items', 'url')
|
||||
# ### end Alembic commands ###
|
||||
0
backend/app/__init__.py
Normal file
0
backend/app/__init__.py
Normal file
0
backend/app/core/__init__.py
Normal file
0
backend/app/core/__init__.py
Normal file
121
backend/app/core/config.py
Normal file
121
backend/app/core/config.py
Normal file
@@ -0,0 +1,121 @@
|
||||
"""Configuration de l'application HomeStock.
|
||||
|
||||
Utilise Pydantic Settings pour charger et valider les variables d'environnement.
|
||||
Documentation : https://docs.pydantic.dev/latest/concepts/pydantic_settings/
|
||||
"""
|
||||
|
||||
from functools import lru_cache
|
||||
from typing import Literal
|
||||
|
||||
from pydantic import Field, field_validator
|
||||
from pydantic_settings import BaseSettings, SettingsConfigDict
|
||||
|
||||
|
||||
class Settings(BaseSettings):
|
||||
"""Configuration globale de l'application.
|
||||
|
||||
Les valeurs sont chargées depuis les variables d'environnement ou le fichier .env.
|
||||
"""
|
||||
|
||||
model_config = SettingsConfigDict(
|
||||
env_file=".env",
|
||||
env_file_encoding="utf-8",
|
||||
case_sensitive=False,
|
||||
extra="ignore",
|
||||
)
|
||||
|
||||
# === Application ===
|
||||
APP_NAME: str = Field(default="HomeStock", description="Nom de l'application")
|
||||
APP_VERSION: str = Field(default="0.1.0", description="Version de l'application")
|
||||
ENVIRONMENT: Literal["development", "production"] = Field(
|
||||
default="development", description="Environnement d'exécution"
|
||||
)
|
||||
DEBUG: bool = Field(default=True, description="Mode debug")
|
||||
LOG_LEVEL: Literal["DEBUG", "INFO", "WARNING", "ERROR"] = Field(
|
||||
default="DEBUG", description="Niveau de log"
|
||||
)
|
||||
|
||||
# === Serveur ===
|
||||
BACKEND_HOST: str = Field(default="0.0.0.0", description="Host du serveur backend")
|
||||
BACKEND_PORT: int = Field(default=8000, description="Port du serveur backend")
|
||||
BACKEND_RELOAD: bool = Field(
|
||||
default=True, description="Hot reload en développement"
|
||||
)
|
||||
|
||||
# === Base de données ===
|
||||
DATABASE_URL: str = Field(
|
||||
default="sqlite+aiosqlite:///./data/homestock.db",
|
||||
description="URL de connexion à la base de données",
|
||||
)
|
||||
|
||||
@field_validator("DATABASE_URL")
|
||||
@classmethod
|
||||
def validate_database_url(cls, v: str) -> str:
|
||||
"""Valide et normalise l'URL de la base de données."""
|
||||
# Pour SQLite, s'assurer que le driver async est utilisé
|
||||
if v.startswith("sqlite:///"):
|
||||
return v.replace("sqlite:///", "sqlite+aiosqlite:///")
|
||||
return v
|
||||
|
||||
# === CORS ===
|
||||
CORS_ORIGINS: str = Field(
|
||||
default="http://localhost:5173,http://10.0.0.50:5173",
|
||||
description="Origines autorisées pour CORS (séparées par des virgules)",
|
||||
)
|
||||
CORS_ALLOW_CREDENTIALS: bool = Field(
|
||||
default=False, description="Autorise les credentials CORS"
|
||||
)
|
||||
|
||||
@property
|
||||
def cors_origins_list(self) -> list[str]:
|
||||
"""Retourne la liste des origines CORS autorisées."""
|
||||
return [origin.strip() for origin in self.CORS_ORIGINS.split(",")]
|
||||
|
||||
# === Stockage fichiers ===
|
||||
UPLOAD_DIR: str = Field(default="./uploads", description="Répertoire des uploads")
|
||||
MAX_UPLOAD_SIZE_MB: int = Field(
|
||||
default=50, description="Taille max des uploads en Mo"
|
||||
)
|
||||
ALLOWED_EXTENSIONS: str = Field(
|
||||
default="jpg,jpeg,png,gif,pdf,doc,docx",
|
||||
description="Extensions de fichiers autorisées (séparées par des virgules)",
|
||||
)
|
||||
|
||||
@property
|
||||
def allowed_extensions_list(self) -> list[str]:
|
||||
"""Retourne la liste des extensions autorisées."""
|
||||
return [ext.strip().lower() for ext in self.ALLOWED_EXTENSIONS.split(",")]
|
||||
|
||||
@property
|
||||
def max_upload_size_bytes(self) -> int:
|
||||
"""Retourne la taille max en octets."""
|
||||
return self.MAX_UPLOAD_SIZE_MB * 1024 * 1024
|
||||
|
||||
# === Recherche ===
|
||||
SEARCH_MIN_QUERY_LENGTH: int = Field(
|
||||
default=2, description="Longueur minimale des requêtes de recherche"
|
||||
)
|
||||
|
||||
# === Propriétés calculées ===
|
||||
@property
|
||||
def is_development(self) -> bool:
|
||||
"""Indique si l'environnement est en développement."""
|
||||
return self.ENVIRONMENT == "development"
|
||||
|
||||
@property
|
||||
def is_production(self) -> bool:
|
||||
"""Indique si l'environnement est en production."""
|
||||
return self.ENVIRONMENT == "production"
|
||||
|
||||
|
||||
@lru_cache
|
||||
def get_settings() -> Settings:
|
||||
"""Retourne l'instance singleton de la configuration.
|
||||
|
||||
Utilise lru_cache pour ne charger la configuration qu'une seule fois.
|
||||
"""
|
||||
return Settings()
|
||||
|
||||
|
||||
# Instance globale de configuration
|
||||
settings = get_settings()
|
||||
99
backend/app/core/database.py
Normal file
99
backend/app/core/database.py
Normal file
@@ -0,0 +1,99 @@
|
||||
"""Configuration de la base de données avec SQLAlchemy.
|
||||
|
||||
Utilise SQLAlchemy 2.0+ avec support asynchrone (aiosqlite pour SQLite).
|
||||
Documentation : https://docs.sqlalchemy.org/en/20/orm/extensions/asyncio.html
|
||||
"""
|
||||
|
||||
from collections.abc import AsyncGenerator
|
||||
from typing import Any
|
||||
|
||||
from sqlalchemy import event
|
||||
from sqlalchemy.ext.asyncio import (
|
||||
AsyncSession,
|
||||
async_sessionmaker,
|
||||
create_async_engine,
|
||||
)
|
||||
from sqlalchemy.orm import DeclarativeBase
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
|
||||
class Base(DeclarativeBase):
|
||||
"""Classe de base pour tous les modèles SQLAlchemy."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
# Création du moteur asynchrone
|
||||
engine = create_async_engine(
|
||||
settings.DATABASE_URL,
|
||||
echo=settings.DEBUG, # Log SQL en mode debug
|
||||
future=True,
|
||||
# Pool de connexions pour SQLite
|
||||
pool_pre_ping=True, # Vérifie la connexion avant utilisation
|
||||
)
|
||||
|
||||
|
||||
# Configuration spécifique pour SQLite (activation des foreign keys)
|
||||
@event.listens_for(engine.sync_engine, "connect")
|
||||
def set_sqlite_pragma(dbapi_conn: Any, connection_record: Any) -> None:
|
||||
"""Active les contraintes de clés étrangères pour SQLite.
|
||||
|
||||
SQLite désactive par défaut les foreign keys, il faut les activer manuellement.
|
||||
"""
|
||||
cursor = dbapi_conn.cursor()
|
||||
cursor.execute("PRAGMA foreign_keys=ON")
|
||||
cursor.close()
|
||||
|
||||
|
||||
# Session factory pour créer des sessions asynchrones
|
||||
AsyncSessionLocal = async_sessionmaker(
|
||||
engine,
|
||||
class_=AsyncSession,
|
||||
expire_on_commit=False, # Ne pas expirer les objets après commit
|
||||
autocommit=False,
|
||||
autoflush=False,
|
||||
)
|
||||
|
||||
|
||||
async def get_db() -> AsyncGenerator[AsyncSession, None]:
|
||||
"""Générateur de session de base de données pour FastAPI.
|
||||
|
||||
Utilisé comme dépendance FastAPI pour injecter une session dans les routes.
|
||||
|
||||
Usage:
|
||||
@router.get("/items")
|
||||
async def get_items(db: AsyncSession = Depends(get_db)):
|
||||
...
|
||||
|
||||
Yields:
|
||||
AsyncSession: Session SQLAlchemy asynchrone
|
||||
"""
|
||||
async with AsyncSessionLocal() as session:
|
||||
try:
|
||||
yield session
|
||||
await session.commit()
|
||||
except Exception:
|
||||
await session.rollback()
|
||||
raise
|
||||
finally:
|
||||
await session.close()
|
||||
|
||||
|
||||
async def init_db() -> None:
|
||||
"""Initialise la base de données.
|
||||
|
||||
Crée toutes les tables définies dans les modèles.
|
||||
À utiliser uniquement en développement ou pour les tests.
|
||||
En production, utiliser Alembic pour les migrations.
|
||||
"""
|
||||
async with engine.begin() as conn:
|
||||
await conn.run_sync(Base.metadata.create_all)
|
||||
|
||||
|
||||
async def close_db() -> None:
|
||||
"""Ferme proprement les connexions à la base de données.
|
||||
|
||||
À appeler lors de l'arrêt de l'application.
|
||||
"""
|
||||
await engine.dispose()
|
||||
105
backend/app/core/logging.py
Normal file
105
backend/app/core/logging.py
Normal file
@@ -0,0 +1,105 @@
|
||||
"""Configuration du système de logging avec Loguru.
|
||||
|
||||
Loguru est une bibliothèque de logging moderne et simple à utiliser.
|
||||
Documentation : https://loguru.readthedocs.io/
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from loguru import logger
|
||||
|
||||
from app.core.config import settings
|
||||
|
||||
# Supprimer les handlers par défaut de loguru
|
||||
logger.remove()
|
||||
|
||||
|
||||
def setup_logging() -> None:
|
||||
"""Configure le système de logging pour l'application.
|
||||
|
||||
- En développement : logs dans stdout + fichier debug
|
||||
- En production : logs dans fichiers avec rotation
|
||||
"""
|
||||
# Format de log avec couleurs pour stdout
|
||||
log_format = (
|
||||
"<green>{time:YYYY-MM-DD HH:mm:ss.SSS}</green> | "
|
||||
"<level>{level: <8}</level> | "
|
||||
"<cyan>{name}</cyan>:<cyan>{function}</cyan>:<cyan>{line}</cyan> - "
|
||||
"<level>{message}</level>"
|
||||
)
|
||||
|
||||
# Format de log sans couleurs pour fichiers
|
||||
log_format_file = (
|
||||
"{time:YYYY-MM-DD HH:mm:ss.SSS} | "
|
||||
"{level: <8} | "
|
||||
"{name}:{function}:{line} - "
|
||||
"{message}"
|
||||
)
|
||||
|
||||
# === Configuration commune ===
|
||||
# Handler pour stdout (console)
|
||||
logger.add(
|
||||
sys.stdout,
|
||||
format=log_format,
|
||||
level=settings.LOG_LEVEL,
|
||||
colorize=True,
|
||||
backtrace=True, # Affiche le traceback complet des exceptions
|
||||
diagnose=settings.DEBUG, # Affiche les valeurs des variables en debug
|
||||
)
|
||||
|
||||
# Créer le répertoire des logs s'il n'existe pas
|
||||
log_dir = Path("logs")
|
||||
log_dir.mkdir(exist_ok=True)
|
||||
|
||||
# === Configuration par environnement ===
|
||||
if settings.is_development:
|
||||
# En développement : logs détaillés dans un fichier
|
||||
logger.add(
|
||||
"logs/homestock_dev.log",
|
||||
format=log_format_file,
|
||||
level="DEBUG",
|
||||
rotation="10 MB", # Rotation tous les 10 Mo
|
||||
retention="7 days", # Garde les logs 7 jours
|
||||
compression="zip", # Compression des logs archivés
|
||||
backtrace=True,
|
||||
diagnose=True,
|
||||
)
|
||||
else:
|
||||
# En production : logs normaux + logs d'erreurs séparés
|
||||
logger.add(
|
||||
"logs/homestock.log",
|
||||
format=log_format_file,
|
||||
level="INFO",
|
||||
rotation="50 MB",
|
||||
retention="30 days",
|
||||
compression="zip",
|
||||
backtrace=True,
|
||||
diagnose=False,
|
||||
)
|
||||
|
||||
# Fichier séparé pour les erreurs
|
||||
logger.add(
|
||||
"logs/homestock_errors.log",
|
||||
format=log_format_file,
|
||||
level="ERROR",
|
||||
rotation="50 MB",
|
||||
retention="90 days", # Garde les erreurs plus longtemps
|
||||
compression="zip",
|
||||
backtrace=True,
|
||||
diagnose=False,
|
||||
)
|
||||
|
||||
logger.info(f"Logging configuré (level={settings.LOG_LEVEL}, env={settings.ENVIRONMENT})")
|
||||
|
||||
|
||||
def get_logger(name: str) -> "logger":
|
||||
"""Retourne un logger avec un nom spécifique.
|
||||
|
||||
Args:
|
||||
name: Nom du logger (généralement __name__ du module)
|
||||
|
||||
Returns:
|
||||
Logger configuré avec le nom spécifié
|
||||
"""
|
||||
return logger.bind(name=name)
|
||||
126
backend/app/main.py
Normal file
126
backend/app/main.py
Normal file
@@ -0,0 +1,126 @@
|
||||
"""Point d'entrée de l'application HomeStock.
|
||||
|
||||
Application FastAPI pour la gestion d'inventaire domestique.
|
||||
"""
|
||||
|
||||
from contextlib import asynccontextmanager
|
||||
from typing import Any
|
||||
|
||||
from fastapi import FastAPI
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
from fastapi.responses import JSONResponse
|
||||
from loguru import logger
|
||||
|
||||
from app.core.config import settings
|
||||
from app.core.database import close_db, init_db
|
||||
from app.core.logging import setup_logging
|
||||
|
||||
# Configuration du logging
|
||||
setup_logging()
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI) -> Any:
|
||||
"""Gère le cycle de vie de l'application.
|
||||
|
||||
- Startup : Initialise la base de données
|
||||
- Shutdown : Ferme les connexions proprement
|
||||
"""
|
||||
logger.info("Démarrage de l'application HomeStock")
|
||||
logger.info(f"Environnement: {settings.ENVIRONMENT}")
|
||||
logger.info(f"Version: {settings.APP_VERSION}")
|
||||
|
||||
# Initialisation de la base de données
|
||||
# Note: En production, utiliser Alembic pour les migrations
|
||||
if settings.is_development:
|
||||
logger.debug("Initialisation de la base de données (mode développement)")
|
||||
await init_db()
|
||||
|
||||
yield
|
||||
|
||||
# Nettoyage
|
||||
logger.info("Arrêt de l'application HomeStock")
|
||||
await close_db()
|
||||
|
||||
|
||||
# Création de l'application FastAPI
|
||||
app = FastAPI(
|
||||
title=settings.APP_NAME,
|
||||
description="API pour la gestion d'inventaire domestique",
|
||||
version=settings.APP_VERSION,
|
||||
docs_url="/api/docs" if settings.is_development else None, # Swagger UI
|
||||
redoc_url="/api/redoc" if settings.is_development else None, # ReDoc
|
||||
openapi_url="/api/openapi.json" if settings.is_development else None,
|
||||
lifespan=lifespan,
|
||||
)
|
||||
|
||||
|
||||
# === Configuration CORS ===
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=settings.cors_origins_list,
|
||||
allow_credentials=settings.CORS_ALLOW_CREDENTIALS,
|
||||
allow_methods=["*"], # Autorise toutes les méthodes (GET, POST, PUT, DELETE, etc.)
|
||||
allow_headers=["*"], # Autorise tous les headers
|
||||
)
|
||||
|
||||
|
||||
# === Routes de santé ===
|
||||
@app.get("/", tags=["Health"])
|
||||
async def root() -> dict[str, str]:
|
||||
"""Route racine - Vérification de l'état de l'API."""
|
||||
return {
|
||||
"app": settings.APP_NAME,
|
||||
"version": settings.APP_VERSION,
|
||||
"status": "running",
|
||||
}
|
||||
|
||||
|
||||
@app.get("/health", tags=["Health"])
|
||||
async def health() -> dict[str, str]:
|
||||
"""Endpoint de santé pour monitoring."""
|
||||
return {
|
||||
"status": "healthy",
|
||||
"environment": settings.ENVIRONMENT,
|
||||
}
|
||||
|
||||
|
||||
# === Gestion globale des erreurs ===
|
||||
@app.exception_handler(Exception)
|
||||
async def global_exception_handler(request: Any, exc: Exception) -> JSONResponse:
|
||||
"""Gestionnaire global des exceptions non capturées.
|
||||
|
||||
Log l'erreur et retourne une réponse JSON standardisée.
|
||||
"""
|
||||
logger.error(f"Erreur non gérée: {exc}", exc_info=True)
|
||||
|
||||
# En production, masquer les détails de l'erreur
|
||||
detail = str(exc) if settings.is_development else "Erreur interne du serveur"
|
||||
|
||||
return JSONResponse(
|
||||
status_code=500,
|
||||
content={
|
||||
"detail": detail,
|
||||
"type": "internal_server_error",
|
||||
},
|
||||
)
|
||||
|
||||
|
||||
# === Enregistrement des routers ===
|
||||
from app.routers import categories_router, items_router, locations_router
|
||||
|
||||
app.include_router(categories_router, prefix="/api/v1")
|
||||
app.include_router(locations_router, prefix="/api/v1")
|
||||
app.include_router(items_router, prefix="/api/v1")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
|
||||
uvicorn.run(
|
||||
"app.main:app",
|
||||
host=settings.BACKEND_HOST,
|
||||
port=settings.BACKEND_PORT,
|
||||
reload=settings.BACKEND_RELOAD,
|
||||
log_level=settings.LOG_LEVEL.lower(),
|
||||
)
|
||||
19
backend/app/models/__init__.py
Normal file
19
backend/app/models/__init__.py
Normal file
@@ -0,0 +1,19 @@
|
||||
"""Package des modèles SQLAlchemy.
|
||||
|
||||
Importe tous les modèles pour qu'ils soient disponibles pour Alembic.
|
||||
"""
|
||||
|
||||
from app.models.category import Category
|
||||
from app.models.document import Document, DocumentType
|
||||
from app.models.item import Item, ItemStatus
|
||||
from app.models.location import Location, LocationType
|
||||
|
||||
__all__ = [
|
||||
"Category",
|
||||
"Location",
|
||||
"LocationType",
|
||||
"Item",
|
||||
"ItemStatus",
|
||||
"Document",
|
||||
"DocumentType",
|
||||
]
|
||||
61
backend/app/models/category.py
Normal file
61
backend/app/models/category.py
Normal file
@@ -0,0 +1,61 @@
|
||||
"""Modèle SQLAlchemy pour les catégories d'objets.
|
||||
|
||||
Les catégories permettent de classer les objets par domaine d'utilisation
|
||||
(bricolage, informatique, électronique, cuisine, etc.).
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from sqlalchemy import DateTime, String, Text
|
||||
from sqlalchemy.orm import Mapped, mapped_column, relationship
|
||||
from sqlalchemy.sql import func
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from app.models.item import Item
|
||||
|
||||
|
||||
class Category(Base):
|
||||
"""Catégorie d'objets.
|
||||
|
||||
Attributes:
|
||||
id: Identifiant unique auto-incrémenté
|
||||
name: Nom de la catégorie (ex: "Bricolage", "Informatique")
|
||||
description: Description optionnelle de la catégorie
|
||||
color: Couleur hexadécimale pour l'affichage (ex: "#3b82f6")
|
||||
icon: Nom d'icône optionnel (ex: "wrench", "computer")
|
||||
created_at: Date/heure de création (auto)
|
||||
updated_at: Date/heure de dernière modification (auto)
|
||||
items: Relation vers les objets de cette catégorie
|
||||
"""
|
||||
|
||||
__tablename__ = "categories"
|
||||
|
||||
# Colonnes
|
||||
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
|
||||
name: Mapped[str] = mapped_column(String(100), unique=True, nullable=False, index=True)
|
||||
description: Mapped[str | None] = mapped_column(Text, nullable=True)
|
||||
color: Mapped[str | None] = mapped_column(String(7), nullable=True) # Format: #RRGGBB
|
||||
icon: Mapped[str | None] = mapped_column(String(50), nullable=True)
|
||||
|
||||
# Timestamps
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), server_default=func.now(), nullable=False
|
||||
)
|
||||
updated_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True),
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
nullable=False,
|
||||
)
|
||||
|
||||
# Relations
|
||||
items: Mapped[list["Item"]] = relationship(
|
||||
"Item", back_populates="category", cascade="all, delete-orphan"
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
"""Représentation string de la catégorie."""
|
||||
return f"<Category(id={self.id}, name='{self.name}')>"
|
||||
85
backend/app/models/document.py
Normal file
85
backend/app/models/document.py
Normal file
@@ -0,0 +1,85 @@
|
||||
"""Modèle SQLAlchemy pour les documents attachés aux objets.
|
||||
|
||||
Les documents peuvent être des photos, notices d'utilisation, factures, etc.
|
||||
Ils sont stockés sur le système de fichiers local.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from sqlalchemy import DateTime, Enum, ForeignKey, Integer, String
|
||||
from sqlalchemy.orm import Mapped, mapped_column, relationship
|
||||
from sqlalchemy.sql import func
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from app.models.item import Item
|
||||
|
||||
import enum
|
||||
|
||||
|
||||
class DocumentType(str, enum.Enum):
|
||||
"""Type de document."""
|
||||
|
||||
PHOTO = "photo" # Photo de l'objet
|
||||
MANUAL = "manual" # Notice d'utilisation
|
||||
INVOICE = "invoice" # Facture d'achat
|
||||
WARRANTY = "warranty" # Garantie
|
||||
OTHER = "other" # Autre
|
||||
|
||||
|
||||
class Document(Base):
|
||||
"""Document attaché à un objet.
|
||||
|
||||
Attributes:
|
||||
id: Identifiant unique auto-incrémenté
|
||||
filename: Nom du fichier sur le disque (UUID + extension)
|
||||
original_name: Nom original du fichier uploadé
|
||||
type: Type de document (photo/manual/invoice/warranty/other)
|
||||
mime_type: Type MIME (ex: "image/jpeg", "application/pdf")
|
||||
size_bytes: Taille du fichier en octets
|
||||
file_path: Chemin relatif du fichier (ex: "uploads/photos/uuid.jpg")
|
||||
description: Description optionnelle
|
||||
item_id: ID de l'objet associé (FK)
|
||||
created_at: Date/heure de création (auto)
|
||||
updated_at: Date/heure de dernière modification (auto)
|
||||
item: Relation vers l'objet
|
||||
"""
|
||||
|
||||
__tablename__ = "documents"
|
||||
|
||||
# Colonnes
|
||||
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
|
||||
filename: Mapped[str] = mapped_column(String(255), nullable=False, unique=True)
|
||||
original_name: Mapped[str] = mapped_column(String(255), nullable=False)
|
||||
type: Mapped[DocumentType] = mapped_column(
|
||||
Enum(DocumentType, native_enum=False, length=20), nullable=False
|
||||
)
|
||||
mime_type: Mapped[str] = mapped_column(String(100), nullable=False)
|
||||
size_bytes: Mapped[int] = mapped_column(Integer, nullable=False)
|
||||
file_path: Mapped[str] = mapped_column(String(500), nullable=False)
|
||||
description: Mapped[str | None] = mapped_column(String(500), nullable=True)
|
||||
|
||||
# Foreign Key
|
||||
item_id: Mapped[int] = mapped_column(
|
||||
Integer, ForeignKey("items.id", ondelete="CASCADE"), nullable=False, index=True
|
||||
)
|
||||
|
||||
# Timestamps
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), server_default=func.now(), nullable=False
|
||||
)
|
||||
updated_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True),
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
nullable=False,
|
||||
)
|
||||
|
||||
# Relations
|
||||
item: Mapped["Item"] = relationship("Item", back_populates="documents")
|
||||
|
||||
def __repr__(self) -> str:
|
||||
"""Représentation string du document."""
|
||||
return f"<Document(id={self.id}, type={self.type.value}, filename='{self.filename}')>"
|
||||
113
backend/app/models/item.py
Normal file
113
backend/app/models/item.py
Normal file
@@ -0,0 +1,113 @@
|
||||
"""Modèle SQLAlchemy pour les objets de l'inventaire.
|
||||
|
||||
Les objets (items) sont l'entité centrale de l'application.
|
||||
Chaque objet appartient à une catégorie et est situé à un emplacement.
|
||||
"""
|
||||
|
||||
from datetime import date, datetime
|
||||
from decimal import Decimal
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from sqlalchemy import Date, DateTime, Enum, ForeignKey, Integer, Numeric, String, Text
|
||||
from sqlalchemy.orm import Mapped, mapped_column, relationship
|
||||
from sqlalchemy.sql import func
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from app.models.category import Category
|
||||
from app.models.document import Document
|
||||
from app.models.location import Location
|
||||
|
||||
import enum
|
||||
|
||||
|
||||
class ItemStatus(str, enum.Enum):
|
||||
"""Statut d'un objet."""
|
||||
|
||||
IN_STOCK = "in_stock" # En stock (non utilisé)
|
||||
IN_USE = "in_use" # En cours d'utilisation
|
||||
BROKEN = "broken" # Cassé/HS
|
||||
SOLD = "sold" # Vendu
|
||||
LENT = "lent" # Prêté
|
||||
|
||||
|
||||
class Item(Base):
|
||||
"""Objet de l'inventaire domestique.
|
||||
|
||||
Attributes:
|
||||
id: Identifiant unique auto-incrémenté
|
||||
name: Nom de l'objet (ex: "Perceuse sans fil Bosch")
|
||||
description: Description détaillée optionnelle
|
||||
quantity: Quantité en stock (défaut: 1)
|
||||
price: Prix d'achat optionnel
|
||||
purchase_date: Date d'achat optionnelle
|
||||
status: Statut de l'objet (in_stock/in_use/broken/sold/lent)
|
||||
brand: Marque optionnelle (ex: "Bosch", "Samsung")
|
||||
model: Modèle optionnel (ex: "PSR 18 LI-2")
|
||||
serial_number: Numéro de série optionnel
|
||||
notes: Notes libres
|
||||
category_id: ID de la catégorie (FK)
|
||||
location_id: ID de l'emplacement (FK)
|
||||
created_at: Date/heure de création (auto)
|
||||
updated_at: Date/heure de dernière modification (auto)
|
||||
category: Relation vers la catégorie
|
||||
location: Relation vers l'emplacement
|
||||
documents: Relation vers les documents (photos, notices, factures)
|
||||
"""
|
||||
|
||||
__tablename__ = "items"
|
||||
|
||||
# Colonnes principales
|
||||
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
|
||||
name: Mapped[str] = mapped_column(String(200), nullable=False, index=True)
|
||||
description: Mapped[str | None] = mapped_column(Text, nullable=True)
|
||||
quantity: Mapped[int] = mapped_column(Integer, nullable=False, default=1)
|
||||
status: Mapped[ItemStatus] = mapped_column(
|
||||
Enum(ItemStatus, native_enum=False, length=20),
|
||||
nullable=False,
|
||||
default=ItemStatus.IN_STOCK,
|
||||
)
|
||||
|
||||
# Informations produit
|
||||
brand: Mapped[str | None] = mapped_column(String(100), nullable=True)
|
||||
model: Mapped[str | None] = mapped_column(String(100), nullable=True)
|
||||
serial_number: Mapped[str | None] = mapped_column(String(100), nullable=True, unique=True)
|
||||
url: Mapped[str | None] = mapped_column(String(500), nullable=True) # Lien vers page produit
|
||||
|
||||
# Informations achat
|
||||
price: Mapped[Decimal | None] = mapped_column(Numeric(10, 2), nullable=True)
|
||||
purchase_date: Mapped[date | None] = mapped_column(Date, nullable=True)
|
||||
|
||||
# Notes
|
||||
notes: Mapped[str | None] = mapped_column(Text, nullable=True)
|
||||
|
||||
# Foreign Keys
|
||||
category_id: Mapped[int] = mapped_column(
|
||||
Integer, ForeignKey("categories.id", ondelete="RESTRICT"), nullable=False, index=True
|
||||
)
|
||||
location_id: Mapped[int] = mapped_column(
|
||||
Integer, ForeignKey("locations.id", ondelete="RESTRICT"), nullable=False, index=True
|
||||
)
|
||||
|
||||
# Timestamps
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), server_default=func.now(), nullable=False
|
||||
)
|
||||
updated_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True),
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
nullable=False,
|
||||
)
|
||||
|
||||
# Relations
|
||||
category: Mapped["Category"] = relationship("Category", back_populates="items")
|
||||
location: Mapped["Location"] = relationship("Location", back_populates="items")
|
||||
documents: Mapped[list["Document"]] = relationship(
|
||||
"Document", back_populates="item", cascade="all, delete-orphan"
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
"""Représentation string de l'objet."""
|
||||
return f"<Item(id={self.id}, name='{self.name}', status={self.status.value})>"
|
||||
97
backend/app/models/location.py
Normal file
97
backend/app/models/location.py
Normal file
@@ -0,0 +1,97 @@
|
||||
"""Modèle SQLAlchemy pour les emplacements de stockage.
|
||||
|
||||
Les emplacements sont organisés de manière hiérarchique :
|
||||
pièce → meuble → tiroir → boîte
|
||||
Exemple : Garage → Étagère A → Tiroir 2 → Boîte visserie
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
from typing import TYPE_CHECKING
|
||||
|
||||
from sqlalchemy import DateTime, Enum, ForeignKey, Integer, String
|
||||
from sqlalchemy.orm import Mapped, mapped_column, relationship
|
||||
from sqlalchemy.sql import func
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
if TYPE_CHECKING:
|
||||
from app.models.item import Item
|
||||
|
||||
import enum
|
||||
|
||||
|
||||
class LocationType(str, enum.Enum):
|
||||
"""Type d'emplacement dans la hiérarchie."""
|
||||
|
||||
ROOM = "room" # Pièce (ex: Garage, Cuisine)
|
||||
FURNITURE = "furniture" # Meuble (ex: Étagère, Armoire)
|
||||
DRAWER = "drawer" # Tiroir
|
||||
BOX = "box" # Boîte/Bac de rangement
|
||||
|
||||
|
||||
class Location(Base):
|
||||
"""Emplacement de stockage hiérarchique.
|
||||
|
||||
Attributes:
|
||||
id: Identifiant unique auto-incrémenté
|
||||
name: Nom de l'emplacement (ex: "Garage", "Étagère A")
|
||||
type: Type d'emplacement (room/furniture/drawer/box)
|
||||
parent_id: ID du parent (None si racine)
|
||||
path: Chemin complet calculé (ex: "Garage > Étagère A > Tiroir 2")
|
||||
description: Description optionnelle
|
||||
created_at: Date/heure de création (auto)
|
||||
updated_at: Date/heure de dernière modification (auto)
|
||||
parent: Relation vers le parent
|
||||
children: Relation vers les enfants
|
||||
items: Relation vers les objets à cet emplacement
|
||||
"""
|
||||
|
||||
__tablename__ = "locations"
|
||||
|
||||
# Colonnes
|
||||
id: Mapped[int] = mapped_column(primary_key=True, autoincrement=True)
|
||||
name: Mapped[str] = mapped_column(String(100), nullable=False, index=True)
|
||||
type: Mapped[LocationType] = mapped_column(
|
||||
Enum(LocationType, native_enum=False, length=20), nullable=False
|
||||
)
|
||||
parent_id: Mapped[int | None] = mapped_column(
|
||||
Integer, ForeignKey("locations.id", ondelete="CASCADE"), nullable=True, index=True
|
||||
)
|
||||
path: Mapped[str] = mapped_column(String(500), nullable=False, index=True)
|
||||
description: Mapped[str | None] = mapped_column(String(500), nullable=True)
|
||||
|
||||
# Timestamps
|
||||
created_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True), server_default=func.now(), nullable=False
|
||||
)
|
||||
updated_at: Mapped[datetime] = mapped_column(
|
||||
DateTime(timezone=True),
|
||||
server_default=func.now(),
|
||||
onupdate=func.now(),
|
||||
nullable=False,
|
||||
)
|
||||
|
||||
# Relations
|
||||
parent: Mapped["Location | None"] = relationship(
|
||||
"Location", remote_side=[id], back_populates="children"
|
||||
)
|
||||
children: Mapped[list["Location"]] = relationship(
|
||||
"Location", back_populates="parent", cascade="all, delete-orphan"
|
||||
)
|
||||
items: Mapped[list["Item"]] = relationship(
|
||||
"Item", back_populates="location", cascade="all, delete-orphan"
|
||||
)
|
||||
|
||||
def __repr__(self) -> str:
|
||||
"""Représentation string de l'emplacement."""
|
||||
return f"<Location(id={self.id}, name='{self.name}', type={self.type.value})>"
|
||||
|
||||
def calculate_path(self) -> str:
|
||||
"""Calcule le chemin complet de l'emplacement.
|
||||
|
||||
Returns:
|
||||
Chemin complet (ex: "Garage > Étagère A > Tiroir 2")
|
||||
"""
|
||||
if self.parent is None:
|
||||
return self.name
|
||||
return f"{self.parent.calculate_path()} > {self.name}"
|
||||
15
backend/app/repositories/__init__.py
Normal file
15
backend/app/repositories/__init__.py
Normal file
@@ -0,0 +1,15 @@
|
||||
"""Package des repositories pour l'accès aux données."""
|
||||
|
||||
from app.repositories.base import BaseRepository
|
||||
from app.repositories.category import CategoryRepository
|
||||
from app.repositories.document import DocumentRepository
|
||||
from app.repositories.item import ItemRepository
|
||||
from app.repositories.location import LocationRepository
|
||||
|
||||
__all__ = [
|
||||
"BaseRepository",
|
||||
"CategoryRepository",
|
||||
"LocationRepository",
|
||||
"ItemRepository",
|
||||
"DocumentRepository",
|
||||
]
|
||||
154
backend/app/repositories/base.py
Normal file
154
backend/app/repositories/base.py
Normal file
@@ -0,0 +1,154 @@
|
||||
"""Repository de base générique.
|
||||
|
||||
Fournit les opérations CRUD de base pour tous les modèles.
|
||||
"""
|
||||
|
||||
from typing import Any, Generic, TypeVar
|
||||
|
||||
from sqlalchemy import func, select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.core.database import Base
|
||||
|
||||
ModelType = TypeVar("ModelType", bound=Base)
|
||||
|
||||
|
||||
class BaseRepository(Generic[ModelType]):
|
||||
"""Repository générique avec opérations CRUD de base.
|
||||
|
||||
Attributes:
|
||||
model: Classe du modèle SQLAlchemy
|
||||
db: Session de base de données
|
||||
"""
|
||||
|
||||
def __init__(self, model: type[ModelType], db: AsyncSession) -> None:
|
||||
"""Initialise le repository.
|
||||
|
||||
Args:
|
||||
model: Classe du modèle SQLAlchemy
|
||||
db: Session de base de données async
|
||||
"""
|
||||
self.model = model
|
||||
self.db = db
|
||||
|
||||
async def get(self, id: int) -> ModelType | None:
|
||||
"""Récupère un élément par son ID.
|
||||
|
||||
Args:
|
||||
id: Identifiant de l'élément
|
||||
|
||||
Returns:
|
||||
L'élément trouvé ou None
|
||||
"""
|
||||
result = await self.db.execute(select(self.model).where(self.model.id == id))
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_all(
|
||||
self, skip: int = 0, limit: int = 100, **filters: Any
|
||||
) -> list[ModelType]:
|
||||
"""Récupère tous les éléments avec pagination et filtres optionnels.
|
||||
|
||||
Args:
|
||||
skip: Nombre d'éléments à sauter
|
||||
limit: Nombre max d'éléments à retourner
|
||||
**filters: Filtres additionnels (ex: status="active")
|
||||
|
||||
Returns:
|
||||
Liste des éléments
|
||||
"""
|
||||
query = select(self.model)
|
||||
|
||||
# Appliquer les filtres
|
||||
for field, value in filters.items():
|
||||
if value is not None and hasattr(self.model, field):
|
||||
query = query.where(getattr(self.model, field) == value)
|
||||
|
||||
query = query.offset(skip).limit(limit)
|
||||
result = await self.db.execute(query)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def count(self, **filters: Any) -> int:
|
||||
"""Compte le nombre d'éléments avec filtres optionnels.
|
||||
|
||||
Args:
|
||||
**filters: Filtres additionnels
|
||||
|
||||
Returns:
|
||||
Nombre total d'éléments
|
||||
"""
|
||||
query = select(func.count(self.model.id))
|
||||
|
||||
for field, value in filters.items():
|
||||
if value is not None and hasattr(self.model, field):
|
||||
query = query.where(getattr(self.model, field) == value)
|
||||
|
||||
result = await self.db.execute(query)
|
||||
return result.scalar_one()
|
||||
|
||||
async def create(self, **data: Any) -> ModelType:
|
||||
"""Crée un nouvel élément.
|
||||
|
||||
Args:
|
||||
**data: Données de l'élément
|
||||
|
||||
Returns:
|
||||
L'élément créé
|
||||
"""
|
||||
instance = self.model(**data)
|
||||
self.db.add(instance)
|
||||
await self.db.flush()
|
||||
await self.db.refresh(instance)
|
||||
return instance
|
||||
|
||||
async def update(self, id: int, **data: Any) -> ModelType | None:
|
||||
"""Met à jour un élément existant.
|
||||
|
||||
Args:
|
||||
id: Identifiant de l'élément
|
||||
**data: Données à mettre à jour (seules les valeurs non-None)
|
||||
|
||||
Returns:
|
||||
L'élément mis à jour ou None si non trouvé
|
||||
"""
|
||||
instance = await self.get(id)
|
||||
if instance is None:
|
||||
return None
|
||||
|
||||
for field, value in data.items():
|
||||
if value is not None and hasattr(instance, field):
|
||||
setattr(instance, field, value)
|
||||
|
||||
await self.db.flush()
|
||||
await self.db.refresh(instance)
|
||||
return instance
|
||||
|
||||
async def delete(self, id: int) -> bool:
|
||||
"""Supprime un élément.
|
||||
|
||||
Args:
|
||||
id: Identifiant de l'élément
|
||||
|
||||
Returns:
|
||||
True si supprimé, False si non trouvé
|
||||
"""
|
||||
instance = await self.get(id)
|
||||
if instance is None:
|
||||
return False
|
||||
|
||||
await self.db.delete(instance)
|
||||
await self.db.flush()
|
||||
return True
|
||||
|
||||
async def exists(self, id: int) -> bool:
|
||||
"""Vérifie si un élément existe.
|
||||
|
||||
Args:
|
||||
id: Identifiant de l'élément
|
||||
|
||||
Returns:
|
||||
True si existe, False sinon
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(func.count(self.model.id)).where(self.model.id == id)
|
||||
)
|
||||
return result.scalar_one() > 0
|
||||
85
backend/app/repositories/category.py
Normal file
85
backend/app/repositories/category.py
Normal file
@@ -0,0 +1,85 @@
|
||||
"""Repository pour les catégories."""
|
||||
|
||||
from sqlalchemy import func, select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import selectinload
|
||||
|
||||
from app.models.category import Category
|
||||
from app.repositories.base import BaseRepository
|
||||
|
||||
|
||||
class CategoryRepository(BaseRepository[Category]):
|
||||
"""Repository pour les opérations sur les catégories."""
|
||||
|
||||
def __init__(self, db: AsyncSession) -> None:
|
||||
"""Initialise le repository."""
|
||||
super().__init__(Category, db)
|
||||
|
||||
async def get_by_name(self, name: str) -> Category | None:
|
||||
"""Récupère une catégorie par son nom.
|
||||
|
||||
Args:
|
||||
name: Nom de la catégorie
|
||||
|
||||
Returns:
|
||||
La catégorie trouvée ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Category).where(Category.name == name)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_with_item_count(self, id: int) -> tuple[Category, int] | None:
|
||||
"""Récupère une catégorie avec le nombre d'objets.
|
||||
|
||||
Args:
|
||||
id: ID de la catégorie
|
||||
|
||||
Returns:
|
||||
Tuple (catégorie, nombre d'objets) ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Category).options(selectinload(Category.items)).where(Category.id == id)
|
||||
)
|
||||
category = result.scalar_one_or_none()
|
||||
if category is None:
|
||||
return None
|
||||
return category, len(category.items)
|
||||
|
||||
async def get_all_with_item_count(
|
||||
self, skip: int = 0, limit: int = 100
|
||||
) -> list[tuple[Category, int]]:
|
||||
"""Récupère toutes les catégories avec le nombre d'objets.
|
||||
|
||||
Args:
|
||||
skip: Offset
|
||||
limit: Limite
|
||||
|
||||
Returns:
|
||||
Liste de tuples (catégorie, nombre d'objets)
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Category)
|
||||
.options(selectinload(Category.items))
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(Category.name)
|
||||
)
|
||||
categories = result.scalars().all()
|
||||
return [(cat, len(cat.items)) for cat in categories]
|
||||
|
||||
async def name_exists(self, name: str, exclude_id: int | None = None) -> bool:
|
||||
"""Vérifie si un nom de catégorie existe déjà.
|
||||
|
||||
Args:
|
||||
name: Nom à vérifier
|
||||
exclude_id: ID à exclure (pour les mises à jour)
|
||||
|
||||
Returns:
|
||||
True si le nom existe déjà
|
||||
"""
|
||||
query = select(func.count(Category.id)).where(Category.name == name)
|
||||
if exclude_id is not None:
|
||||
query = query.where(Category.id != exclude_id)
|
||||
result = await self.db.execute(query)
|
||||
return result.scalar_one() > 0
|
||||
113
backend/app/repositories/document.py
Normal file
113
backend/app/repositories/document.py
Normal file
@@ -0,0 +1,113 @@
|
||||
"""Repository pour les documents attachés."""
|
||||
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.models.document import Document, DocumentType
|
||||
from app.repositories.base import BaseRepository
|
||||
|
||||
|
||||
class DocumentRepository(BaseRepository[Document]):
|
||||
"""Repository pour les opérations sur les documents."""
|
||||
|
||||
def __init__(self, db: AsyncSession) -> None:
|
||||
"""Initialise le repository."""
|
||||
super().__init__(Document, db)
|
||||
|
||||
async def get_by_item(self, item_id: int) -> list[Document]:
|
||||
"""Récupère tous les documents d'un objet.
|
||||
|
||||
Args:
|
||||
item_id: ID de l'objet
|
||||
|
||||
Returns:
|
||||
Liste des documents
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Document)
|
||||
.where(Document.item_id == item_id)
|
||||
.order_by(Document.type, Document.created_at)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_by_item_and_type(
|
||||
self, item_id: int, type: DocumentType
|
||||
) -> list[Document]:
|
||||
"""Récupère les documents d'un objet par type.
|
||||
|
||||
Args:
|
||||
item_id: ID de l'objet
|
||||
type: Type de document
|
||||
|
||||
Returns:
|
||||
Liste des documents
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Document)
|
||||
.where(Document.item_id == item_id, Document.type == type)
|
||||
.order_by(Document.created_at)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_by_filename(self, filename: str) -> Document | None:
|
||||
"""Récupère un document par son nom de fichier.
|
||||
|
||||
Args:
|
||||
filename: Nom du fichier (UUID)
|
||||
|
||||
Returns:
|
||||
Le document trouvé ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Document).where(Document.filename == filename)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def count_by_item(self, item_id: int) -> int:
|
||||
"""Compte le nombre de documents d'un objet.
|
||||
|
||||
Args:
|
||||
item_id: ID de l'objet
|
||||
|
||||
Returns:
|
||||
Nombre de documents
|
||||
"""
|
||||
from sqlalchemy import func
|
||||
|
||||
result = await self.db.execute(
|
||||
select(func.count(Document.id)).where(Document.item_id == item_id)
|
||||
)
|
||||
return result.scalar_one()
|
||||
|
||||
async def get_photos(self, item_id: int) -> list[Document]:
|
||||
"""Récupère les photos d'un objet.
|
||||
|
||||
Args:
|
||||
item_id: ID de l'objet
|
||||
|
||||
Returns:
|
||||
Liste des photos
|
||||
"""
|
||||
return await self.get_by_item_and_type(item_id, DocumentType.PHOTO)
|
||||
|
||||
async def get_invoices(self, item_id: int) -> list[Document]:
|
||||
"""Récupère les factures d'un objet.
|
||||
|
||||
Args:
|
||||
item_id: ID de l'objet
|
||||
|
||||
Returns:
|
||||
Liste des factures
|
||||
"""
|
||||
return await self.get_by_item_and_type(item_id, DocumentType.INVOICE)
|
||||
|
||||
async def get_manuals(self, item_id: int) -> list[Document]:
|
||||
"""Récupère les notices d'un objet.
|
||||
|
||||
Args:
|
||||
item_id: ID de l'objet
|
||||
|
||||
Returns:
|
||||
Liste des notices
|
||||
"""
|
||||
return await self.get_by_item_and_type(item_id, DocumentType.MANUAL)
|
||||
247
backend/app/repositories/item.py
Normal file
247
backend/app/repositories/item.py
Normal file
@@ -0,0 +1,247 @@
|
||||
"""Repository pour les objets d'inventaire."""
|
||||
|
||||
from decimal import Decimal
|
||||
from typing import Any
|
||||
|
||||
from sqlalchemy import or_, select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import selectinload
|
||||
|
||||
from app.models.item import Item, ItemStatus
|
||||
from app.repositories.base import BaseRepository
|
||||
|
||||
|
||||
class ItemRepository(BaseRepository[Item]):
|
||||
"""Repository pour les opérations sur les objets."""
|
||||
|
||||
def __init__(self, db: AsyncSession) -> None:
|
||||
"""Initialise le repository."""
|
||||
super().__init__(Item, db)
|
||||
|
||||
async def get_with_relations(self, id: int) -> Item | None:
|
||||
"""Récupère un objet avec ses relations (catégorie, emplacement, documents).
|
||||
|
||||
Args:
|
||||
id: ID de l'objet
|
||||
|
||||
Returns:
|
||||
L'objet avec ses relations ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Item)
|
||||
.options(
|
||||
selectinload(Item.category),
|
||||
selectinload(Item.location),
|
||||
selectinload(Item.documents),
|
||||
)
|
||||
.where(Item.id == id)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_all_with_relations(
|
||||
self, skip: int = 0, limit: int = 100
|
||||
) -> list[Item]:
|
||||
"""Récupère tous les objets avec leurs relations.
|
||||
|
||||
Args:
|
||||
skip: Offset
|
||||
limit: Limite
|
||||
|
||||
Returns:
|
||||
Liste des objets avec relations
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Item)
|
||||
.options(
|
||||
selectinload(Item.category),
|
||||
selectinload(Item.location),
|
||||
)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(Item.name)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def search(
|
||||
self,
|
||||
query: str,
|
||||
category_id: int | None = None,
|
||||
location_id: int | None = None,
|
||||
status: ItemStatus | None = None,
|
||||
min_price: Decimal | None = None,
|
||||
max_price: Decimal | None = None,
|
||||
skip: int = 0,
|
||||
limit: int = 100,
|
||||
) -> list[Item]:
|
||||
"""Recherche des objets avec filtres.
|
||||
|
||||
Args:
|
||||
query: Texte de recherche (nom, description, marque, modèle)
|
||||
category_id: Filtre par catégorie
|
||||
location_id: Filtre par emplacement
|
||||
status: Filtre par statut
|
||||
min_price: Prix minimum
|
||||
max_price: Prix maximum
|
||||
skip: Offset
|
||||
limit: Limite
|
||||
|
||||
Returns:
|
||||
Liste des objets correspondants
|
||||
"""
|
||||
stmt = select(Item).options(
|
||||
selectinload(Item.category),
|
||||
selectinload(Item.location),
|
||||
)
|
||||
|
||||
# Recherche textuelle
|
||||
if query:
|
||||
search_term = f"%{query}%"
|
||||
stmt = stmt.where(
|
||||
or_(
|
||||
Item.name.ilike(search_term),
|
||||
Item.description.ilike(search_term),
|
||||
Item.brand.ilike(search_term),
|
||||
Item.model.ilike(search_term),
|
||||
Item.notes.ilike(search_term),
|
||||
)
|
||||
)
|
||||
|
||||
# Filtres
|
||||
if category_id is not None:
|
||||
stmt = stmt.where(Item.category_id == category_id)
|
||||
if location_id is not None:
|
||||
stmt = stmt.where(Item.location_id == location_id)
|
||||
if status is not None:
|
||||
stmt = stmt.where(Item.status == status)
|
||||
if min_price is not None:
|
||||
stmt = stmt.where(Item.price >= min_price)
|
||||
if max_price is not None:
|
||||
stmt = stmt.where(Item.price <= max_price)
|
||||
|
||||
stmt = stmt.offset(skip).limit(limit).order_by(Item.name)
|
||||
result = await self.db.execute(stmt)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def count_filtered(
|
||||
self,
|
||||
query: str | None = None,
|
||||
category_id: int | None = None,
|
||||
location_id: int | None = None,
|
||||
status: ItemStatus | None = None,
|
||||
min_price: Decimal | None = None,
|
||||
max_price: Decimal | None = None,
|
||||
) -> int:
|
||||
"""Compte les objets avec filtres.
|
||||
|
||||
Returns:
|
||||
Nombre d'objets correspondants
|
||||
"""
|
||||
from sqlalchemy import func
|
||||
|
||||
stmt = select(func.count(Item.id))
|
||||
|
||||
if query:
|
||||
search_term = f"%{query}%"
|
||||
stmt = stmt.where(
|
||||
or_(
|
||||
Item.name.ilike(search_term),
|
||||
Item.description.ilike(search_term),
|
||||
Item.brand.ilike(search_term),
|
||||
Item.model.ilike(search_term),
|
||||
Item.notes.ilike(search_term),
|
||||
)
|
||||
)
|
||||
|
||||
if category_id is not None:
|
||||
stmt = stmt.where(Item.category_id == category_id)
|
||||
if location_id is not None:
|
||||
stmt = stmt.where(Item.location_id == location_id)
|
||||
if status is not None:
|
||||
stmt = stmt.where(Item.status == status)
|
||||
if min_price is not None:
|
||||
stmt = stmt.where(Item.price >= min_price)
|
||||
if max_price is not None:
|
||||
stmt = stmt.where(Item.price <= max_price)
|
||||
|
||||
result = await self.db.execute(stmt)
|
||||
return result.scalar_one()
|
||||
|
||||
async def get_by_category(
|
||||
self, category_id: int, skip: int = 0, limit: int = 100
|
||||
) -> list[Item]:
|
||||
"""Récupère les objets d'une catégorie.
|
||||
|
||||
Args:
|
||||
category_id: ID de la catégorie
|
||||
skip: Offset
|
||||
limit: Limite
|
||||
|
||||
Returns:
|
||||
Liste des objets
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Item)
|
||||
.where(Item.category_id == category_id)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(Item.name)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_by_location(
|
||||
self, location_id: int, skip: int = 0, limit: int = 100
|
||||
) -> list[Item]:
|
||||
"""Récupère les objets d'un emplacement.
|
||||
|
||||
Args:
|
||||
location_id: ID de l'emplacement
|
||||
skip: Offset
|
||||
limit: Limite
|
||||
|
||||
Returns:
|
||||
Liste des objets
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Item)
|
||||
.where(Item.location_id == location_id)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(Item.name)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_by_status(
|
||||
self, status: ItemStatus, skip: int = 0, limit: int = 100
|
||||
) -> list[Item]:
|
||||
"""Récupère les objets par statut.
|
||||
|
||||
Args:
|
||||
status: Statut recherché
|
||||
skip: Offset
|
||||
limit: Limite
|
||||
|
||||
Returns:
|
||||
Liste des objets
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Item)
|
||||
.where(Item.status == status)
|
||||
.offset(skip)
|
||||
.limit(limit)
|
||||
.order_by(Item.name)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_by_serial_number(self, serial_number: str) -> Item | None:
|
||||
"""Récupère un objet par son numéro de série.
|
||||
|
||||
Args:
|
||||
serial_number: Numéro de série
|
||||
|
||||
Returns:
|
||||
L'objet trouvé ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Item).where(Item.serial_number == serial_number)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
171
backend/app/repositories/location.py
Normal file
171
backend/app/repositories/location.py
Normal file
@@ -0,0 +1,171 @@
|
||||
"""Repository pour les emplacements."""
|
||||
|
||||
from sqlalchemy import select
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
from sqlalchemy.orm import selectinload
|
||||
|
||||
from app.models.location import Location, LocationType
|
||||
from app.repositories.base import BaseRepository
|
||||
|
||||
|
||||
class LocationRepository(BaseRepository[Location]):
|
||||
"""Repository pour les opérations sur les emplacements."""
|
||||
|
||||
def __init__(self, db: AsyncSession) -> None:
|
||||
"""Initialise le repository."""
|
||||
super().__init__(Location, db)
|
||||
|
||||
async def get_with_children(self, id: int) -> Location | None:
|
||||
"""Récupère un emplacement avec ses enfants.
|
||||
|
||||
Args:
|
||||
id: ID de l'emplacement
|
||||
|
||||
Returns:
|
||||
L'emplacement avec ses enfants ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Location)
|
||||
.options(selectinload(Location.children))
|
||||
.where(Location.id == id)
|
||||
)
|
||||
return result.scalar_one_or_none()
|
||||
|
||||
async def get_root_locations(self) -> list[Location]:
|
||||
"""Récupère tous les emplacements racine (sans parent).
|
||||
|
||||
Returns:
|
||||
Liste des emplacements racine
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Location)
|
||||
.where(Location.parent_id.is_(None))
|
||||
.order_by(Location.name)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_children(self, parent_id: int) -> list[Location]:
|
||||
"""Récupère les enfants directs d'un emplacement.
|
||||
|
||||
Args:
|
||||
parent_id: ID du parent
|
||||
|
||||
Returns:
|
||||
Liste des enfants
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Location)
|
||||
.where(Location.parent_id == parent_id)
|
||||
.order_by(Location.name)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_by_type(self, type: LocationType) -> list[Location]:
|
||||
"""Récupère tous les emplacements d'un type donné.
|
||||
|
||||
Args:
|
||||
type: Type d'emplacement
|
||||
|
||||
Returns:
|
||||
Liste des emplacements
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Location)
|
||||
.where(Location.type == type)
|
||||
.order_by(Location.path)
|
||||
)
|
||||
return list(result.scalars().all())
|
||||
|
||||
async def get_full_tree(self) -> list[Location]:
|
||||
"""Récupère l'arborescence complète des emplacements.
|
||||
|
||||
Returns:
|
||||
Liste des emplacements racine avec enfants chargés récursivement
|
||||
"""
|
||||
# Charger tous les emplacements avec leurs enfants
|
||||
result = await self.db.execute(
|
||||
select(Location)
|
||||
.options(selectinload(Location.children))
|
||||
.order_by(Location.path)
|
||||
)
|
||||
all_locations = list(result.scalars().all())
|
||||
|
||||
# Retourner seulement les racines (les enfants sont déjà chargés)
|
||||
return [loc for loc in all_locations if loc.parent_id is None]
|
||||
|
||||
async def get_with_item_count(self, id: int) -> tuple[Location, int] | None:
|
||||
"""Récupère un emplacement avec le nombre d'objets.
|
||||
|
||||
Args:
|
||||
id: ID de l'emplacement
|
||||
|
||||
Returns:
|
||||
Tuple (emplacement, nombre d'objets) ou None
|
||||
"""
|
||||
result = await self.db.execute(
|
||||
select(Location)
|
||||
.options(selectinload(Location.items))
|
||||
.where(Location.id == id)
|
||||
)
|
||||
location = result.scalar_one_or_none()
|
||||
if location is None:
|
||||
return None
|
||||
return location, len(location.items)
|
||||
|
||||
async def create_with_path(
|
||||
self,
|
||||
name: str,
|
||||
type: LocationType,
|
||||
parent_id: int | None = None,
|
||||
description: str | None = None,
|
||||
) -> Location:
|
||||
"""Crée un emplacement avec calcul automatique du chemin.
|
||||
|
||||
Args:
|
||||
name: Nom de l'emplacement
|
||||
type: Type d'emplacement
|
||||
parent_id: ID du parent (None si racine)
|
||||
description: Description optionnelle
|
||||
|
||||
Returns:
|
||||
L'emplacement créé
|
||||
"""
|
||||
# Calculer le chemin
|
||||
if parent_id is None:
|
||||
path = name
|
||||
else:
|
||||
parent = await self.get(parent_id)
|
||||
if parent is None:
|
||||
path = name
|
||||
else:
|
||||
path = f"{parent.path} > {name}"
|
||||
|
||||
return await self.create(
|
||||
name=name,
|
||||
type=type,
|
||||
parent_id=parent_id,
|
||||
path=path,
|
||||
description=description,
|
||||
)
|
||||
|
||||
async def update_paths_recursive(self, location: Location) -> None:
|
||||
"""Met à jour récursivement les chemins après modification.
|
||||
|
||||
Args:
|
||||
location: Emplacement modifié
|
||||
"""
|
||||
# Mettre à jour le chemin de cet emplacement
|
||||
if location.parent_id is None:
|
||||
location.path = location.name
|
||||
else:
|
||||
parent = await self.get(location.parent_id)
|
||||
if parent:
|
||||
location.path = f"{parent.path} > {location.name}"
|
||||
else:
|
||||
location.path = location.name
|
||||
|
||||
# Mettre à jour les enfants
|
||||
children = await self.get_children(location.id)
|
||||
for child in children:
|
||||
child.path = f"{location.path} > {child.name}"
|
||||
await self.update_paths_recursive(child)
|
||||
11
backend/app/routers/__init__.py
Normal file
11
backend/app/routers/__init__.py
Normal file
@@ -0,0 +1,11 @@
|
||||
"""Package des routers API."""
|
||||
|
||||
from app.routers.categories import router as categories_router
|
||||
from app.routers.items import router as items_router
|
||||
from app.routers.locations import router as locations_router
|
||||
|
||||
__all__ = [
|
||||
"categories_router",
|
||||
"locations_router",
|
||||
"items_router",
|
||||
]
|
||||
168
backend/app/routers/categories.py
Normal file
168
backend/app/routers/categories.py
Normal file
@@ -0,0 +1,168 @@
|
||||
"""Router API pour les catégories."""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.core.database import get_db
|
||||
from app.repositories.category import CategoryRepository
|
||||
from app.schemas.category import (
|
||||
CategoryCreate,
|
||||
CategoryResponse,
|
||||
CategoryUpdate,
|
||||
CategoryWithItemCount,
|
||||
)
|
||||
from app.schemas.common import PaginatedResponse, SuccessResponse
|
||||
|
||||
router = APIRouter(prefix="/categories", tags=["Categories"])
|
||||
|
||||
|
||||
@router.get("", response_model=PaginatedResponse[CategoryWithItemCount])
|
||||
async def list_categories(
|
||||
page: int = 1,
|
||||
page_size: int = 20,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> PaginatedResponse[CategoryWithItemCount]:
|
||||
"""Liste toutes les catégories avec le nombre d'objets."""
|
||||
repo = CategoryRepository(db)
|
||||
skip = (page - 1) * page_size
|
||||
|
||||
categories_with_count = await repo.get_all_with_item_count(skip=skip, limit=page_size)
|
||||
total = await repo.count()
|
||||
|
||||
items = [
|
||||
CategoryWithItemCount(
|
||||
id=cat.id,
|
||||
name=cat.name,
|
||||
description=cat.description,
|
||||
color=cat.color,
|
||||
icon=cat.icon,
|
||||
created_at=cat.created_at,
|
||||
updated_at=cat.updated_at,
|
||||
item_count=count,
|
||||
)
|
||||
for cat, count in categories_with_count
|
||||
]
|
||||
|
||||
return PaginatedResponse.create(items=items, total=total, page=page, page_size=page_size)
|
||||
|
||||
|
||||
@router.get("/{category_id}", response_model=CategoryWithItemCount)
|
||||
async def get_category(
|
||||
category_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> CategoryWithItemCount:
|
||||
"""Récupère une catégorie par son ID."""
|
||||
repo = CategoryRepository(db)
|
||||
result = await repo.get_with_item_count(category_id)
|
||||
|
||||
if result is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Catégorie {category_id} non trouvée",
|
||||
)
|
||||
|
||||
category, item_count = result
|
||||
return CategoryWithItemCount(
|
||||
id=category.id,
|
||||
name=category.name,
|
||||
description=category.description,
|
||||
color=category.color,
|
||||
icon=category.icon,
|
||||
created_at=category.created_at,
|
||||
updated_at=category.updated_at,
|
||||
item_count=item_count,
|
||||
)
|
||||
|
||||
|
||||
@router.post("", response_model=CategoryResponse, status_code=status.HTTP_201_CREATED)
|
||||
async def create_category(
|
||||
data: CategoryCreate,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> CategoryResponse:
|
||||
"""Crée une nouvelle catégorie."""
|
||||
repo = CategoryRepository(db)
|
||||
|
||||
# Vérifier si le nom existe déjà
|
||||
if await repo.name_exists(data.name):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Une catégorie avec le nom '{data.name}' existe déjà",
|
||||
)
|
||||
|
||||
category = await repo.create(
|
||||
name=data.name,
|
||||
description=data.description,
|
||||
color=data.color,
|
||||
icon=data.icon,
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
return CategoryResponse.model_validate(category)
|
||||
|
||||
|
||||
@router.put("/{category_id}", response_model=CategoryResponse)
|
||||
async def update_category(
|
||||
category_id: int,
|
||||
data: CategoryUpdate,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> CategoryResponse:
|
||||
"""Met à jour une catégorie."""
|
||||
repo = CategoryRepository(db)
|
||||
|
||||
# Vérifier si la catégorie existe
|
||||
existing = await repo.get(category_id)
|
||||
if existing is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Catégorie {category_id} non trouvée",
|
||||
)
|
||||
|
||||
# Vérifier si le nouveau nom existe déjà (si changement de nom)
|
||||
if data.name and data.name != existing.name:
|
||||
if await repo.name_exists(data.name, exclude_id=category_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Une catégorie avec le nom '{data.name}' existe déjà",
|
||||
)
|
||||
|
||||
category = await repo.update(
|
||||
category_id,
|
||||
name=data.name,
|
||||
description=data.description,
|
||||
color=data.color,
|
||||
icon=data.icon,
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
return CategoryResponse.model_validate(category)
|
||||
|
||||
|
||||
@router.delete("/{category_id}", response_model=SuccessResponse)
|
||||
async def delete_category(
|
||||
category_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> SuccessResponse:
|
||||
"""Supprime une catégorie."""
|
||||
repo = CategoryRepository(db)
|
||||
|
||||
# Vérifier si la catégorie existe
|
||||
result = await repo.get_with_item_count(category_id)
|
||||
if result is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Catégorie {category_id} non trouvée",
|
||||
)
|
||||
|
||||
category, item_count = result
|
||||
|
||||
# Empêcher la suppression si des objets utilisent cette catégorie
|
||||
if item_count > 0:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Impossible de supprimer : {item_count} objet(s) utilisent cette catégorie",
|
||||
)
|
||||
|
||||
await repo.delete(category_id)
|
||||
await db.commit()
|
||||
|
||||
return SuccessResponse(message="Catégorie supprimée avec succès", id=category_id)
|
||||
264
backend/app/routers/items.py
Normal file
264
backend/app/routers/items.py
Normal file
@@ -0,0 +1,264 @@
|
||||
"""Router API pour les objets d'inventaire."""
|
||||
|
||||
from decimal import Decimal
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, Query, status
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.core.database import get_db
|
||||
from app.models.item import ItemStatus
|
||||
from app.repositories.category import CategoryRepository
|
||||
from app.repositories.item import ItemRepository
|
||||
from app.repositories.location import LocationRepository
|
||||
from app.schemas.common import PaginatedResponse, SuccessResponse
|
||||
from app.schemas.item import (
|
||||
ItemCreate,
|
||||
ItemResponse,
|
||||
ItemUpdate,
|
||||
ItemWithRelations,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/items", tags=["Items"])
|
||||
|
||||
|
||||
@router.get("", response_model=PaginatedResponse[ItemWithRelations])
|
||||
async def list_items(
|
||||
page: int = Query(default=1, ge=1),
|
||||
page_size: int = Query(default=20, ge=1, le=100),
|
||||
search: str | None = Query(default=None, min_length=2),
|
||||
category_id: int | None = None,
|
||||
location_id: int | None = None,
|
||||
status: ItemStatus | None = None,
|
||||
min_price: Decimal | None = None,
|
||||
max_price: Decimal | None = None,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> PaginatedResponse[ItemWithRelations]:
|
||||
"""Liste les objets avec filtres et pagination."""
|
||||
repo = ItemRepository(db)
|
||||
skip = (page - 1) * page_size
|
||||
|
||||
items = await repo.search(
|
||||
query=search or "",
|
||||
category_id=category_id,
|
||||
location_id=location_id,
|
||||
status=status,
|
||||
min_price=min_price,
|
||||
max_price=max_price,
|
||||
skip=skip,
|
||||
limit=page_size,
|
||||
)
|
||||
|
||||
total = await repo.count_filtered(
|
||||
query=search,
|
||||
category_id=category_id,
|
||||
location_id=location_id,
|
||||
status=status,
|
||||
min_price=min_price,
|
||||
max_price=max_price,
|
||||
)
|
||||
|
||||
result_items = [ItemWithRelations.model_validate(item) for item in items]
|
||||
return PaginatedResponse.create(items=result_items, total=total, page=page, page_size=page_size)
|
||||
|
||||
|
||||
@router.get("/{item_id}", response_model=ItemWithRelations)
|
||||
async def get_item(
|
||||
item_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> ItemWithRelations:
|
||||
"""Récupère un objet par son ID avec ses relations."""
|
||||
repo = ItemRepository(db)
|
||||
item = await repo.get_with_relations(item_id)
|
||||
|
||||
if item is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Objet {item_id} non trouvé",
|
||||
)
|
||||
|
||||
return ItemWithRelations.model_validate(item)
|
||||
|
||||
|
||||
@router.post("", response_model=ItemResponse, status_code=status.HTTP_201_CREATED)
|
||||
async def create_item(
|
||||
data: ItemCreate,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> ItemResponse:
|
||||
"""Crée un nouvel objet."""
|
||||
item_repo = ItemRepository(db)
|
||||
category_repo = CategoryRepository(db)
|
||||
location_repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que la catégorie existe
|
||||
if not await category_repo.exists(data.category_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Catégorie {data.category_id} non trouvée",
|
||||
)
|
||||
|
||||
# Vérifier que l'emplacement existe
|
||||
if not await location_repo.exists(data.location_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {data.location_id} non trouvé",
|
||||
)
|
||||
|
||||
# Vérifier l'unicité du numéro de série si fourni
|
||||
if data.serial_number:
|
||||
existing = await item_repo.get_by_serial_number(data.serial_number)
|
||||
if existing:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Un objet avec le numéro de série '{data.serial_number}' existe déjà",
|
||||
)
|
||||
|
||||
item = await item_repo.create(
|
||||
name=data.name,
|
||||
description=data.description,
|
||||
quantity=data.quantity,
|
||||
status=data.status,
|
||||
brand=data.brand,
|
||||
model=data.model,
|
||||
serial_number=data.serial_number,
|
||||
price=data.price,
|
||||
purchase_date=data.purchase_date,
|
||||
notes=data.notes,
|
||||
category_id=data.category_id,
|
||||
location_id=data.location_id,
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
return ItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.put("/{item_id}", response_model=ItemResponse)
|
||||
async def update_item(
|
||||
item_id: int,
|
||||
data: ItemUpdate,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> ItemResponse:
|
||||
"""Met à jour un objet."""
|
||||
item_repo = ItemRepository(db)
|
||||
category_repo = CategoryRepository(db)
|
||||
location_repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que l'objet existe
|
||||
existing = await item_repo.get(item_id)
|
||||
if existing is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Objet {item_id} non trouvé",
|
||||
)
|
||||
|
||||
# Vérifier la catégorie si changée
|
||||
if data.category_id is not None and data.category_id != existing.category_id:
|
||||
if not await category_repo.exists(data.category_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Catégorie {data.category_id} non trouvée",
|
||||
)
|
||||
|
||||
# Vérifier l'emplacement si changé
|
||||
if data.location_id is not None and data.location_id != existing.location_id:
|
||||
if not await location_repo.exists(data.location_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {data.location_id} non trouvé",
|
||||
)
|
||||
|
||||
# Vérifier l'unicité du numéro de série si changé
|
||||
if data.serial_number and data.serial_number != existing.serial_number:
|
||||
existing_with_serial = await item_repo.get_by_serial_number(data.serial_number)
|
||||
if existing_with_serial and existing_with_serial.id != item_id:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Un objet avec le numéro de série '{data.serial_number}' existe déjà",
|
||||
)
|
||||
|
||||
item = await item_repo.update(
|
||||
item_id,
|
||||
name=data.name,
|
||||
description=data.description,
|
||||
quantity=data.quantity,
|
||||
status=data.status,
|
||||
brand=data.brand,
|
||||
model=data.model,
|
||||
serial_number=data.serial_number,
|
||||
price=data.price,
|
||||
purchase_date=data.purchase_date,
|
||||
notes=data.notes,
|
||||
category_id=data.category_id,
|
||||
location_id=data.location_id,
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
return ItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.delete("/{item_id}", response_model=SuccessResponse)
|
||||
async def delete_item(
|
||||
item_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> SuccessResponse:
|
||||
"""Supprime un objet et ses documents associés."""
|
||||
repo = ItemRepository(db)
|
||||
|
||||
# Vérifier que l'objet existe
|
||||
if not await repo.exists(item_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Objet {item_id} non trouvé",
|
||||
)
|
||||
|
||||
await repo.delete(item_id)
|
||||
await db.commit()
|
||||
|
||||
return SuccessResponse(message="Objet supprimé avec succès", id=item_id)
|
||||
|
||||
|
||||
@router.patch("/{item_id}/status", response_model=ItemResponse)
|
||||
async def update_item_status(
|
||||
item_id: int,
|
||||
new_status: ItemStatus,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> ItemResponse:
|
||||
"""Met à jour le statut d'un objet."""
|
||||
repo = ItemRepository(db)
|
||||
|
||||
item = await repo.update(item_id, status=new_status)
|
||||
if item is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Objet {item_id} non trouvé",
|
||||
)
|
||||
|
||||
await db.commit()
|
||||
return ItemResponse.model_validate(item)
|
||||
|
||||
|
||||
@router.patch("/{item_id}/location", response_model=ItemResponse)
|
||||
async def move_item(
|
||||
item_id: int,
|
||||
new_location_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> ItemResponse:
|
||||
"""Déplace un objet vers un nouvel emplacement."""
|
||||
item_repo = ItemRepository(db)
|
||||
location_repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que le nouvel emplacement existe
|
||||
if not await location_repo.exists(new_location_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {new_location_id} non trouvé",
|
||||
)
|
||||
|
||||
item = await item_repo.update(item_id, location_id=new_location_id)
|
||||
if item is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Objet {item_id} non trouvé",
|
||||
)
|
||||
|
||||
await db.commit()
|
||||
return ItemResponse.model_validate(item)
|
||||
249
backend/app/routers/locations.py
Normal file
249
backend/app/routers/locations.py
Normal file
@@ -0,0 +1,249 @@
|
||||
"""Router API pour les emplacements."""
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException, status
|
||||
from sqlalchemy.ext.asyncio import AsyncSession
|
||||
|
||||
from app.core.database import get_db
|
||||
from app.models.location import LocationType
|
||||
from app.repositories.location import LocationRepository
|
||||
from app.schemas.common import PaginatedResponse, SuccessResponse
|
||||
from app.schemas.location import (
|
||||
LocationCreate,
|
||||
LocationResponse,
|
||||
LocationTree,
|
||||
LocationUpdate,
|
||||
LocationWithItemCount,
|
||||
)
|
||||
|
||||
router = APIRouter(prefix="/locations", tags=["Locations"])
|
||||
|
||||
|
||||
@router.get("", response_model=PaginatedResponse[LocationResponse])
|
||||
async def list_locations(
|
||||
page: int = 1,
|
||||
page_size: int = 50,
|
||||
parent_id: int | None = None,
|
||||
type: LocationType | None = None,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> PaginatedResponse[LocationResponse]:
|
||||
"""Liste les emplacements avec filtres optionnels."""
|
||||
repo = LocationRepository(db)
|
||||
skip = (page - 1) * page_size
|
||||
|
||||
filters = {}
|
||||
if parent_id is not None:
|
||||
filters["parent_id"] = parent_id
|
||||
if type is not None:
|
||||
filters["type"] = type
|
||||
|
||||
locations = await repo.get_all(skip=skip, limit=page_size, **filters)
|
||||
total = await repo.count(**filters)
|
||||
|
||||
items = [LocationResponse.model_validate(loc) for loc in locations]
|
||||
return PaginatedResponse.create(items=items, total=total, page=page, page_size=page_size)
|
||||
|
||||
|
||||
@router.get("/tree", response_model=list[LocationTree])
|
||||
async def get_location_tree(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> list[LocationTree]:
|
||||
"""Récupère l'arborescence complète des emplacements."""
|
||||
repo = LocationRepository(db)
|
||||
|
||||
# Récupérer tous les emplacements
|
||||
all_locations = await repo.get_all(skip=0, limit=1000)
|
||||
|
||||
# Construire un dictionnaire pour un accès rapide
|
||||
loc_dict: dict[int, LocationTree] = {}
|
||||
for loc in all_locations:
|
||||
loc_dict[loc.id] = LocationTree(
|
||||
id=loc.id,
|
||||
name=loc.name,
|
||||
type=loc.type,
|
||||
path=loc.path,
|
||||
children=[],
|
||||
item_count=0,
|
||||
)
|
||||
|
||||
# Construire l'arborescence
|
||||
roots: list[LocationTree] = []
|
||||
for loc in all_locations:
|
||||
tree_node = loc_dict[loc.id]
|
||||
if loc.parent_id is None:
|
||||
roots.append(tree_node)
|
||||
elif loc.parent_id in loc_dict:
|
||||
loc_dict[loc.parent_id].children.append(tree_node)
|
||||
|
||||
return roots
|
||||
|
||||
|
||||
@router.get("/roots", response_model=list[LocationResponse])
|
||||
async def get_root_locations(
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> list[LocationResponse]:
|
||||
"""Récupère les emplacements racine (pièces)."""
|
||||
repo = LocationRepository(db)
|
||||
locations = await repo.get_root_locations()
|
||||
return [LocationResponse.model_validate(loc) for loc in locations]
|
||||
|
||||
|
||||
@router.get("/{location_id}", response_model=LocationWithItemCount)
|
||||
async def get_location(
|
||||
location_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> LocationWithItemCount:
|
||||
"""Récupère un emplacement par son ID."""
|
||||
repo = LocationRepository(db)
|
||||
result = await repo.get_with_item_count(location_id)
|
||||
|
||||
if result is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {location_id} non trouvé",
|
||||
)
|
||||
|
||||
location, item_count = result
|
||||
return LocationWithItemCount(
|
||||
id=location.id,
|
||||
name=location.name,
|
||||
type=location.type,
|
||||
parent_id=location.parent_id,
|
||||
path=location.path,
|
||||
description=location.description,
|
||||
created_at=location.created_at,
|
||||
updated_at=location.updated_at,
|
||||
item_count=item_count,
|
||||
)
|
||||
|
||||
|
||||
@router.get("/{location_id}/children", response_model=list[LocationResponse])
|
||||
async def get_location_children(
|
||||
location_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> list[LocationResponse]:
|
||||
"""Récupère les enfants directs d'un emplacement."""
|
||||
repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que le parent existe
|
||||
if not await repo.exists(location_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {location_id} non trouvé",
|
||||
)
|
||||
|
||||
children = await repo.get_children(location_id)
|
||||
return [LocationResponse.model_validate(child) for child in children]
|
||||
|
||||
|
||||
@router.post("", response_model=LocationResponse, status_code=status.HTTP_201_CREATED)
|
||||
async def create_location(
|
||||
data: LocationCreate,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> LocationResponse:
|
||||
"""Crée un nouvel emplacement."""
|
||||
repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que le parent existe si spécifié
|
||||
if data.parent_id is not None:
|
||||
if not await repo.exists(data.parent_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement parent {data.parent_id} non trouvé",
|
||||
)
|
||||
|
||||
location = await repo.create_with_path(
|
||||
name=data.name,
|
||||
type=data.type,
|
||||
parent_id=data.parent_id,
|
||||
description=data.description,
|
||||
)
|
||||
await db.commit()
|
||||
|
||||
return LocationResponse.model_validate(location)
|
||||
|
||||
|
||||
@router.put("/{location_id}", response_model=LocationResponse)
|
||||
async def update_location(
|
||||
location_id: int,
|
||||
data: LocationUpdate,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> LocationResponse:
|
||||
"""Met à jour un emplacement."""
|
||||
repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que l'emplacement existe
|
||||
existing = await repo.get(location_id)
|
||||
if existing is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {location_id} non trouvé",
|
||||
)
|
||||
|
||||
# Vérifier que le nouveau parent existe si spécifié
|
||||
if data.parent_id is not None and data.parent_id != existing.parent_id:
|
||||
if data.parent_id == location_id:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_400_BAD_REQUEST,
|
||||
detail="Un emplacement ne peut pas être son propre parent",
|
||||
)
|
||||
if not await repo.exists(data.parent_id):
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement parent {data.parent_id} non trouvé",
|
||||
)
|
||||
|
||||
# Mettre à jour
|
||||
location = await repo.update(
|
||||
location_id,
|
||||
name=data.name,
|
||||
type=data.type,
|
||||
parent_id=data.parent_id,
|
||||
description=data.description,
|
||||
)
|
||||
|
||||
# Recalculer les chemins si le nom ou le parent a changé
|
||||
if data.name or data.parent_id is not None:
|
||||
await repo.update_paths_recursive(location)
|
||||
|
||||
await db.commit()
|
||||
|
||||
return LocationResponse.model_validate(location)
|
||||
|
||||
|
||||
@router.delete("/{location_id}", response_model=SuccessResponse)
|
||||
async def delete_location(
|
||||
location_id: int,
|
||||
db: AsyncSession = Depends(get_db),
|
||||
) -> SuccessResponse:
|
||||
"""Supprime un emplacement."""
|
||||
repo = LocationRepository(db)
|
||||
|
||||
# Vérifier que l'emplacement existe
|
||||
result = await repo.get_with_item_count(location_id)
|
||||
if result is None:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_404_NOT_FOUND,
|
||||
detail=f"Emplacement {location_id} non trouvé",
|
||||
)
|
||||
|
||||
location, item_count = result
|
||||
|
||||
# Empêcher la suppression si des objets utilisent cet emplacement
|
||||
if item_count > 0:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Impossible de supprimer : {item_count} objet(s) utilisent cet emplacement",
|
||||
)
|
||||
|
||||
# Vérifier s'il y a des enfants
|
||||
children = await repo.get_children(location_id)
|
||||
if children:
|
||||
raise HTTPException(
|
||||
status_code=status.HTTP_409_CONFLICT,
|
||||
detail=f"Impossible de supprimer : cet emplacement a {len(children)} sous-emplacement(s)",
|
||||
)
|
||||
|
||||
await repo.delete(location_id)
|
||||
await db.commit()
|
||||
|
||||
return SuccessResponse(message="Emplacement supprimé avec succès", id=location_id)
|
||||
68
backend/app/schemas/__init__.py
Normal file
68
backend/app/schemas/__init__.py
Normal file
@@ -0,0 +1,68 @@
|
||||
"""Package des schémas Pydantic."""
|
||||
|
||||
from app.schemas.category import (
|
||||
CategoryCreate,
|
||||
CategoryResponse,
|
||||
CategoryUpdate,
|
||||
CategoryWithItemCount,
|
||||
)
|
||||
from app.schemas.common import (
|
||||
ErrorResponse,
|
||||
PaginatedResponse,
|
||||
PaginationParams,
|
||||
SuccessResponse,
|
||||
)
|
||||
from app.schemas.document import (
|
||||
DocumentCreate,
|
||||
DocumentResponse,
|
||||
DocumentUpdate,
|
||||
DocumentUploadResponse,
|
||||
)
|
||||
from app.schemas.item import (
|
||||
ItemCreate,
|
||||
ItemFilter,
|
||||
ItemResponse,
|
||||
ItemSummary,
|
||||
ItemUpdate,
|
||||
ItemWithRelations,
|
||||
)
|
||||
from app.schemas.location import (
|
||||
LocationCreate,
|
||||
LocationResponse,
|
||||
LocationTree,
|
||||
LocationUpdate,
|
||||
LocationWithChildren,
|
||||
LocationWithItemCount,
|
||||
)
|
||||
|
||||
__all__ = [
|
||||
# Category
|
||||
"CategoryCreate",
|
||||
"CategoryUpdate",
|
||||
"CategoryResponse",
|
||||
"CategoryWithItemCount",
|
||||
# Location
|
||||
"LocationCreate",
|
||||
"LocationUpdate",
|
||||
"LocationResponse",
|
||||
"LocationWithChildren",
|
||||
"LocationWithItemCount",
|
||||
"LocationTree",
|
||||
# Item
|
||||
"ItemCreate",
|
||||
"ItemUpdate",
|
||||
"ItemResponse",
|
||||
"ItemWithRelations",
|
||||
"ItemSummary",
|
||||
"ItemFilter",
|
||||
# Document
|
||||
"DocumentCreate",
|
||||
"DocumentUpdate",
|
||||
"DocumentResponse",
|
||||
"DocumentUploadResponse",
|
||||
# Common
|
||||
"PaginationParams",
|
||||
"PaginatedResponse",
|
||||
"ErrorResponse",
|
||||
"SuccessResponse",
|
||||
]
|
||||
48
backend/app/schemas/category.py
Normal file
48
backend/app/schemas/category.py
Normal file
@@ -0,0 +1,48 @@
|
||||
"""Schémas Pydantic pour les catégories.
|
||||
|
||||
Définit les schémas de validation pour les requêtes et réponses API.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
|
||||
class CategoryBase(BaseModel):
|
||||
"""Schéma de base pour les catégories."""
|
||||
|
||||
name: str = Field(..., min_length=1, max_length=100, description="Nom de la catégorie")
|
||||
description: str | None = Field(None, max_length=1000, description="Description optionnelle")
|
||||
color: str | None = Field(None, pattern=r"^#[0-9A-Fa-f]{6}$", description="Couleur hex (#RRGGBB)")
|
||||
icon: str | None = Field(None, max_length=50, description="Nom de l'icône")
|
||||
|
||||
|
||||
class CategoryCreate(CategoryBase):
|
||||
"""Schéma pour la création d'une catégorie."""
|
||||
|
||||
pass
|
||||
|
||||
|
||||
class CategoryUpdate(BaseModel):
|
||||
"""Schéma pour la mise à jour d'une catégorie (tous les champs optionnels)."""
|
||||
|
||||
name: str | None = Field(None, min_length=1, max_length=100)
|
||||
description: str | None = Field(None, max_length=1000)
|
||||
color: str | None = Field(None, pattern=r"^#[0-9A-Fa-f]{6}$")
|
||||
icon: str | None = Field(None, max_length=50)
|
||||
|
||||
|
||||
class CategoryResponse(CategoryBase):
|
||||
"""Schéma de réponse pour une catégorie."""
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: int
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
|
||||
class CategoryWithItemCount(CategoryResponse):
|
||||
"""Schéma de réponse avec le nombre d'objets."""
|
||||
|
||||
item_count: int = Field(default=0, description="Nombre d'objets dans cette catégorie")
|
||||
60
backend/app/schemas/common.py
Normal file
60
backend/app/schemas/common.py
Normal file
@@ -0,0 +1,60 @@
|
||||
"""Schémas Pydantic communs.
|
||||
|
||||
Définit les schémas réutilisables (pagination, erreurs, etc.).
|
||||
"""
|
||||
|
||||
from typing import Generic, TypeVar
|
||||
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
T = TypeVar("T")
|
||||
|
||||
|
||||
class PaginationParams(BaseModel):
|
||||
"""Paramètres de pagination."""
|
||||
|
||||
page: int = Field(default=1, ge=1, description="Numéro de page (commence à 1)")
|
||||
page_size: int = Field(default=20, ge=1, le=100, description="Nombre d'éléments par page")
|
||||
|
||||
@property
|
||||
def offset(self) -> int:
|
||||
"""Calcule l'offset pour la requête SQL."""
|
||||
return (self.page - 1) * self.page_size
|
||||
|
||||
|
||||
class PaginatedResponse(BaseModel, Generic[T]):
|
||||
"""Réponse paginée générique."""
|
||||
|
||||
items: list[T]
|
||||
total: int = Field(..., description="Nombre total d'éléments")
|
||||
page: int = Field(..., description="Page actuelle")
|
||||
page_size: int = Field(..., description="Taille de la page")
|
||||
pages: int = Field(..., description="Nombre total de pages")
|
||||
|
||||
@classmethod
|
||||
def create(
|
||||
cls, items: list[T], total: int, page: int, page_size: int
|
||||
) -> "PaginatedResponse[T]":
|
||||
"""Crée une réponse paginée."""
|
||||
pages = (total + page_size - 1) // page_size if page_size > 0 else 0
|
||||
return cls(
|
||||
items=items,
|
||||
total=total,
|
||||
page=page,
|
||||
page_size=page_size,
|
||||
pages=pages,
|
||||
)
|
||||
|
||||
|
||||
class ErrorResponse(BaseModel):
|
||||
"""Schéma de réponse d'erreur."""
|
||||
|
||||
detail: str = Field(..., description="Message d'erreur")
|
||||
type: str = Field(..., description="Type d'erreur")
|
||||
|
||||
|
||||
class SuccessResponse(BaseModel):
|
||||
"""Schéma de réponse de succès."""
|
||||
|
||||
message: str = Field(..., description="Message de succès")
|
||||
id: int | None = Field(None, description="ID de l'élément concerné")
|
||||
63
backend/app/schemas/document.py
Normal file
63
backend/app/schemas/document.py
Normal file
@@ -0,0 +1,63 @@
|
||||
"""Schémas Pydantic pour les documents attachés.
|
||||
|
||||
Définit les schémas de validation pour les requêtes et réponses API.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from app.models.document import DocumentType
|
||||
|
||||
|
||||
class DocumentBase(BaseModel):
|
||||
"""Schéma de base pour les documents."""
|
||||
|
||||
type: DocumentType = Field(..., description="Type de document")
|
||||
description: str | None = Field(None, max_length=500, description="Description optionnelle")
|
||||
|
||||
|
||||
class DocumentCreate(DocumentBase):
|
||||
"""Schéma pour la création d'un document (métadonnées seulement).
|
||||
|
||||
Le fichier est uploadé séparément via multipart/form-data.
|
||||
"""
|
||||
|
||||
item_id: int = Field(..., description="ID de l'objet associé")
|
||||
|
||||
|
||||
class DocumentUpdate(BaseModel):
|
||||
"""Schéma pour la mise à jour d'un document."""
|
||||
|
||||
type: DocumentType | None = None
|
||||
description: str | None = Field(None, max_length=500)
|
||||
|
||||
|
||||
class DocumentResponse(BaseModel):
|
||||
"""Schéma de réponse pour un document."""
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: int
|
||||
filename: str
|
||||
original_name: str
|
||||
type: DocumentType
|
||||
mime_type: str
|
||||
size_bytes: int
|
||||
file_path: str
|
||||
description: str | None
|
||||
item_id: int
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
|
||||
class DocumentUploadResponse(BaseModel):
|
||||
"""Schéma de réponse après upload d'un document."""
|
||||
|
||||
id: int
|
||||
filename: str
|
||||
original_name: str
|
||||
type: DocumentType
|
||||
mime_type: str
|
||||
size_bytes: int
|
||||
message: str = "Document uploadé avec succès"
|
||||
98
backend/app/schemas/item.py
Normal file
98
backend/app/schemas/item.py
Normal file
@@ -0,0 +1,98 @@
|
||||
"""Schémas Pydantic pour les objets d'inventaire.
|
||||
|
||||
Définit les schémas de validation pour les requêtes et réponses API.
|
||||
"""
|
||||
|
||||
from datetime import date, datetime
|
||||
from decimal import Decimal
|
||||
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from app.models.item import ItemStatus
|
||||
from app.schemas.category import CategoryResponse
|
||||
from app.schemas.location import LocationResponse
|
||||
|
||||
|
||||
class ItemBase(BaseModel):
|
||||
"""Schéma de base pour les objets."""
|
||||
|
||||
name: str = Field(..., min_length=1, max_length=200, description="Nom de l'objet")
|
||||
description: str | None = Field(None, description="Description détaillée")
|
||||
quantity: int = Field(default=1, ge=0, description="Quantité en stock")
|
||||
status: ItemStatus = Field(default=ItemStatus.IN_STOCK, description="Statut de l'objet")
|
||||
brand: str | None = Field(None, max_length=100, description="Marque")
|
||||
model: str | None = Field(None, max_length=100, description="Modèle")
|
||||
serial_number: str | None = Field(None, max_length=100, description="Numéro de série")
|
||||
url: str | None = Field(None, max_length=500, description="Lien vers page produit")
|
||||
price: Decimal | None = Field(None, ge=0, decimal_places=2, description="Prix d'achat")
|
||||
purchase_date: date | None = Field(None, description="Date d'achat")
|
||||
notes: str | None = Field(None, description="Notes libres")
|
||||
|
||||
|
||||
class ItemCreate(ItemBase):
|
||||
"""Schéma pour la création d'un objet."""
|
||||
|
||||
category_id: int = Field(..., description="ID de la catégorie")
|
||||
location_id: int = Field(..., description="ID de l'emplacement")
|
||||
|
||||
|
||||
class ItemUpdate(BaseModel):
|
||||
"""Schéma pour la mise à jour d'un objet (tous les champs optionnels)."""
|
||||
|
||||
name: str | None = Field(None, min_length=1, max_length=200)
|
||||
description: str | None = None
|
||||
quantity: int | None = Field(None, ge=0)
|
||||
status: ItemStatus | None = None
|
||||
brand: str | None = Field(None, max_length=100)
|
||||
model: str | None = Field(None, max_length=100)
|
||||
serial_number: str | None = Field(None, max_length=100)
|
||||
url: str | None = Field(None, max_length=500)
|
||||
price: Decimal | None = Field(None, ge=0)
|
||||
purchase_date: date | None = None
|
||||
notes: str | None = None
|
||||
category_id: int | None = None
|
||||
location_id: int | None = None
|
||||
|
||||
|
||||
class ItemResponse(ItemBase):
|
||||
"""Schéma de réponse pour un objet."""
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: int
|
||||
category_id: int
|
||||
location_id: int
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
|
||||
class ItemWithRelations(ItemResponse):
|
||||
"""Schéma de réponse avec les relations (catégorie et emplacement)."""
|
||||
|
||||
category: CategoryResponse
|
||||
location: LocationResponse
|
||||
|
||||
|
||||
class ItemSummary(BaseModel):
|
||||
"""Schéma résumé pour les listes d'objets."""
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: int
|
||||
name: str
|
||||
quantity: int
|
||||
status: ItemStatus
|
||||
brand: str | None
|
||||
category_id: int
|
||||
location_id: int
|
||||
|
||||
|
||||
class ItemFilter(BaseModel):
|
||||
"""Schéma pour filtrer les objets."""
|
||||
|
||||
category_id: int | None = None
|
||||
location_id: int | None = None
|
||||
status: ItemStatus | None = None
|
||||
search: str | None = Field(None, min_length=2, description="Recherche textuelle")
|
||||
min_price: Decimal | None = None
|
||||
max_price: Decimal | None = None
|
||||
70
backend/app/schemas/location.py
Normal file
70
backend/app/schemas/location.py
Normal file
@@ -0,0 +1,70 @@
|
||||
"""Schémas Pydantic pour les emplacements.
|
||||
|
||||
Définit les schémas de validation pour les requêtes et réponses API.
|
||||
"""
|
||||
|
||||
from datetime import datetime
|
||||
|
||||
from pydantic import BaseModel, ConfigDict, Field
|
||||
|
||||
from app.models.location import LocationType
|
||||
|
||||
|
||||
class LocationBase(BaseModel):
|
||||
"""Schéma de base pour les emplacements."""
|
||||
|
||||
name: str = Field(..., min_length=1, max_length=100, description="Nom de l'emplacement")
|
||||
type: LocationType = Field(..., description="Type d'emplacement")
|
||||
description: str | None = Field(None, max_length=500, description="Description optionnelle")
|
||||
|
||||
|
||||
class LocationCreate(LocationBase):
|
||||
"""Schéma pour la création d'un emplacement."""
|
||||
|
||||
parent_id: int | None = Field(None, description="ID du parent (None si racine)")
|
||||
|
||||
|
||||
class LocationUpdate(BaseModel):
|
||||
"""Schéma pour la mise à jour d'un emplacement (tous les champs optionnels)."""
|
||||
|
||||
name: str | None = Field(None, min_length=1, max_length=100)
|
||||
type: LocationType | None = None
|
||||
description: str | None = Field(None, max_length=500)
|
||||
parent_id: int | None = None
|
||||
|
||||
|
||||
class LocationResponse(LocationBase):
|
||||
"""Schéma de réponse pour un emplacement."""
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
|
||||
id: int
|
||||
parent_id: int | None
|
||||
path: str
|
||||
created_at: datetime
|
||||
updated_at: datetime
|
||||
|
||||
|
||||
class LocationWithChildren(LocationResponse):
|
||||
"""Schéma de réponse avec les enfants."""
|
||||
|
||||
children: list["LocationWithChildren"] = Field(default_factory=list)
|
||||
|
||||
|
||||
class LocationWithItemCount(LocationResponse):
|
||||
"""Schéma de réponse avec le nombre d'objets."""
|
||||
|
||||
item_count: int = Field(default=0, description="Nombre d'objets à cet emplacement")
|
||||
|
||||
|
||||
class LocationTree(BaseModel):
|
||||
"""Schéma pour l'arborescence complète des emplacements."""
|
||||
|
||||
id: int
|
||||
name: str
|
||||
type: LocationType
|
||||
path: str
|
||||
children: list["LocationTree"] = Field(default_factory=list)
|
||||
item_count: int = 0
|
||||
|
||||
model_config = ConfigDict(from_attributes=True)
|
||||
0
backend/app/services/__init__.py
Normal file
0
backend/app/services/__init__.py
Normal file
0
backend/app/utils/__init__.py
Normal file
0
backend/app/utils/__init__.py
Normal file
142
backend/pyproject.toml
Normal file
142
backend/pyproject.toml
Normal file
@@ -0,0 +1,142 @@
|
||||
[project]
|
||||
name = "homestock-backend"
|
||||
version = "0.1.0"
|
||||
description = "HomeStock - Backend API pour gestion d'inventaire domestique"
|
||||
authors = [
|
||||
{ name = "Gilles", email = "gilles@example.com" }
|
||||
]
|
||||
readme = "README.md"
|
||||
requires-python = ">=3.11"
|
||||
license = { text = "MIT" }
|
||||
|
||||
dependencies = [
|
||||
# FastAPI
|
||||
"fastapi>=0.109.0",
|
||||
"uvicorn[standard]>=0.27.0",
|
||||
"python-multipart>=0.0.6", # Pour upload fichiers
|
||||
|
||||
# Database
|
||||
"sqlalchemy>=2.0.25",
|
||||
"alembic>=1.13.1",
|
||||
"aiosqlite>=0.19.0", # Async SQLite
|
||||
|
||||
# Validation
|
||||
"pydantic>=2.5.3",
|
||||
"pydantic-settings>=2.1.0",
|
||||
"email-validator>=2.1.0",
|
||||
|
||||
# Logging
|
||||
"loguru>=0.7.2",
|
||||
|
||||
# Utils
|
||||
"python-dotenv>=1.0.0",
|
||||
"httpx>=0.26.0", # Pour tests API
|
||||
]
|
||||
|
||||
[project.optional-dependencies]
|
||||
dev = [
|
||||
# Testing
|
||||
"pytest>=7.4.4",
|
||||
"pytest-asyncio>=0.23.3",
|
||||
"pytest-cov>=4.1.0",
|
||||
"pytest-mock>=3.12.0",
|
||||
|
||||
# Linting & Formatting
|
||||
"ruff>=0.1.14",
|
||||
"mypy>=1.8.0",
|
||||
]
|
||||
|
||||
[build-system]
|
||||
requires = ["hatchling"]
|
||||
build-backend = "hatchling.build"
|
||||
|
||||
[tool.hatch.build.targets.wheel]
|
||||
packages = ["app"]
|
||||
|
||||
# === Ruff Configuration ===
|
||||
[tool.ruff]
|
||||
target-version = "py311"
|
||||
line-length = 100
|
||||
select = [
|
||||
"E", # pycodestyle errors
|
||||
"W", # pycodestyle warnings
|
||||
"F", # pyflakes
|
||||
"I", # isort
|
||||
"B", # flake8-bugbear
|
||||
"C4", # flake8-comprehensions
|
||||
"UP", # pyupgrade
|
||||
]
|
||||
ignore = [
|
||||
"E501", # line too long (handled by formatter)
|
||||
"B008", # do not perform function calls in argument defaults
|
||||
"C901", # too complex
|
||||
]
|
||||
|
||||
[tool.ruff.per-file-ignores]
|
||||
"__init__.py" = ["F401"] # unused imports in __init__
|
||||
|
||||
[tool.ruff.isort]
|
||||
known-first-party = ["app"]
|
||||
|
||||
# === MyPy Configuration ===
|
||||
[tool.mypy]
|
||||
python_version = "3.11"
|
||||
strict = true
|
||||
warn_return_any = true
|
||||
warn_unused_configs = true
|
||||
disallow_untyped_defs = true
|
||||
disallow_any_generics = true
|
||||
disallow_subclassing_any = true
|
||||
disallow_untyped_calls = true
|
||||
disallow_incomplete_defs = true
|
||||
check_untyped_defs = true
|
||||
no_implicit_optional = true
|
||||
warn_redundant_casts = true
|
||||
warn_unused_ignores = true
|
||||
warn_no_return = true
|
||||
warn_unreachable = true
|
||||
strict_equality = true
|
||||
|
||||
[[tool.mypy.overrides]]
|
||||
module = [
|
||||
"sqlalchemy.*",
|
||||
"alembic.*",
|
||||
"loguru.*",
|
||||
]
|
||||
ignore_missing_imports = true
|
||||
|
||||
# === Pytest Configuration ===
|
||||
[tool.pytest.ini_options]
|
||||
minversion = "7.0"
|
||||
addopts = [
|
||||
"-ra",
|
||||
"--strict-markers",
|
||||
"--cov=app",
|
||||
"--cov-report=term-missing",
|
||||
"--cov-report=html",
|
||||
]
|
||||
testpaths = ["tests"]
|
||||
pythonpath = ["."]
|
||||
asyncio_mode = "auto"
|
||||
|
||||
# === Coverage Configuration ===
|
||||
[tool.coverage.run]
|
||||
source = ["app"]
|
||||
omit = [
|
||||
"*/tests/*",
|
||||
"*/__init__.py",
|
||||
"*/alembic/*",
|
||||
]
|
||||
|
||||
[tool.coverage.report]
|
||||
precision = 2
|
||||
show_missing = true
|
||||
skip_covered = false
|
||||
exclude_lines = [
|
||||
"pragma: no cover",
|
||||
"def __repr__",
|
||||
"if TYPE_CHECKING:",
|
||||
"raise AssertionError",
|
||||
"raise NotImplementedError",
|
||||
"if __name__ == .__main__.:",
|
||||
]
|
||||
0
backend/tests/__init__.py
Normal file
0
backend/tests/__init__.py
Normal file
0
backend/tests/integration/__init__.py
Normal file
0
backend/tests/integration/__init__.py
Normal file
0
backend/tests/unit/__init__.py
Normal file
0
backend/tests/unit/__init__.py
Normal file
1286
backend/uv.lock
generated
Normal file
1286
backend/uv.lock
generated
Normal file
File diff suppressed because it is too large
Load Diff
Reference in New Issue
Block a user