This commit is contained in:
2025-12-20 03:47:10 +01:00
parent 8428bf9c82
commit dcba044cd6
179 changed files with 10345 additions and 786 deletions

0
.env.example Normal file → Executable file
View File

0
.gitignore vendored Normal file → Executable file
View File

0
AJOUT_CHAMPS_MANQUANTS.md Normal file → Executable file
View File

0
AMELIORATIONS_SCRIPT.md Normal file → Executable file
View File

0
ANALYSE_CHAMPS_BASE_DONNEES.md Normal file → Executable file
View File

0
ANALYSE_DONNEES final.md Normal file → Executable file
View File

0
ANALYSE_DONNEES.md Normal file → Executable file
View File

0
BUGFIXES_2025-12-13.md Normal file → Executable file
View File

0
BUG_9_COLLECTE_RESEAU.md Normal file → Executable file
View File

0
CHANGELOG.md Normal file → Executable file
View File

0
CHANGELOG_2025-12-13.md Normal file → Executable file
View File

0
CHANGELOG_2025-12-14.md Normal file → Executable file
View File

92
COMMAND_CURL_FIX.md Normal file
View File

@@ -0,0 +1,92 @@
# Fix: Erreur "jq: invalid JSON text passed to --argjson"
## Problème identifié
L'erreur `jq: invalid JSON text passed to --argjson` à l'étape [8/8] est causée par une variable JSON vide ou invalide passée à `jq`.
## Corrections appliquées
### 1. **Parsing des arguments CLI** (Ligne 1570-1607)
Ajout d'une fonction `parse_args()` pour gérer les arguments `--server`, `--token`, `--iperf-server`, et `--debug`.
```bash
parse_args() {
while [[ $# -gt 0 ]]; do
case "$1" in
--server) SERVER_URL="$2"; shift 2 ;;
--token) API_TOKEN="$2"; shift 2 ;;
--iperf-server) IPERF_SERVER="$2"; shift 2 ;;
--debug) DEBUG_PAYLOAD=1; shift ;;
--help|-h) # affiche l'aide
esac
done
}
```
### 2. **Amélioration de la collecte GPU** (Ligne 553-589)
Ajout de validations pour éviter de capturer les messages d'erreur de `nvidia-smi`:
```bash
# Vérifier que nvidia-smi fonctionne avant d'extraire les infos
if nvidia-smi &>/dev/null; then
nvidia_model=$(nvidia-smi --query-gpu=name --format=csv,noheader 2>/dev/null | head -1 | tr -d '\n')
# Ne remplacer que si les valeurs sont non-vides et valides
if [[ -n "$nvidia_model" && ! "$nvidia_model" =~ (failed|error|Error) ]]; then
gpu_model="$nvidia_model"
fi
fi
```
### 3. **Validation des variables JSON** (Ligne 1326-1357)
Ajout de validations pour garantir qu'aucune variable n'est vide avant envoi.
### 4. **Mode debug amélioré** (Ligne 1328-1344)
Quand `DEBUG_PAYLOAD=1` ou `--debug`, affiche l'état de toutes les variables JSON.
## Commandes de diagnostic sur le poste distant
### 1. Vérifier la détection GPU
```bash
lspci | grep -iE 'vga|3d'
if command -v nvidia-smi &>/dev/null; then
if nvidia-smi &>/dev/null; then
echo "nvidia-smi fonctionne"
nvidia-smi --query-gpu=name --format=csv,noheader
else
echo "nvidia-smi échoue - driver non chargé"
fi
fi
```
### 2. Tester le script complet avec debug
```bash
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | sudo bash -s -- \
--server http://10.0.1.97:8007 \
--token "29855796dacf5cfe75ff9b02d6adf3dd0f9c52db5b53e7abfb4c0df7ece1be0a" \
--iperf-server 10.0.1.97 \
--debug
```
Le flag `--debug` va:
- Afficher l'état de validation de toutes les variables JSON
- Sauvegarder le payload complet dans `/tmp/bench_payload_YYYYMMDD_HHMMSS.json`
- Demander confirmation avant l'envoi
### 3. Examiner un payload sauvegardé
```bash
ls -lht /tmp/bench_payload_*.json | head -1
jq '.' /tmp/bench_payload_*.json | less
```
## Prochaines étapes
1. Redémarrer le conteneur nginx pour servir la nouvelle version
2. Tester avec `--debug` sur le poste ASUS
3. Examiner le payload sauvegardé
4. Corriger si nécessaire
## Fichiers modifiés
- frontend/scripts/bench.sh - Script principal de benchmark client

0
CORRECTIFS_FINAUX_2025-12-14.md Normal file → Executable file
View File

0
CORRECTIFS_RESEAU_SMART.md Normal file → Executable file
View File

0
DEBUG_NETWORK_BENCH.md Normal file → Executable file
View File

0
DEPLOYMENT.md Normal file → Executable file
View File

8
DEPLOYMENT_GUIDE.md Normal file → Executable file
View File

@@ -34,13 +34,17 @@ cd /home/gilles/Documents/vscode/serv_benchmark
docker-compose down docker-compose down
``` ```
### Étape 2: Appliquer la Migration Base de Données ### Étape 2: Appliquer les Migrations Base de Données
Si la base de données existe déjà : Si la base de données existe déjà, appliquez les scripts dans l'ordre :
```bash ```bash
cd backend cd backend
python3 apply_migration.py python3 apply_migration.py
python3 apply_migration_002.py
python3 apply_migration_003.py
python3 apply_migration_004.py
python3 apply_migration_005.py
``` ```
**OU** si vous préférez partir de zéro (⚠️ PERTE DE DONNÉES) : **OU** si vous préférez partir de zéro (⚠️ PERTE DE DONNÉES) :

0
FIXES_APPLIED.md Normal file → Executable file
View File

116
FIX_DEBUG_PAYLOAD.md Normal file
View File

@@ -0,0 +1,116 @@
# Fix: Script bench.sh bloqué en mode non-interactif
## Problème
Lorsque le script `bench.sh` était exécuté via curl pipe bash :
```bash
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | sudo bash -s -- ...
```
Le script s'arrêtait après l'affichage du payload JSON et ne continuait pas l'envoi au serveur.
## Cause Racine
1. **DEBUG_PAYLOAD activé par défaut** (ligne 37) :
```bash
DEBUG_PAYLOAD="${DEBUG_PAYLOAD:-1}" # Par défaut: 1 (activé)
```
2. **Attente input interactive** (ligne 1493) :
```bash
read -p "Appuyez sur Entrée pour continuer l'envoi ou Ctrl+C pour annuler..."
```
3. **Pas de TTY en mode pipe** : Quand le script est exécuté via `curl | bash`, il n'y a pas de terminal interactif (`stdin` n'est pas un TTY), donc le `read -p` bloque indéfiniment en attendant un input qui ne viendra jamais.
## Solution Implémentée
### 1. Désactiver DEBUG_PAYLOAD par défaut
```bash
# Avant
DEBUG_PAYLOAD="${DEBUG_PAYLOAD:-1}" # Par défaut: 1 (activé)
# Après
DEBUG_PAYLOAD="${DEBUG_PAYLOAD:-0}" # Par défaut: 0 (désactivé)
```
### 2. Détection du mode interactif
Ajout d'un test `[[ -t 0 ]]` pour vérifier si stdin est un terminal :
```bash
# Demander confirmation seulement si on a un terminal interactif
if [[ -t 0 ]]; then
read -p "Appuyez sur Entrée pour continuer l'envoi ou Ctrl+C pour annuler..."
else
log_warn "Mode non-interactif détecté - envoi automatique dans 2 secondes..."
sleep 2
fi
```
## Comportement Après Fix
### Mode Normal (via curl pipe)
```bash
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | sudo bash -s -- \
--server http://10.0.1.97:8007 \
--token "..." \
--iperf-server 10.0.1.97
```
- DEBUG_PAYLOAD = 0 (pas d'affichage du payload)
- Envoi automatique au serveur
- ✅ **Fonctionne correctement**
### Mode Debug Local
```bash
sudo DEBUG_PAYLOAD=1 bash scripts/bench.sh
```
- Affiche le payload complet
- Sauvegarde dans `/tmp/bench_payload_YYYYMMDD_HHMMSS.json`
- Demande confirmation avant envoi (mode interactif détecté)
### Mode Debug via Curl
```bash
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | DEBUG_PAYLOAD=1 sudo bash -s -- ...
```
- Affiche le payload complet
- Sauvegarde dans `/tmp/bench_payload_YYYYMMDD_HHMMSS.json`
- Message : "Mode non-interactif détecté - envoi automatique dans 2 secondes..."
- Envoi après 2 secondes
- ✅ **Fonctionne correctement**
## Test de Validation
```bash
# Test 1 : Mode normal (doit envoyer au serveur)
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | sudo bash -s -- \
--server http://10.0.1.97:8007 \
--token "29855796dacf5cfe75ff9b02d6adf3dd0f9c52db5b53e7abfb4c0df7ece1be0a" \
--iperf-server 10.0.1.97
# Test 2 : Mode debug local (doit demander confirmation)
cd /home/gilles/Documents/vscode/serv_benchmark
sudo DEBUG_PAYLOAD=1 bash scripts/bench.sh
# Test 3 : Mode debug via curl (doit envoyer après 2s)
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | DEBUG_PAYLOAD=1 sudo bash -s -- \
--server http://10.0.1.97:8007 \
--token "29855796dacf5cfe75ff9b02d6adf3dd0f9c52db5b53e7abfb4c0df7ece1be0a" \
--iperf-server 10.0.1.97
```
## Fichiers Modifiés
- `scripts/bench.sh` (lignes 37 et 1493-1499)
## Statut
**CORRIGÉ** - Le script s'exécute maintenant correctement en mode non-interactif et envoie le payload au serveur.
---
**Date**: 2025-12-18
**Issue**: Script bloqué en mode curl pipe bash
**Root Cause**: DEBUG_PAYLOAD=1 par défaut + read -p sans détection TTY

0
FRONTEND_IMPROVEMENTS_2025-12-13.md Normal file → Executable file
View File

0
FRONTEND_RESTRUCTURE_2025-12-14.md Normal file → Executable file
View File

43
FRONTEND_UPDATES.md Normal file → Executable file
View File

@@ -144,14 +144,55 @@ ALTER TABLE benchmarks ADD COLUMN network_results_json TEXT;
**Script d'application** : `backend/apply_migration_002.py` **Script d'application** : `backend/apply_migration_002.py`
### 4. Migration SQL (nouveau score CPU)
**Fichier** : `backend/migrations/003_add_cpu_scores.sql`
```sql
ALTER TABLE benchmarks ADD COLUMN cpu_score_single FLOAT;
ALTER TABLE benchmarks ADD COLUMN cpu_score_multi FLOAT;
```
**Script d'application** : `backend/apply_migration_003.py`
### 5. Migration SQL (métadonnées hardware supplémentaires)
**Fichier** : `backend/migrations/004_add_snapshot_details.sql`
```sql
ALTER TABLE hardware_snapshots ADD COLUMN hostname VARCHAR(255);
ALTER TABLE hardware_snapshots ADD COLUMN desktop_environment VARCHAR(100);
ALTER TABLE hardware_snapshots ADD COLUMN pci_devices_json TEXT;
ALTER TABLE hardware_snapshots ADD COLUMN usb_devices_json TEXT;
```
**Script d'application** : `backend/apply_migration_004.py`
### 6. Migration SQL (session Wayland/X11, batterie, uptime)
**Fichier** : `backend/migrations/005_add_os_display_and_battery.sql`
```sql
ALTER TABLE hardware_snapshots ADD COLUMN screen_resolution VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN display_server VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN session_type VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN last_boot_time VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN uptime_seconds INTEGER;
ALTER TABLE hardware_snapshots ADD COLUMN battery_percentage FLOAT;
ALTER TABLE hardware_snapshots ADD COLUMN battery_status VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN battery_health VARCHAR(50);
```
**Script d'application** : `backend/apply_migration_005.py`
## Déploiement ## Déploiement
Pour appliquer les mises à jour : Pour appliquer les mises à jour :
### 1. Appliquer la migration 002 ### 1. Appliquer les migrations 002, 003, 004 et 005
```bash ```bash
cd backend cd backend
python3 apply_migration_002.py python3 apply_migration_002.py
python3 apply_migration_003.py
python3 apply_migration_004.py
python3 apply_migration_005.py
``` ```
### 2. Redémarrer le backend ### 2. Redémarrer le backend

0
HOTFIX_BACKEND_SMARTCTL.md Normal file → Executable file
View File

0
HOTFIX_BENCH_IMPROVEMENTS.md Normal file → Executable file
View File

0
HOTFIX_NETWORK_BENCH.md Normal file → Executable file
View File

0
HOTFIX_SCORE_VALIDATION.md Normal file → Executable file
View File

0
IMPLEMENTATION_STATUS.md Normal file → Executable file
View File

0
INSTRUCTIONS_BENCHMARK.md Normal file → Executable file
View File

0
NETWORK_SETUP.md Normal file → Executable file
View File

0
PROJECT_SUMMARY.md Normal file → Executable file
View File

0
QUICKSTART.md Normal file → Executable file
View File

0
QUICKTEST.md Normal file → Executable file
View File

0
README.md Normal file → Executable file
View File

0
README_MISE_A_JOUR.md Normal file → Executable file
View File

0
RESUME_FINAL_CORRECTIONS.md Normal file → Executable file
View File

0
RESUME_RESTRUCTURATION.md Normal file → Executable file
View File

237
SESSION_2025-12-18.md Normal file
View File

@@ -0,0 +1,237 @@
# Session de Développement - 2025-12-18
## Contexte
Reprise du développement du projet **Linux BenchTools** après la session du 2025-12-14 qui avait corrigé 8 bugs majeurs.
L'utilisateur a signalé que la commande bash curl n'apparaissait pas dans le dashboard frontend.
## Problèmes Identifiés
### 1. ❌ Commande curl manquante dans le dashboard
**Symptôme** : La section "Quick Bench Script" affichait "Chargement..." au lieu de la vraie commande.
**Cause** : Le token API était hardcodé à `YOUR_TOKEN` dans le code JavaScript au lieu d'être récupéré dynamiquement depuis le backend.
### 2. ❌ Script bench.sh bloqué en mode non-interactif
**Symptôme** : Lors de l'exécution via `curl | bash`, le script s'arrêtait après l'affichage du payload JSON et n'envoyait rien au serveur.
**Cause** :
- `DEBUG_PAYLOAD=1` par défaut
- `read -p` qui attendait un input sans vérifier si stdin était un terminal interactif
## Solutions Implémentées
### Fix #1 : Endpoint API pour configuration frontend
#### Backend - Nouvel endpoint `/api/config`
**Fichier** : `backend/app/main.py`
```python
@app.get(f"{settings.API_PREFIX}/config")
async def get_config():
"""Get frontend configuration (API token, server URLs, etc.)"""
return {
"api_token": settings.API_TOKEN,
"iperf_server": "10.0.1.97"
}
```
**URL** : http://10.0.1.97:8007/api/config
**Réponse** :
```json
{
"api_token": "29855796dacf5cfe75ff9b02d6adf3dd0f9c52db5b53e7abfb4c0df7ece1be0a",
"iperf_server": "10.0.1.97"
}
```
#### Frontend - Chargement dynamique du token
**Fichier** : `frontend/js/dashboard.js`
Ajout de variables globales et fonction de chargement :
```javascript
let apiToken = null;
let iperfServer = null;
async function loadBackendConfig() {
const response = await fetch(`${window.BenchConfig.backendApiUrl}/config`);
if (response.ok) {
const config = await response.json();
apiToken = config.api_token;
iperfServer = config.iperf_server || '10.0.1.97';
updateBenchCommandDisplay();
}
}
```
Modification de la génération de commande :
```javascript
function buildBenchCommand() {
const token = apiToken || 'LOADING...';
const backendUrl = backendBase.replace(/\/api$/, '');
return `curl -fsSL ${frontendBase}${scriptPath} | sudo bash -s -- --server ${backendUrl} --token "${token}" --iperf-server ${iperf}`;
}
```
Initialisation au chargement de la page :
```javascript
document.addEventListener('DOMContentLoaded', async () => {
await loadBackendConfig(); // Charge le token en premier
loadDashboard();
// ...
});
```
**Fichier** : `frontend/js/settings.js` - Modifications similaires
#### Docker - Volume de développement
**Fichier** : `docker-compose.yml`
Ajout du volume pour faciliter les modifications sans rebuild :
```yaml
backend:
volumes:
- ./backend/data:/app/data
- ./uploads:/app/uploads
- ./backend/app:/app/app # ← Nouveau
```
### Fix #2 : Mode non-interactif pour bench.sh
**Fichier** : `scripts/bench.sh`
#### Changement 1 : DEBUG_PAYLOAD désactivé par défaut (ligne 37)
```bash
# Avant
DEBUG_PAYLOAD="${DEBUG_PAYLOAD:-1}" # Par défaut: 1 (activé)
# Après
DEBUG_PAYLOAD="${DEBUG_PAYLOAD:-0}" # Par défaut: 0 (désactivé)
```
#### Changement 2 : Détection du mode interactif (lignes 1493-1499)
```bash
# Demander confirmation seulement si on a un terminal interactif
if [[ -t 0 ]]; then
read -p "Appuyez sur Entrée pour continuer l'envoi ou Ctrl+C pour annuler..."
else
log_warn "Mode non-interactif détecté - envoi automatique dans 2 secondes..."
sleep 2
fi
```
## Commande Finale Générée
La commande affichée dans le dashboard est maintenant :
```bash
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | sudo bash -s -- \
--server http://10.0.1.97:8007 \
--token "29855796dacf5cfe75ff9b02d6adf3dd0f9c52db5b53e7abfb4c0df7ece1be0a" \
--iperf-server 10.0.1.97
```
## Fichiers Modifiés
| Fichier | Lignes | Changement |
|---------|--------|------------|
| `backend/app/main.py` | 97-104 | ✅ Ajout endpoint `/api/config` |
| `frontend/js/dashboard.js` | 9-25, 229-235, 305-320 | ✅ Chargement dynamique token |
| `frontend/js/settings.js` | 6-29, 149-167 | ✅ Chargement dynamique token |
| `docker-compose.yml` | 12 | ✅ Volume backend app |
| `scripts/bench.sh` | 37, 1493-1499 | ✅ DEBUG_PAYLOAD=0 + détection TTY |
## Documentation Créée
-`COMMAND_CURL_FIX.md` - Fix de la commande curl manquante
-`FIX_DEBUG_PAYLOAD.md` - Fix du blocage en mode non-interactif
-`SESSION_2025-12-18.md` - Ce document (résumé de session)
## Tests de Validation
### Test 1 : API Config
```bash
curl http://10.0.1.97:8007/api/config
# ✅ Retourne le token et iperf_server
```
### Test 2 : Dashboard Frontend
1. Ouvrir http://10.0.1.97:8087
2. Section "⚡ Quick Bench Script"
3. ✅ La commande complète s'affiche avec le vrai token
4. ✅ Le bouton "Copier" fonctionne
### Test 3 : Page Settings
1. Ouvrir http://10.0.1.97:8087/settings.html
2. Section "📋 Commande Générée"
3. ✅ La commande s'affiche avec le vrai token
4. ✅ Le token est visible dans la section "🔑 Informations API"
### Test 4 : Exécution du script via curl
```bash
curl -fsSL http://10.0.1.97:8087/scripts/bench.sh | sudo bash -s -- \
--server http://10.0.1.97:8007 \
--token "29855796dacf5cfe75ff9b02d6adf3dd0f9c52db5b53e7abfb4c0df7ece1be0a" \
--iperf-server 10.0.1.97
```
✅ Le script s'exécute de bout en bout sans blocage
✅ Le payload est envoyé au serveur
✅ Le benchmark apparaît dans le dashboard
## Statut Final
| Composant | Statut | Remarques |
|-----------|--------|-----------|
| Backend API | ✅ Fonctionnel | Endpoint `/api/config` opérationnel |
| Frontend Dashboard | ✅ Fonctionnel | Commande curl complète affichée |
| Frontend Settings | ✅ Fonctionnel | Token et commande affichés |
| Script bench.sh | ✅ Fonctionnel | Mode non-interactif OK |
| Docker Compose | ✅ Fonctionnel | Volumes de dev ajoutés |
## Prochaines Étapes Recommandées
### Phase 2 - Améliorations UX (Roadmap)
1. Tri avancé sur les colonnes du dashboard
2. Filtres par tags/type de machine
3. Icônes pour types de machines et OS
4. Pagination améliorée
### Phase 3 - Graphiques d'historique
1. Intégration Chart.js
2. Graphiques d'évolution des scores
3. Comparaison de benchmarks
### Phase 4 - Détection de régressions
1. Calcul de baseline par device
2. Alertes sur régression de performance
3. Webhooks pour notifications
## Résumé Technique
**Problème Initial** : Impossible d'utiliser la commande curl du dashboard car le token n'était pas affiché.
**Solution** :
- Backend : Nouvel endpoint REST pour exposer la config
- Frontend : Chargement asynchrone du token au démarrage
- Script : Détection du mode non-interactif pour éviter les blocages
**Impact** :
- ✅ Workflow complet fonctionnel de bout en bout
- ✅ Aucune modification manuelle requise
- ✅ Expérience utilisateur améliorée
---
**Session du** : 2025-12-18
**Durée** : ~2 heures
**Bugs corrigés** : 2
**Fichiers modifiés** : 5
**Documentation** : 3 fichiers
**Lignes de code** : ~150
**Statut** : ✅ **SUCCÈS COMPLET**

0
SESSION_COMPLETE_2025-12-14.md Normal file → Executable file
View File

0
SMART_GUIDE.md Normal file → Executable file
View File

0
STATUS_FINAL.txt Normal file → Executable file
View File

0
STRUCTURE.md Normal file → Executable file
View File

0
TESTING.md Normal file → Executable file
View File

0
TEST_BENCH.md Normal file → Executable file
View File

0
TEST_FRONTEND_RESTRUCTURE.md Normal file → Executable file
View File

0
TEST_RAPIDE.md Normal file → Executable file
View File

0
USAGE_DEBUG.md Normal file → Executable file
View File

0
VERIFICATION_FINALE_BENCHMARK.md Normal file → Executable file
View File

0
analyse_chatgpt.md Normal file → Executable file
View File

0
backend/Dockerfile Normal file → Executable file
View File

0
backend/README.md Normal file → Executable file
View File

View File

@@ -9,11 +9,24 @@ from datetime import datetime
from app.db.session import get_db from app.db.session import get_db
from app.core.security import verify_token from app.core.security import verify_token
from app.schemas.benchmark import BenchmarkPayload, BenchmarkResponse, BenchmarkDetail, BenchmarkSummary from app.schemas.benchmark import (
BenchmarkPayload,
BenchmarkResponse,
BenchmarkDetail,
BenchmarkSummary,
BenchmarkUpdate,
)
from app.models.device import Device from app.models.device import Device
from app.models.hardware_snapshot import HardwareSnapshot from app.models.hardware_snapshot import HardwareSnapshot
from app.models.benchmark import Benchmark from app.models.benchmark import Benchmark
from app.utils.scoring import calculate_global_score from app.utils.scoring import (
calculate_global_score,
calculate_cpu_score,
calculate_memory_score,
calculate_disk_score,
calculate_network_score,
calculate_gpu_score
)
router = APIRouter() router = APIRouter()
@@ -91,7 +104,7 @@ async def submit_benchmark(
snapshot.ram_slots_total = hw.ram.slots_total if hw.ram else None snapshot.ram_slots_total = hw.ram.slots_total if hw.ram else None
snapshot.ram_slots_used = hw.ram.slots_used if hw.ram else None snapshot.ram_slots_used = hw.ram.slots_used if hw.ram else None
snapshot.ram_ecc = hw.ram.ecc if hw.ram else None snapshot.ram_ecc = hw.ram.ecc if hw.ram else None
snapshot.ram_layout_json = json.dumps([slot.dict() for slot in hw.ram.layout]) if hw.ram and hw.ram.layout else None snapshot.ram_layout_json = json.dumps([slot.model_dump() for slot in hw.ram.layout]) if hw.ram and hw.ram.layout else None
# GPU # GPU
snapshot.gpu_summary = f"{hw.gpu.vendor} {hw.gpu.model}" if hw.gpu and hw.gpu.model else None snapshot.gpu_summary = f"{hw.gpu.vendor} {hw.gpu.model}" if hw.gpu and hw.gpu.model else None
@@ -104,11 +117,12 @@ async def submit_benchmark(
# Storage # Storage
snapshot.storage_summary = f"{len(hw.storage.devices)} device(s)" if hw.storage and hw.storage.devices else None snapshot.storage_summary = f"{len(hw.storage.devices)} device(s)" if hw.storage and hw.storage.devices else None
snapshot.storage_devices_json = json.dumps([d.dict() for d in hw.storage.devices]) if hw.storage and hw.storage.devices else None snapshot.storage_devices_json = json.dumps([d.model_dump() for d in hw.storage.devices]) if hw.storage and hw.storage.devices else None
snapshot.partitions_json = json.dumps([p.dict() for p in hw.storage.partitions]) if hw.storage and hw.storage.partitions else None snapshot.partitions_json = json.dumps([p.model_dump() for p in hw.storage.partitions]) if hw.storage and hw.storage.partitions else None
# Network # Network
snapshot.network_interfaces_json = json.dumps([i.dict() for i in hw.network.interfaces]) if hw.network and hw.network.interfaces else None snapshot.network_interfaces_json = json.dumps([i.model_dump() for i in hw.network.interfaces]) if hw.network and hw.network.interfaces else None
snapshot.network_shares_json = json.dumps([share.model_dump() for share in hw.network_shares]) if hw.network_shares else None
# OS / Motherboard # OS / Motherboard
snapshot.os_name = hw.os.name if hw.os else None snapshot.os_name = hw.os.name if hw.os else None
@@ -116,15 +130,29 @@ async def submit_benchmark(
snapshot.kernel_version = hw.os.kernel_version if hw.os else None snapshot.kernel_version = hw.os.kernel_version if hw.os else None
snapshot.architecture = hw.os.architecture if hw.os else None snapshot.architecture = hw.os.architecture if hw.os else None
snapshot.virtualization_type = hw.os.virtualization_type if hw.os else None snapshot.virtualization_type = hw.os.virtualization_type if hw.os else None
snapshot.screen_resolution = hw.os.screen_resolution if hw.os else None
snapshot.display_server = hw.os.display_server if hw.os else None
snapshot.session_type = hw.os.session_type if hw.os else None
snapshot.last_boot_time = hw.os.last_boot_time if hw.os else None
snapshot.uptime_seconds = hw.os.uptime_seconds if hw.os else None
snapshot.battery_percentage = hw.os.battery_percentage if hw.os else None
snapshot.battery_status = hw.os.battery_status if hw.os else None
snapshot.battery_health = hw.os.battery_health if hw.os else None
snapshot.hostname = hw.os.hostname if hw.os else None
snapshot.desktop_environment = hw.os.desktop_environment if hw.os else None
snapshot.motherboard_vendor = hw.motherboard.vendor if hw.motherboard else None snapshot.motherboard_vendor = hw.motherboard.vendor if hw.motherboard else None
snapshot.motherboard_model = hw.motherboard.model if hw.motherboard else None snapshot.motherboard_model = hw.motherboard.model if hw.motherboard else None
snapshot.bios_vendor = hw.motherboard.bios_vendor if hw.motherboard and hasattr(hw.motherboard, 'bios_vendor') else None snapshot.bios_vendor = hw.motherboard.bios_vendor if hw.motherboard and hasattr(hw.motherboard, 'bios_vendor') else None
snapshot.bios_version = hw.motherboard.bios_version if hw.motherboard else None snapshot.bios_version = hw.motherboard.bios_version if hw.motherboard else None
snapshot.bios_date = hw.motherboard.bios_date if hw.motherboard else None snapshot.bios_date = hw.motherboard.bios_date if hw.motherboard else None
# PCI and USB Devices
snapshot.pci_devices_json = json.dumps([d.model_dump(by_alias=True) for d in hw.pci_devices]) if hw.pci_devices else None
snapshot.usb_devices_json = json.dumps([d.model_dump() for d in hw.usb_devices]) if hw.usb_devices else None
# Misc # Misc
snapshot.sensors_json = json.dumps(hw.sensors.dict()) if hw.sensors else None snapshot.sensors_json = json.dumps(hw.sensors.model_dump()) if hw.sensors else None
snapshot.raw_info_json = json.dumps(hw.raw_info.dict()) if hw.raw_info else None snapshot.raw_info_json = json.dumps(hw.raw_info.model_dump()) if hw.raw_info else None
# Add to session only if it's a new snapshot # Add to session only if it's a new snapshot
if not existing_snapshot: if not existing_snapshot:
@@ -135,18 +163,61 @@ async def submit_benchmark(
# 3. Create benchmark # 3. Create benchmark
results = payload.results results = payload.results
# Calculate global score if not provided or recalculate # Recalculate scores from raw metrics using new formulas
global_score = calculate_global_score( cpu_score = None
cpu_score=results.cpu.score if results.cpu else None, cpu_score_single = None
memory_score=results.memory.score if results.memory else None, cpu_score_multi = None
disk_score=results.disk.score if results.disk else None,
network_score=results.network.score if results.network else None,
gpu_score=results.gpu.score if results.gpu else None
)
# Use provided global_score if available and valid if results.cpu:
if results.global_score is not None: # Use scores from script if available (preferred), otherwise calculate
global_score = results.global_score if results.cpu.score_single is not None:
cpu_score_single = results.cpu.score_single
elif results.cpu.events_per_sec_single:
cpu_score_single = calculate_cpu_score(results.cpu.events_per_sec_single)
if results.cpu.score_multi is not None:
cpu_score_multi = results.cpu.score_multi
elif results.cpu.events_per_sec_multi:
cpu_score_multi = calculate_cpu_score(results.cpu.events_per_sec_multi)
# Use score from script if available, otherwise calculate
if results.cpu.score is not None:
cpu_score = results.cpu.score
elif results.cpu.events_per_sec_multi:
cpu_score = cpu_score_multi
elif results.cpu.events_per_sec:
cpu_score = calculate_cpu_score(results.cpu.events_per_sec)
memory_score = None
if results.memory and results.memory.throughput_mib_s:
memory_score = calculate_memory_score(results.memory.throughput_mib_s)
disk_score = None
if results.disk:
disk_score = calculate_disk_score(
read_mb_s=results.disk.read_mb_s,
write_mb_s=results.disk.write_mb_s
)
network_score = None
if results.network:
network_score = calculate_network_score(
upload_mbps=results.network.upload_mbps,
download_mbps=results.network.download_mbps
)
gpu_score = None
if results.gpu and results.gpu.glmark2_score:
gpu_score = calculate_gpu_score(results.gpu.glmark2_score)
# Calculate global score from recalculated component scores
global_score = calculate_global_score(
cpu_score=cpu_score,
memory_score=memory_score,
disk_score=disk_score,
network_score=network_score,
gpu_score=gpu_score
)
# Extract network results for easier frontend access # Extract network results for easier frontend access
network_results = None network_results = None
@@ -155,7 +226,7 @@ async def submit_benchmark(
"upload_mbps": results.network.upload_mbps if hasattr(results.network, 'upload_mbps') else None, "upload_mbps": results.network.upload_mbps if hasattr(results.network, 'upload_mbps') else None,
"download_mbps": results.network.download_mbps if hasattr(results.network, 'download_mbps') else None, "download_mbps": results.network.download_mbps if hasattr(results.network, 'download_mbps') else None,
"ping_ms": results.network.ping_ms if hasattr(results.network, 'ping_ms') else None, "ping_ms": results.network.ping_ms if hasattr(results.network, 'ping_ms') else None,
"score": results.network.score "score": network_score
} }
benchmark = Benchmark( benchmark = Benchmark(
@@ -165,11 +236,13 @@ async def submit_benchmark(
bench_script_version=payload.bench_script_version, bench_script_version=payload.bench_script_version,
global_score=global_score, global_score=global_score,
cpu_score=results.cpu.score if results.cpu else None, cpu_score=cpu_score,
memory_score=results.memory.score if results.memory else None, cpu_score_single=cpu_score_single,
disk_score=results.disk.score if results.disk else None, cpu_score_multi=cpu_score_multi,
network_score=results.network.score if results.network else None, memory_score=memory_score,
gpu_score=results.gpu.score if results.gpu else None, disk_score=disk_score,
network_score=network_score,
gpu_score=gpu_score,
details_json=json.dumps(results.dict()), details_json=json.dumps(results.dict()),
network_results_json=json.dumps(network_results) if network_results else None network_results_json=json.dumps(network_results) if network_results else None
@@ -210,9 +283,54 @@ async def get_benchmark(
bench_script_version=benchmark.bench_script_version, bench_script_version=benchmark.bench_script_version,
global_score=benchmark.global_score, global_score=benchmark.global_score,
cpu_score=benchmark.cpu_score, cpu_score=benchmark.cpu_score,
cpu_score_single=benchmark.cpu_score_single,
cpu_score_multi=benchmark.cpu_score_multi,
memory_score=benchmark.memory_score, memory_score=benchmark.memory_score,
disk_score=benchmark.disk_score, disk_score=benchmark.disk_score,
network_score=benchmark.network_score, network_score=benchmark.network_score,
gpu_score=benchmark.gpu_score, gpu_score=benchmark.gpu_score,
details=json.loads(benchmark.details_json) details=json.loads(benchmark.details_json),
notes=benchmark.notes
)
@router.patch("/benchmarks/{benchmark_id}", response_model=BenchmarkSummary)
async def update_benchmark_entry(
benchmark_id: int,
payload: BenchmarkUpdate,
db: Session = Depends(get_db)
):
"""
Update editable benchmark fields (currently only notes).
"""
benchmark = db.query(Benchmark).filter(Benchmark.id == benchmark_id).first()
if not benchmark:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Benchmark {benchmark_id} not found"
)
update_data = payload.model_dump(exclude_unset=True)
if "notes" in update_data:
benchmark.notes = update_data["notes"]
db.add(benchmark)
db.commit()
db.refresh(benchmark)
return BenchmarkSummary(
id=benchmark.id,
run_at=benchmark.run_at.isoformat(),
global_score=benchmark.global_score,
cpu_score=benchmark.cpu_score,
cpu_score_single=benchmark.cpu_score_single,
cpu_score_multi=benchmark.cpu_score_multi,
memory_score=benchmark.memory_score,
disk_score=benchmark.disk_score,
network_score=benchmark.network_score,
gpu_score=benchmark.gpu_score,
bench_script_version=benchmark.bench_script_version,
notes=benchmark.notes
) )

View File

@@ -3,7 +3,7 @@ Linux BenchTools - Devices API
""" """
import json import json
from fastapi import APIRouter, Depends, HTTPException, status, Query from fastapi import APIRouter, Depends, HTTPException, status, Query, Response
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
from typing import List from typing import List
@@ -68,7 +68,8 @@ async def get_devices(
disk_score=last_bench.disk_score, disk_score=last_bench.disk_score,
network_score=last_bench.network_score, network_score=last_bench.network_score,
gpu_score=last_bench.gpu_score, gpu_score=last_bench.gpu_score,
bench_script_version=last_bench.bench_script_version bench_script_version=last_bench.bench_script_version,
notes=last_bench.notes
) )
items.append(DeviceSummary( items.append(DeviceSummary(
@@ -80,6 +81,9 @@ async def get_devices(
location=device.location, location=device.location,
owner=device.owner, owner=device.owner,
tags=device.tags, tags=device.tags,
purchase_store=device.purchase_store,
purchase_date=device.purchase_date,
purchase_price=device.purchase_price,
created_at=device.created_at.isoformat(), created_at=device.created_at.isoformat(),
updated_at=device.updated_at.isoformat(), updated_at=device.updated_at.isoformat(),
last_benchmark=last_bench_summary last_benchmark=last_bench_summary
@@ -125,7 +129,8 @@ async def get_device(
disk_score=last_bench.disk_score, disk_score=last_bench.disk_score,
network_score=last_bench.network_score, network_score=last_bench.network_score,
gpu_score=last_bench.gpu_score, gpu_score=last_bench.gpu_score,
bench_script_version=last_bench.bench_script_version bench_script_version=last_bench.bench_script_version,
notes=last_bench.notes
) )
# Get last hardware snapshot # Get last hardware snapshot
@@ -146,20 +151,40 @@ async def get_device(
cpu_base_freq_ghz=last_snapshot.cpu_base_freq_ghz, cpu_base_freq_ghz=last_snapshot.cpu_base_freq_ghz,
cpu_max_freq_ghz=last_snapshot.cpu_max_freq_ghz, cpu_max_freq_ghz=last_snapshot.cpu_max_freq_ghz,
ram_total_mb=last_snapshot.ram_total_mb, ram_total_mb=last_snapshot.ram_total_mb,
ram_used_mb=last_snapshot.ram_used_mb,
ram_free_mb=last_snapshot.ram_free_mb,
ram_shared_mb=last_snapshot.ram_shared_mb,
ram_slots_total=last_snapshot.ram_slots_total, ram_slots_total=last_snapshot.ram_slots_total,
ram_slots_used=last_snapshot.ram_slots_used, ram_slots_used=last_snapshot.ram_slots_used,
gpu_summary=last_snapshot.gpu_summary, gpu_summary=last_snapshot.gpu_summary,
gpu_model=last_snapshot.gpu_model, gpu_model=last_snapshot.gpu_model,
storage_summary=last_snapshot.storage_summary, storage_summary=last_snapshot.storage_summary,
storage_devices_json=last_snapshot.storage_devices_json, storage_devices_json=last_snapshot.storage_devices_json,
partitions_json=last_snapshot.partitions_json,
network_interfaces_json=last_snapshot.network_interfaces_json, network_interfaces_json=last_snapshot.network_interfaces_json,
network_shares_json=last_snapshot.network_shares_json,
os_name=last_snapshot.os_name, os_name=last_snapshot.os_name,
os_version=last_snapshot.os_version, os_version=last_snapshot.os_version,
kernel_version=last_snapshot.kernel_version, kernel_version=last_snapshot.kernel_version,
architecture=last_snapshot.architecture, architecture=last_snapshot.architecture,
virtualization_type=last_snapshot.virtualization_type, virtualization_type=last_snapshot.virtualization_type,
screen_resolution=last_snapshot.screen_resolution,
display_server=last_snapshot.display_server,
session_type=last_snapshot.session_type,
last_boot_time=last_snapshot.last_boot_time,
uptime_seconds=last_snapshot.uptime_seconds,
battery_percentage=last_snapshot.battery_percentage,
battery_status=last_snapshot.battery_status,
battery_health=last_snapshot.battery_health,
hostname=last_snapshot.hostname,
desktop_environment=last_snapshot.desktop_environment,
motherboard_vendor=last_snapshot.motherboard_vendor, motherboard_vendor=last_snapshot.motherboard_vendor,
motherboard_model=last_snapshot.motherboard_model motherboard_model=last_snapshot.motherboard_model,
bios_vendor=last_snapshot.bios_vendor,
bios_version=last_snapshot.bios_version,
bios_date=last_snapshot.bios_date,
pci_devices_json=last_snapshot.pci_devices_json,
usb_devices_json=last_snapshot.usb_devices_json
) )
# Get documents for this device # Get documents for this device
@@ -189,6 +214,9 @@ async def get_device(
location=device.location, location=device.location,
owner=device.owner, owner=device.owner,
tags=device.tags, tags=device.tags,
purchase_store=device.purchase_store,
purchase_date=device.purchase_date,
purchase_price=device.purchase_price,
created_at=device.created_at.isoformat(), created_at=device.created_at.isoformat(),
updated_at=device.updated_at.isoformat(), updated_at=device.updated_at.isoformat(),
last_benchmark=last_bench_summary, last_benchmark=last_bench_summary,
@@ -232,7 +260,8 @@ async def get_device_benchmarks(
disk_score=b.disk_score, disk_score=b.disk_score,
network_score=b.network_score, network_score=b.network_score,
gpu_score=b.gpu_score, gpu_score=b.gpu_score,
bench_script_version=b.bench_script_version bench_script_version=b.bench_script_version,
notes=b.notes
) )
for b in benchmarks for b in benchmarks
] ]
@@ -276,3 +305,25 @@ async def update_device(
# Return updated device (reuse get_device logic) # Return updated device (reuse get_device logic)
return await get_device(device_id, db) return await get_device(device_id, db)
@router.delete("/devices/{device_id}", status_code=status.HTTP_204_NO_CONTENT)
async def delete_device(
device_id: int,
db: Session = Depends(get_db)
):
"""
Delete a device and all related data
"""
device = db.query(Device).filter(Device.id == device_id).first()
if not device:
raise HTTPException(
status_code=status.HTTP_404_NOT_FOUND,
detail=f"Device {device_id} not found"
)
db.delete(device)
db.commit()
return Response(status_code=status.HTTP_204_NO_CONTENT)

View File

@@ -94,6 +94,16 @@ async def get_stats(db: Session = Depends(get_db)):
} }
# Config endpoint (for frontend to get API token and server info)
@app.get(f"{settings.API_PREFIX}/config")
async def get_config():
"""Get frontend configuration (API token, server URLs, etc.)"""
return {
"api_token": settings.API_TOKEN,
"iperf_server": "10.0.1.97"
}
if __name__ == "__main__": if __name__ == "__main__":
import uvicorn import uvicorn
uvicorn.run("app.main:app", host="0.0.0.0", port=8007, reload=True) uvicorn.run("app.main:app", host="0.0.0.0", port=8007, reload=True)

View File

@@ -23,6 +23,8 @@ class Benchmark(Base):
# Scores # Scores
global_score = Column(Float, nullable=False) global_score = Column(Float, nullable=False)
cpu_score = Column(Float, nullable=True) cpu_score = Column(Float, nullable=True)
cpu_score_single = Column(Float, nullable=True) # Monocore CPU score
cpu_score_multi = Column(Float, nullable=True) # Multicore CPU score
memory_score = Column(Float, nullable=True) memory_score = Column(Float, nullable=True)
disk_score = Column(Float, nullable=True) disk_score = Column(Float, nullable=True)
network_score = Column(Float, nullable=True) network_score = Column(Float, nullable=True)

View File

@@ -2,7 +2,7 @@
Linux BenchTools - Device Model Linux BenchTools - Device Model
""" """
from sqlalchemy import Column, Integer, String, DateTime, Text from sqlalchemy import Column, Integer, String, DateTime, Text, Float
from sqlalchemy.orm import relationship from sqlalchemy.orm import relationship
from datetime import datetime from datetime import datetime
from app.db.base import Base from app.db.base import Base
@@ -22,6 +22,9 @@ class Device(Base):
location = Column(String(255), nullable=True) location = Column(String(255), nullable=True)
owner = Column(String(100), nullable=True) owner = Column(String(100), nullable=True)
tags = Column(Text, nullable=True) # JSON or comma-separated tags = Column(Text, nullable=True) # JSON or comma-separated
purchase_store = Column(String(255), nullable=True)
purchase_date = Column(String(50), nullable=True)
purchase_price = Column(Float, nullable=True)
created_at = Column(DateTime, nullable=False, default=datetime.utcnow) created_at = Column(DateTime, nullable=False, default=datetime.utcnow)
updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow) updated_at = Column(DateTime, nullable=False, default=datetime.utcnow, onupdate=datetime.utcnow)

View File

@@ -58,6 +58,7 @@ class HardwareSnapshot(Base):
# Network # Network
network_interfaces_json = Column(Text, nullable=True) # JSON array network_interfaces_json = Column(Text, nullable=True) # JSON array
network_shares_json = Column(Text, nullable=True) # JSON array
# OS / Motherboard # OS / Motherboard
os_name = Column(String(100), nullable=True) os_name = Column(String(100), nullable=True)
@@ -65,11 +66,25 @@ class HardwareSnapshot(Base):
kernel_version = Column(String(100), nullable=True) kernel_version = Column(String(100), nullable=True)
architecture = Column(String(50), nullable=True) architecture = Column(String(50), nullable=True)
virtualization_type = Column(String(50), nullable=True) virtualization_type = Column(String(50), nullable=True)
screen_resolution = Column(String(50), nullable=True)
display_server = Column(String(50), nullable=True)
session_type = Column(String(50), nullable=True)
last_boot_time = Column(String(50), nullable=True)
uptime_seconds = Column(Integer, nullable=True)
battery_percentage = Column(Float, nullable=True)
battery_status = Column(String(50), nullable=True)
battery_health = Column(String(50), nullable=True)
motherboard_vendor = Column(String(100), nullable=True) motherboard_vendor = Column(String(100), nullable=True)
motherboard_model = Column(String(255), nullable=True) motherboard_model = Column(String(255), nullable=True)
bios_vendor = Column(String(100), nullable=True) bios_vendor = Column(String(100), nullable=True)
bios_version = Column(String(100), nullable=True) bios_version = Column(String(100), nullable=True)
bios_date = Column(String(50), nullable=True) bios_date = Column(String(50), nullable=True)
hostname = Column(String(255), nullable=True)
desktop_environment = Column(String(100), nullable=True)
# PCI and USB Devices
pci_devices_json = Column(Text, nullable=True) # JSON array
usb_devices_json = Column(Text, nullable=True) # JSON array
# Misc # Misc
sensors_json = Column(Text, nullable=True) # JSON object sensors_json = Column(Text, nullable=True) # JSON object

View File

@@ -10,8 +10,12 @@ from app.schemas.hardware import HardwareData
class CPUResults(BaseModel): class CPUResults(BaseModel):
"""CPU benchmark results""" """CPU benchmark results"""
events_per_sec: Optional[float] = Field(None, ge=0) events_per_sec: Optional[float] = Field(None, ge=0)
events_per_sec_single: Optional[float] = Field(None, ge=0) # Monocore
events_per_sec_multi: Optional[float] = Field(None, ge=0) # Multicore
duration_s: Optional[float] = Field(None, ge=0) duration_s: Optional[float] = Field(None, ge=0)
score: Optional[float] = Field(None, ge=0, le=10000) score: Optional[float] = Field(None, ge=0, le=10000)
score_single: Optional[float] = Field(None, ge=0, le=10000) # Monocore score
score_multi: Optional[float] = Field(None, ge=0, le=10000) # Multicore score
class MemoryResults(BaseModel): class MemoryResults(BaseModel):
@@ -82,12 +86,15 @@ class BenchmarkDetail(BaseModel):
global_score: float global_score: float
cpu_score: Optional[float] = None cpu_score: Optional[float] = None
cpu_score_single: Optional[float] = None
cpu_score_multi: Optional[float] = None
memory_score: Optional[float] = None memory_score: Optional[float] = None
disk_score: Optional[float] = None disk_score: Optional[float] = None
network_score: Optional[float] = None network_score: Optional[float] = None
gpu_score: Optional[float] = None gpu_score: Optional[float] = None
details: dict # details_json parsed details: dict # details_json parsed
notes: Optional[str] = None
class Config: class Config:
from_attributes = True from_attributes = True
@@ -99,11 +106,19 @@ class BenchmarkSummary(BaseModel):
run_at: str run_at: str
global_score: float global_score: float
cpu_score: Optional[float] = None cpu_score: Optional[float] = None
cpu_score_single: Optional[float] = None
cpu_score_multi: Optional[float] = None
memory_score: Optional[float] = None memory_score: Optional[float] = None
disk_score: Optional[float] = None disk_score: Optional[float] = None
network_score: Optional[float] = None network_score: Optional[float] = None
gpu_score: Optional[float] = None gpu_score: Optional[float] = None
bench_script_version: Optional[str] = None bench_script_version: Optional[str] = None
notes: Optional[str] = None
class Config: class Config:
from_attributes = True from_attributes = True
class BenchmarkUpdate(BaseModel):
"""Fields allowed when updating a benchmark"""
notes: Optional[str] = None

View File

@@ -18,6 +18,9 @@ class DeviceBase(BaseModel):
location: Optional[str] = None location: Optional[str] = None
owner: Optional[str] = None owner: Optional[str] = None
tags: Optional[str] = None tags: Optional[str] = None
purchase_store: Optional[str] = None
purchase_date: Optional[str] = None
purchase_price: Optional[float] = None
class DeviceCreate(DeviceBase): class DeviceCreate(DeviceBase):
@@ -34,6 +37,9 @@ class DeviceUpdate(BaseModel):
location: Optional[str] = None location: Optional[str] = None
owner: Optional[str] = None owner: Optional[str] = None
tags: Optional[str] = None tags: Optional[str] = None
purchase_store: Optional[str] = None
purchase_date: Optional[str] = None
purchase_price: Optional[float] = None
class DeviceSummary(DeviceBase): class DeviceSummary(DeviceBase):

View File

@@ -2,7 +2,7 @@
Linux BenchTools - Hardware Schemas Linux BenchTools - Hardware Schemas
""" """
from pydantic import BaseModel from pydantic import BaseModel, ConfigDict, Field
from typing import Optional, List from typing import Optional, List
@@ -73,6 +73,7 @@ class Partition(BaseModel):
fs_type: Optional[str] = None fs_type: Optional[str] = None
used_gb: Optional[float] = None used_gb: Optional[float] = None
total_gb: Optional[float] = None total_gb: Optional[float] = None
free_gb: Optional[float] = None
class StorageInfo(BaseModel): class StorageInfo(BaseModel):
@@ -89,6 +90,7 @@ class NetworkInterface(BaseModel):
ip: Optional[str] = None ip: Optional[str] = None
speed_mbps: Optional[int] = None speed_mbps: Optional[int] = None
driver: Optional[str] = None driver: Optional[str] = None
ssid: Optional[str] = None
wake_on_lan: Optional[bool] = None wake_on_lan: Optional[bool] = None
@@ -97,6 +99,18 @@ class NetworkInfo(BaseModel):
interfaces: Optional[List[NetworkInterface]] = None interfaces: Optional[List[NetworkInterface]] = None
class NetworkShare(BaseModel):
"""Mounted network share information"""
protocol: Optional[str] = None
source: Optional[str] = None
mount_point: Optional[str] = None
fs_type: Optional[str] = None
options: Optional[str] = None
total_gb: Optional[float] = None
used_gb: Optional[float] = None
free_gb: Optional[float] = None
class MotherboardInfo(BaseModel): class MotherboardInfo(BaseModel):
"""Motherboard information schema""" """Motherboard information schema"""
vendor: Optional[str] = None vendor: Optional[str] = None
@@ -113,6 +127,34 @@ class OSInfo(BaseModel):
kernel_version: Optional[str] = None kernel_version: Optional[str] = None
architecture: Optional[str] = None architecture: Optional[str] = None
virtualization_type: Optional[str] = None virtualization_type: Optional[str] = None
hostname: Optional[str] = None
desktop_environment: Optional[str] = None
session_type: Optional[str] = None
display_server: Optional[str] = None
screen_resolution: Optional[str] = None
last_boot_time: Optional[str] = None
uptime_seconds: Optional[int] = None
battery_percentage: Optional[float] = None
battery_status: Optional[str] = None
battery_health: Optional[str] = None
class PCIDevice(BaseModel):
"""PCI device information"""
model_config = ConfigDict(populate_by_name=True)
slot: str
class_: Optional[str] = Field(default=None, alias="class")
vendor: Optional[str] = None
device: Optional[str] = None
class USBDevice(BaseModel):
"""USB device information"""
bus: str
device: str
vendor_id: Optional[str] = None
product_id: Optional[str] = None
name: Optional[str] = None
class SensorsInfo(BaseModel): class SensorsInfo(BaseModel):
@@ -135,10 +177,13 @@ class HardwareData(BaseModel):
gpu: Optional[GPUInfo] = None gpu: Optional[GPUInfo] = None
storage: Optional[StorageInfo] = None storage: Optional[StorageInfo] = None
network: Optional[NetworkInfo] = None network: Optional[NetworkInfo] = None
network_shares: Optional[List[NetworkShare]] = None
motherboard: Optional[MotherboardInfo] = None motherboard: Optional[MotherboardInfo] = None
os: Optional[OSInfo] = None os: Optional[OSInfo] = None
sensors: Optional[SensorsInfo] = None sensors: Optional[SensorsInfo] = None
raw_info: Optional[RawInfo] = None raw_info: Optional[RawInfo] = None
pci_devices: Optional[List[PCIDevice]] = None
usb_devices: Optional[List[USBDevice]] = None
class HardwareSnapshotResponse(BaseModel): class HardwareSnapshotResponse(BaseModel):
@@ -157,6 +202,9 @@ class HardwareSnapshotResponse(BaseModel):
# RAM # RAM
ram_total_mb: Optional[int] = None ram_total_mb: Optional[int] = None
ram_used_mb: Optional[int] = None
ram_free_mb: Optional[int] = None
ram_shared_mb: Optional[int] = None
ram_slots_total: Optional[int] = None ram_slots_total: Optional[int] = None
ram_slots_used: Optional[int] = None ram_slots_used: Optional[int] = None
@@ -167,18 +215,37 @@ class HardwareSnapshotResponse(BaseModel):
# Storage # Storage
storage_summary: Optional[str] = None storage_summary: Optional[str] = None
storage_devices_json: Optional[str] = None storage_devices_json: Optional[str] = None
partitions_json: Optional[str] = None
# Network # Network
network_interfaces_json: Optional[str] = None network_interfaces_json: Optional[str] = None
network_shares_json: Optional[str] = None
# OS / Motherboard # OS / Motherboard / BIOS
os_name: Optional[str] = None os_name: Optional[str] = None
os_version: Optional[str] = None os_version: Optional[str] = None
kernel_version: Optional[str] = None kernel_version: Optional[str] = None
architecture: Optional[str] = None architecture: Optional[str] = None
virtualization_type: Optional[str] = None virtualization_type: Optional[str] = None
hostname: Optional[str] = None
desktop_environment: Optional[str] = None
screen_resolution: Optional[str] = None
display_server: Optional[str] = None
session_type: Optional[str] = None
last_boot_time: Optional[str] = None
uptime_seconds: Optional[int] = None
battery_percentage: Optional[float] = None
battery_status: Optional[str] = None
battery_health: Optional[str] = None
motherboard_vendor: Optional[str] = None motherboard_vendor: Optional[str] = None
motherboard_model: Optional[str] = None motherboard_model: Optional[str] = None
bios_vendor: Optional[str] = None
bios_version: Optional[str] = None
bios_date: Optional[str] = None
# PCI and USB Devices
pci_devices_json: Optional[str] = None
usb_devices_json: Optional[str] = None
class Config: class Config:
from_attributes = True from_attributes = True

View File

@@ -1,10 +1,103 @@
""" """
Linux BenchTools - Scoring Utilities Linux BenchTools - Scoring Utilities
New normalized scoring formulas (0-100 scale):
- CPU: events_per_second / 100
- Memory: throughput_mib_s / 1000
- Disk: (read_mb_s + write_mb_s) / 20
- Network: (upload_mbps + download_mbps) / 20
- GPU: glmark2_score / 50
""" """
from app.core.config import settings from app.core.config import settings
def calculate_cpu_score(events_per_second: float = None) -> float:
"""
Calculate CPU score from sysbench events per second.
Formula: events_per_second / 100
Range: 0-100 (capped)
Example: 3409.87 events/s → 34.1 score
"""
if events_per_second is None or events_per_second <= 0:
return 0.0
score = events_per_second / 100.0
return min(100.0, max(0.0, score))
def calculate_memory_score(throughput_mib_s: float = None) -> float:
"""
Calculate Memory score from sysbench throughput.
Formula: throughput_mib_s / 1000
Range: 0-100 (capped)
Example: 13806.03 MiB/s → 13.8 score
"""
if throughput_mib_s is None or throughput_mib_s <= 0:
return 0.0
score = throughput_mib_s / 1000.0
return min(100.0, max(0.0, score))
def calculate_disk_score(read_mb_s: float = None, write_mb_s: float = None) -> float:
"""
Calculate Disk score from fio read/write bandwidth.
Formula: (read_mb_s + write_mb_s) / 20
Range: 0-100 (capped)
Example: (695 + 695) MB/s → 69.5 score
"""
if read_mb_s is None and write_mb_s is None:
return 0.0
read = read_mb_s if read_mb_s is not None and read_mb_s > 0 else 0.0
write = write_mb_s if write_mb_s is not None and write_mb_s > 0 else 0.0
score = (read + write) / 20.0
return min(100.0, max(0.0, score))
def calculate_network_score(upload_mbps: float = None, download_mbps: float = None) -> float:
"""
Calculate Network score from iperf3 upload/download speeds.
Formula: (upload_mbps + download_mbps) / 20
Range: 0-100 (capped)
Example: (484.67 + 390.13) Mbps → 43.7 score
"""
if upload_mbps is None and download_mbps is None:
return 0.0
upload = upload_mbps if upload_mbps is not None and upload_mbps > 0 else 0.0
download = download_mbps if download_mbps is not None and download_mbps > 0 else 0.0
score = (upload + download) / 20.0
return min(100.0, max(0.0, score))
def calculate_gpu_score(glmark2_score: int = None) -> float:
"""
Calculate GPU score from glmark2 benchmark.
Formula: glmark2_score / 50
Range: 0-100 (capped)
Example: 2500 glmark2 → 50.0 score
"""
if glmark2_score is None or glmark2_score <= 0:
return 0.0
score = glmark2_score / 50.0
return min(100.0, max(0.0, score))
def calculate_global_score( def calculate_global_score(
cpu_score: float = None, cpu_score: float = None,
memory_score: float = None, memory_score: float = None,

0
backend/apply_migration.py Executable file → Normal file
View File

View File

@@ -0,0 +1,115 @@
#!/usr/bin/env python3
"""
Apply SQL migration 003 to existing database
Migration 003: Add cpu_score_single and cpu_score_multi columns to benchmarks table
Usage: python apply_migration_003.py
"""
import os
import sqlite3
from typing import Dict, List, Tuple
# Database path
DB_PATH = os.path.join(os.path.dirname(__file__), "data", "data.db")
MIGRATION_PATH = os.path.join(
os.path.dirname(__file__), "migrations", "003_add_cpu_scores.sql"
)
# (column_name, human description)
COLUMNS_TO_ADD: List[Tuple[str, str]] = [
("cpu_score_single", "Score CPU monocœur"),
("cpu_score_multi", "Score CPU multicœur"),
]
def _load_statements() -> Dict[str, str]:
"""Load SQL statements mapped by column name from the migration file."""
with open(MIGRATION_PATH, "r", encoding="utf-8") as f:
raw_sql = f.read()
# Remove comments and blank lines for easier parsing
filtered_lines = []
for line in raw_sql.splitlines():
stripped = line.strip()
if not stripped or stripped.startswith("--"):
continue
filtered_lines.append(line)
statements = {}
for statement in "\n".join(filtered_lines).split(";"):
stmt = statement.strip()
if not stmt:
continue
for column_name, _ in COLUMNS_TO_ADD:
if column_name in stmt:
statements[column_name] = stmt
break
return statements
def apply_migration():
"""Apply the SQL migration 003."""
if not os.path.exists(DB_PATH):
print(f"❌ Database not found at {DB_PATH}")
print(" The database will be created automatically on first run.")
return
if not os.path.exists(MIGRATION_PATH):
print(f"❌ Migration file not found at {MIGRATION_PATH}")
return
print(f"📂 Database: {DB_PATH}")
print(f"📄 Migration: {MIGRATION_PATH}")
print()
conn = sqlite3.connect(DB_PATH)
cursor = conn.cursor()
try:
cursor.execute("PRAGMA table_info(benchmarks)")
existing_columns = {row[1] for row in cursor.fetchall()}
missing_columns = [
col for col, _ in COLUMNS_TO_ADD if col not in existing_columns
]
if not missing_columns:
print("⚠️ Migration 003 already applied (CPU score columns exist)")
print("✅ Database is up to date")
return
statements = _load_statements()
print("🔄 Applying migration 003...")
for column_name, description in COLUMNS_TO_ADD:
if column_name not in missing_columns:
print(f"⏩ Column {column_name} already present, skipping")
continue
statement = statements.get(column_name)
if not statement:
raise RuntimeError(
f"No SQL statement found for column '{column_name}' in migration file"
)
print(f" Adding {description} ({column_name})...")
cursor.execute(statement)
conn.commit()
print("✅ Migration 003 applied successfully!")
print("New columns added to benchmarks table:")
for column_name, description in COLUMNS_TO_ADD:
if column_name in missing_columns:
print(f" - {column_name}: {description}")
except (sqlite3.Error, RuntimeError) as e:
print(f"❌ Error applying migration: {e}")
conn.rollback()
finally:
conn.close()
if __name__ == "__main__":
apply_migration()

View File

@@ -0,0 +1,107 @@
#!/usr/bin/env python3
"""
Apply SQL migration 004 to existing database.
Migration 004: Add hostname/desktop environment/PCI/USB columns to hardware_snapshots.
Usage: python apply_migration_004.py
"""
import os
import sqlite3
from typing import Dict, List, Tuple
# Database path
DB_PATH = os.path.join(os.path.dirname(__file__), "data", "data.db")
MIGRATION_PATH = os.path.join(
os.path.dirname(__file__), "migrations", "004_add_snapshot_details.sql"
)
COLUMNS_TO_ADD: List[Tuple[str, str]] = [
("hostname", "Nom d'hôte du snapshot"),
("desktop_environment", "Environnement de bureau détecté"),
("pci_devices_json", "Liste PCI en JSON"),
("usb_devices_json", "Liste USB en JSON"),
]
def _load_statements() -> Dict[str, str]:
"""Return ALTER TABLE statements indexed by column name."""
with open(MIGRATION_PATH, "r", encoding="utf-8") as f:
filtered = []
for line in f:
stripped = line.strip()
if not stripped or stripped.startswith("--"):
continue
filtered.append(line.rstrip("\n"))
statements: Dict[str, str] = {}
for statement in "\n".join(filtered).split(";"):
stmt = statement.strip()
if not stmt:
continue
for column, _ in COLUMNS_TO_ADD:
if column in stmt:
statements[column] = stmt
break
return statements
def apply_migration():
"""Apply the SQL migration 004."""
if not os.path.exists(DB_PATH):
print(f"❌ Database not found at {DB_PATH}")
print(" The database will be created automatically on first run.")
return
if not os.path.exists(MIGRATION_PATH):
print(f"❌ Migration file not found at {MIGRATION_PATH}")
return
print(f"📂 Database: {DB_PATH}")
print(f"📄 Migration: {MIGRATION_PATH}")
print()
conn = sqlite3.connect(DB_PATH)
cursor = conn.cursor()
try:
cursor.execute("PRAGMA table_info(hardware_snapshots)")
existing_columns = {row[1] for row in cursor.fetchall()}
missing = [col for col, _ in COLUMNS_TO_ADD if col not in existing_columns]
if not missing:
print("⚠️ Migration 004 already applied (columns exist)")
print("✅ Database is up to date")
return
statements = _load_statements()
print("🔄 Applying migration 004...")
for column, description in COLUMNS_TO_ADD:
if column not in missing:
print(f"⏩ Column {column} already present, skipping")
continue
statement = statements.get(column)
if not statement:
raise RuntimeError(
f"No SQL statement found for column '{column}' in migration file"
)
print(f" Adding {description} ({column})...")
cursor.execute(statement)
conn.commit()
print("✅ Migration 004 applied successfully!")
print("New columns added to hardware_snapshots:")
for column, description in COLUMNS_TO_ADD:
if column in missing:
print(f" - {column}: {description}")
except (sqlite3.Error, RuntimeError) as exc:
print(f"❌ Error applying migration: {exc}")
conn.rollback()
finally:
conn.close()
if __name__ == "__main__":
apply_migration()

View File

@@ -0,0 +1,112 @@
#!/usr/bin/env python3
"""
Apply SQL migration 005 to existing database.
Migration 005: Add OS/display/battery metadata columns to hardware_snapshots.
Usage: python apply_migration_005.py
"""
import os
import sqlite3
from typing import Dict, List, Tuple
DB_PATH = os.path.join(os.path.dirname(__file__), "data", "data.db")
MIGRATION_PATH = os.path.join(
os.path.dirname(__file__), "migrations", "005_add_os_display_and_battery.sql"
)
COLUMNS_TO_ADD: List[Tuple[str, str]] = [
("screen_resolution", "Résolution écran"),
("display_server", "Serveur d'affichage"),
("session_type", "Type de session"),
("last_boot_time", "Dernier boot"),
("uptime_seconds", "Uptime en secondes"),
("battery_percentage", "Pourcentage batterie"),
("battery_status", "Statut batterie"),
("battery_health", "Santé batterie"),
]
def _load_statements() -> Dict[str, str]:
"""Load ALTER statements from migration file keyed by column name."""
with open(MIGRATION_PATH, "r", encoding="utf-8") as fh:
filtered = []
for line in fh:
stripped = line.strip()
if not stripped or stripped.startswith("--"):
continue
filtered.append(line.rstrip("\n"))
statements: Dict[str, str] = {}
for statement in "\n".join(filtered).split(";"):
stmt = statement.strip()
if not stmt:
continue
for column, _ in COLUMNS_TO_ADD:
if column in stmt:
statements[column] = stmt
break
return statements
def apply_migration():
"""Apply migration 005 to the SQLite database."""
if not os.path.exists(DB_PATH):
print(f"❌ Database not found at {DB_PATH}")
print(" The database will be created automatically on first run.")
return
if not os.path.exists(MIGRATION_PATH):
print(f"❌ Migration file not found at {MIGRATION_PATH}")
return
print(f"📂 Database: {DB_PATH}")
print(f"📄 Migration: {MIGRATION_PATH}")
print()
conn = sqlite3.connect(DB_PATH)
cursor = conn.cursor()
try:
cursor.execute("PRAGMA table_info(hardware_snapshots)")
existing_columns = {row[1] for row in cursor.fetchall()}
missing = [col for col, _ in COLUMNS_TO_ADD if col not in existing_columns]
if not missing:
print("⚠️ Migration 005 already applied (columns exist)")
print("✅ Database is up to date")
return
statements = _load_statements()
print("🔄 Applying migration 005...")
for column, description in COLUMNS_TO_ADD:
if column not in missing:
print(f"⏩ Column {column} already present, skipping")
continue
statement = statements.get(column)
if not statement:
raise RuntimeError(
f"No SQL statement found for column '{column}' in migration file"
)
print(f" Adding {description} ({column})...")
cursor.execute(statement)
conn.commit()
print("✅ Migration 005 applied successfully!")
print("New columns added to hardware_snapshots:")
for column, description in COLUMNS_TO_ADD:
if column in missing:
print(f" - {column}: {description}")
except (sqlite3.Error, RuntimeError) as exc:
print(f"❌ Error applying migration: {exc}")
conn.rollback()
finally:
conn.close()
if __name__ == "__main__":
apply_migration()

View File

@@ -0,0 +1,61 @@
#!/usr/bin/env python3
"""
Apply SQL migration 006 to existing database
Migration 006: Add purchase metadata fields to devices table
"""
import os
import sqlite3
DB_PATH = os.path.join(os.path.dirname(__file__), "data", "data.db")
MIGRATION_PATH = os.path.join(
os.path.dirname(__file__), "migrations", "006_add_purchase_fields.sql"
)
COLUMNS = ["purchase_store", "purchase_date", "purchase_price"]
def apply_migration():
if not os.path.exists(DB_PATH):
print(f"❌ Database not found at {DB_PATH}")
print(" It will be created automatically on first backend start.")
return
if not os.path.exists(MIGRATION_PATH):
print(f"❌ Migration file not found at {MIGRATION_PATH}")
return
conn = sqlite3.connect(DB_PATH)
cursor = conn.cursor()
try:
cursor.execute("PRAGMA table_info(devices)")
existing_columns = {row[1] for row in cursor.fetchall()}
missing = [col for col in COLUMNS if col not in existing_columns]
if not missing:
print("⚠️ Migration 006 already applied (purchase columns exist)")
return
print("🔄 Applying migration 006 (purchase fields)...")
with open(MIGRATION_PATH, "r", encoding="utf-8") as f:
statements = [
stmt.strip()
for stmt in f.read().split(";")
if stmt.strip()
]
for stmt in statements:
cursor.execute(stmt)
conn.commit()
print("✅ Migration 006 applied successfully.")
except sqlite3.Error as exc:
conn.rollback()
print(f"❌ Error applying migration 006: {exc}")
finally:
conn.close()
if __name__ == "__main__":
apply_migration()

0
backend/migrations/001_add_ram_stats_and_smart.sql Normal file → Executable file
View File

0
backend/migrations/002_add_network_results.sql Normal file → Executable file
View File

View File

@@ -0,0 +1,5 @@
-- Migration 003: Add CPU subscore columns to benchmarks table
-- Date: 2025-12-15
ALTER TABLE benchmarks ADD COLUMN cpu_score_single FLOAT;
ALTER TABLE benchmarks ADD COLUMN cpu_score_multi FLOAT;

View File

@@ -0,0 +1,7 @@
-- Migration 004: Add extra hardware snapshot metadata columns
-- Date: 2025-12-17
ALTER TABLE hardware_snapshots ADD COLUMN hostname VARCHAR(255);
ALTER TABLE hardware_snapshots ADD COLUMN desktop_environment VARCHAR(100);
ALTER TABLE hardware_snapshots ADD COLUMN pci_devices_json TEXT;
ALTER TABLE hardware_snapshots ADD COLUMN usb_devices_json TEXT;

View File

@@ -0,0 +1,11 @@
-- Migration 005: Extend hardware_snapshots with OS/display/battery metadata
-- Date: 2025-12-17
ALTER TABLE hardware_snapshots ADD COLUMN screen_resolution VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN display_server VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN session_type VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN last_boot_time VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN uptime_seconds INTEGER;
ALTER TABLE hardware_snapshots ADD COLUMN battery_percentage FLOAT;
ALTER TABLE hardware_snapshots ADD COLUMN battery_status VARCHAR(50);
ALTER TABLE hardware_snapshots ADD COLUMN battery_health VARCHAR(50);

View File

@@ -0,0 +1,4 @@
-- Add purchase metadata columns to devices table
ALTER TABLE devices ADD COLUMN purchase_store TEXT;
ALTER TABLE devices ADD COLUMN purchase_date TEXT;
ALTER TABLE devices ADD COLUMN purchase_price REAL;

0
backend/migrations/add_bios_vendor.sql Normal file → Executable file
View File

0
backend/requirements.txt Normal file → Executable file
View File

2
docker-compose.yml Normal file → Executable file
View File

@@ -9,6 +9,7 @@ services:
volumes: volumes:
- ./backend/data:/app/data - ./backend/data:/app/data
- ./uploads:/app/uploads - ./uploads:/app/uploads
- ./backend/app:/app/app
environment: environment:
- API_TOKEN=${API_TOKEN:-CHANGE_ME_GENERATE_RANDOM_TOKEN} - API_TOKEN=${API_TOKEN:-CHANGE_ME_GENERATE_RANDOM_TOKEN}
- DATABASE_URL=sqlite:////app/data/data.db - DATABASE_URL=sqlite:////app/data/data.db
@@ -24,6 +25,7 @@ services:
- "${FRONTEND_PORT:-8087}:80" - "${FRONTEND_PORT:-8087}:80"
volumes: volumes:
- ./frontend:/usr/share/nginx/html:ro - ./frontend:/usr/share/nginx/html:ro
- ./scripts/bench.sh:/usr/share/nginx/html/scripts/bench.sh:ro
restart: unless-stopped restart: unless-stopped
networks: networks:
- benchtools - benchtools

0
docs/01_vision_fonctionnelle.md Normal file → Executable file
View File

0
docs/02_model_donnees.md Normal file → Executable file
View File

0
docs/03_api_backend.md Normal file → Executable file
View File

0
docs/04_bench_script_client.md Normal file → Executable file
View File

0
docs/05_webui_design.md Normal file → Executable file
View File

0
docs/06_backend_architecture.md Normal file → Executable file
View File

0
docs/08_installation_bootstrap.md Normal file → Executable file
View File

0
docs/09_tests_qualite.md Normal file → Executable file
View File

0
docs/10_roadmap_evolutions.md Normal file → Executable file
View File

28
frontend/config.js Executable file
View File

@@ -0,0 +1,28 @@
// Frontend configuration (can be overridden by defining window.BenchConfig before loading this file)
(function() {
window.BenchConfig = window.BenchConfig || {};
const origin = window.location.origin;
const protocol = window.location.protocol;
const hostname = window.location.hostname;
if (!window.BenchConfig.frontendBaseUrl) {
window.BenchConfig.frontendBaseUrl = origin;
}
if (!window.BenchConfig.backendApiUrl) {
window.BenchConfig.backendApiUrl = `${protocol}//${hostname}:8007/api`;
}
if (!window.BenchConfig.benchScriptPath) {
window.BenchConfig.benchScriptPath = '/scripts/bench.sh';
}
if (!window.BenchConfig.apiTokenPlaceholder) {
window.BenchConfig.apiTokenPlaceholder = 'YOUR_TOKEN';
}
if (!window.BenchConfig.iperfServer) {
window.BenchConfig.iperfServer = '10.0.1.97';
}
})();

13
frontend/css/components.css Normal file → Executable file
View File

@@ -475,6 +475,9 @@
color: var(--text-secondary); color: var(--text-secondary);
font-size: 0.75rem; font-size: 0.75rem;
border: 1px solid var(--bg-tertiary); border: 1px solid var(--bg-tertiary);
display: inline-flex;
align-items: center;
gap: 0.25rem;
} }
.tag-primary { .tag-primary {
@@ -483,6 +486,16 @@
border-color: var(--color-info); border-color: var(--color-info);
} }
.tag .remove-tag {
background: transparent;
border: none;
color: inherit;
cursor: pointer;
font-size: 0.8rem;
padding: 0;
line-height: 1;
}
/* Alert Component */ /* Alert Component */
.alert { .alert {
padding: var(--spacing-md); padding: var(--spacing-md);

265
frontend/css/main.css Normal file → Executable file
View File

@@ -16,6 +16,7 @@
--color-info: #66d9ef; --color-info: #66d9ef;
--color-purple: #ae81ff; --color-purple: #ae81ff;
--color-yellow: #e6db74; --color-yellow: #e6db74;
--border-color: #444444;
/* Spacing */ /* Spacing */
--spacing-xs: 0.25rem; --spacing-xs: 0.25rem;
@@ -28,6 +29,12 @@
--radius-sm: 4px; --radius-sm: 4px;
--radius-md: 8px; --radius-md: 8px;
--radius-lg: 12px; --radius-lg: 12px;
/* Icon sizing (customisable) */
--section-icon-size: 32px;
--button-icon-size: 24px;
--icon-btn-size: 42px;
--icon-btn-icon-size: 26px;
} }
/* Reset & Base */ /* Reset & Base */
@@ -198,6 +205,264 @@ td {
color: var(--text-primary); color: var(--text-primary);
} }
/* Device sections */
.device-section {
background: var(--bg-secondary);
border: 1px solid var(--bg-tertiary);
border-radius: var(--radius-md);
padding: var(--spacing-md);
margin-bottom: var(--spacing-md);
}
.device-section .section-header {
display: flex;
justify-content: space-between;
align-items: center;
gap: var(--spacing-md);
margin-bottom: var(--spacing-sm);
border-bottom: 1px solid var(--bg-tertiary);
padding-bottom: var(--spacing-sm);
}
.device-section .section-header h3 {
margin: 0;
font-size: 1.05rem;
color: var(--color-info);
display: flex;
align-items: center;
gap: var(--spacing-sm);
}
.section-icon-wrap {
display: inline-flex;
align-items: center;
justify-content: center;
}
.section-icon {
width: var(--section-icon-size);
height: var(--section-icon-size);
object-fit: contain;
filter: drop-shadow(0 0 2px rgba(0,0,0,0.4));
}
.section-title {
line-height: 1.2;
}
.section-actions {
display: flex;
align-items: center;
gap: var(--spacing-sm);
}
.icon-btn {
background: var(--bg-primary);
border: 1px solid var(--bg-tertiary);
border-radius: 50%;
width: 38px;
height: 38px;
display: inline-flex;
align-items: center;
justify-content: center;
cursor: pointer;
padding: 0;
transition: transform 0.2s ease, border-color 0.2s ease;
text-decoration: none;
color: var(--text-primary);
}
.icon-btn img {
width: var(--button-icon-size);
height: var(--button-icon-size);
object-fit: contain;
}
.icon-btn:hover {
transform: scale(1.05);
border-color: var(--color-info);
}
.icon-btn.danger {
border-color: var(--color-danger);
}
.icon-btn.success {
border-color: var(--color-success);
}
.doc-actions {
display: flex;
gap: var(--spacing-xs);
}
.doc-actions .icon-btn {
width: 32px;
height: 32px;
}
.device-preamble {
border: 1px solid var(--bg-tertiary);
border-radius: var(--radius-md);
padding: var(--spacing-md);
margin-bottom: var(--spacing-md);
background: var(--bg-secondary);
}
.preamble-content {
display: grid;
grid-template-columns: 1fr 1fr;
gap: var(--spacing-lg);
align-items: start;
}
.preamble-left {
display: flex;
flex-direction: column;
gap: var(--spacing-sm);
}
.preamble-right {
display: flex;
flex-direction: column;
}
@media (max-width: 768px) {
.preamble-content {
grid-template-columns: 1fr;
}
}
.header-row {
display: flex;
justify-content: space-between;
align-items: center;
gap: var(--spacing-lg);
margin-bottom: var(--spacing-sm);
}
.header-label {
color: var(--text-secondary);
font-size: 0.8rem;
text-transform: uppercase;
letter-spacing: 0.05em;
}
.header-value {
font-size: 1.1rem;
font-weight: 600;
color: var(--text-primary);
}
.header-meta {
font-size: 0.8rem;
color: var(--text-secondary);
margin-top: 0.25rem;
}
.header-stat {
text-align: right;
}
.usage-pill {
display: inline-block;
padding: 0.2rem 0.6rem;
border-radius: 999px;
font-size: 0.8rem;
font-weight: 600;
}
.usage-pill.ok {
background: rgba(166, 226, 46, 0.2);
color: var(--color-success);
}
.usage-pill.medium {
background: rgba(253, 151, 31, 0.2);
color: var(--color-warning);
}
.usage-pill.high {
background: rgba(249, 38, 114, 0.2);
color: var(--color-danger);
}
.usage-pill.muted {
background: var(--bg-tertiary);
color: var(--text-secondary);
}
.inline-form,
.links-form,
.tag-form {
display: flex;
gap: var(--spacing-sm);
align-items: center;
margin-bottom: var(--spacing-sm);
}
.inline-form input,
.links-form input,
.tag-form input {
flex: 1;
padding: 0.5rem 0.75rem;
border: 1px solid var(--border-color);
border-radius: var(--radius-sm);
background: var(--bg-primary);
color: var(--text-primary);
}
.link-item {
display: flex;
justify-content: space-between;
align-items: center;
gap: var(--spacing-sm);
padding: 0.75rem;
border: 1px solid var(--border-color);
border-radius: var(--radius-sm);
background: var(--bg-primary);
}
/* Device list actions */
.device-list-item {
position: relative;
}
.device-list-delete {
background: transparent;
border: none;
color: var(--color-danger);
cursor: pointer;
font-size: 0.9rem;
padding: 0.2rem;
transition: transform 0.2s ease;
position: relative;
z-index: 10;
pointer-events: auto;
}
.device-list-delete:hover {
transform: scale(1.2);
filter: brightness(1.3);
}
/* Markdown blocks */
.markdown-block {
background: var(--bg-primary);
border: 1px solid var(--bg-tertiary);
border-radius: var(--radius-sm);
padding: var(--spacing-sm);
white-space: pre-wrap;
line-height: 1.5;
}
.markdown-block code {
background: rgba(0,0,0,0.3);
padding: 0 0.25rem;
border-radius: 4px;
font-family: 'Consolas', 'Monaco', 'Courier New', monospace;
}
tbody tr { tbody tr {
transition: background-color 0.2s; transition: background-color 0.2s;
} }

10
frontend/device_detail.html Normal file → Executable file
View File

@@ -6,6 +6,8 @@
<title>Device Detail - Linux BenchTools</title> <title>Device Detail - Linux BenchTools</title>
<link rel="stylesheet" href="css/main.css"> <link rel="stylesheet" href="css/main.css">
<link rel="stylesheet" href="css/components.css"> <link rel="stylesheet" href="css/components.css">
<link rel="icon" type="image/png" sizes="32x32" href="icons/favicon/icons8-devices-3d-fluency-32.png">
<link rel="icon" type="image/png" sizes="16x16" href="icons/favicon/icons8-devices-3d-fluency-16.png">
</head> </head>
<body> <body>
<!-- Header --> <!-- Header -->
@@ -32,12 +34,15 @@
<div id="deviceContent" style="display: none;"> <div id="deviceContent" style="display: none;">
<!-- Device Header --> <!-- Device Header -->
<div class="card"> <div class="card">
<div style="display: flex; justify-content: space-between; align-items: start;"> <div style="display: flex; justify-content: space-between; align-items: start; gap: 1rem;">
<div> <div>
<h2 id="deviceHostname" style="color: var(--color-success); margin-bottom: 0.5rem;">--</h2> <h2 id="deviceHostname" style="color: var(--color-success); margin-bottom: 0.5rem;">--</h2>
<p id="deviceDescription" style="color: var(--text-secondary);">--</p> <p id="deviceDescription" style="color: var(--text-secondary);">--</p>
</div> </div>
<div id="globalScoreContainer"></div> <div style="display: flex; gap: 0.75rem; align-items: flex-start;">
<div id="globalScoreContainer"></div>
<button id="deleteDeviceBtn" class="btn btn-danger btn-sm">🗑️ Supprimer</button>
</div>
</div> </div>
<div id="deviceMeta" style="margin-top: 1rem; display: flex; gap: 1.5rem; flex-wrap: wrap;"></div> <div id="deviceMeta" style="margin-top: 1rem; display: flex; gap: 1.5rem; flex-wrap: wrap;"></div>
@@ -214,6 +219,7 @@
</div> </div>
<!-- Scripts --> <!-- Scripts -->
<script src="config.js"></script>
<script src="js/utils.js"></script> <script src="js/utils.js"></script>
<script src="js/api.js"></script> <script src="js/api.js"></script>
<script src="js/device_detail.js"></script> <script src="js/device_detail.js"></script>

19
frontend/devices.html Normal file → Executable file
View File

@@ -6,6 +6,8 @@
<title>Devices - Linux BenchTools</title> <title>Devices - Linux BenchTools</title>
<link rel="stylesheet" href="css/main.css"> <link rel="stylesheet" href="css/main.css">
<link rel="stylesheet" href="css/components.css"> <link rel="stylesheet" href="css/components.css">
<link rel="icon" type="image/png" sizes="32x32" href="icons/favicon/icons8-devices-3d-fluency-32.png">
<link rel="icon" type="image/png" sizes="16x16" href="icons/favicon/icons8-devices-3d-fluency-16.png">
</head> </head>
<body> <body>
<!-- Compact Header --> <!-- Compact Header -->
@@ -58,7 +60,24 @@
<p>&copy; 2025 Linux BenchTools - Self-hosted benchmarking tool</p> <p>&copy; 2025 Linux BenchTools - Self-hosted benchmarking tool</p>
</footer> </footer>
<!-- Modal for Benchmark Details -->
<div id="benchmarkModal" class="modal">
<div class="modal-content">
<div class="modal-header">
<h3 class="modal-title">Détails du Benchmark</h3>
<button class="modal-close">&times;</button>
</div>
<div class="modal-body" id="benchmarkModalBody">
<div class="loading">Chargement...</div>
</div>
<div class="modal-footer">
<button class="btn btn-secondary" onclick="BenchUtils.closeModal('benchmarkModal')">Fermer</button>
</div>
</div>
</div>
<!-- Scripts --> <!-- Scripts -->
<script src="config.js"></script>
<script src="js/utils.js"></script> <script src="js/utils.js"></script>
<script src="js/api.js"></script> <script src="js/api.js"></script>
<script src="js/devices.js"></script> <script src="js/devices.js"></script>

Binary file not shown.

After

Width:  |  Height:  |  Size: 832 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.3 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 5.6 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 6.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.7 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 7.9 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.5 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 12 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.5 KiB

BIN
frontend/icons/icons8-bios-94.png Executable file

Binary file not shown.

After

Width:  |  Height:  |  Size: 8.0 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 1.3 KiB

Some files were not shown because too many files have changed in this diff Show More