maj avec codex

This commit is contained in:
2025-12-24 16:21:23 +01:00
parent 5b96063d16
commit 6213f548c5
9 changed files with 729 additions and 110 deletions

12
CHANGELOG.md Normal file
View File

@@ -0,0 +1,12 @@
# Changelog
## 2025-12-24
- Ajout d'un scheduler precis par entite avec frequences en secondes/minutes.
- Lectures Modbus optimisees par batch + publication sur changement uniquement.
- Reconnexion Modbus avec backoff + backoff par entite en cas d'erreurs.
- LWT MQTT et publication d'availability automatique.
- Logs enrichis (niveau configurable, sortie fichier optionnelle) + metrics MQTT.
- Prise en charge de `state_topic` et `value_map` dans la config.
- Validation de configuration au demarrage.
- Ajout d'entites texte pour SystemStatus et FurnaceStatus.
- Ajout d'un capteur calcule "Stock energie" (kWh) avec volume et Tmin configurables.

View File

@@ -12,7 +12,7 @@ RUN pip install --no-cache-dir -r requirements.txt
# Copier seulement le fichier main.py et config.yaml dans le conteneur # Copier seulement le fichier main.py et config.yaml dans le conteneur
COPY ./src/main.py /usr/src/app/main.py COPY ./src/main.py /usr/src/app/main.py
COPY ./config/config.yaml /usr/src/app/config.yaml COPY ./src/config.yaml /usr/src/app/config.yaml
# Commande pour exécuter l'application # Commande pour exécuter l'application
CMD ["python", "./main.py"] CMD ["python", "./main.py"]

37
PROMPT.md Normal file
View File

@@ -0,0 +1,37 @@
# Prompt - Modbus to MQTT App
Use these development guidelines for this repository.
## Goals
- Keep the app config-driven and backward compatible.
- Avoid breaking MQTT topics or HA discovery behavior.
- Prefer readable logs and safe defaults.
## Core behavior
- Read Modbus registers and publish to MQTT.
- Support per-entity refresh intervals (seconds or minutes).
- Use batching for contiguous Modbus reads.
- Publish only on change by default (configurable).
- Publish availability (LWT) and metrics.
## Configuration rules
- `src/config.yaml` is the active config file.
- Validate config on startup (required fields, unique_id).
- Support `state_topic` override per entity.
- Support `value_map` for text mapping.
- Allow computed entities (`input_type: computed`).
## Reliability
- Reconnect Modbus with exponential backoff.
- Backoff per entity after read errors.
- Keep MQTT connection stable and log failures.
## Observability
- Log at INFO by default, allow DEBUG via config.
- Optional file logging via `log_file`.
- Metrics to `<topic_prefix>/metrics`.
## Changes checklist
- Update README and CHANGELOG when behavior changes.
- Keep ASCII where possible (avoid accents in new files).
- Add minimal comments only when logic is not obvious.

View File

@@ -1,2 +1,59 @@
# froling_modbus2mqtt # froling_modbus2mqtt
Passerelle Modbus -> MQTT pour chaudiere Froling, avec autodiscovery Home Assistant.
## Fonctionnement
- Lit des registres Modbus (input/holding/coils/discrete) et publie sur MQTT.
- Chaque entite peut avoir sa frequence (secondes ou minutes).
- Autodiscovery HA publie automatiquement les capteurs.
- Optimisations: lectures Modbus par batch, publication sur changement uniquement (optionnel).
- Fiabilite: LWT MQTT + reconnexion Modbus avec backoff.
## Fichier de configuration
L'application lit `src/config.yaml` (monte dans le conteneur).
Parametres globaux utiles:
- `refresh_rate`: intervalle par defaut.
- `log_level` / `log_file`: logs console et fichier optionnel.
- `publish_on_change_only`: publier uniquement si la valeur change.
- `metrics_interval`: envoi des metrics MQTT.
- `buffer_volume_l`, `buffer_min_temp_c`: calcul Stock energie.
- `modbus.reconnect_min` / `modbus.reconnect_max`: backoff reconnexion.
Parametres par entite:
- `refresh` + `refresh_unit` (`s` ou `m`)
- `state_topic` (sinon genere avec `topic_prefix` et `name`)
- `value_map` (SystemStatus, FurnaceStatus, etc.)
## Entites calculees
Type `input_type: computed` pour les valeurs derivees.
Exemple Stock energie (kWh) base sur Tampon Haut/Bas:
```yaml
buffer_volume_l: 1000
buffer_min_temp_c: 25
...
- name: "Stock energie"
unique_id: "stock_energie_sensor"
type: "sensor"
unit_of_measurement: "kWh"
state_topic: "froling/S3Turbo/STOCK_ENERGIE/state"
input_type: "computed"
formula: "buffer_energy_kwh"
sources:
- "tampon_haut_sensor"
- "tampon_bas_sensor"
precision: 2
```
## MQTT
- Availability: `<topic_prefix>/availability`
- Metrics: `<topic_prefix>/metrics` (reads, errors, published, uptime)
## Lancer en Docker
```bash
docker compose up -d --build
```
## Changelog
Voir `CHANGELOG.md`.

View File

@@ -1,22 +1,3 @@
# Digital Outputs
# Function: Read Coil Status (FC=01)
# Address Range: 00001-00158
# Digital Inputs
# Function: Read Input Status (FC=02)
# Address Range: 10001-10086
# Actual Values
# Function: Read Input Registers(FC=04)
# Address Range: 30001-30272
# input_type : input
# Parameters
# Function: Read Holding Registers(FC=03)
# Address Range: 40001-41094
# input_type: holding
mqtt: mqtt:
host: "10.0.0.3" host: "10.0.0.3"
port: 1883 port: 1883
@@ -84,8 +65,8 @@ entities:
- name: "Etats systeme" - name: "Etats systeme"
unique_id: "etats_systeme_sensor" unique_id: "etats_systeme_sensor"
type: "sensor" type: "sensor"
#device_class: "enum" # device_class: "enum"
#state_class: "measurement" # state_class: "measurement"
icon: "mdi:radiator" icon: "mdi:radiator"
unit_of_measurement: "" unit_of_measurement: ""
state_topic: "froling/S3Turbo/ETATS_SYSTEME/state" state_topic: "froling/S3Turbo/ETATS_SYSTEME/state"
@@ -97,11 +78,13 @@ entities:
precision: 1 precision: 1
value_map: "SystemStatus" value_map: "SystemStatus"
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Etat Chaudiere" - name: "Etat Chaudiere"
unique_id: "etats_chaudiere_sensor" unique_id: "etats_chaudiere_sensor"
type: "sensor" type: "sensor"
#device_class: "enum" # device_class: "enum"
#state_class: "measurement" # state_class: "measurement"
icon: "mdi:water-boiler-alert" icon: "mdi:water-boiler-alert"
unit_of_measurement: "" unit_of_measurement: ""
state_topic: "froling/S3Turbo/ETATS_CHAUDIERE/state" state_topic: "froling/S3Turbo/ETATS_CHAUDIERE/state"
@@ -113,6 +96,8 @@ entities:
precision: 0 precision: 0
value_map: "FurnaceStatus" value_map: "FurnaceStatus"
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "T° Chaudiere" - name: "T° Chaudiere"
unique_id: "temperature_chaudiere_sensor" unique_id: "temperature_chaudiere_sensor"
type: "sensor" type: "sensor"
@@ -129,6 +114,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "T° Fumee" - name: "T° Fumee"
unique_id: "temperature_fumee_sensor" unique_id: "temperature_fumee_sensor"
type: "sensor" type: "sensor"
@@ -145,6 +132,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "T° Board" - name: "T° Board"
unique_id: "temperature_board_sensor" unique_id: "temperature_board_sensor"
type: "sensor" type: "sensor"
@@ -161,6 +150,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "O2 residuel" - name: "O2 residuel"
unique_id: "o2_residuel_sensor" unique_id: "o2_residuel_sensor"
type: "sensor" type: "sensor"
@@ -177,6 +168,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "T° Exterieur" - name: "T° Exterieur"
unique_id: "temp_exterieur_sensor" unique_id: "temp_exterieur_sensor"
type: "sensor" type: "sensor"
@@ -193,6 +186,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: true signed: true
refresh: 1
refresh_unit: "m"
- name: "Porte chaudiere" - name: "Porte chaudiere"
unique_id: "porte_chaudiere_sensor" unique_id: "porte_chaudiere_sensor"
type: "binary_sensor" type: "binary_sensor"
@@ -209,6 +204,8 @@ entities:
offset: 10001 offset: 10001
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Air primaire" - name: "Air primaire"
@@ -226,6 +223,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Air Secondaire" - name: "Air Secondaire"
unique_id: "air_secondaire_sensor" unique_id: "air_secondaire_sensor"
type: "sensor" type: "sensor"
@@ -241,6 +240,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Vitesse ventilateur" - name: "Vitesse ventilateur"
unique_id: "vitesse_ventilateur_sensor" unique_id: "vitesse_ventilateur_sensor"
type: "sensor" type: "sensor"
@@ -256,6 +257,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Commande tirage" - name: "Commande tirage"
unique_id: "commande_tirage_sensor" unique_id: "commande_tirage_sensor"
type: "sensor" type: "sensor"
@@ -271,6 +274,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Consigne T° fumée" - name: "Consigne T° fumée"
unique_id: "consigne_temperature_fumee_sensor" unique_id: "consigne_temperature_fumee_sensor"
type: "sensor" type: "sensor"
@@ -287,6 +292,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Consigne T° chauffage" - name: "Consigne T° chauffage"
unique_id: "consigne_temperature_chauffage_sensor" unique_id: "consigne_temperature_chauffage_sensor"
type: "sensor" type: "sensor"
@@ -303,6 +310,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Heure fonctionnement" - name: "Heure fonctionnement"
unique_id: "heure_fonctionnement_sensor" unique_id: "heure_fonctionnement_sensor"
type: "sensor" type: "sensor"
@@ -318,6 +327,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Heure de maintien de feu" - name: "Heure de maintien de feu"
unique_id: "heure_maintien_de_feu_sensor" unique_id: "heure_maintien_de_feu_sensor"
type: "sensor" type: "sensor"
@@ -333,6 +344,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Heure chauffage" - name: "Heure chauffage"
unique_id: "heure_chauffage_sensor" unique_id: "heure_chauffage_sensor"
type: "sensor" type: "sensor"
@@ -348,6 +361,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Tampon Haut" - name: "Tampon Haut"
unique_id: "tampon_haut_sensor" unique_id: "tampon_haut_sensor"
type: "sensor" type: "sensor"
@@ -364,6 +379,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Tampon Bas" - name: "Tampon Bas"
unique_id: "tampon_bas_sensor" unique_id: "tampon_bas_sensor"
type: "sensor" type: "sensor"
@@ -380,6 +397,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "T° depart chauffage" - name: "T° depart chauffage"
unique_id: "temperature_depart_chauffage_sensor" unique_id: "temperature_depart_chauffage_sensor"
type: "sensor" type: "sensor"
@@ -396,6 +415,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Charge tampon" - name: "Charge tampon"
unique_id: "charge_tampon_sensor" unique_id: "charge_tampon_sensor"
type: "sensor" type: "sensor"
@@ -411,21 +432,25 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
# Digital Outputs refresh: 1
# Function: Read Coil Status (FC=01) refresh_unit: "m"
- name: pompe circuit chauffage" - name: "pompe accumulateur"
unique_id: "pompe_circuit_chauffage" unique_id: "pompe_accu"
type: "binary_sensor" type: "sensor"
device_class: "running" device_class: "battery"
icon: "mdi:pump" state_class: "measurement"
state_topic: "froling/S3Turbo/pump_chauffage/state" icon: "mdi:percent-circle-outline"
value_template: "{{ value_json.charge_tampon_sensor }}" unit_of_measurement: "%"
input_type: "coil" state_topic: "froling/S3Turbo/pompe_accu/state"
address: 30226 value_template: "{{ value_json.pompe_accu }}"
input_type: "input_register"
address: 30141
offset: 30001 offset: 30001
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
# Digital Outputs # Digital Outputs
# Function: Read Coil Status (FC=01) # Function: Read Coil Status (FC=01)
- name: pompe circuit_chauffage - name: pompe circuit_chauffage
@@ -440,6 +465,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: cc_melangeur_ouvert - name: cc_melangeur_ouvert
unique_id: "cc_melangeur_ouvert" unique_id: "cc_melangeur_ouvert"
type: "binary_sensor" type: "binary_sensor"
@@ -452,6 +479,8 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: cc_melangeur_ferme - name: cc_melangeur_ferme
unique_id: "cc_melangeur_ferme" unique_id: "cc_melangeur_ferme"
type: "binary_sensor" type: "binary_sensor"
@@ -464,4 +493,7 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"

View File

@@ -1,4 +1,4 @@
version: "3.8" #version: "3.8"
services: services:
app: app:

View File

@@ -15,8 +15,17 @@ modbus:
port: 502 port: 502
unit_id: 2 unit_id: 2
timeout: 30 timeout: 30
reconnect_min: 1
reconnect_max: 30
refresh_rate: 10 # Fréquence d'actualisation en secondes refresh_rate: 10 # Fréquence d'actualisation en secondes
log_level: "INFO" # DEBUG, INFO, WARNING, ERROR
log_file: "" # Exemple: "/usr/src/app/logs/app.log"
error_backoff_max: 300 # secondes, limite max apres erreurs Modbus
publish_on_change_only: true
metrics_interval: 60 # secondes
buffer_volume_l: 1000
buffer_min_temp_c: 25
device: device:
identifiers: "FrolingS3" # Remplacer par l'identifiant unique du dispositif identifiers: "FrolingS3" # Remplacer par l'identifiant unique du dispositif
@@ -78,6 +87,9 @@ entities:
precision: 1 precision: 1
value_map: "SystemStatus" value_map: "SystemStatus"
signed: false signed: false
refresh: 5
refresh_unit: "m"
- name: "Etat Chaudiere" - name: "Etat Chaudiere"
unique_id: "etats_chaudiere_sensor" unique_id: "etats_chaudiere_sensor"
type: "sensor" type: "sensor"
@@ -94,6 +106,9 @@ entities:
precision: 0 precision: 0
value_map: "FurnaceStatus" value_map: "FurnaceStatus"
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "T° Chaudiere" - name: "T° Chaudiere"
unique_id: "temperature_chaudiere_sensor" unique_id: "temperature_chaudiere_sensor"
type: "sensor" type: "sensor"
@@ -110,6 +125,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "T° Fumee" - name: "T° Fumee"
unique_id: "temperature_fumee_sensor" unique_id: "temperature_fumee_sensor"
type: "sensor" type: "sensor"
@@ -126,6 +144,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "T° Board" - name: "T° Board"
unique_id: "temperature_board_sensor" unique_id: "temperature_board_sensor"
type: "sensor" type: "sensor"
@@ -142,6 +163,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "O2 residuel" - name: "O2 residuel"
unique_id: "o2_residuel_sensor" unique_id: "o2_residuel_sensor"
type: "sensor" type: "sensor"
@@ -158,6 +182,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "m"
- name: "T° Exterieur" - name: "T° Exterieur"
unique_id: "temp_exterieur_sensor" unique_id: "temp_exterieur_sensor"
type: "sensor" type: "sensor"
@@ -174,6 +201,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: true signed: true
refresh: 1
refresh_unit: "m"
- name: "Porte chaudiere" - name: "Porte chaudiere"
unique_id: "porte_chaudiere_sensor" unique_id: "porte_chaudiere_sensor"
type: "binary_sensor" type: "binary_sensor"
@@ -190,6 +220,8 @@ entities:
offset: 10001 offset: 10001
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "s"
- name: "Air primaire" - name: "Air primaire"
@@ -207,6 +239,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Air Secondaire" - name: "Air Secondaire"
unique_id: "air_secondaire_sensor" unique_id: "air_secondaire_sensor"
type: "sensor" type: "sensor"
@@ -222,6 +257,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Vitesse ventilateur" - name: "Vitesse ventilateur"
unique_id: "vitesse_ventilateur_sensor" unique_id: "vitesse_ventilateur_sensor"
type: "sensor" type: "sensor"
@@ -237,6 +275,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 5
refresh_unit: "5"
- name: "Commande tirage" - name: "Commande tirage"
unique_id: "commande_tirage_sensor" unique_id: "commande_tirage_sensor"
type: "sensor" type: "sensor"
@@ -252,6 +293,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Consigne T° fumée" - name: "Consigne T° fumée"
unique_id: "consigne_temperature_fumee_sensor" unique_id: "consigne_temperature_fumee_sensor"
type: "sensor" type: "sensor"
@@ -268,6 +312,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "m"
- name: "Consigne T° chauffage" - name: "Consigne T° chauffage"
unique_id: "consigne_temperature_chauffage_sensor" unique_id: "consigne_temperature_chauffage_sensor"
type: "sensor" type: "sensor"
@@ -284,6 +331,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "m"
- name: "Heure fonctionnement" - name: "Heure fonctionnement"
unique_id: "heure_fonctionnement_sensor" unique_id: "heure_fonctionnement_sensor"
type: "sensor" type: "sensor"
@@ -299,6 +349,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 50
refresh_unit: "m"
- name: "Heure de maintien de feu" - name: "Heure de maintien de feu"
unique_id: "heure_maintien_de_feu_sensor" unique_id: "heure_maintien_de_feu_sensor"
type: "sensor" type: "sensor"
@@ -314,6 +367,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 50
refresh_unit: "m"
- name: "Heure chauffage" - name: "Heure chauffage"
unique_id: "heure_chauffage_sensor" unique_id: "heure_chauffage_sensor"
type: "sensor" type: "sensor"
@@ -329,6 +385,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 50
refresh_unit: "m"
- name: "Tampon Haut" - name: "Tampon Haut"
unique_id: "tampon_haut_sensor" unique_id: "tampon_haut_sensor"
type: "sensor" type: "sensor"
@@ -345,6 +404,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Tampon Bas" - name: "Tampon Bas"
unique_id: "tampon_bas_sensor" unique_id: "tampon_bas_sensor"
type: "sensor" type: "sensor"
@@ -361,6 +423,26 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 1
refresh_unit: "m"
- name: "Stock energie"
unique_id: "stock_energie_sensor"
type: "sensor"
device_class: "energy"
state_class: "measurement"
icon: "mdi:flash"
unit_of_measurement: "kWh"
state_topic: "froling/S3Turbo/STOCK_ENERGIE/state"
input_type: "computed"
formula: "buffer_energy_kwh"
sources:
- "tampon_haut_sensor"
- "tampon_bas_sensor"
precision: 2
refresh: 1
refresh_unit: "m"
- name: "T° depart chauffage" - name: "T° depart chauffage"
unique_id: "temperature_depart_chauffage_sensor" unique_id: "temperature_depart_chauffage_sensor"
type: "sensor" type: "sensor"
@@ -377,6 +459,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "s"
- name: "Charge tampon" - name: "Charge tampon"
unique_id: "charge_tampon_sensor" unique_id: "charge_tampon_sensor"
type: "sensor" type: "sensor"
@@ -392,6 +477,27 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 10
refresh_unit: "m"
- name: "pompe accumulateur"
unique_id: "pompe_accu"
type: "sensor"
device_class: "battery"
state_class: "measurement"
icon: "mdi:percent-circle-outline"
unit_of_measurement: "%"
state_topic: "froling/S3Turbo/pompe_accu/state"
value_template: "{{ value_json.pompe_accu }}"
input_type: "input_register"
address: 30141
offset: 30001
precision: 0
value_map: null
signed: false
refresh: 2
refresh_unit: "s"
# Digital Outputs # Digital Outputs
# Function: Read Coil Status (FC=01) # Function: Read Coil Status (FC=01)
- name: pompe circuit_chauffage - name: pompe circuit_chauffage
@@ -406,6 +512,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 2
refresh_unit: "s"
- name: cc_melangeur_ouvert - name: cc_melangeur_ouvert
unique_id: "cc_melangeur_ouvert" unique_id: "cc_melangeur_ouvert"
type: "binary_sensor" type: "binary_sensor"
@@ -418,6 +527,9 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 2
refresh_unit: "s"
- name: cc_melangeur_ferme - name: cc_melangeur_ferme
unique_id: "cc_melangeur_ferme" unique_id: "cc_melangeur_ferme"
type: "binary_sensor" type: "binary_sensor"
@@ -430,6 +542,6 @@ entities:
precision: 0 precision: 0
value_map: null value_map: null
signed: false signed: false
refresh: 2
refresh_unit: "s"

View File

@@ -2,6 +2,7 @@ import os
import time import time
import yaml import yaml
import json import json
import logging
import paho.mqtt.client as mqtt import paho.mqtt.client as mqtt
from pyModbusTCP.client import ModbusClient from pyModbusTCP.client import ModbusClient
@@ -46,8 +47,98 @@ def load_config():
config = yaml.safe_load(file) config = yaml.safe_load(file)
return config return config
def setup_logging(level_name, log_file=None):
level = getattr(logging, str(level_name).upper(), logging.INFO)
logger = logging.getLogger("modbus2mqtt")
logger.setLevel(level)
if not logger.handlers:
formatter = logging.Formatter("%(asctime)s %(levelname)s %(message)s")
console_handler = logging.StreamHandler()
console_handler.setFormatter(formatter)
logger.addHandler(console_handler)
if log_file:
file_handler = logging.FileHandler(log_file)
file_handler.setFormatter(formatter)
logger.addHandler(file_handler)
return logger
def get_entity_refresh_seconds(entity, default_seconds):
"""
Return refresh interval in seconds for an entity.
Accepts per-entity "refresh" with "refresh_unit" ("s" or "m").
Falls back to default_seconds when missing or invalid.
"""
refresh_value = entity.get('refresh')
if refresh_value is None:
return default_seconds
try:
refresh_value = float(refresh_value)
except (TypeError, ValueError):
return default_seconds
if refresh_value <= 0:
return default_seconds
unit = str(entity.get('refresh_unit', 's')).lower()
if unit in ('m', 'min', 'minute', 'minutes'):
return refresh_value * 60
return refresh_value
def get_entity_interval_seconds(entity, default_seconds, error_count, error_backoff_max):
base_interval = get_entity_refresh_seconds(entity, default_seconds)
if error_count <= 0:
return base_interval
backoff = base_interval * (2 ** min(error_count, 5))
if backoff > error_backoff_max:
backoff = error_backoff_max
return backoff
def get_state_topic(entity, config):
return entity.get('state_topic') or f"{config['mqtt']['topic_prefix']}/{entity['name']}/state"
def apply_value_map(value, entity, config, logger):
map_name = entity.get('value_map')
if not map_name:
return value
value_maps = config.get('value_maps', {})
mapping = value_maps.get(map_name)
if mapping is None:
logger.error("Value map not found: %s", map_name)
return value
return mapping.get(value, value)
def compute_buffer_energy_kwh(avg_temp_c, volume_l, min_temp_c):
if avg_temp_c <= min_temp_c:
return 0.0
delta = avg_temp_c - min_temp_c
# 1L water ~ 1kg, 4.186 kJ/kgK => kWh = kJ / 3600
return (volume_l * delta * 4.186) / 3600.0
def validate_config(config, logger):
required_top = ['mqtt', 'modbus', 'homeassistant', 'entities']
for key in required_top:
if key not in config:
raise ValueError(f"Missing config section: {key}")
seen_ids = set()
valid_types = {'input_register', 'holding_register', 'input_status', 'coil', 'computed'}
for entity in config['entities']:
for field in ['name', 'unique_id', 'input_type']:
if field not in entity:
raise ValueError(f"Missing entity field: {field}")
if entity['input_type'] != 'computed':
for field in ['address', 'offset']:
if field not in entity:
raise ValueError(f"Missing entity field: {field}")
else:
if 'formula' not in entity or 'sources' not in entity:
raise ValueError(f"Missing computed entity fields: {entity['unique_id']}")
if entity['unique_id'] in seen_ids:
raise ValueError(f"Duplicate unique_id: {entity['unique_id']}")
seen_ids.add(entity['unique_id'])
if entity['input_type'] not in valid_types:
raise ValueError(f"Invalid input_type: {entity['input_type']}")
logger.info("Configuration validated (%d entities).", len(config['entities']))
# Ajout de la fonction de publication des messages de découverte # Ajout de la fonction de publication des messages de découverte
def publish_discovery_messages(client, config): def publish_discovery_messages(client, config, logger):
base_topic = config['homeassistant']['discovery_prefix'] base_topic = config['homeassistant']['discovery_prefix']
node_id = config['homeassistant']['node_id'] node_id = config['homeassistant']['node_id']
device_info = config['device'] device_info = config['device']
@@ -62,12 +153,10 @@ def publish_discovery_messages(client, config):
"unique_id": entity['unique_id'], "unique_id": entity['unique_id'],
"device": device_info, "device": device_info,
"name": entity['name'], "name": entity['name'],
"state_topic": f"{config['mqtt']['topic_prefix']}/{entity['name']}/state", "state_topic": get_state_topic(entity, config),
#"value_template": "{{ value }}", # Ajouter une value_template si nécessaire #"value_template": "{{ value }}", # Ajouter une value_template si nécessaire
#"device_class": entity.get('device_class', ''), #"device_class": entity.get('device_class', ''),
#"unit_of_measurement": entity.get('unit_of_measurement', '')# La valeur par défaut est '°C #"unit_of_measurement": entity.get('unit_of_measurement', '')# La valeur par défaut est '°C
} }
if "unit_of_measurement" in entity: message["unit_of_measurement"] = entity["unit_of_measurement"] if "unit_of_measurement" in entity: message["unit_of_measurement"] = entity["unit_of_measurement"]
if "state_class" in entity: message["state_class"] = entity["state_class"] if "state_class" in entity: message["state_class"] = entity["state_class"]
@@ -80,10 +169,12 @@ def publish_discovery_messages(client, config):
"value_template": entity.get('value_template', "{{ value_json.value }}"), # Ajouter une value_template si nécessaire "value_template": entity.get('value_template', "{{ value_json.value }}"), # Ajouter une value_template si nécessaire
}) })
client.publish(discovery_topic, json.dumps(message), qos=1, retain=True) client.publish(discovery_topic, json.dumps(message), qos=1, retain=True)
logger.info("HA discovery published: %s", discovery_topic)
def publish_availability(client, status, config): def publish_availability(client, status, config, logger):
availability_topic = f"{config['mqtt']['topic_prefix']}/availability" availability_topic = f"{config['mqtt']['topic_prefix']}/availability"
client.publish(availability_topic, status, qos=1, retain=True) client.publish(availability_topic, status, qos=1, retain=True)
logger.info("Availability set to %s", status)
def convert_to_signed(value): def convert_to_signed(value):
NEGATIVE_THRESHOLD = 32000 # Seuil pour déterminer les valeurs négatives NEGATIVE_THRESHOLD = 32000 # Seuil pour déterminer les valeurs négatives
@@ -96,22 +187,29 @@ def convert_to_signed(value):
def on_connect(client, userdata, flags, rc): def on_connect(client, userdata, flags, rc):
if rc == 0: if rc == 0:
print("Connected to MQTT Broker!") logger = userdata.get('_logger')
publish_availability(client, 'online', userdata) if logger:
logger.info("Connected to MQTT broker")
publish_availability(client, 'online', userdata, logger)
# Si l'autodiscovery est activé, publiez les messages de découverte # Si l'autodiscovery est activé, publiez les messages de découverte
if userdata['homeassistant']['autodiscovery']: if userdata['homeassistant']['autodiscovery']:
publish_discovery_messages(client, userdata) publish_discovery_messages(client, userdata, logger)
else: else:
print("Failed to connect, return code %d\n", rc) logger = userdata.get('_logger')
if logger:
logger.error("Failed to connect to MQTT broker (rc=%d)", rc)
def on_publish(client, userdata, result): def on_publish(client, userdata, result):
print(f"Data published to MQTT. Result: {result}") logger = userdata.get('_logger')
if logger:
logger.debug("MQTT publish result: %s", result)
def connect_modbus(host, port, unit_id, timeout): def connect_modbus(host, port, unit_id, timeout, logger, client=None):
""" """
Connect to the Modbus server. Connect to the Modbus server.
""" """
if client is None:
client = ModbusClient() client = ModbusClient()
client.host = host # Set the host client.host = host # Set the host
client.port = port # Set the port client.port = port # Set the port
@@ -120,13 +218,13 @@ def connect_modbus(host, port, unit_id, timeout):
connection = client.open() connection = client.open()
if connection: if connection:
print(f"Connected to Modbus server at {host}:{port}") logger.info("Connected to Modbus server at %s:%s", host, port)
else: else:
print(f"Failed to connect to Modbus server at {host}:{port}") logger.error("Failed to connect to Modbus server at %s:%s", host, port)
return client return client if connection else None
def read_modbus_registers(client, input_type, address, offset, scale=1, precision=0, count=1): def read_modbus_registers(client, input_type, address, offset, scale=1, precision=0, count=1, logger=None):
""" """
Read registers from the Modbus server. Read registers from the Modbus server.
@@ -140,57 +238,215 @@ def read_modbus_registers(client, input_type, address, offset, scale=1, precisio
:param value_map: a dictionary mapping register values to human-readable strings :param value_map: a dictionary mapping register values to human-readable strings
:return: the scaled and rounded value of the registers or None if an error occurs :return: the scaled and rounded value of the registers or None if an error occurs
""" """
print(f"Address from call: {address}, Offset from call: {offset}")
# Calculate the real address by subtracting the offset # Calculate the real address by subtracting the offset
real_address = address - offset real_address = address - offset
print(f"Preparing to read Modbus registers with initial address: {address}, " if logger:
f"offset: {offset}, resulting in real address: {real_address}") logger.info(
"Modbus read: type=%s address=%s offset=%s real=%s count=%s",
input_type,
address,
offset,
real_address,
count,
)
try: try:
regs = None regs = None
if input_type == 'input_register': if input_type == 'input_register':
# Use FC04 to read input registers # Use FC04 to read input registers
print(f"Reading {count} input register(s) starting at initial address {address} "
f"(real address {real_address}, offset {offset}) using FC04")
regs = client.read_input_registers(real_address, count) regs = client.read_input_registers(real_address, count)
elif input_type == 'holding_register': elif input_type == 'holding_register':
# Use FC03 to read holding registers # Use FC03 to read holding registers
print(f"Reading {count} holding register(s) starting at initial address {address} "
f"(real address {real_address}, offset {offset}) using FC03")
regs = client.read_holding_registers(real_address, count) regs = client.read_holding_registers(real_address, count)
elif input_type == 'input_status': elif input_type == 'input_status':
# Use FC02 to read discrete inputs # Use FC02 to read discrete inputs
print(f"Reading {count} input status(es) starting at initial address {address} "
f"(real address {real_address}, offset {offset}) using FC02")
regs = client.read_discrete_inputs(real_address, count) regs = client.read_discrete_inputs(real_address, count)
elif input_type == 'coil': elif input_type == 'coil':
# Use FC01 to read coils # Use FC01 to read coils
print(f"Reading {count} coil(s) starting at initial address {address} "
f"(real address {real_address}, offset {offset}) using FC01")
regs = client.read_coils(real_address, count) regs = client.read_coils(real_address, count)
else: else:
print(f"Invalid input type: {input_type}") if logger:
logger.error("Invalid input type: %s", input_type)
return None return None
if regs is not None: if regs is not None:
# Apply scaling and rounding to the read values # Apply scaling and rounding to the read values
scaled_and_rounded_regs = [round(reg * scale, precision) for reg in regs] scaled_and_rounded_regs = [round(reg * scale, precision) for reg in regs]
print(f"Read successful: {scaled_and_rounded_regs}") if logger:
logger.info("Modbus read success: %s", scaled_and_rounded_regs)
return scaled_and_rounded_regs return scaled_and_rounded_regs
else: else:
print(f"Read error at initial address {address} (real address {real_address}, offset {offset})") if logger:
logger.error(
"Modbus read error at address=%s (real=%s offset=%s)",
address,
real_address,
offset,
)
return None return None
except Exception as e: except Exception as e:
print(f"Modbus read error at initial address {address} (real address {real_address}, offset {offset}): {e}") if logger:
logger.exception(
"Modbus read exception at address=%s (real=%s offset=%s): %s",
address,
real_address,
offset,
e,
)
return None return None
def read_modbus_raw(client, input_type, real_address, count, logger=None):
if logger:
logger.info(
"Modbus batch read: type=%s start=%s count=%s",
input_type,
real_address,
count,
)
try:
if input_type == 'input_register':
return client.read_input_registers(real_address, count)
if input_type == 'holding_register':
return client.read_holding_registers(real_address, count)
if input_type == 'input_status':
return client.read_discrete_inputs(real_address, count)
if input_type == 'coil':
return client.read_coils(real_address, count)
except Exception as e:
if logger:
logger.exception(
"Modbus batch exception at start=%s count=%s: %s",
real_address,
count,
e,
)
return None
if logger:
logger.error("Invalid input type for batch read: %s", input_type)
return None
def _process_batch(
modbus_client,
input_type,
batch,
now,
logger,
mqtt_client,
config,
last_read_at,
error_counts,
last_values,
publish_on_change_only,
metrics,
):
start_address = batch[0][0]
end_address = batch[-1][0]
count = end_address - start_address + 1
regs = read_modbus_raw(modbus_client, input_type, start_address, count, logger=logger)
if regs is None:
for _, entity in batch:
uid = entity['unique_id']
error_counts[uid] = error_counts.get(uid, 0) + 1
last_read_at[uid] = now
metrics['errors'] += 1
return
for real_address, entity in batch:
uid = entity['unique_id']
idx = real_address - start_address
if idx < 0 or idx >= len(regs):
logger.error("Batch index out of range for %s", entity['name'])
error_counts[uid] = error_counts.get(uid, 0) + 1
last_read_at[uid] = now
metrics['errors'] += 1
continue
value = regs[idx]
scale = entity.get('scale', 1)
precision = entity.get('precision', 0)
value = round(value * scale, precision)
if entity.get('signed', False):
value = convert_to_signed(value)
logger.info("Signed conversion: %s", value)
value = apply_value_map(value, entity, config, logger)
metrics['reads'] += 1
previous_value = last_values.get(uid)
last_values[uid] = value
if publish_on_change_only and previous_value == value:
logger.debug("Unchanged value for %s, skip publish", entity['name'])
last_read_at[uid] = now
error_counts[uid] = 0
continue
topic = get_state_topic(entity, config)
mqtt_client.publish(topic, value)
logger.info("MQTT publish: %s => %s", topic, value)
last_read_at[uid] = now
error_counts[uid] = 0
metrics['published'] += 1
def _process_computed(
entity,
now,
logger,
mqtt_client,
config,
last_read_at,
last_values,
publish_on_change_only,
metrics,
):
sources = entity.get('sources', [])
source_values = []
for source_id in sources:
if source_id not in last_values:
logger.debug("Computed %s missing source %s", entity['name'], source_id)
return
source_values.append(last_values[source_id])
if entity.get('formula') == 'buffer_energy_kwh':
volume_l = float(config.get('buffer_volume_l', 1000))
min_temp_c = float(config.get('buffer_min_temp_c', 25))
avg_temp_c = sum(source_values) / len(source_values)
value = compute_buffer_energy_kwh(avg_temp_c, volume_l, min_temp_c)
else:
logger.error("Unknown formula for %s", entity['name'])
return
precision = entity.get('precision', 2)
value = round(value, precision)
metrics['reads'] += 1
previous_value = last_values.get(entity['unique_id'])
last_values[entity['unique_id']] = value
if publish_on_change_only and previous_value == value:
logger.debug("Unchanged value for %s, skip publish", entity['name'])
last_read_at[entity['unique_id']] = now
return
topic = get_state_topic(entity, config)
mqtt_client.publish(topic, value)
logger.info("MQTT publish: %s => %s", topic, value)
last_read_at[entity['unique_id']] = now
metrics['published'] += 1
def main(): def main():
config = load_config() config = load_config()
logger = setup_logging(config.get('log_level', 'INFO'), config.get('log_file'))
config['_logger'] = logger
validate_config(config, logger)
refresh_rate = config['refresh_rate'] # Obtenez la fréquence d'actualisation depuis la configuration refresh_rate = config['refresh_rate'] # Obtenez la fréquence d'actualisation depuis la configuration
default_refresh_seconds = float(refresh_rate)
error_backoff_max = float(config.get('error_backoff_max', 300))
metrics_interval = float(config.get('metrics_interval', 60))
metrics_topic = f"{config['mqtt']['topic_prefix']}/metrics"
start_time = time.monotonic()
next_metrics_at = time.monotonic() + metrics_interval
# Configuration MQTT # Configuration MQTT
mqtt_config = config['mqtt'] mqtt_config = config['mqtt']
@@ -200,11 +456,20 @@ def main():
mqtt_client.on_publish = on_publish mqtt_client.on_publish = on_publish
if mqtt_config['user'] and mqtt_config['password']: if mqtt_config['user'] and mqtt_config['password']:
mqtt_client.username_pw_set(mqtt_config['user'], mqtt_config['password']) mqtt_client.username_pw_set(mqtt_config['user'], mqtt_config['password'])
availability_topic = f"{config['mqtt']['topic_prefix']}/availability"
mqtt_client.will_set(availability_topic, 'offline', qos=1, retain=True)
mqtt_client.connect(mqtt_config['host'], mqtt_config['port'], 60) mqtt_client.connect(mqtt_config['host'], mqtt_config['port'], 60)
mqtt_client.loop_start() mqtt_client.loop_start()
while True: last_read_at = {}
error_counts = {}
last_values = {}
metrics = {"reads": 0, "errors": 0, "published": 0}
modbus_client = None modbus_client = None
reconnect_delay = float(config.get('modbus', {}).get('reconnect_min', 1))
reconnect_max = float(config.get('modbus', {}).get('reconnect_max', 30))
publish_on_change_only = bool(config.get('publish_on_change_only', True))
while True:
try: try:
# etape de connexion Modbus # etape de connexion Modbus
# Affichage de la configuration pour la vérification # Affichage de la configuration pour la vérification
@@ -226,53 +491,157 @@ def main():
# Étape de connexion Modbus # Étape de connexion Modbus
modbus_client = connect_modbus(config['modbus']['host'], if modbus_client is None:
modbus_client = connect_modbus(
config['modbus']['host'],
config['modbus']['port'], config['modbus']['port'],
config['modbus']['unit_id'], config['modbus']['unit_id'],
config['modbus']['timeout']) config['modbus']['timeout'],
logger,
)
if modbus_client is None:
logger.warning("Modbus reconnect in %.1fs", reconnect_delay)
time.sleep(reconnect_delay)
reconnect_delay = min(reconnect_delay * 2, reconnect_max)
continue
reconnect_delay = float(config.get('modbus', {}).get('reconnect_min', 1))
elif not modbus_client.open():
logger.error("Failed to reconnect to the Modbus server.")
modbus_client = None
logger.warning("Modbus reconnect in %.1fs", reconnect_delay)
time.sleep(reconnect_delay)
reconnect_delay = min(reconnect_delay * 2, reconnect_max)
continue
if modbus_client: if modbus_client:
# Lecture des registres Modbus pour chaque entité # Lecture des registres Modbus pour chaque entité
now = time.monotonic()
due_entities = []
due_computed = []
for entity in config['entities']: for entity in config['entities']:
input_type = entity['input_type'] uid = entity['unique_id']
address = entity['address'] error_count = error_counts.get(uid, 0)
offset = entity['offset'] # Utilisation de l'offset de l'entité interval_seconds = get_entity_interval_seconds(
scale = entity.get('scale', 1) entity,
precision = entity.get('precision', 0) # Utilisation de get pour une valeur par défaut si non présente default_refresh_seconds,
count = 1 # Supposons que count est toujours 1 pour simplifier, à ajuster si nécessaire error_count,
error_backoff_max,
# Affichage de l'entité et de l'offset avant la lecture )
print(f"Reading entity: {entity['name']} with offset: {offset}") last_ts = last_read_at.get(uid, 0)
# Lecture des registres avec l'offset correct if now - last_ts < interval_seconds:
regs = read_modbus_registers(modbus_client, input_type, address, offset, scale, precision, count) logger.debug(
if regs is not None: "Skip entity %s (next in %.1fs)",
entity['name'],
value = regs[0] # Prend la première valeur si plusieurs registres sont lus interval_seconds - (now - last_ts),
if entity.get('signed', False): )
value = convert_to_signed(value) continue
print(f"Valeur convertie en signée: {value}") if entity['input_type'] == 'computed':
due_computed.append(entity)
# Publier la valeur lue sur le topic MQTT approprié
topic = f"{config['mqtt']['topic_prefix']}/{entity['name']}/state"
mqtt_client.publish(topic, value)
print(f"Published value for {entity['name']}: {value}")
else: else:
print(f"Failed to read value for {entity['name']}") due_entities.append(entity)
# Assurez-vous de fermer la connexion Modbus lorsque vous avez terminé grouped = {}
modbus_client.close() for entity in due_entities:
real_address = entity['address'] - entity['offset']
key = entity['input_type']
grouped.setdefault(key, []).append((real_address, entity))
for input_type, items in grouped.items():
items.sort(key=lambda x: x[0])
batch = []
prev_addr = None
for real_address, entity in items:
if prev_addr is None or real_address == prev_addr + 1:
batch.append((real_address, entity))
else: else:
print("Failed to connect to the Modbus server.") _process_batch(
modbus_client,
input_type,
batch,
now,
logger,
mqtt_client,
config,
last_read_at,
error_counts,
last_values,
publish_on_change_only,
metrics,
)
batch = [(real_address, entity)]
prev_addr = real_address
if batch:
_process_batch(
modbus_client,
input_type,
batch,
now,
logger,
mqtt_client,
config,
last_read_at,
error_counts,
last_values,
publish_on_change_only,
metrics,
)
for entity in due_computed:
_process_computed(
entity,
now,
logger,
mqtt_client,
config,
last_read_at,
last_values,
publish_on_change_only,
metrics,
)
# Calcule le prochain delai selon les entites
now = time.monotonic()
if now >= next_metrics_at:
payload = {
"reads": metrics["reads"],
"errors": metrics["errors"],
"published": metrics["published"],
"entities": len(config['entities']),
"uptime_s": int(now - start_time),
}
mqtt_client.publish(metrics_topic, json.dumps(payload))
logger.info("MQTT metrics: %s", payload)
next_metrics_at = now + metrics_interval
next_sleep = None
for entity in config['entities']:
uid = entity['unique_id']
error_count = error_counts.get(uid, 0)
interval_seconds = get_entity_interval_seconds(
entity,
default_refresh_seconds,
error_count,
error_backoff_max,
)
last_ts = last_read_at.get(entity['unique_id'], 0)
due_in = interval_seconds - (now - last_ts)
if due_in < 0:
due_in = 0
if next_sleep is None or due_in < next_sleep:
next_sleep = due_in
if next_sleep is None:
next_sleep = default_refresh_seconds
if next_sleep < 0.1:
next_sleep = 0.1
logger.info("Sleeping for %.1fs until next due entity.", next_sleep)
time.sleep(next_sleep)
continue
except Exception as e: except Exception as e:
print(f"An error occurred: {e}") logger.exception("An error occurred: %s", e)
publish_availability(mqtt_client, 'offline', config) publish_availability(mqtt_client, 'offline', config, logger)
finally:
if modbus_client: if modbus_client:
modbus_client.close() modbus_client.close()
modbus_client = None
# Attendez la fréquence d'actualisation avant de recommencer time.sleep(default_refresh_seconds)
print(f"Waiting for {refresh_rate} seconds before the next update.")
time.sleep(refresh_rate)
if __name__ == '__main__': if __name__ == '__main__':
main() main()