Skip to content
Back to Projects
IoTIndustrialEU R&DHardware

LAUDS — Full-Stack IoT Energy Monitoring Platform

Built and deployed a production IoT energy monitoring platform across 3 FabCity Hamburg sites — from custom ESP32 sensor firmware (C++) to Node-RED data pipelines, TimescaleDB, ML-powered analytics, Digital Product Passports, and pre-provisioned Grafana dashboards. 7 Docker services, 30+ monitored 3D printers, zero-touch deployment. Live demo online.

Role: Full-Stack Developer & Site DeployerPeriod: 2024–2025Funding: EU — FabCity Hamburg

3

sites deployed

30+

printers monitored

7

docker services

Auto1 / 6LAUDS — Full-Stack IoT Energy Monitoring PlatformLAUDS — Full-Stack IoT Energy Monitoring PlatformLAUDS — Full-Stack IoT Energy Monitoring PlatformLAUDS — Full-Stack IoT Energy Monitoring PlatformLAUDS — Full-Stack IoT Energy Monitoring PlatformLAUDS — Full-Stack IoT Energy Monitoring Platform

Problem

FabCity Hamburg's urban fabrication labs had zero visibility into their energy consumption. Thirty-plus 3D printers, laser cutters, and workshop tools consumed energy without monitoring or optimization. The labs needed a system that spans the full vertical — from custom sensor hardware to ML-powered analytics — and that non-technical FabLab operators could actually use.

My Role & Responsibilities

  • Built and deployed the full platform across three FabCity sites: JUPITER OpenSpace, Fabulous St. Pauli, and FABRIC
  • Wrote custom ESP32 sensor firmware (C++) with MPU6050 (accelerometer/gyro), MAX6675 (thermocouple), and DHT22 (temperature/humidity) — MQTT publishing, OTA firmware updates, WiFi reconnection handling
  • Integrated Shelly smart plugs for equipment-level power monitoring across 30+ machines
  • Designed 6+ Node-RED data flows — Ingest Shelly MQTT, Manual Model Training, Analysis API, Live Predictor, Fetch Environment, DPP API, ESP Sensors
  • Built a Python Flask API for Digital Product Passports — per-job energy tracking with PDF report generation
  • Developed the ML Worker service for predictive analytics and model training
  • Built the Interactive Analysis engine — configurable deep-dive analytics with device selection, time ranges, and temperature drivers
  • Created pre-provisioned Grafana dashboards — zero manual configuration, everything ready on deploy
  • Integrated Raspberry Pi 4 + OctoPrint to bridge legacy 3D printers to SimplyPrint IoT cloud
  • Containerized the entire platform into 7 Docker services with zero-touch deployment
  • Conducted on-site workshops demonstrating the system to FabLab operators

Architecture

7 containerized services — docker compose up → fully operational platform:

  • Hardware Layer: ESP32 sensor hubs, Shelly plugs, Raspberry Pi + OctoPrint bridge for legacy printers
  • Ingestion Layer: Mosquitto MQTT broker + Node-RED flows (ingest, model training, live predictor, DPP, environment)
  • Backend Layer: Flask API, ML Worker, Interactive Analysis engine
  • Data Layer: PostgreSQL + TimescaleDB for high-volume time-series storage
  • Visualization Layer: pre-provisioned Grafana dashboards + Nginx-hosted UI and CRUD modules
Sensors / Smart Plugs / Printers
              ↓
       MQTT Broker + Node-RED
              ↓
 Flask API + ML Worker + Analysis Engine
              ↓
       PostgreSQL / TimescaleDB
              ↓
      Grafana + Web UI (Nginx)

Tech Stack

  • Hardware: ESP32 (custom sensor firmware in C++), Raspberry Pi 4, Shelly Plus Plugs
  • Sensors: MPU6050 (accelerometer/gyro), MAX6675 (thermocouple), DHT22 (temperature/humidity)
  • Data Pipeline: Node-RED (6+ flows), MQTT (Eclipse Mosquitto), Modbus
  • Backend: Python Flask (DPP API + reports), Python ML Worker (predictive analytics)
  • Database: PostgreSQL + TimescaleDB — time-series optimized with chunking and compression
  • Visualization: Grafana with pre-provisioned dashboards — zero manual config
  • Frontend: HTML/CSS/JavaScript with device management CRUD interface
  • 3D Print Integration: OctoPrint, SimplyPrint Cloud, Prusa Connect APIs
  • Infrastructure: Docker-Compose (7 services), Nginx reverse proxy
  • Language Mix: HTML 65.9%, Python 25.0%, JavaScript 5.2%, C++ 1.2%, PLpgSQL 1.2%

Demo

The live platform is running at lauds-demo.intel50001.com — real-time energy data, sensor readings, ML predictions, and equipment status from the deployed FabLab sites.

Platform Screenshots

Results & Impact

  • 3 FabLab sites fully deployed with real-time monitoring across 30+ machines — this is a production system, not a prototype
  • Full hardware-to-ML vertical — from custom C++ sensor firmware to Python ML models, all in one platform
  • Zero-touch deploymentdocker compose up → 7 services running, dashboards provisioned, flows active, no manual config
  • Digital Product Passports — per-job energy tracking with automated PDF reports, enabling transparency and sustainability documentation per manufactured item
  • Human-centric interfaces — dashboards designed and validated with non-technical FabLab operators through on-site workshops
  • ML-powered predictive analytics — model training, live prediction, and interactive analysis for energy optimization
  • Bridge between legacy equipment and modern IoT — Raspberry Pi + OctoPrint connecting old 3D printers to SimplyPrint cloud

Challenges & Lessons Learned

  • Field deployment vs. lab prototype — sensor hubs needed robust enclosures, reliable WiFi reconnection, and graceful MQTT error recovery. What works on a bench fails in a dusty workshop.
  • Legacy equipment integration — older 3D printers without network capabilities required the Raspberry Pi + OctoPrint bridge, adding complexity but achieving full fleet coverage
  • Node-RED flow complexity — 6+ processing tabs with ML training, live prediction, and DPP generation required careful flow organization and error handling to prevent cascading failures
  • User-centric dashboard design — iterating with non-technical FabLab operators taught me that simpler visualizations with clear action items outperform information-dense dashboards every time

How AI/Agents Were Used

AI-augmented development accelerated every layer: custom VS Code agents generated ESP32 firmware boilerplate (sensor reading, MQTT publishing, error handling) which I validated against hardware specs and tested on physical devices. Node-RED flow designs and Grafana dashboard configurations were generated through agentic workflows, then refined through field testing. The ML Worker itself runs predictive models trained on real sensor data.