7 Hardware Tools · 52 TOPS

Software that touches silicon

Seven tools for sensors, power, firmware, and fleet tracking across 5 Raspberry Pis with 2 Hailo-8 AI accelerators.

View ToolsAI Accelerators
5
Raspberry Pis
52
TOPS (2x Hailo-8)
7
Hardware Tools
1TB
NVMe Storage
Tools
Seven hardware management tools
Sensors, power, firmware, fleet tracking — all Python, all local, all running on the hardware they manage.
Sensor Network
I2C and SPI sensor aggregation. Configurable polling intervals, threshold-based alerts, data logging to SQLite.
sensor_network.py
IoT Gateway
MQTT broker bridge for device provisioning and telemetry ingestion. Routes sensor data across the fleet.
iot_gateway.py
Power Manager
CPU governor control (conservative mode), voltage monitoring, thermal throttling detection. Deployed to all nodes via cron.
power_manager.py
Fleet Tracker
Device discovery via mDNS and ARP. Health monitoring with 1-minute heartbeats. Fleet-wide command execution.
fleet_tracker.py
Device Registry
125+ devices cataloged. Layer 7 deep scan for service detection. UUID assignment. Master at ~/roadnet/DEVICE-REGISTRY.md.
device_registry.py
Firmware Updater
OTA firmware updates with rollback support. Version tracking across all nodes. Auto-revert on failure.
firmware_updater.py
Smart Home
eero mesh integration (Thread/Matter). Apple TV AirPlay, Roku API, device automation across the LAN.
smart_home.py
Fleet
Five nodes, all physical
Alice
Pi 400 · 33.6°C · 0.916V · 369M/3.7G RAM · 78% disk · 48+ domains · Pi-hole · PostgreSQL
192.168.4.49
Cecilia
Pi 5 · Hailo-8 · CECE API · TTS · 16 Ollama models · MinIO · UNREACHABLE
192.168.4.96
Octavia
Pi 5 · 32.0°C · 0.781V · 1.4G/7.9G RAM · 68% disk · Hailo-8 · 931G NVMe · Gitea (207 repos) · 4 containers · 11 Ollama models
192.168.4.100
Aria
Pi 5 · 54.0°C · 0.873V · 1.1G/7.9G RAM · 82% disk · Portainer · Headscale · 1 container · 6 Ollama models
192.168.4.98
Lucidia
Pi 5 · 61.2°C · 5.3G/7.9G RAM · 42% disk · 14 containers · 6 Ollama models · 334 web apps
192.168.4.38
AI Accelerators
52 TOPS of edge inference
Two Hailo-8 accelerators on PCIe, 26 TOPS each. Neural network inference at the edge, no cloud.
Cecilia · Hailo-8
HLLWM2B233704667
26 TOPS · /dev/hailo0 verified
PCIe · Pi 5 host
CECE inference · TTS pipeline
Voltage: 0.876V post-optimization
Octavia · Hailo-8
HLLWM2B233704606
26 TOPS · /dev/hailo0 verified
PCIe · Pi 5 host · 1TB NVMe
Ollama inference · Swarm workloads
Voltage: 0.781V (undervoltage)
Related
Go deeper
AI Agents & Ollama
Fleet Security
Cloudflare & Tunnels
Docker Swarm
228 Databases
$136/mo Economics
80+ CLI Tools