Mobile robots, AGVs, inspection drones, autonomous boats and unmanned platforms. Eldric carries the kernel on-board so perception, voice command, and recovery decisions keep working when the radio link drops. Soft real-time, edge-deployed, no cloud dependency.
A factory AGV behind a steel rack. An inspection drone inside a turbine hall. A pipeline crawler in a steel pipe. A maritime vessel below deck. The shared property: the radio link is intermittent or absent, and the robot still needs to perceive, decide, and act.
Camera frames, LiDAR scans, IMU streams flow through native inferenced. Object detection, anomaly detection, classification run on the robot's compute box — no round-trip to a base station.
STT in, command parsed, action executed, TTS out — all on board. Useful for inspection crews talking to drones, factory operators directing AGVs, field technicians driving service robots.
The xLSTM forecasting daemon (port 8884) handles trajectory prediction, motion planning, and recovery policy on time-series sensor data. Linear in sequence length — better fit than transformers for hours-long telemetry windows.
The IoT worker buffers telemetry locally during disconnect windows and replays to fleet HQ when the link returns. No lost frames, no lost events.
Runs on a Pi 4, an industrial gateway, a Jetson, or any Linux compute box on the robot. Same binary as the data-centre Eldric; only the activated modules differ.
When the link drops, a pre-loaded policy can drive the robot to a safe state. When the link returns, the fleet HQ reconciles. Soft real-time at the line; hard real-time stays the motor controller's job.
AGVs, AMRs, factory floor delivery, parts pickers. Eldric ties into the IoT worker for OPC-UA / Modbus / MQTT, then runs perception + decision locally. Soft real-time tag-to-action ~50 ms on the demo cluster.
Aerial inspection drones (turbine halls, transmission lines), pipeline crawlers, sewer-inspection rovers. Long disconnect windows. xLSTM forecasting predicts when the robot will lose contact and pre-plans the recovery path.
Hospital delivery robots, hotel concierge bots, retail floor assistants. Voice-in / voice-out, knowledge-base lookup (the duty roster, the policy manual), and the AI inbox to escalate to a human when needed.
On-board inference for ADAS-adjacent workloads, fleet-level forecasting, voice interface for operators. Hard real-time control stays with the dedicated controller; Eldric handles the cognitive layer above it.
What Eldric does NOT do on a robot:
What it does well: the cognitive layer of an autonomous platform — perception, voice, decision-making, telemetry summarisation, fleet-level coordination — all on the robot, with sane degradation when the link to base drops.