Software-defined autonomy for industrial operations.
A four-layer autonomy stack where the runtime is fixed and the models are bespoke. The same BT engine, safety monitor, and SLAM frontend powers warehouse AMRs, grid inspection drones, and manufacturing cobots — configured per vertical, not rebuilt per customer.
Architecture
The stack
Six layers. Fixed runtime. Bespoke models.
Hardware Abstraction
Canonical message types decouple the stack from specific hardware. DJI drones, Skydio, AMRs — same runtime, different adapter plugins.
Perception
BEVFusion pipeline with calibration daemon. Multi-sensor fusion for camera, LiDAR, thermal, and radar inputs.
Localization & Mapping
SLAM, GPS fusion, and HD map. Works indoors, underground, and in GPS-denied environments.
Planning
Behavior tree engine with LLM planner, RRT* path planning, and multi-agent coordination for fleet operations.
Controls
PID/MPC controllers, safety monitor, and trajectory executor. Human-in-the-loop confirmation for all physical parameter changes.
Platform Agent
Fleet heartbeat, OTA model updates, and OpenTelemetry telemetry. Staged canary rollout with auto-rollback.
Deploy autonomy. Configure, don't rebuild.
The same fixed runtime across every vertical. Three model artifacts, three config files, and a hardware adapter — that's a new deployment.
