RoboResponse EECS C106A · Spring 2026 · UC Berkeley

Introduction

Autonomous multi-robot emergency response

Emergency response operations often demand rapid, precise logistics in dangerous and unstructured environments — scenarios where autonomous robots could dramatically reduce risk to human life. RoboResponse develops a coordinated multi-robot system that demonstrates how a mobile robot and a robotic arm can work in concert to identify hazards, pack resources, and deliver them without human intervention.

A TurtleBot3 uses its camera and LiDAR to detect color-coded hazard markers and map the incident site. Upon detection, it signals a UR7e robotic arm, which uses Inverse Kinematics and vision-based picking to pack the specific resources required for that hazard. Once prepared, the TurtleBot autonomously docks for loading, then navigates back to the hazard zone for delivery — a complete, closed-loop emergency logistics pipeline.

Motivation

First responders frequently operate under high cognitive load in environments with limited situational awareness. A robotic system capable of classifying an emergency and autonomously delivering the correct supplies could free human responders to focus on higher-level decisions. This project demonstrates such a concept as a fully integrated pipeline combining the three pillars of EECS C106A: control, dynamics, and perception.

System Workflow

01

Hazard Detection

TurtleBot3 uses its RealSense camera and OpenCV to detect color-coded hazard markers, classify the emergency type, and estimate its location.

02

Coordination & Resource Mapping

A ROS2 coordination node maps hazard type to required resources and triggers the UR7e packing routine via CycloneDDS messaging.

03

Pick & Pack

The UR7e uses its RealSense camera for pose estimation, then executes a pick-and-place routine with MoveIt2 to pack resources into a delivery box.

04

Docking & Loading

The TurtleBot navigates to a predefined docking position and the UR7e places the prepared resource box onto the TurtleBot's platform.

05

Delivery Mission

TurtleBot navigates back to the hazard zone using NAV2, completing the autonomous emergency response loop.

Design

Hardware & software architecture

Hardware

TurtleBot3

Mobile exploration platform for autonomous navigation, hazard detection, and resource delivery. Equipped with a 2D LiDAR sensor for SLAM-based mapping and a RealSense camera for visual perception.

LiDAR RealSense DYNAMixel

UR7e Robotic Arm

Stationary manipulation platform with parallel gripper. Receives hazard type from TurtleBot, identifies resource objects via camera, and packs them into the delivery box using IK-based motion planning.

Parallel Gripper RealSense MoveIt2

Software Architecture

Hazard Detection
Camera + OpenCV
↓ hazard info
Task Coordination Node
State Machine · ROS2 / CycloneDDS
↙ nav commands ↘ pack command
TurtleBot Navigation
NAV2 · Mapping
TurtleBot Motion
Docking Procedure
UR7 Manipulation Node
MoveIt2
Pick & Pack Resources
Resource Loading
↓ resource delivery

Sensing, Planning & Actuation

Sensing

TurtleBot's 2D LiDAR enables localization and SLAM via SLAM Toolbox. Both robots use RealSense cameras — for hazard detection and object pose estimation respectively.

Planning

TurtleBot path planning uses NAV2 with MPPI/PID. UR7e trajectory planning uses MoveIt2. A state machine coordinates asynchronous task sequencing across both robots.

Actuation

The TurtleBot's DYNAMixel motors execute navigation trajectories. The UR7e's joints and parallel gripper execute pick-and-place motions via Inverse Kinematics.

Communication

Inter-robot messaging uses ROS2 over CycloneDDS middleware. The TurtleBot publishes hazard detections; the UR7e subscribes and triggers packing accordingly.

Implementation

Nodes, stack, and development timeline

Technology Stack

ROS2 NAV2 SLAM Toolbox MoveIt2 OpenCV CycloneDDS Python RealSense SDK ur_robot_driver TurtleBot3 pkgs

ROS2 Node Architecture

NodePlatformResponsibility
hazard_detection_node TurtleBot Detects color-coded hazard markers via OpenCV and publishes type + position
navigation_node TurtleBot Autonomous movement using NAV2 with SLAM-built map
coordination_node Central State machine — maps hazard types to resources, orchestrates task sequencing
object_detection_node UR7e Detects and estimates poses of resource objects via UR7e camera
pick_and_place_node UR7e MoveIt2-based pick-and-place to pack resources into delivery box
docking_node TurtleBot Autonomous docking to reach loading position near UR7e

Development Timeline

Mar 22
System Setup & Communication ROS2 workspace with TurtleBot3, UR7e, RealSense. Multi-robot communication node over CycloneDDS.
Apr 1
TurtleBot Hazard Detection & Navigation OpenCV color-marker detection. NAV2 + LiDAR mapping and autonomous navigation.
Apr 12
UR7e Resource Picking System Object pose estimation pipeline. MoveIt2 pick-and-place for grasping and packing resources.
Apr 19
Multi-Robot Coordination Hazard-to-resource mapping logic. Coordination node linking detection to UR7e packing trigger.
Apr 26
Docking & Resource Loading Autonomous docking to loading position. UR7e places box onto TurtleBot platform.
May 1
Delivery Mission TurtleBot navigates back to hazard zone with loaded resources.
May 6
Testing & Integration Module-level, integration, and end-to-end pipeline testing.

Success Criteria

Baseline

Identify one hazard type, pick one object, load onto TurtleBot, and navigate to the hazard zone.

Full Goal

Multiple hazard types with corresponding resource boxes containing multiple objects — full end-to-end pipeline.

Reach Goal

Dynamic robot-to-robot handoff while TurtleBot is in motion using visual servoing.

Conclusion

Findings, lessons, and future directions

RoboResponse demonstrates that autonomous multi-robot coordination for emergency logistics is achievable at small scale using off-the-shelf hardware and open-source software. By integrating a mobile platform with a robotic manipulator through a shared ROS2 communication layer, we validate how sensing, planning, and actuation can compose into a cohesive, end-to-end autonomous pipeline.

The project validated the feasibility of combining SLAM-based navigation, vision-based hazard classification, and IK-driven manipulation in a single coordinated system. The state machine architecture proved essential for managing asynchronous handoffs between the TurtleBot's detection events and the UR7e's packing routine.

Lessons Learned

Perception Robustness

Color-based hazard detection is sensitive to lighting. Consistent marker design and camera exposure control were critical for reliable classification.

Docking Precision

Autonomous docking requires accurate relative localization. Small pose estimation errors compound into misalignment during resource loading.

Multi-Robot Coordination

Clear handoff conditions between robots prevented deadlocks. State machine design is crucial at the system level.

Integration Testing

Early and frequent integration testing revealed interface mismatches before they became blocking issues in the full pipeline.

Future Work

Natural extensions include dynamic handoff via visual servoing (our reach goal), scaling to more hazard types, 3D obstacle avoidance, and multi-TurtleBot fleet coordination. The modular ROS2 architecture is designed to accommodate these extensions with minimal changes to the core coordination logic.

Team

UC Berkeley undergraduates, Spring 2026

Jaemin
Mechanical Engineering & CS
Interests at the intersection of aerospace and robotics, particularly space robotics and autonomous systems for in-space operations.
Pranav
Engineering, Math & Statistics
Interested in the automobile side of robotics and autonomous systems. Also British — enjoys Beans on Toast and Formula 1.
I Gy
Computer Science & Statistics
Interested in AI and machine learning, particularly computer vision and physical AI.
Christian
Mechanical Engineering
Exploring a variety of engineering disciplines. Recently interested in robotics. Enjoys a black coffee — no sugar, no cream.

Additional Materials

Hardware, software, and documents

Bill of Materials

ItemQtySource
UR7e (with parallel gripper)1Lab resource
TurtleBot3 (with LiDAR sensor)1Lab resource
Intel RealSense Cameras2Lab resource
Small delivery box / container1ESG budget ($55)
Resource objects (blocks)5ESG budget
Color-coded hazard markers4ESG budget

Software Packages