Events for External Reflexes

In the previous Vision Tracking Using Depth tutorial, the robot was using the VisionRGBDFollower algorithm. This meant that the robot will wait for a user to send a request to the controller to follow a an object with a given ID.

In this tutorial, we will make the behavior intelligent. The robot will:

  1. Idle/Patrol by default.

  2. Switch to Follow Mode automatically when a “person” is detected in the video feed.

1. The Setup

Prerequisites: Completed Vision Tracking Using Depth.

We use the same component setup (Vision, Controller, DriveManager), but we initialize the Controller in path following mode (standard navigation) instead of Vision mode.

# Start the controller in standard navigation mode (PathFollowing)
controller = Controller(component_name="controller")
controller.algorithm = "PurePursuit" # Default mode
controller.direct_sensor = True

# Vision Component (as before)

2. Event A: Person Detected -> Follow

We want to detect if the class “person” appears in the detections list. If it does, we trigger the Controller’s Action Server to start the tracking behavior.

from kompass.ros import Event

# Trigger if "person" is in the list of detected labels
event_person_detected= Event(
    event_condition=detections_topic.msg.labels.contains("person"),
    on_change=True
)

The Action

We need to tell the Controller to switch behaviors. We use two actions:

  1. Switch Algorithm: Change the internal logic to VisionRGBDFollower.

  2. Trigger Action Server: Send a goal to the controller to start the active tracking loop.

Tip

If you link an Event to a set of Actions, the action set will get executed in sequence.

# Import the required actions
from kompass.actions import update_parameter, send_component_action_server_goal

# Import the vision follower 'ActionServer' ROS2 message
from kompass_interfaces.action import TrackVisionTarget

# Switch to vision following
# Option 1: use set_algorithm action from the controller (same as in the fallbacks tutorial)
# switch_algorithm_action =  Action(method=controller.set_algorithm, args=(ControllersID.VISION_DEPTH,))

# Option 2: use the update_parameter action to change the 'algorithm' parameter value
# the value can be set to ControllersID.VISION_DEPTH or directly the string name of the algorithm class VisionRGBDFollower
switch_algorithm_action = update_parameter(
    component=controller,
    param_name="algorithm",
    new_value="VisionRGBDFollower"
)

# Action to trigger the action server to follow a person
action_request_msg = TrackVisionTarget.Goal()
action_request_msg.label = "person"  # Specify the target to follow
action_start_person_following = send_component_action_server_goal(
    component=controller,
    request_msg=action_request_msg,
)

# Set the event/action(s) pair
events_action = {
    event_person_detected: [switch_algorithm_action, action_start_person_following]
}

4. Complete Recipe

Here is the complete script. Launch this, run it and stand in front of the robot to make it follow you!

turtlebot3_with_fallbacks.py
 1import numpy as np
 2from agents.components import Vision
 3from agents.config import VisionConfig
 4from agents.ros import Topic
 5from kompass.components import Controller, ControllerConfig, DriveManager, LocalMapper
 6from kompass.robot import (
 7    AngularCtrlLimits,
 8    LinearCtrlLimits,
 9    RobotGeometry,
10    RobotType,
11    RobotConfig,
12)
13from kompass.ros import Launcher, Event
14from kompass.actions import update_parameter, send_component_action_server_goal
15from kompass_interfaces.action import TrackVisionTarget
16
17
18image0 = Topic(name="/camera/rgbd", msg_type="RGBD")
19detections_topic = Topic(name="detections", msg_type="Detections")
20
21detection_config = VisionConfig(threshold=0.5, enable_local_classifier=True)
22vision = Vision(
23    inputs=[image0],
24    outputs=[detections_topic],
25    trigger=image0,
26    config=detection_config,
27    component_name="detection_component",
28)
29
30# Setup your robot configuration
31my_robot = RobotConfig(
32    model_type=RobotType.ACKERMANN,
33    geometry_type=RobotGeometry.Type.CYLINDER,
34    geometry_params=np.array([0.1, 0.3]),
35    ctrl_vx_limits=LinearCtrlLimits(max_vel=1,0, max_acc=3.0, max_decel=2.5),
36    ctrl_omega_limits=AngularCtrlLimits(
37        max_vel=4.0, max_acc=6.0, max_decel=10.0, max_steer=np.pi / 3
38    ),
39)
40
41depth_cam_info_topic = Topic(name="/camera/aligned_depth_to_color/camera_info", msg_type="CameraInfo")
42
43# Setup the controller
44config = ControllerConfig(ctrl_publish_type="Parallel")
45controller = Controller(component_name="controller", config=config)
46controller.inputs(vision_detections=detections_topic, depth_camera_info=depth_cam_info_topic)
47controller.algorithm = "VisionRGBDFollower"
48controller.direct_sensor = False
49
50# Add additional helper components
51driver = DriveManager(component_name="driver")
52mapper = LocalMapper(component_name="local_mapper")
53
54
55# Define Event: Trigger if "person" is in the list of detected labels
56event_person_detected= Event(
57    event_condition=detections_topic.msg.labels.contains("person"),
58    on_change=True
59)
60
61# Define Action(s)
62switch_algorithm_action = update_parameter(
63    component=controller,
64    param_name="algorithm",
65    new_value="VisionRGBDFollower"
66)
67
68# Action to trigger the action server to follow a person
69action_request_msg = TrackVisionTarget.Goal()
70action_request_msg.label = "person"  # Specify the target to follow
71action_start_person_following = send_component_action_server_goal(
72    component=controller,
73    request_msg=action_request_msg,
74)
75
76# Set the event/action(s) pair
77events_action = {
78    event_person_detected: [switch_algorithm_action, action_start_person_following]
79}
80
81# Bring it up with the launcher
82launcher = Launcher()
83
84launcher.add_pkg(components=[vision], ros_log_level="warn",
85                 package_name="automatika_embodied_agents",
86                 executable_entry_point="executable",
87                 multiprocessing=True)
88
89# Add component and the events to monitor
90launcher.kompass(components=[controller, mapper, driver], events_actions=events_action)
91
92# Set the robot config for all components
93launcher.robot = my_robot
94launcher.bringup()

Next Steps

We have handled single-topic events for both cross-component healing and external reflexes. But real-world decisions are rarely based on one factor.

In the next tutorial, we will use Logic Gates to create smarter, composed events (e.g., “If Emergency Stop AND Battery Low”).

Logic Gates & Composed Events →