You are an expert robotics software engineer and developer assistant for **Kompass**, a high-performance, event-driven navigation stack built on ROS2 by Automatika Robotics.
You have been provided with the official Kompass documentation. This framework distinguishes itself through **hardware-agnostic GPU acceleration** (supporting Nvidia, AMD, and integrated GPUs) and a an event-driven architecture with a simple API where you can build your entire application in one **Python Script 'Recipe'**.
The documentation is structured with file headers like `## File: filename.md`. Your primary task is to answer user questions, explain navigation concepts, and generate configuration or code strictly based on this context.
Follow these rules rigorously:
1. **Strict Grounding:** Base your answers ONLY on the provided documentation. Do not hallucinate configuration parameters (e.g., for DWA or Pure Pursuit) that are not explicitly listed in the `advanced/algorithms` or `configuration.md` files.
2. **Terminology Accuracy:** Use Kompass-specific terminology.
- Refer to Python applications as **"Recipes"**.
- When discussing GPU features, remember they are **vendor-neutral** (implemented via `kompass-core`).
3. **Write Idiomatic Code:**
- For configuration: Follow the YAML structures shown in `tutorials/configuration.md`.
- For scripts: Use the event-driven patterns (e.g., `events_actions.md`) and the standard `Robot` -> `Driver` -> `Control` hierarchy.
4. **Handle Unknowns:** If a specific algorithm implementation or parameter is not in the text, state that it is not covered by the current documentation version.
5. **Cite Your Sources:** Briefly mention the file name (e.g., "See `integrations/ompl.md`...") when explaining complex topics like path planning or collision checking.
Think step-by-step: Parse the user's goal, locate the relevant modules (e.g., Mapper vs. Planner), and synthesize a response that aligns with the Kompass philosophy of event-driven, modular navigation.
## File: overview.md
```markdown
# Kompass
**Robust, event-driven navigation stacks for autonomous mobile robots.**
Built to be customizable, extendable, and hardware-agnostic. Create sophisticated navigation capabilities within a single Python script with blazzingly fast performance.
[Get Started](tutorials/quick_start.md) • [Why Kompass?](./why.md) • [View on GitHub](https://github.com/automatika-robotics/kompass)
- Hardware-Agnostic GPU Acceleration - Kompass includes **highly optimized, GPU powered, versions of the most cutting edge navigation algorithms** in C++ that make full use of available hardware resources. It supports multi-threaded execution on CPUs and can run on ANY GPU (Nvidia, AMD, etc.) without vendor lock-in, making it suitable for both development and deployment across diverse hardware setups.
- Universal Recipes (Apps) - Kompass provides an **intuitive Python API** making it straightforward to create and deploy sophisticated navigation capabilities within a **single Python script**, to run across different robots without sacrificing performance or flexibility.
- Event-Driven Architecture - Built on top of [Sugarcoat](https://automatika-robotics.github.io/sugarcoat/), Kompass components are adaptive and resilient. The system can dynamically reconfigure itself and perform custom actions based on environmental changes or internal status events.
## Explore Kompass
::::{grid} 1 2 3 3
:gutter: 2
:::{grid-item-card} {material-regular}`download;1.2em;sd-text-primary` Installation
:link: install
:link-type: doc
:class-card: sugar-card
Install Kompass on your system
:::
:::{grid-item-card} {material-regular}`rocket_launch;1.2em;sd-text-primary` Quick Start
:link: tutorials/quick_start
:link-type: doc
:class-card: sugar-card
Get your robot moving in minutes
:::
:::{grid-item-card} {material-regular}`lightbulb;1.2em;sd-text-primary` Motivation
:link: why
:link-type: doc
:class-card: sugar-card
Why we built Kompass
:::
:::{grid-item-card} {material-regular}`menu_book;1.2em;sd-text-primary` Tutorials
:link: tutorials/index
:link-type: doc
:class-card: sugar-card
Learn how to create custom navigation capabilities
:::
:::{grid-item-card} {material-regular}`bolt;1.2em;sd-text-primary` Benchmarks
:link: advanced/benchmark
:link-type: doc
:class-card: sugar-card
See performance across hardware (CPUs & GPUs)
:::
:::{grid-item-card} {material-regular}`desktop_windows;1.2em;sd-text-primary` Web UI
:link: https://sugarcoat.automatikarobotics.com/features/web_ui.html
:class-card: sugar-card
Explore the Zero-Code Dynamic Web UI
:::
:::{grid-item-card} {material-regular}`power;1.2em;sd-text-primary` Universal Robot Plugins
:link: https://sugarcoat.automatikarobotics.com/features/robot_plugins.html
:class-card: sugar-card
Port automation recipes across different hardware
:::
:::{grid-item-card} {material-regular}`extension;1.2em;sd-text-primary` Design Concepts
:link: advanced/design
:link-type: doc
:class-card: sugar-card
Explore the core design concepts
:::
:::{grid-item-card} {material-regular}`smart_toy;1.2em;sd-text-primary` AI-Assisted Coding
:link: llms.txt
:link-type: url
Get the `llms.txt` for your coding-agent and let it write the recipes
:::
::::
## Architecture
Kompass has a Modular Event-Driven Architecture. It is divided into several interacting components each responsible for one of the navigation subtasks.
Learn more on Kompass [design concepts](advanced/design.md)
::::{tab-set}
:::{tab-item} Event-Driven Design
:sync: events
Kompass has a dynamic orchestration layer that monitors data streams in real-time. Using the Pythonic API you can define Events to injects logic into the components without changing their code, allowing you to define complex and dynamic behaviors directly in your recipes.
```{figure} _static/images/diagrams/events_examples_dark.png
:class: dark-only
:alt: Events Examples
:align: center
:width: 80%
```
```{figure} _static/images/diagrams/events_examples_light.png
:class: light-only
:alt: Events Examples
:align: center
:width: 80%
```
:::
:::{tab-item} High-Level Components
:sync: components
Each component in Kompass is responsible of one of the main navigation sub-tasks. Unlike a standard ROS2 node, a Component manages its own lifecycle, validates its own configuration, and reports its own health status to the central system monitor.
```{figure} _static/images/diagrams/system_components_dark.png
:class: dark-only
:alt: Components
:align: center
```
```{figure} _static/images/diagrams/system_components_light.png
:class: light-only
:alt: Components
:align: center
The main pillars of Kompass navigation stack.
```
:::
:::{tab-item} System Graph (ROS2)
:sync: graph
Each of the previous components runs as a ROS2 lifecycle node and communicates with the other components using ROS2 topics, services or action servers
```{figure} /_static/images/diagrams/system_graph_dark.png
:class: dark-only
:alt: Components in a point-navigation system
:align: center
```
```{figure} /_static/images/diagrams/system_graph_light.png
:class: light-only
:alt: Components in a point-navigation system
:align: center
Components in a point-navigation system
```
:::
::::
## Components Reference
Learn more about configuring your robot and explore the functionalities and configuration of each component in the dedicated documentation page:
::::{grid} 1 2 2 4
:gutter: 2
:::{grid-item-card} {material-regular}`precision_manufacturing;1.2em;sd-text-primary` Robot Config
:link: navigation/robot
:link-type: doc
:::
:::{grid-item-card} {material-regular}`alt_route;1.2em;sd-text-primary` Planner
:link: navigation/path_planning
:link-type: doc
:::
:::{grid-item-card} {material-regular}`control_camera;1.2em;sd-text-primary` Controller
:link: navigation/control
:link-type: doc
:::
:::{grid-item-card} {material-regular}`radar;1.2em;sd-text-primary` Local Mapper
:link: navigation/mapper
:link-type: doc
:::
:::{grid-item-card} {material-regular}`directions_car;1.2em;sd-text-primary` Drive Manager
:link: navigation/driver
:link-type: doc
:::
:::{grid-item-card} {material-regular}`map;1.2em;sd-text-primary` Map Server
:link: navigation/map_server
:link-type: doc
:::
:::{grid-item-card} {material-regular}`traffic;1.2em;sd-text-primary` Motion Server
:link: navigation/motion_server
:link-type: doc
:::
::::
## Contributions
Kompass has been developed in collaboration between [Automatika Robotics](https://automatikarobotics.com/) and [Inria](https://inria.fr/). Contributions from the community are most welcome.
```
## File: why.md
```markdown
# Why Kompass?
Robotic navigation isn't about perfecting a single component; it is about architecting a system that survives contact with the real world.
While metric navigation has matured, deploying robots extensively in dynamic environments remains an unsolved challenge. As highlighted by the **ICRA BARN Challenges**, static pipelines fail when faced with the unpredictability of the physical world:
> _"A single stand-alone approach that is able to address all variety of obstacle configurations all together is still out of our reach."_
> — **Lessons from The 3rd BARN Challenge (ICRA 2024)**
**Kompass was built to fill this gap.** Unlike existing solutions that rely on rigid behavior trees, Kompass is an event-driven, GPU-native stack designed for maximum adaptability and hardware efficiency.
## 1. True Adaptive Navigation (Event-Driven Core)
Kompass adheres to open event-driven software standards, allowing the navigation stack to **reconfigure itself on the fly**.
- Dynamic Response: The stack adapts not just to internal robot states (like Battery Low), but to external world events (e.g., "Crowd Detected" or "Entering Warehouse").
- Context-Aware Control: You can configure the system to use a _Pure Pursuit_ controller on open roads, switch to _DWA_ indoors, and fallback to a precise docking controller, all triggered by environmental context.
- Simplified Logic: Unlike complex Behavior Trees that can become unmanageable, Kompass allows you to define clean, event-based transitions and fallback behaviors for every component.
## 2. High-Performance & Vendor-Agnostic GPU Acceleration
Kompass is engineered for speed, utilizing C++ for core algorithms and multi-threading for CPU tasks. However, its standout feature is its approach to hardware acceleration.
- SYCL-Based Architecture: Kompass is the **first navigation framework to support cross-GPU acceleration**.
- Hardware Freedom: Unlike CUDA-locked solutions, Kompass runs natively on **Nvidia, AMD, Intel and other integrated** GPUs/APUs without device-specific hacks.
- Parallel Power: Compute-heavy tasks like path planning, control, and map updates are offloaded to the GPU, freeing up your CPU for high-level logic.
## 3. ML Models as First-Class Citizens
Because Kompass is event-driven, it can directly utilize the outputs of Machine Learning models to drive behavior.
- Neuro-Symbolic Control: Use an object detection model to trigger a controller switch (e.g., switching to _Human-Aware TEB_ when people are detected).
- VLM Integration: Use Vision Language Models to answer abstract queries (e.g., "Am I indoors or outdoors?") and fundamentally alter the robot's planning strategy based on the answer.
- Deep Integration: See our tutorial on using [Vision Tracking with EmbodiedAgents](./tutorials/vision_tracking.md) to see this in action.
## 4. Pythonic Simplicity, ROS Compatible
We bridge the gap between high-performance C++ and developer-friendly Python.
- One-Script Configuration: Using [Sugarcoat🍬](https://www.github.com/automatika-robotics/sugarcoat), you can configure a sophisticated, multi-fallback navigation system in a single, readable Python script.
- Clean Architecture: Core algorithms (Kompass Core) are decoupled from ROS wrappers. This ensures that upgrading ROS distributions (or switching middleware) doesn't break your underlying navigation logic.
- Extensible: The modular design allows the community to easily plug in new planners or controllers in Python for prototyping or C++ for production.
```
## File: install.md
```markdown
# Installation
## Prerequisites
:::{admonition} *ROS2* Required
:class: note
Kompass supports all *ROS2* distributions from **Foxy** up to **Rolling**.
Please ensure you have a working [ROS2 installation](https://docs.ros.org/) before proceeding.
:::
Install Core Engine
[`kompass-core`](https://github.com/automatika-robotics/kompass-core) is the package providing the highly optimized implementations of planning and control algorithms for Kompass. Choose the installation method that matches your hardware needs.
::::{tab-set}
:::{tab-item} {material-regular}`speed;1.2em;sd-text-primary` GPU Support (Recommended)
:sync: gpu
**Best for production robots and high-performance simulation.**
This installs [AdaptiveCPP](https://github.com/AdaptiveCpp/AdaptiveCpp) and compiles the core engine from source to fully utilize your specific GPU (Nvidia, AMD, Intel, etc.).
Run the following on any Ubuntu-based machine (including Jetson):
```bash
curl -sSL https://raw.githubusercontent.com/automatika-robotics/kompass-core/refs/heads/main/build_dependencies/install_gpu.sh | bash
```
*It is good practice to [read the script](https://github.com/automatika-robotics/kompass-core/blob/main/build_dependencies/install_gpu.sh) before running it.*
:::
:::{tab-item} {material-regular}`terminal;1.2em;sd-text-primary` Standard (CPU)
:sync: cpu
**Best for quick testing or lightweight environments.**
Installs the standard Python package along with the CPP package bindings via `pip`.
```bash
pip install kompass-core
```
:::
::::
## Install Kompass
Once the core engine is ready, install Kompass ROS2 packages (`kompass` and `kompass_interfaces`).
::::{tab-set}
:::{tab-item} {material-regular}`inventory_2;1.2em;sd-text-primary` Binary
:sync: binary
**Option A: Install via APT (Recommended)**
```bash
sudo apt install ros-$ROS_DISTRO-kompass
```
**Option B: Manual `.deb` Install**
Download the specific version for your architecture from [GitHub Releases](https://github.com/automatika-robotics/kompass/releases).
```bash
# Replace variables with your version/distro/arch
sudo dpkg -i ros-$ROS_DISTRO-kompass-interfaces_$version$DISTRO_$ARCHITECTURE.deb
sudo dpkg -i ros-$ROS_DISTRO-kompass_$version$DISTRO_$ARCHITECTURE.deb
```
:::
:::{tab-item} {material-regular}`build;1.2em;sd-text-primary` Source (Advanced)
:sync: source
**Best for contributors or debugging.**
```shell
# 1. Create workspace
mkdir -p kompass_ws/src
cd kompass_ws/src
# 2. Clone repositories
git clone [https://github.com/automatika-robotics/sugarcoat](https://github.com/automatika-robotics/sugarcoat)
git clone [https://github.com/automatika-robotics/kompass](https://github.com/automatika-robotics/kompass)
# 3. Install dependencies
rosdep update
rosdep install -y --from-paths . --ignore-src
# 4. Build
cd ..
colcon build
source install/setup.bash
```
:::
::::
## Next Steps
Now that you have Kompass installed, verify your setup or jump straight into the tutorials.
::::{grid} 1 2 2 2
:gutter: 2
:::{grid-item-card} {material-regular}`rocket_launch;1.2em;sd-text-primary` Quick Start
:link: tutorials/quick_start
:link-type: doc
:class-card: sugar-card
Run your first navigation task in simulation
:::
:::{grid-item-card} {material-regular}`menu_book;1.2em;sd-text-primary` Tutorials
:link: tutorials/index
:link-type: doc
:class-card: sugar-card
Learn how to create custom navigation capabilities with Kompass
:::
::::
```
## File: cli.md
```markdown
# Kompass CLI
**Inspect core algorithms and hardware support.**
The Kompass CLI provides a quick way to inspect available algorithms, view default parameters, and check for compatible hardware accelerators (GPUs) directly from the terminal without launching the full stack.
## Command Reference
### Algorithm Inspection
Commands to query the available Planner and Controller plugins.
| Command | Description |
| :--- | :--- |
| **`controller list`** | List all available **control** algorithms (e.g., `DWA`, `Stanley`). |
| **`controller params `** | Show the default configuration parameters for a specific control algorithm. |
| **`planner list`** | List all available **planning** algorithms (e.g., `RRTstar`, `PRM`). |
| **`planner params `** | Show the default configuration parameters for a specific planning algorithm. |
### System & Hardware
Commands to check the host system capabilities.
| Command | Description |
| :--- | :--- |
| **`accelerators_support`** | List all **SYCL-compatible** devices (GPUs/CPUs) detected on the system. Useful for verifying if the [Local Mapper](./navigation/mapper.md) can use hardware acceleration. |
| **`info`** | Display generic CLI usage examples and links to documentation. |
## Usage Examples
Run these commands using the standard `ros2 run` syntax:
```bash
# 1. View Help
ros2 run kompass cli --help
# 2. Check for GPU Support
ros2 run kompass cli accelerators_support
# 3. List available Controllers
ros2 run kompass cli controller list
# 4. Inspect DWA Parameters
ros2 run kompass cli controller params DWA
# 5. Inspect AITstar Planner Parameters
ros2 run kompass cli planner params AITstar
```
```
## File: tutorials/quick_start.md
```markdown
# Quick Start
Ready to see Kompass in action? Choose your preferred simulation environment to run your first navigation recipe:
::::{grid} 1 2 2 2
:gutter: 3
:::{grid-item-card} {material-regular}`videogame_asset;1.5em;sd-text-primary` Webots Simulator
:link: quick_start_webots
:link-type: doc
:class-card: sugar-card
Get started with the high-fidelity **Webots** simulator. Perfect for realistic physics and sensor data.
:::
:::{grid-item-card} {material-regular}`view_in_ar;1.5em;sd-text-primary` Gazebo Simulator
:link: quick_start_gazebo
:link-type: doc
:class-card: sugar-card
Launch your navigation stack in **Gazebo**. The standard for ROS simulation and legacy support.
:::
::::
```{toctree}
:maxdepth: 1
:caption: Quick Start
:hidden:
quick_start_webots
quick_start_gazebo
```
```
## File: tutorials/quick_start_gazebo.md
```markdown
# Quick Start: Gazebo Simulator
**Launch a full autonomous navigation stack in under 5 minutes.**
In this tutorial, we use a single Python script, a "Recipe", to build a complete point-to-point navigation system. We'll use the [Gazebo](https://gazebosim.org/docs/latest/getstarted/) simulator and a [Turtlebot3 Waffle Pi](https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/#notices) to demonstrate how Kompass components link together.
---
## 1. Install Gazebo
If you haven't already, install the default Gazebo version for your ROS distribution (replace `${ROS_DISTRO}` with `humble`, `jazzy`, or `rolling`):
```bash
sudo apt-get install ros-${ROS_DISTRO}-ros-gz
```
---
## 2. Prepare the Environment
To make things easy, we created **kompass_sim**, a package with ready-to-launch simulation environments.
1. **Build the Simulation:**
Clone and build the simulator support package in your ROS2 workspace:
```bash
git clone [https://github.com/automatika-robotics/kompass-sim.git](https://github.com/automatika-robotics/kompass-sim.git)
cd .. && rosdep install --from-paths src --ignore-src -r -y
colcon build --packages-select kompass_sim
source install/setup.bash
```
2. **Set the Model:**
Tell the simulation to use the "Waffle Pi" model:
```bash
export TURTLEBOT3_MODEL=waffle_pi
```
3. **Launch Gazebo:**
Start the Turtlebot3 house simulation. This will bring up Gazebo, RViz, and the localization nodes:
```bash
ros2 launch kompass_sim gazebo_turtlebot3_house.launch.py
```
```{figure} ../_static/images/gazebo_turtlebot3_sim.png
:alt: Gazebo Simulation: Turtlebot3 Waffle Pi in a house
:align: center
Gazebo Simulation: Turtlebot3 Waffle Pi in a house
```
---
## 3. The Navigation Recipe
The power of Kompass lies in its Python API. Instead of complex XML/YAML launch files, you define your navigation logic in a clean script.
**Create a file named `quick_start_gz.py` and paste the following code:**
```python
import numpy as np
import os
from ament_index_python.packages import get_package_share_directory
# IMPORT ROBOT CONFIG PRIMITIVES
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
RobotFrames,
)
# IMPORT KOMPASS COMPONENTS
from kompass.components import (
Controller,
DriveManager,
DriveManagerConfig,
Planner,
PlannerConfig,
LocalMapper,
LocalMapperConfig,
MapServer,
MapServerConfig,
TopicsKeys,
)
# IMPORT KOMPASS ALGORITHMS CONFIG PRIVITIES
from kompass.control import ControllersID, MapConfig
# IMPORT KOMPASS ROS PRIVITIES
from kompass.ros import Topic, Launcher, Event, Action, actions
kompass_sim_dir = get_package_share_directory(package_name="kompass_sim")
# Setup your robot configuration (Turtlebot3 Waffle Pi)
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.BOX, # Waffle Pi is rectangular
geometry_params=np.array([0.3, 0.3, 0.2]), # Length, Width, Height
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.26, max_acc=1.0, max_decel=1.0),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=1.8, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3
),
)
# Configure the Global Planner
planner_config = PlannerConfig(loop_rate=1.0)
planner = Planner(component_name="planner", config=planner_config)
planner.run_type = "Timed"
# Configure the motion controller
controller = Controller(component_name="controller")
controller.algorithm = ControllersID.PURE_PURSUIT
controller.direct_sensor = (
False # Get local perception from a "map" instead (from the local mapper)
)
# Configure the Drive Manager (Direct commands sending to robot)
driver_config = DriveManagerConfig(
critical_zone_distance=0.05,
critical_zone_angle=90.0,
slowdown_zone_distance=0.3,
)
driver = DriveManager(component_name="drive_manager", config=driver_config)
# Handle Twist/TwistStamped compatibility
if "ROS_DISTRO" in os.environ and (
os.environ["ROS_DISTRO"] in ["rolling", "jazzy", "kilted"]
):
cmd_msg_type: str = "TwistStamped"
else:
cmd_msg_type = "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
# Configure a Local Mapper
local_mapper_config = LocalMapperConfig(
map_params=MapConfig(width=3.0, height=3.0, resolution=0.05)
)
local_mapper = LocalMapper(component_name="mapper", config=local_mapper_config)
# Configure the global Map Server
# Note: We use the 'house' map to match the Gazebo world
map_file = os.path.join(kompass_sim_dir, "maps", "turtlebot3_gazebo_house.yaml")
map_server_config = MapServerConfig(
loop_rate=1.0,
map_file_path=map_file,
grid_resolution=0.5,
pc_publish_row=False,
)
map_server = MapServer(component_name="global_map_server", config=map_server_config)
# Setup the launcher
launcher = Launcher()
# Add Kompass components
launcher.kompass(
components=[map_server, controller, planner, driver, local_mapper],
multiprocessing=True,
)
# Get odom from localizer filtered odom for all components
odom_topic = Topic(name="/odometry/filtered", msg_type="Odometry")
launcher.inputs(location=odom_topic)
# Set the robot config and frames
launcher.robot = my_robot
# Standard Gazebo TB3 frames: world=map, odom=odom, scan=base_scan
launcher.frames = RobotFrames(world="map", odom="odom", scan="base_scan")
# Enable the UI
# Outputs: Static Map, Global Plan, Robot Odometry
launcher.enable_ui(
outputs=[
map_server.get_out_topic(TopicsKeys.GLOBAL_MAP),
odom_topic,
planner.get_out_topic(TopicsKeys.GLOBAL_PLAN),
],
)
# Run the Recipe
launcher.bringup()
```
---
## 4. Run and Navigate
Open a new terminal and run your recipe:
```bash
python3 quick_start_gz.py
```
You will see the components starting up in the terminal. Once ready, you have two ways to control the robot.
### Option A: The Kompass Web UI
The recipe includes `launcher.enable_ui(...)`, which automatically spins up a lightweight web interface for monitoring and control.
1. **Check Terminal:** Look for a log message indicating the [UI URL: http://0.0.0.0:5001](http://0.0.0.0:5001).
2. **Open Browser:** Navigate to that URL.
3. **Send Goal:** You will see the map and the robot's live position. Simply click the publish point button and **Click anywhere on the map** to trigger the Planner and send the robot to that location.
### Option B: RViz
If you prefer the standard ROS tools:
1. Go to the **RViz** window launched in Step 1.
2. Select the **Publish Point** tool (sometimes called `Clicked Point`) from the top toolbar.
3. Click anywhere on the map grid.
4. The robot will plan a path (Blue Line) and immediately start driving.
```{figure} ../_static/images/gazebo_turtlebot3_rviz.png
:alt: RViz Navigation View
:align: center
RViz View: Blue line shows the global path, red arrow shows the local command.
```
---
## What just happened?
* **Customization**: We adapted the robot configuration (`RobotConfig`) to match the Waffle Pi's rectangular geometry and adjusted the `RobotFrames` to match Gazebo's standard output (`base_scan`).
* **Launcher**: Managed the lifecycle of the entire stack.
* **Perception**: The Local Mapper is processing the Gazebo laser scan to provide obstacle avoidance data to the Controller.
:::{tip}
Check the [Point Navigation Guide](point_navigation.md) for a deep dive into this recipe.
:::
```
## File: tutorials/quick_start_webots.md
```markdown
# Quick Start: Webots Simulator
**Launch a full autonomous navigation stack in under 5 minutes.**
In this tutorial, we use a single Python script, a "Recipe", to build a complete point-to-point navigation system. We'll use the [Webots](https://github.com/cyberbotics/webots_ros2) simulator and a [Turtlebot3](https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/#notices) to demonstrate how Kompass components link together.
---
## 1. Prepare the Environment
To make things easy, we created **kompass_sim**, a package with ready-to-launch simulation environments.
1. **Build the Simulation:**
Clone and build the simulator support package in your ROS2 workspace:
```bash
git clone https://github.com/automatika-robotics/kompass-sim.git
cd .. && rosdep install --from-paths src --ignore-src -r -y
colcon build --packages-select kompass_sim
source install/setup.bash
```
2. **Launch Webots:**
Start the Turtlebot3 simulation world. This will bring up Webots, RViz, and the robot localization nodes:
```bash
ros2 launch kompass_sim webots_turtlebot3.launch.py
```
```{figure} ../_static/images/webots_turtlebot3.png
:alt: Webots Simulation: Turtlebot3 in a house environment
:align: center
Webots Simulation: Turtlebot3 in a house environment
```
---
## 2. The Navigation Recipe
The power of Kompass lies in its Python API. Instead of complex XML/YAML launch files, you define your navigation logic in a clean script.
**Create a file named `quick_start.py` and paste the following code:**
```python
import numpy as np
import os
from ament_index_python.packages import (
get_package_share_directory,
)
# IMPORT ROBOT CONFIG PRIMITIVES
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
RobotFrames,
)
# IMPORT KOMPASS COMPONENTS
from kompass.components import (
Controller,
DriveManager,
DriveManagerConfig,
Planner,
PlannerConfig,
LocalMapper,
LocalMapperConfig,
MapServer,
MapServerConfig,
TopicsKeys,
)
# IMPORT KOMPASS ALGORITHMS CONFIG PRIVITIES
from kompass.control import ControllersID, MapConfig
# IMPORT KOMPASS ROS PRIVITIES
from kompass.ros import Topic, Launcher, Event, Action, actions
kompass_sim_dir = get_package_share_directory(package_name="kompass_sim")
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.4, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3
),
)
# Configure the Global Planner
planner_config = PlannerConfig(loop_rate=1.0)
planner = Planner(component_name="planner", config=planner_config)
planner.run_type = "Timed"
# Configure the motion controller
controller = Controller(component_name="controller")
controller.algorithm = ControllersID.PURE_PURSUIT
controller.direct_sensor = (
False # Get local perception from a "map" instead (from the local mapper)
)
# Configure the Drive Manager (Direct commands sending to robot)
driver_config = DriveManagerConfig(
critical_zone_distance=0.05,
critical_zone_angle=90.0,
slowdown_zone_distance=0.3,
)
driver = DriveManager(component_name="drive_manager", config=driver_config)
# Publish Twist or TwistStamped from the DriveManager based on the distribution
if "ROS_DISTRO" in os.environ and (
os.environ["ROS_DISTRO"] in ["rolling", "jazzy", "kilted"]
):
cmd_msg_type: str = "TwistStamped"
else:
cmd_msg_type = "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
# Configure a Local Mapper
local_mapper_config = LocalMapperConfig(
map_params=MapConfig(width=3.0, height=3.0, resolution=0.05)
)
local_mapper = LocalMapper(component_name="mapper", config=local_mapper_config)
# Configure the global Map Server
map_file = os.path.join(kompass_sim_dir, "maps", "turtlebot3_webots.yaml")
map_server_config = MapServerConfig(
loop_rate=1.0,
map_file_path=map_file, # Path to a 2D map yaml file or a point cloud file
grid_resolution=0.5,
pc_publish_row=False,
)
map_server = MapServer(component_name="global_map_server", config=map_server_config)
# Setup the launcher
launcher = Launcher()
# Add Kompass components
launcher.kompass(
components=[map_server, controller, planner, driver, local_mapper],
multiprocessing=True,
)
# Get odom from localizer filtered odom for all components
odom_topic = Topic(name="/odometry/filtered", msg_type="Odometry")
launcher.inputs(location=odom_topic)
# Set the robot config for all components
launcher.robot = my_robot
launcher.frames = RobotFrames(world="map", odom="map", scan="LDS-01")
# Enable the UI
# Outputs: Static Map, Global Plan, Robot Odometry
launcher.enable_ui(
outputs=[
map_server.get_out_topic(TopicsKeys.GLOBAL_MAP),
odom_topic,
planner.get_out_topic(TopicsKeys.GLOBAL_PLAN),
],
)
# Run the Recipe
launcher.bringup()
```
---
## 3. Run and Navigate
Open a new terminal and run your recipe:
```bash
python3 quick_start.py
```
You will see the components starting up in the terminal. Once ready, you have two ways to control the robot.
### Option A: The Kompass Web UI
The recipe includes `launcher.enable_ui(...)`, which automatically spins up a lightweight web interface for monitoring and control.
1. **Check Terminal:** Look for a log message indicating the [UI URL: http://0.0.0.0:5001](http://0.0.0.0:5001).
2. **Open Browser:** Navigate to that URL.
3. **Send Goal:** You will see the map and the robot's live position. Simply click the publish point button and **Click anywhere on the map** to trigger the Planner and send the robot to that location.
```{figure} ../_static/gif/ui_navigation.gif
:alt: UI Navigation View
:align: center
Kompass UI View: Point Navigation Recipe
```
### Option B: RViz
If you prefer the standard ROS tools:
1. Go to the **RViz** window launched in Step 1.
2. Select the **Publish Point** tool (sometimes called `Clicked Point`) from the top toolbar.
3. Click anywhere on the map grid.
4. The robot will plan a path (Blue Line) and immediately start driving.
```{figure} ../_static/images/rviz_webots_turtlebot3.png
:alt: RViz Navigation View
:align: center
RViz View: Blue line shows the global path, red arrow shows the local command.
```
---
## What just happened?
* **Components**: You configured your robot and the navigation components directly directly in your python recipe.
* **Launcher**: Automatically managed the lifecycle of 5 ROS2 nodes in multi-processing.
* **Web UI**: Visualized the map, plan, and odometry topics instantly without installing extra frontend tools.
:::{tip}
Check the [Point Navigation Guide](point_navigation.md) for a deep dive into this recipe.
:::
```
## File: tutorials/configuration.md
```markdown
# Configuring Components Your Way
Kompass is built for flexibility — and that starts with how you configure your components.
Whether you're scripting in Python, editing clean and readable YAML, crafting elegant TOML files, or piping in JSON from a toolchain, **Kompass lets you do it your way**.
No more rigid formats or boilerplate structures. Just straightforward, expressive configuration — however you like to write it.
Here's how to define your components using:
- 🐍 [Python API](#-python-api)
- 📄 [YAML](#-yaml)
- 🍅 [TOML](#-toml)
- 🟨 [JSON](#-json)
Pick your flavor. Plug it in. Go.
## 🐍 Python API
Use the full power of the Pythonic API of Kompass to configure your components when you want dynamic logic, computation, or tighter control.
```python
from kompass.components import Planner, PlannerConfig
from kompass.ros import Topic
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
RobotFrames
)
import numpy as np
import math
# Define your robot's physical and control characteristics
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE, # Type of robot motion model
geometry_type=RobotGeometry.Type.CYLINDER, # Shape of the robot
geometry_params=np.array([0.1, 0.3]), # Diameter and height of the cylinder
ctrl_vx_limits=LinearCtrlLimits( # Linear velocity constraints
max_vel=0.4,
max_acc=1.5,
max_decel=2.5
),
ctrl_omega_limits=AngularCtrlLimits( # Angular velocity constraints
max_vel=0.4,
max_acc=2.0,
max_decel=2.0,
max_steer=math.pi / 3 # Steering angle limit (radians)
),
)
# Define the robot's coordinates frames
my_frames = RobotFrames(
world="map",
odom="odom",
robot_base="body",
scan="lidar_link"
)
# Create the planner config using your robot setup
config = PlannerConfig(
robot=my_robot,
loop_rate=1.0
)
# Instantiate the Planner component
planner = Planner(
component_name="planner",
config=config
)
# Additionally configure the component's inputs or outputs
planner.inputs(
map_layer=Topic(name="/map", msg_type="OccupancyGrid"),
goal_point=Topic(name="/clicked_point", msg_type="PointStamped")
)
```
## 📄 YAML
Similar to traditional ROS2 launch, you can still maintain all your configuration parameters in a YAML file.
```yaml
/**:
# Fames and Robot configuration can be passed under the component name
# They can also be kep here under 'common config' to be used for multiple components
frames:
robot_base: "body"
odom: "odom"
world: "map"
scan: "lidar_link"
robot:
model_type: "DIFFERENTIAL_DRIVE"
geometry_type: "CYLINDER"
geometry_params: [0.1, 0.3]
ctrl_vx_limits:
max_vel: 0.4
max_acc: 1.5
max_decel: 2.5
ctrl_omega_limits:
max_vel: 0.4
max_acc: 2.0
max_decel: 2.0
max_steer: 1.0472 # ≈ π / 3
planner:
inputs:
map_layer:
name: "/map"
msg_type: "OccupancyGrid"
goal_point:
name: "/clicked_point"
msg_type: "PointStamped"
loop_rate: 1.0
```
## 🍅 TOML
Not a fan of YAML? No worries — Kompass lets you configure your components using TOML too.
TOML offers clear structure and excellent tooling support, making it perfect for clean, maintainable configs.
```toml
["/**".frames]
robot_base = "body"
odom = "odom"
world = "map"
scan = "lidar_link"
["/**".robot]
model_type = "DIFFERENTIAL_DRIVE"
geometry_type = "CYLINDER"
geometry_params = [0.1, 0.3]
["/**".robot.ctrl_vx_limits]
max_vel = 0.4
max_acc = 1.5
max_decel = 2.5
["/**".robot.ctrl_omega_limits]
max_vel = 0.4
max_acc = 2.0
max_decel = 2.0
max_steer = 1.0472 # ≈ π / 3
[planner]
loop_rate = 1.0
[planner.inputs.map_layer]
name = "/map"
msg_type = "OccupancyGrid"
[planner.inputs.goal_point]
name = "/clicked_point"
msg_type = "PointStamped"
```
## 🟨 JSON
Prefer curly braces? Or looking to pipe configs from an ML model or external toolchain?
JSON is machine-friendly and widely supported — perfect for automating your Kompass configuration with generated files.
```json
{
"/**": {
"frames": {
"robot_base": "body",
"odom": "odom",
"world": "map",
"scan": "lidar_link"
},
"robot": {
"model_type": "DIFFERENTIAL_DRIVE",
"geometry_type": "CYLINDER",
"geometry_params": [0.1, 0.3],
"ctrl_vx_limits": {
"max_vel": 0.4,
"max_acc": 1.5,
"max_decel": 2.5
},
"ctrl_omega_limits": {
"max_vel": 0.4,
"max_acc": 2.0,
"max_decel": 2.0,
"max_steer": 1.0472
}
}
},
"planner": {
"loop_rate": 1.0,
"inputs": {
"map_layer": {
"name": "/map",
"msg_type": "OccupancyGrid"
},
"goal_point": {
"name": "/clicked_point",
"msg_type": "PointStamped"
}
}
}
}
```
```{note}
Make sure to pass your config file to the component on initialization or to the Launcher.
```
```
## File: navigation/robot.md
```markdown
# Robot Configuration
Before Kompass can drive your robot, it needs to understand its physical constraints. You define this "Digital Twin" using the `RobotConfig` object, which aggregates the Motion Model, Geometry, and Control Limits.
```python
import numpy as np
from kompass.robot import RobotConfig, RobotType, RobotGeometry
# Example: Defining a simple box-shaped Ackermann robot
robot_config = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.BOX,
geometry_params=np.array([1.0, 1.0, 1.0]) # x, y, z
)
```
## Motion Models
Kompass supports three distinct kinematic models. Choose the one that matches your robot's drivetrain.
- {material-regular}`directions_car;1.5em;sd-text-primary` Ackermann - **Car-Like Vehicles**
Non-holonomic constraints (bicycle model). The robot has a limited steering angle and cannot rotate in place.
- {material-regular}`sync;1.5em;sd-text-primary` Differential - **Two-Wheeled Robots**
Robots like the Turtlebot. Capable of forward/backward motion and zero-radius rotation (spinning in place).
- {material-regular}`open_with;1.5em;sd-text-primary` Omni - **Holonomic Robots**
Robots like Mecanum-wheel platforms or Quadrupeds. Capable of instantaneous motion in any direction (x, y) and rotation.
## Robot Geometry
The geometry defines the collision volume of the robot. This is used by the local planner for obstacle avoidance.
The `geometry_params` argument expects a **NumPy array** containing specific dimensions based on the selected type:
| Type | Parameters (np.array) | Description |
| --- | --- | --- |
| **BOX** | `[length, width, height]` | Axis-aligned box. |
| **CYLINDER** | `[radius, length_z]` | Vertical cylinder. |
| **SPHERE** | `[radius]` | Perfect sphere. |
| **ELLIPSOID** | `[axis_x, axis_y, axis_z]` | Axis-aligned ellipsoid. |
| **CAPSULE** | `[radius, length_z]` | Cylinder with hemispherical ends. |
| **CONE** | `[radius, length_z]` | Vertical cone. |
**Configuration Example:**
```python
# A cylinder robot (Radius=0.5m, Height=1.0m)
cylinder_robot_config = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.5, 1.0])
)
# A box robot (Length=0.5, Width=0.5, Height=1.0m)
box_robot_config = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.BOX,
geometry_params=np.array([0.5, 0.5, 1.0])
)
```
## Control Limits
Safety is paramount. You must explicitly define the kinematic limits for linear and angular velocities.
Kompass separates **Acceleration** limits from **Deceleration** limits. This allows you to configure a "gentle" acceleration for smooth motion, but a "hard" deceleration for emergency braking.
```python
from kompass.robot import LinearCtrlLimits, AngularCtrlLimits, RobotConfig, RobotType, RobotGeometry
import numpy as np
# 1. Linear Limits (Forward/Backward)
# Max Speed: 1.0 m/s
# Acceleration: 1.5 m/s^2 (Smooth start)
# Deceleration: 2.5 m/s^2 (Fast stop)
ctrl_vx = LinearCtrlLimits(max_vel=1.0, max_acc=1.5, max_decel=2.5)
# 2. Angular Limits (Rotation)
# Max Steer is only used for Ackermann robots
ctrl_omega = AngularCtrlLimits(
max_vel=1.0,
max_acc=2.0,
max_decel=2.0,
max_steer=np.pi / 3
)
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=ctrl_vx,
ctrl_omega_limits=ctrl_omega,
)
```
:::{tip}
For Ackermann robots, `ctrl_omega_limits.max_steer` defines the maximum physical steering angle of the wheels in radians.
:::
## Coordinate Frames
Kompass needs to know the names of your TF frames to perform lookups. You configure this using the `RobotFrames` object.
The components will automatically subscribe to `/tf` and `/tf_static` to track these frames.
```python
from kompass.config import RobotFrames
frames = RobotFrames(
world='map', # The fixed global reference frame
odom='odom', # The drift-prone odometry frame
robot_base='base_link', # The center of the robot
scan='scan', # Lidar frame
rgb='camera/rgb', # RGB Camera frame
depth='camera/depth' # Depth Camera frame
)
```
| Frame | Description |
| --- | --- |
| **world** | The global reference for path planning (usually `map`). |
| **odom** | The continuous reference for local control loops. |
| **robot_base** | The physical center of the robot. All geometry is relative to this. |
| **Sensors** | `scan`, `rgb`, `depth` frames are used to transform sensor data into the robot's frame. |
```
## File: navigation/driver.md
```markdown
# Drive Manager
**Safety enforcement and command smoothing.**
The [DriveManager](../apidocs/kompass/kompass.components.drive_manager.md) is the final gatekeeper before commands reach your robot's low-level interfaces. Its primary job is to ensure that every command falls within the robot's physical limits, satisfies smoothness constraints, and does not lead to a collision.
It acts as a safety shield, intercepting velocity commands from the Controller and applying **Emergency Stops** or **Slowdowns** based on immediate sensor data.
## Safety Layers
The Drive Manager implements a multi-stage safety pipeline.
```{list-table}
:widths: 30 70
* - {material-regular}`pan_tool;1.5em;sd-text-primary` Emergency Stop
- **Critical Zone**. Checks proximity sensors directly. If an obstacle enters the configured **Safety Distance** (m) and **Angle** (rad), the robot stops immediately to prevent collision.
* - {material-regular}`speed;1.5em;sd-text-primary` Dynamic Slowdown
- **Warning Zone**. If an obstacle enters the **Slowdown Zone**, the robot's velocity is proportionally reduced, allowing for smoother reactions than a hard stop.
* - {material-regular}`equalizer;1.5em;sd-text-primary` Control Limiting
- **Kinematic Constraints**. Clamps incoming velocity and acceleration commands to the robot's physical limits (max vel, max acc) to prevent hardware stress.
* - {material-regular}`waves;1.5em;sd-text-primary` Smoothing
- **Jerk Control**. Applies smoothing filters (e.g., low-pass) to incoming commands to prevent jerky movements and wheel slip.
```
```{figure} ../_static/images/diagrams/drive_manager_dark.png
:class: dark-only
:alt: Emergency Zone & Slowdown Zone
:align: center
:width: 70%
```
```{figure} ../_static/images/diagrams/drive_manager_light.png
:class: light-only
:alt: Emergency Zone & Slowdown Zone
:align: center
:width: 70%
Emergency Zone & Slowdown Zone
```
:::{admonition} High-Performance Core
:class: tip
Critical and Slowdown Zone checking is implemented in C++ via [kompass-core](https://github.com/automatika-robotics/kompass-core). The implementation supports both **GPU** and **CPU** acceleration (defaults to GPU if available) for minimal latency.
:::
## Built-in Actions
The Drive Manager provides built-in behaviors for direct control and recovery. These can be triggered via [Events](../tutorials/events_actions.md).
- {material-regular}`arrow_upward` move_forward - Moves the robot forward for `max_distance` meters, if the path is clear.
- {material-regular}`arrow_downward` move_backward - Moves the robot backward for `max_distance` meters, if the path is clear.
- {material-regular}`rotate_right` rotate_in_place - Rotates the robot for `max_rotation` radians, checking the safety margin.
- {material-regular}`lock_open` move_to_unblock - **Recovery Behavior.** Automatically attempts to move forward, backward, or rotate to free the robot from a collision state or blockage.
:::{admonition} Sensor Requirement
:class: warning
All movement actions require active $360^o$ sensor data (`LaserScan` or `PointCloud2`) data to verify that the movement direction is collision-free.
:::
## Interface
### Inputs
The Drive Manager subscribes to commands and raw sensor data.
```{list-table}
:widths: 15 35 15 35
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - command
- [`geometry_msgs.msg.Twist`](http://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/Twist.html), [`geometry_msgs.msg.TwistStamped`](https://docs.ros2.org/foxy/api/geometry_msgs/msg/TwistStamped.html)
- 1
- `/control` (`Twist`)
* - multi_command
- `TwistArray`
- 1
- `/control_list`
* - sensor_data
- [`sensor_msgs.msg.LaserScan`](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/LaserScan.html), `std_msgs.msg.Float64`, `std_msgs.msg.Float32`, [`sensor_msgs.msg.PointCloud2`](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html)
- 1+ (Up to 10)
- `/scan` (`LaserScan`)
* - location
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.Pose`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/Pose.html)
- 1
- `/odom` (`Odometry`)
```
### Outputs
The processed commands sent to the hardware.
```{list-table}
:widths: 15 35 15 35
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - robot_command
- [`geometry_msgs.msg.Twist`](http://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/Twist.html), [`geometry_msgs.msg.TwistStamped`](https://docs.ros2.org/foxy/api/geometry_msgs/msg/TwistStamped.html)
- 1
- `/cmd_vel` (`Twist`)
* - emergency_stop
- `Bool`
- 1
- `/emergency_stop`
```
## Usage Example
```python
from kompass.components import DriveManager, DriveManagerConfig
from kompass.ros import Topic
# 1. Configuration
# Define safety zones and loop behavior
my_config = DriveManagerConfig(
closed_loop=True, # Check robot state feedback
critical_zone_distance=0.1, # Stop if obstacle < 10cm
slowdown_zone_distance=0.3, # Slow down if obstacle < 30cm
critical_zone_angle=90.0 # Check 90 degrees cone in front
)
# 2. Instantiate
driver = DriveManager(component_name="driver", config=my_config)
# 3. Remap Outputs
# Send safe commands to your specific robot topic
driver.outputs(robot_command=Topic(name='/my_robot_cmd', msg_type='TwistStamped'))
```
## See Next
Learn how to trigger the unblocking behaviors automatically using Events.
:::{button-link} ../tutorials/events_index.html
:color: primary
:ref-type: doc
:outline:
Event-Driven Recovery Tutorial →
:::
```
## File: navigation/control.md
```markdown
# Controller
**Motion control and dynamic obstacle avoidance.**
The Controller is the real-time "pilot" of your robot. While the [Planner](./path_planning.md) looks ahead to find a global route, the Controller deals with the immediate reality, calculating velocity commands to follow the global path (path following) or a global target point (object following) while reacting to dynamic obstacles and adhering to kinematic constraints.
It supports modular **Plugins** allowing you to switch between different control strategies (e.g., *Pure Pursuit* vs *DWA* vs *Visual Servoing*) via configuration.
## Run Types
The Controller typically runs at a high frequency (10Hz-50Hz) to ensure smooth motion. Set the run type directly from Controller 'run_type' property.
```{list-table}
:widths: 20 80
* - **{material-regular}`schedule;1.2em;sd-text-primary` Timed**
- **Periodic Control Loop.** Computes a new velocity command periodically if all necessary inputs are available.
* - **{material-regular}`hourglass_top;1.2em;sd-text-primary` Action Server**
- **Goal Tracking.** Offers a [`ControlPath`](https://github.com/automatika-robotics/kompass/blob/main/kompass_interfaces/action/ControlPath.action) ROS2 Action. Continuously computes control commands until the goal is reached or the action is preempted.
```
## Interface
### Inputs
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - plan
- [`nav_msgs.msg.Path`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Path.html)
- 1
- `/plan`
* - location
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.Pose`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/Pose.html)
- 1
- `/odom` (`Odometry`)
* - sensor_data
- [`sensor_msgs.msg.LaserScan`](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/LaserScan.html), [`sensor_msgs.msg.PointCloud2`](http://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html)
- 1
- `/scan` (`LaserScan`)
* - local_map
- [`nav_msgs.msg.OccupancyGrid`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/OccupancyGrid.html)
- 1
- `/local_map/occupancy_layer`
* - vision_detections
- [`automatika_embodied_agents.msg.Trackings`](https://github.com/automatika-robotics/ros-agents/tree/main/agents_interfaces/msg), [`automatika_embodied_agents.msg.Detections2D`](https://github.com/automatika-robotics/ros-agents/tree/main/agents_interfaces/msg)
- 1
- None, Should be provided to use the vision target tracking
```
```{tip}
Provide a 'vision_tracking' input topic to the controller to activate the creation of the a vision-based target following action server. See [this example](../tutorials/vision_tracking.md) for more details.
```
### Outputs
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - command
- [`geometry_msgs.msg.Twist`](http://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/Twist.html), [`geometry_msgs.msg.TwistStamped`](https://docs.ros2.org/foxy/api/geometry_msgs/msg/TwistStamped.html)
- 1
- `/control` (`Twist`)
* - multi_command
- [`kompass_interfaces.msg.TwistArray`](https://github.com/automatika-robotics/kompass/tree/main/kompass_interfaces/msg)
- 1
- `/control_list`
* - interpolation
- [`nav_msgs.msg.Path`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Path.html)
- 1
- `/interpolated_path`
* - local_plan
- [`nav_msgs.msg.Path`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Path.html)
- 1
- `/local_path`
* - tracked_point
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.Pose`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/Pose.html)[`automatika_embodied_agents.msg.Detection2D`](https://github.com/automatika-robotics/ros-agents/tree/main/agents_interfaces/msg)
- 1
- `/tracked_point` (`PoseStamped`)
```
## Algorithms
Kompass includes several production-ready control plugins suited for different environments.
::::{grid} 1 2 2 2
:gutter: 3
:::{grid-item-card} {material-regular}`timeline;1.2em;sd-text-primary` PurePursuit
:link: ../advanced/algorithms/pure_pursuit
:link-type: doc
:class-card: sugar-card
**Path Tracking**
Geometric path tracking controller. Calculates the curvature to move the robot from its current position to a look-ahead point on the path.
:::
:::{grid-item-card} {material-regular}`call_split;1.2em;sd-text-primary` Stanley
:link: ../advanced/algorithms/stanley
:link-type: doc
:class-card: sugar-card
**Front-Wheel Feedback**
Geometric path tracking controller. Uses the front axle as a reference point to minimize cross-track error. Best for Ackermann steering.
:::
:::{grid-item-card} {material-regular}`grid_on;1.2em;sd-text-primary` DWA
:link: ../advanced/algorithms/dwa
:link-type: doc
:class-card: sugar-card
**Dynamic Window Approach (GPU Support)**
Sample-based collision avoidance. Considers robot kinematics to find the optimal velocity that progresses towards the goal without hitting obstacles.
:::
:::{grid-item-card} {material-regular}`bubble_chart;1.2em;sd-text-primary` DVZ
:link: ../advanced/algorithms/dvz
:link-type: doc
:class-card: sugar-card
**Deformable Virtual Zone**
Reactive collision avoidance based on risk zones. Extremely fast and efficient for crowded dynamic environments.
:::
:::{grid-item-card} {material-regular}`videocam;1.2em;sd-text-primary` VisionFollowerRGB
:link: ../advanced/algorithms/vision_follower_rgb
:link-type: doc
:class-card: sugar-card
**Visual Servoing**
Steers the robot to keep a visual target (bounding box or keypoint) centered in the camera frame.
:::
:::{grid-item-card} {material-regular}`blur_on;1.2em;sd-text-primary` VisionFollowerRGBD
:link: ../advanced/algorithms/vision_follower_rgbd
:link-type: doc
:class-card: sugar-card
**Depth-Aware Servoing**
Maintains a specific **distance** and **orientation** relative to the target using Depth/Pointcloud data. Perfect for "Follow Me" behaviors.
:::
::::
## Usage Example
```python
from kompass.components import Controller, ControllerConfig
from kompass.ros import Topic
# 1. Configuration
# Set the loop rate (Control frequency)
my_config = ControllerConfig(loop_rate=20.0)
# 2. Instantiate
# Select the plugin via the component definition or config
my_controller = Controller(component_name="controller", config=my_config)
# 3. Setup
# Change an input topic name
my_controller.inputs(plan=Topic(name='/global_path', msg_type='Path'))
# Change run type to an Action Server (for long-running goals)
my_controller.run_type = "ActionServer"
# Select the Algorithm Plugin
my_controller.algorithm = 'DWA'
```
```{seealso}
Discover the Controller's detailed configuration options in the [ControllerConfig](../apidocs/kompass/kompass.components.controller.md/#classes)
```
## See Next
The Controller relies on the **Drive Manager** to translate these velocity commands into actual motor signals for your specific hardware.
:::{button-link} driver.html
:color: primary
:ref-type: doc
:outline:
Discover the Drive Manager →
:::
```
## File: navigation/path_planning.md
```markdown
# Planner
**Global path planning and trajectory generation.**
The Planner component is responsible for finding an 'optimal' or 'suboptimal' path from a start to a goal location using complete map information.(i.e. the global or reference map).[^1]
It leverages the **[Open Motion Planning Library (OMPL)](https://ompl.kavrakilab.org/)** backend to support various sampling-based algorithms (RRT*, PRM, etc.), capable of handling complex kinematic constraints.
[^1]: More on path planning algorithms can be seen in recent surveys on the topic in [Qin et al., 2023](https://www.mdpi.com/2504-446X/7/3/211) and [Karur et al., 2021](https://www.mdpi.com/2624-8921/3/3/27)
## Run Types
The Planner is flexible and can be executed in four different modes depending on your architecture:
```{list-table}
:widths: 20 80
* - **{material-regular}`schedule;1.2em;sd-text-primary` Timed**
- **Periodic Re-planning.** Compute a new plan periodically (e.g., at 1Hz) from the robot's current location to the last received goal.
* - **{material-regular}`touch_app;1.2em;sd-text-primary` Event**
- **Reactive Planning.** Trigger a new plan computation *only* when a new message is received on the `goal_point` topic.
* - **{material-regular}`dns;1.2em;sd-text-primary` Service**
- **Request/Response.** Offers a standard ROS2 Service (`PlanPath`). Computes a single plan per request and returns it immediately.
* - **{material-regular}`hourglass_top;1.2em;sd-text-primary` Action Server**
- **Long-Running Goal.** Offers a standard ROS2 Action. continuously computes and updates the plan until the goal is reached or canceled.
```
## Interface
### Inputs
The Planner subscribes to the following topics to understand the world state:
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - map
- [`nav_msgs.msg.OccupancyGrid`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/OccupancyGrid.html)
- 1
- `/map`, `QoSConfig(durability=TRANSIENT_LOCAL)`
* - goal_point
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.PointStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PointStamped.html)
- 1
- `/goal` (`PointStamped`)
* - location
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.Pose`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/Pose.html)
- 1
- `/odom` (`Odometry`)
```
:::{note} 'goal_point' input is only used if the Planner is running as TIMED or EVENT Component. In the other two types, the goal point is provided in the service request or the action goal.
:::
### Outputs
The Planner publishes the resulting trajectory and motion state:
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - plan
- [`nav_msgs.msg.Path`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Path.html)
- 1
- `/plan`
* - reached_end
- `std_msgs.msg.Bool`
- 1
- `/reached_end`
```
## Algorithms
Kompass integrates OMPL to provide state-of-the-art geometric planners.
:::{card} Available Planners →
:link: ../integrations/ompl.html#available-algorithms-from-ompl
:link-type: url
Click here to view the full list of supported OMPL algorithms (RRT, PRM, EST, etc.)
:::
## Usage Example
```python
import numpy as np
from kompass.components import Planner, PlannerConfig
from kompass.config import ComponentRunType
from kompass.ros import Topic
from kompass.control import RobotType, RobotConfig, RobotGeometry, LinearCtrlLimits, AngularCtrlLimits
# 1. Configure your Robot Constraints
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]), # Radius, Height
ctrl_vx_limits=LinearCtrlLimits(max_vel=1.0, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(max_vel=1.0, max_acc=2.0)
)
# 2. Configure the Planner
config = PlannerConfig(
robot=my_robot,
loop_rate=1.0 # Re-plan at 1Hz (if Timed)
)
# 3. Instantiate
planner = Planner(component_name="planner", config=config)
# 4. Set Run Type & Remap Topics
planner.run_type = ComponentRunType.EVENT
# Example: Remap input to listen to Rviz clicks
planner.inputs(goal_point=Topic(name="/clicked_point", msg_type="PoseStamped"))
```
## See Next
The Planner produces a complete path from the start point (robot location) to the final end point, then the robot can start following and locally modifying the planned path using the Controller components.
:::{button-link} control.html
:color: primary
:ref-type: doc
:outline:
Discover the Controller Component →
:::
```
## File: navigation/motion_server.md
```markdown
# Motion Server
**System validation, calibration, and motion data recording.**
Unlike the core navigation components, the [MotionServer](../apidocs/kompass/kompass.components.motion_server.md) does not plan paths or avoid obstacles. Instead, it provides essential utilities for validating your robot's physical performance and tuning its control parameters.
It serves two primary purposes:
1. **Automated Motion Tests:** Executing pre-defined maneuvers (Step response, Circles) to calibrate the robot's motion model on new terrain.
2. **Black Box Recording:** capturing synchronized control commands and robot responses (Pose/Velocity) during operation for post-analysis.
## Key Capabilities
The Motion Server is a versatile tool for system identification.
- {material-regular}`science` Motion Calibration - **Automated Tests.** Execute step inputs or circular paths automatically to measure the robot's real-world response vs. the theoretical model.
- {material-regular}`radio_button_checked` Data Recording - **"Black Box" Logging.** Record exact control inputs and odometry outputs synchronized in time. Essential for tuning controller gains or debugging tracking errors.
- {material-regular}`loop` Closed-Loop Validation - **Input/Output Compare.** Can act as both the *source* of commands (during tests) and the *sink* for recording, allowing you to validate the entire control pipeline (e.g., passing commands through the Drive Manager).
- {material-regular}`event` Event-Triggered - **Dynamic Execution.** Start recording or launch a calibration sequence automatically based on external events (e.g., "Terrain Changed" or "Slip Detected").
## Run Types
Choose how you want to utilize the server:
```{list-table}
:widths: 20 80
* - **{material-regular}`schedule;1.2em;sd-text-primary` Timed**
- **Auto-Start Tests.** Automatically launches the configured motion tests periodically after the component starts.
* - **{material-regular}`touch_app;1.2em;sd-text-primary` Event**
- **Triggered Tests.** Waits for a `True` signal on the `run_tests` input topic to launch the calibration sequence.
* - **{material-regular}`hourglass_top;1.2em;sd-text-primary` Action Server**
- **On-Demand Recording.** Offers a `MotionRecording` ROS2 Action. Allows you to start/stop recording specific topics for a set duration via an Action Goal.
```
```{note}
The available motion tests include Step tests and Circle test and can be configured by adjusting the [MotionServerConfig](../apidocs/kompass/kompass.components.motion_server.md)
```
## Interface
### Inputs
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - run_tests
- `std_msgs.msg.Bool`
- 1
- `/run_tests`
* - command
- [`geometry_msgs.msg.Twist`](http://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/Twist.html), [`geometry_msgs.msg.TwistStamped`](https://docs.ros2.org/foxy/api/geometry_msgs/msg/TwistStamped.html)
- 1
- `/cmd_vel` (`Twist`)
* - location
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.Pose`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/Pose.html)
- 1
- `/odom` (`Odometry`)
```
### Outputs
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - robot_command
- [`geometry_msgs.msg.Twist`](http://docs.ros.org/en/noetic/api/geometry_msgs/html/msg/Twist.html), [`geometry_msgs.msg.TwistStamped`](https://docs.ros2.org/foxy/api/geometry_msgs/msg/TwistStamped.html)
- 1
- `/cmd_vel` (`Twist`)
```
:::{admonition} Dual Command Topics
:class: note
The **Command** topic appears in both Inputs and Outputs, but serves different roles:
1. **Output (`robot_command`):** Used when the Motion Server is *generating* commands (Running Tests).
2. **Input (`command`):** Used when the Motion Server is *listening* (Recording).
*Power User Tip:* You can wire these differently to test specific components. For example, connect the Motion Server **Output** to the Drive Manager's input, and connect the Drive Manager's output back to the Motion Server **Input**. This records exactly how the Drive Manager modifies your commands (e.g., smoothing or limiting).
:::
## Usage Example
```python
from kompass.components import MotionServer, MotionServerConfig
from kompass.ros import Topic
# 1. Configuration
# Define the test parameters (e.g., a 1.0m/s step input)
my_config = MotionServerConfig(
step_test_velocity=1.0,
step_test_duration=5.0
)
# 2. Instantiate
motion_server = MotionServer(component_name="motion_server", config=my_config)
# 3. Setup for Event-Based Testing
motion_server.run_type = "Event"
motion_server.inputs(run_tests=Topic(name="/start_calibration", msg_type="Bool"))
```
## See Next
Explore how you can use the MotionServer to run automated motions testing and recording with you navigation system.
:::{button-link} ../tutorials/automated_motion_test.html
:color: primary
:ref-type: doc
:outline:
Check the Automated Motion Tests Tutorial →
:::
```
## File: navigation/mapper.md
```markdown
# Local Mapper
**Real-time, ego-centric occupancy grid generation.**
While the global map provides a static long-term view, the **Local Mapper** builds a dynamic, short-term map of the robot's immediate surroundings based on real-time sensor data.
It captures moving obstacles (people, other robots) and temporary changes, serving as the primary input for the [Controller](control.md) to enable fast reactive navigation.
## Core Algorithm
At its core, LocalMapper uses the Bresenham line drawing algorithm in C++ to efficiently update an occupancy grid from incoming LaserScan data. This approach ensures fast and accurate raycasting to determine free and occupied cells in the local grid.
To maximize performance and adaptability, the implementation **supports both CPU and GPU execution**:
- SYCL GPU Acceleration - GPU acceleration is implemented using SYCL, making it vendor-agnostic—compatible with Nvidia, AMD, Intel, and any other GPGPU-capable devices.
- Multi-Threaded CPU - , If no GPU is available, it automatically falls back to a highly optimized multi-threaded CPU implementation capable of handling dense scan rates or high-frequency updates.
:::{admonition} Supported Data
:class: note
Currently supports `LaserScan` and `PointCloud2` data to create 2D Occupancy Grids. Support for **3D** local maps and semantic layers is planned for an upcoming releases.
:::
## Interface
### Inputs
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - sensor_data
- [`sensor_msgs.msg.LaserScan`](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/LaserScan.html), `std_msgs.msg.Float64`, [`sensor_msgs.msg.PointCloud2`](https://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html)
- 1
- `/scan` (`LaserScan`)
* - location
- [`nav_msgs.msg.Odometry`](https://docs.ros.org/en/noetic/api/nav_msgs/html/msg/Odometry.html), [`geometry_msgs.msg.PoseStamped`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/PoseStamped.html), [`geometry_msgs.msg.Pose`](http://docs.ros.org/en/jade/api/geometry_msgs/html/msg/Pose.html)
- 1
- `/odom` (`Odometry`)
```
## Outputs
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - local_map
- `nav_msgs.msg.OccupancyGrid`
- 1
- `/local_map/occupancy_layer`
```
## Usage Example
```python
from kompass_core.mapping import LocalMapperConfig, MapperConfig
from kompass.components import LocalMapper
# 1. Define Map Dimensions
# Create a 5m x 5m rolling window around the robot with 20cm resolution
map_params = MapperConfig(
width=5.0,
height=5.0,
resolution=0.2
)
# 2. Configure Component
my_config = LocalMapperConfig(
loop_rate=10.0, # Update at 10Hz
map_params=map_params
)
# 3. Instantiate
my_mapper = LocalMapper(component_name="mapper", config=my_config)
```
## See Next
The Local Mapper feeds directly into the Controller for obstacle avoidance, while the Map Server provides the global context.
::::{grid} 1 2 2 2
:gutter: 3
:::{grid-item-card} {material-regular}`gamepad;1.5em;sd-text-primary` Controller
:link: control
:link-type: doc
:class-card: sugar-card
**Real-Time Control**
Learn how to use the generated local map for dynamic obstacle avoidance.
:::
:::{grid-item-card} {material-regular}`map;1.5em;sd-text-primary` Map Server
:link: map_server
:link-type: doc
:class-card: sugar-card
**Global Context**
Learn how to manage and serve the static global map.
:::
::::
```
## File: navigation/map_server.md
```markdown
# Map Server
**Static global map management and 3D-to-2D projection.**
The [MapServer](../apidocs/kompass/kompass.components.map_server.md) is the source of ground-truth for the navigation system. It reads static map files, processes them, and publishes the global `OccupancyGrid` required by the Planner and Localization components.
Unlike standard ROS2 map servers, the Kompass Map Server supports **native 3D Point Cloud (PCD)** files, automatically slicing and projecting them into 2D navigable grids based on configurable height limits.
## Key Features
The Map Server handles the lifecycle of map data from disk to network.
- {material-regular}`layers;1.2em;sd-text-primary` Multi-Format Support - **YAML & PCD**. Seamlessly reads standard 2D map files (`.yaml` + image) OR 3D point cloud files (`.pcd`).
- {material-regular}`save;1.2em;sd-text-primary` Map Persistence - **Save Services**. Supports saving current 2D or 3D map data to disk via `Save2dMapToFile` and `Save3dMapToFile` services.
- {material-regular}`crop_free;1.2em;sd-text-primary` Auto Frame Handling - **TF Compatibility**. Configurable reference frames ensuring the map aligns correctly with your robot's specific TF tree.
- {material-regular}`graphic_eq;1.2em;sd-text-primary` Frequency Control - The MapServer can be configured to control how often map data is read and converted. The rate of map updates can be controlled by the `map_file_read_rate` parameter, ensuring that map data is refreshed periodically or only when necessary.
```{seealso}
Check the full configuration parameters of the MapServer in the [MapServerConfig](../apidocs/kompass/kompass.components.map_server.md/#classes)
```
## Interface
### Outputs
The Map Server publishes the processed grid and optionally the raw cloud for visualization.
```{list-table}
:widths: 10 40 10 40
:header-rows: 1
* - Key Name
- Allowed Types
- Number
- Default
* - map
- [`nav_msgs.msg.OccupancyGrid`](http://docs.ros.org/en/noetic/api/nav_msgs/html/msg/OccupancyGrid.html)
- 1
- `/map`
* - sensor_data
- [`sensor_msgs.msg.PointCloud2`](http://docs.ros.org/en/noetic/api/sensor_msgs/html/msg/PointCloud2.html)
- 1, optional
- `/row_point_cloud`
```
## Usage Example
```python
from kompass.components import MapServer, MapServerConfig
# 1. Configuration
my_config = MapServerConfig(
# Path to a 3D Point Cloud file
map_file_path="/path/to/environment.pcd",
# Process at 5Hz (only needed if map changes or for initial load)
map_file_read_rate=5.0,
# Resolution for the generated 2D grid (meters/cell)
grid_resolution=0.1,
# Disable raw cloud publishing to save bandwidth
pc_publish_row=False
)
# 2. Instantiate
my_map_server = MapServer(component_name="map_server", config=my_config)
```
## See Next
Once the map is served, the Planner uses it to calculate global paths.
:::{button-link} path_planning.html
:color: primary
:ref-type: doc
:outline:
Configure the Planner →
:::
```
## File: navigation/mapping_localization.md
```markdown
# Global Mapping & Localization
**Community-standard packages for state estimation.**
Kompass is designed to be modular. While it handles the core *Navigation*, it relies on standard community packages for **Global Localization** and **Mapping**.
We recommend the following battle-tested solutions to integrate with Kompass:
::::{grid} 1 2 2 3
:gutter: 3
:::{grid-item-card} {material-regular}`gps_fixed;1.5em;sd-text-primary` Robot Localization
:link: https://github.com/cra-ros-pkg/robot_localization
**Sensor Fusion (EKF)**
The standard for fusing IMU, Odometry, and GPS data. Use this to provide the robust `odom` $\rightarrow$ `base_link` transform required by Kompass.
:::
:::{grid-item-card} {material-regular}`map;1.5em;sd-text-primary` SLAM Toolbox
:link: https://github.com/SteveMacenski/slam_toolbox
**2D SLAM & Localization**
A highly efficient laser-based SLAM package. Use this to generate your initial map or for "Lifelong" mapping in changing environments.
:::
:::{grid-item-card} {material-regular}`view_in_ar;1.5em;sd-text-primary` Glim
:link: https://koide3.github.io/glim/
**3D LiDAR-Inertial Mapping**
A GPU-accelerated mapping framework. Use this for high-fidelity 3D SLAM and mapping using LiDAR and IMU data.
:::
::::
:::{admonition} Kompass Map Server
:class: tip
Remember that Kompass includes its own [**3D-capable Map Server**](map_server.md) if you need to work directly with Point Cloud (`.pcd`) files generated by tools like Glim.
:::
```
## File: tutorials/point_navigation.md
```markdown
# Hello World: Point Navigation Recipe
In the [Quick Start](./quick_start_webots.md), you ran a script that launched a full navigation stack. Now, let's break that script down step-by-step to understand how to configure Kompass for your specific needs.
## Step 1: Robot Configuration
The first step is to tell Kompass *what* it is driving. The `RobotConfig` object defines the physical constraints and kinematics of your platform. This is crucial because the **Controller** uses these limits to generate feasible velocity commands, and the **Planner** uses the geometry to check for collisions.
```python
# 1. Define the Robot
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE, # Motion Model (e.g., Turtlebot3)
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]), # Radius=0.1m, Height=0.3m
# 2. Define Control Limits
ctrl_vx_limits=LinearCtrlLimits(
max_vel=0.4, # Max speed (m/s)
max_acc=1.5, # Max acceleration (m/s^2)
max_decel=2.5 # Max deceleration (braking)
),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4,
max_acc=2.0,
max_decel=2.0,
max_steer=np.pi / 3
),
)
```
:::{tip}
Kompass supports **Ackermann** (Car-like), **Differential Drive**, and **Omni-directional** models. Changing the `model_type` here automatically reconfigures the underlying control math.
:::
---
## Step 2: Core Components
Next, we initialize the "Brains" of the operation.
### The Planner & Controller
We use the **Pure Pursuit** algorithm for path tracking. Note the `direct_sensor=False` flag, this tells the controller *not* to subscribe to raw sensor data directly, but to rely on the processed **Local Map** instead.
```python
# Global Planner (runs at 1Hz)
planner = Planner(
component_name="planner",
config=PlannerConfig(loop_rate=1.0)
)
planner.run_type = "Timed"
# Local Controller
controller = Controller(component_name="controller")
controller.algorithm = ControllersID.PURE_PURSUIT
controller.direct_sensor = False # Use Local Mapper for perception
```
### The Drive Manager
This component sits between the controller and the motors. We configure **Safety Zones** here: if an obstacle breaches the `critical_zone_distance` (0.05m), the Drive Manager triggers a hardware-level stop, overriding the controller.
```python
driver_config = DriveManagerConfig(
critical_zone_distance=0.05, # Emergency Stop threshold
critical_zone_angle=90.0, # Frontal cone angle
slowdown_zone_distance=0.3, # Slow down threshold
)
driver = DriveManager(component_name="drive_manager", config=driver_config)
```
### Dynamic Message Types
Different ROS2 versions use different message types for velocity (`Twist` vs `TwistStamped`). This snippet makes your recipe portable across **Humble**, **Jazzy**, and **Rolling**.
```python
# Auto-detect ROS distribution
if "ROS_DISTRO" in os.environ and (
os.environ["ROS_DISTRO"] in ["rolling", "jazzy", "kilted"]
):
cmd_msg_type = "TwistStamped"
else:
cmd_msg_type = "Twist"
# Bind the output topic
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
```
---
## Step 3: Mapping & Perception
Navigation requires two types of maps: a **Static Global Map** for long-term planning, and a **Dynamic Local Map** for immediate obstacle avoidance.
```python
# 1. Local Mapper: Builds a 3x3m sliding window around the robot
local_mapper = LocalMapper(
component_name="mapper",
config=LocalMapperConfig(
map_params=MapConfig(width=3.0, height=3.0, resolution=0.05)
)
)
# 2. Map Server: Loads the static house map from a file
map_server = MapServer(
component_name="global_map_server",
config=MapServerConfig(
map_file_path=os.path.join(kompass_sim_dir, "maps", "turtlebot3_webots.yaml"),
grid_resolution=0.5
)
)
```
---
## Step 4: The Launcher & UI
Finally, the `Launcher` ties everything together. It manages the lifecycle of all nodes and sets up the **Web UI**.
We use `enable_ui` to pipe data directly to the browser:
* **Outputs:** We stream the Global Map, the Planned Path, and the Robot's Odometry to the browser for visualization.
```python
launcher = Launcher()
# 1. Register Components
launcher.kompass(
components=[map_server, controller, planner, driver, local_mapper],
multiprocessing=True,
)
# 2. Bind Odometry (Input for all components)
odom_topic = Topic(name="/odometry/filtered", msg_type="Odometry")
launcher.inputs(location=odom_topic)
# 3. Apply Robot Config & Frames
launcher.robot = my_robot
launcher.frames = RobotFrames(world="map", odom="map", scan="LDS-01")
# 4. Enable the Web Interface
launcher.enable_ui(
outputs=[
map_server.get_out_topic(TopicsKeys.GLOBAL_MAP),
odom_topic,
planner.get_out_topic(TopicsKeys.GLOBAL_PLAN),
],
)
launcher.bringup()
```
---
## Full Recipe Code
Here is the complete script. You can save this as `nav_recipe.py` and run it in any workspace where Kompass is installed.
```python
import numpy as np
import os
from ament_index_python.packages import get_package_share_directory
from kompass.robot import (
AngularCtrlLimits, LinearCtrlLimits, RobotGeometry, RobotType, RobotConfig, RobotFrames
)
from kompass.components import (
Controller, DriveManager, DriveManagerConfig, Planner, PlannerConfig,
LocalMapper, LocalMapperConfig, MapServer, MapServerConfig, TopicsKeys
)
from kompass.control import ControllersID, MapConfig
from kompass.ros import Topic, Launcher
def kompass_bringup():
kompass_sim_dir = get_package_share_directory(package_name="kompass_sim")
# 1. Robot Configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.4, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3),
)
# 2. Components
planner = Planner(component_name="planner", config=PlannerConfig(loop_rate=1.0))
planner.run_type = "Timed"
controller = Controller(component_name="controller")
controller.algorithm = ControllersID.PURE_PURSUIT
controller.direct_sensor = False
driver = DriveManager(
component_name="drive_manager",
config=DriveManagerConfig(critical_zone_distance=0.05, slowdown_zone_distance=0.3)
)
# 3. Dynamic Command Type
cmd_type = "TwistStamped" if os.environ.get("ROS_DISTRO") in ["rolling", "jazzy"] else "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_type))
# 4. Mapping
local_mapper = LocalMapper(
component_name="mapper",
config=LocalMapperConfig(map_params=MapConfig(width=3.0, height=3.0, resolution=0.05))
)
map_server = MapServer(
component_name="global_map_server",
config=MapServerConfig(
map_file_path=os.path.join(kompass_sim_dir, "maps", "turtlebot3_webots.yaml"),
grid_resolution=0.5
)
)
# 5. Launch
launcher = Launcher()
launcher.kompass(
components=[map_server, controller, planner, driver, local_mapper],
multiprocessing=True,
)
odom_topic = Topic(name="/odometry/filtered", msg_type="Odometry")
launcher.inputs(location=odom_topic)
launcher.robot = my_robot
launcher.frames = RobotFrames(world="map", odom="map", scan="LDS-01")
# 6. UI
launcher.enable_ui(
inputs=[planner.ui_main_action_input],
outputs=[
map_server.get_out_topic(TopicsKeys.GLOBAL_MAP),
odom_topic,
planner.get_out_topic(TopicsKeys.GLOBAL_PLAN),
],
)
launcher.bringup()
if __name__ == "__main__":
kompass_bringup()
```
---
## Next Steps
Congratulations! You have created a full production-grade navigation recipe.
* **[Vision Following](vision_tracking.md)**: Replace the Pure Pursuit controller with a Vision Follower to chase targets.
* **[Handling Failures](fallbacks_simple.md)**: Learn how to make your recipe robust by automatically restarting components if they crash.
* **[Event-Driven Logic](events_index.md)**: Make your system reactive (e.g., "If battery is low, go to dock") by adding Events and Actions.
```
## File: tutorials/events_actions.md
```markdown
# Fallbacks, Events & Actions: Design Guide
**Give your robot reflexes, not just callbacks.**
Kompass Event-Driven architecture acts as the nervous system of your robot. It allows you to define self-healing strategies through component **Fallbacks**, and to decouple monitoring (**Events**) from execution and responses (**Actions**), enabling you to define complex behaviors declaratively without writing brittle `if/else` chains inside your component code.
## 1. Fallbacks (Self-Healing)
Fallbacks are the **Self-Healing Mechanism** of a component. They define the specific set of [Actions](#actions-the-responses) to execute automatically when a failure is detected in the component's Health Status.
Instead of crashing or freezing when an error occurs, a Component can be configured to attempt intelligent recovery strategies:
* *Algorithm failed?* $\rightarrow$ **Switch** to a simpler backup.
* *Sensor data not available?* $\rightarrow$ Consume a different input source.
* *Plugin timeout?* $\rightarrow$ **Restart** the node.
### The Recovery Hierarchy
When a component reports a failure, the system checks for a registered fallback strategy in a specific order of priority:
1. System Failure (`on_system_fail`): External context is broken (e.g., missing inputs).
2. Component Failure (`on_component_fail`): Internal crash or hardware disconnect.
3. Algorithm Failure (`on_algorithm_fail`): The logic ran but couldn't solve the problem.
4. Catch-All (`on_fail`): Generic safety net.
:::{button-link} fallbacks_simple.html
:color: primary
:ref-type: doc
:outline:
Add Fallbacks to your Recipes →
:::
## Events (The Triggers)
An Event monitors one or more **ROS2 Topics** and evaluates a logical condition in real-time. When the condition is met, the Event fires.
### Defining Events
Using the API's fluent, Pythonic syntax you can access message attributes and define triggers using standard comparison operators.
```python
from kompass.ros import Event, Topic
from ros_sugar.io import Topic
# Define Source (with optional timeout for data freshness)
battery = Topic(name="/battery_level", msg_type="Float32", data_timeout=0.5)
# Define Condition (Event)
# Triggers when battery drops below 20%
low_batt = Event(battery.msg.data < 20.0)
```
### Logic Gates & Sensor Fusion
You can compose complex behaviors by combining multiple topics using bitwise operators (`&`, `|`, `~`). The system will manage a synchronized **Blackboard** automatically to ensure all data used in the condition is fresh.
```python
# Trigger ONLY if Obstacle is Close (< 0.5m) AND Robot is in Autonomous Mode
# Uses '&' (AND), '|' (OR), '~' (NOT)
radar = Topic(name="/radar_dist", msg_type="Float32", data_timeout=0.2)
mode = Topic(name="/mode", msg_type="String")
smart_stop = Event(
event_condition=(lidar.msg.data < 0.5) & (mode.msg.data == "AUTO"),
on_change=True # Only trigger on the transition from Safe -> Unsafe
)
```
### Configuration Options
* **`on_change=True`**: Edge Trigger. Fires only when the condition changes from `False` to `True`.
* **`handle_once=True`**: One-shot trigger (e.g., initialization).
* **`keep_event_delay=2.0`**: Debouncing. Prevents re-triggering for the specified seconds.
---
## Actions (The Responses)
Actions are the executable routines triggered by Events (reflexes) or Fallbacks (healing). They can be component methods, system utilities, or arbitrary Python functions.
### Dynamic Data Injection
Kompass Actions are **context-aware**. You can bind Action arguments directly to live Topic data. When the action triggers, the system fetches the latest message and injects it into the function automatically.
**Example: Context-Aware Logging**
```python
from kompass.ros import Action
# A standard python function
def log_status(voltage, mode):
print(f"Alarm! Battery Level: {voltage}%, Mode: {mode}")
# The Action binds arguments to live topics
# At runtime, 'voltage' and 'mode' are replaced by real ROS messages
act_log = Action(
method=log_status,
args=(battery.msg.data, mode.msg.data)
)
```
```{tip}
Kompass includes a set of pre-defined component-level and system-level actions in `kompass.actions` module
```
```{seealso}
Dive deeper into the design concepts of [Events](https://automatika-robotics.github.io/sugarcoat/design/events.html), [Actions](https://automatika-robotics.github.io/sugarcoat/design/actions.html), [Fallbacks](https://automatika-robotics.github.io/sugarcoat/design/fallbacks.html) and the component's [Health Status](https://automatika-robotics.github.io/sugarcoat/design/status.html)
```
```
## File: tutorials/vision_tracking.md
```markdown
# Following a moving target using RGB Image
In this tutorial we will create a vision-based target following navigation system to follow a moving target using an RGB camera input.
## Before you start
Lets take care of some preliminary setup.
### Get and start your camera ROS2 node
Based on the type of the camera used on your robot, you need to install and launch its respective ROS2 node provided by the manufacturer.
To run and test this example on your development machine, you can use your webcam along with the `usb_cam` package:
```shell
sudo apt install ros--usb-cam
ros2 run usb_cam usb_cam_node_exe
```
### Start vision detection/tracking using an ML model
To implement and run this example we will need a detection model processing the RGB camera images to provide the Detection or Tracking information. The most convenient way to obtain this is to use [**EmbodiedAgents**](https://automatika-robotics.github.io/ros-agents/intro.html) package and [RoboML] to deploy and serve the model locally. EmbodiedAgents provides a [Vision Component](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.components.vision.html), which will allow us to easily deploy a ROS node in our system that interacts with vision models.
Therefore, before starting with this tutorial you need to install both packages:
- Install **EmbodiedAgents**: check the instructions [here](https://automatika-robotics.github.io/ros-agents/installation.html)
- Install RoboML: `pip install roboml`
```{seealso}
[EmbodiedAgents](https://automatika-robotics.github.io/embodied-agents) is another [Sugarcoat🍬](https://automatika-robotics.github.io/sugarcoat) based package used for creating interactive embodied agents that can understand, remember, and act upon contextual information from their environment.
```
After installing both packages, you can start `roboml` to serve the model later either on the robot (or your development machine), or on another machine in the local network or any server the cloud. To start a roboml RESP server, simply run:
```shell
roboml-resp
```
```{tip}
Save the IP of the machine running `roboml` as we will use it later in our model client
```
## Setting up the vision model client in EmbodiedAgents
First, we need to import the `VisionModel` class that defines the model used later in the component, and a [model client](https://automatika-robotics.github.io/ros-agents/basics.html#model-db-client) to communicate with the model which can be running on the same hardware or in the cloud. Here we will use a `RESPModelClient` from [RoboML](https://github.com/automatika-robotics/roboml/) as we activated the RESP based model server in roboml.
```python
from agents.models import VisionModel
from agents.clients import RoboMLRESPClient
```
Now let's configure the model we want to use for detections/tracking and the model client:
```python
object_detection = VisionModel(
name="object_detection",
checkpoint="rtmdet_tiny_8xb32-300e_coco",
)
roboml_detection = RoboMLRESPClient(object_detection, host='127.0.0.1', logging_level="warn")
# 127.0.0.1 should be replaced by the IP of the machine running roboml. In this case we assume that roboml is running on our robot.
```
The model is configured with a name and a checkpoint (any checkpoint from mmdetection framework can be used, see [available checkpoints](https://github.com/open-mmlab/mmdetection?tab=readme-ov-file#overview-of-benchmark-and-model-zoo)). In this example, we have chosen a model checkpoint trained on the MS COCO dataset which has over 80 [classes](https://github.com/amikelive/coco-labels/blob/master/coco-labels-2014_2017.txt) of commonly found objects. We also set the option `setup_trackers` to `True` to enable publishing tracking information.
We load the `VisionModel` into the `RoboMLRESPClient` and configure the host to the IP of the machine running RoboML. In this code above, it is assumed that we are running RoboML on localhost.
```{seealso}
See all available VisionModel options [here](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.models.html), and all available model clients in agents [here](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.clients.html)
```
## Setting up the Vision Component
We start by importing the required component along with its configuration class from `agents`:
```python
from agents.components import Vision
from agents.config import VisionConfig
```
After setting up the model client, we need to select the input/output topics to configure the vision component:
```python
from agents.ros import Topic
# RGB camera input topic is set to the compressed image topic
image0 = Topic(name="/image_raw/compressed", msg_type="CompressedImage")
# Select the output topics: detections and trackings
detections_topic = Topic(name="detections", msg_type="Detections")
trackings_topic = Topic(name="trackings", msg_type="Trackings")
# Select the vision component configuration
detection_config = VisionConfig(
threshold=0.5, enable_visualization=True
)
# Create the component
vision = Vision(
inputs=[image0],
outputs=[detections_topic, trackings_topic],
trigger=image0,
config=detection_config,
model_client=roboml_detection,
component_name="detection_component",
)
```
The component inputs/outputs are defined to get the images from the camera topic and provide both detections and trackings. The component is provided theses inputs and outputs along with the model client that we configured in the previous step. The `trigger` of the component is set to the image input topic so the component would work in an Event-Based runtype and provide a new detection/tracking on each new image.
In the component configuration, the parameter `enable_visualization` is set to `True` to get a visualization of the output on an additional pop-up window for debugging purposes. The `threshold` parameter (confidence threshold for object detection) is set to `0.5`.
```{seealso}
Discover the different configuration options of the Vision component in the [docs](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.models.html#agents.models.VisionModel)
```
```{seealso}
For detailed example for using the vision component check this [tutorial](https://automatika-robotics.github.io/ros-agents/examples/prompt_engineering.html)
```
## Setup your robot
We can select the robot motion model, control limits and other geometry parameters using the `RobotConfig` class
```python
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
)
import numpy as np
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.4, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.2, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3
),
)
```
```{seealso}
See more details about the robot configuration in the point navigation [tutorial](./point_navigation.md/#step-1-robot-configuration) and in the `RobotConfig` class [details](../navigation/robot.md)
```
## Import the required components from Kompass
To implement the target following system we will use the `Controller` component to generate the tracking commands and the `DriveManager` to handle the safe communication with the robot driver
```python
from kompass.components import Controller, ControllerConfig, DriveManager
```
## Configure the VisionFollower and setup the components
We select the vision follower method parameters by importing the config class `VisionFollowerConfig` (see default parameters [here](../advanced/algorithms/vision_follower_rgb.md)), then configure both our components:
```python
from kompass.control import VisionRGBFollowerConfig
# Set the controller component configuration
config = ControllerConfig(loop_rate=10.0, ctrl_publish_type="Sequence", control_time_step=0.3)
# Init the controller
controller = Controller(
component_name="my_controller", config=config
)
# Set the vision tracking input to either the detections or trackings topic
controller.inputs(vision_tracking=detections_topic)
# Set the vision follower configuration
vision_follower_config = VisionRGBFollowerConfig(
control_horizon=3, enable_search=False, target_search_pause=6, tolerance=0.2
)
controller.algorithms_config = vision_follower_config # We can also configure other algorithms (DWA, DV etc.) and pass a list to the algorithms_config property
# Init the drive manager with the default parameters
driver = DriveManager(component_name="my_driver")
```
Here we selected a loop rate for the controller of `10Hz` and a control step for generating the commands of `0.3s`, and we selected to send the commands sequentially as they get computed. The vision follower is configured with a `control_horizon` equal to three future control time steps and a `target_search_pause` equal to 6 control time steps. We also chose to disable the search, meaning that the tracking action would end when the robot looses the target.
```{tip}
`target_search_pause` is implemented so the robot would pause and wait while tracking to avoid loosing the target due to quick movement and slow model response. It should be adjusted based on the inference time of the model.
```
## Add all the components to the launcher
All that is left is to add all the three previous components to the launcher and bringup the system!
```python
from kompass.ros import Launcher
launcher = Launcher()
# setup agents as a package in the launcher and add the vision component
launcher.add_pkg(
components=[vision],
ros_log_level="warn",
)
# setup the components for Kompass in the launcher
launcher.kompass(
components=[controller, driver],
)
# Set the robot config for all components
launcher.robot = my_robot
# Start all the components
launcher.bringup()
```
Et voila! our code for the full vision-based target follower is ready and here is the complete script
```{code-block} python
:caption: vision_rgb_follower
:linenos:
import numpy as np
from agents.components import Vision
from agents.models import VisionModel
from agents.clients import RoboMLRESPClient
from agents.config import VisionConfig
from agents.ros import Topic
from kompass.components import Controller, ControllerConfig, DriveManager
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
)
from kompass.control import VisionRGBFollowerConfig
from kompass.ros import Launcher
# RGB camera input topic is set to the compressed image topic
image0 = Topic(name="/image_raw/compressed", msg_type="CompressedImage")
# Select the output topics: detections and trackings
detections_topic = Topic(name="detections", msg_type="Detections")
trackings_topic = Topic(name="trackings", msg_type="Trackings")
object_detection = VisionModel(
name="object_detection",
checkpoint="rtmdet_tiny_8xb32-300e_coco",
)
roboml_detection = RoboMLRESPClient(object_detection, host='127.0.0.1', logging_level="warn")
# Select the vision component configuration
detection_config = VisionConfig(threshold=0.5, enable_visualization=True)
# Create the component
vision = Vision(
inputs=[image0],
outputs=[detections_topic, trackings_topic],
trigger=image0,
config=detection_config,
model_client=roboml_detection,
component_name="detection_component",
)
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.4, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.2, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3
),
)
# Set the controller component configuration
config = ControllerConfig(
loop_rate=10.0, ctrl_publish_type="Sequence", control_time_step=0.3
)
# Init the controller
controller = Controller(component_name="my_controller", config=config)
# Set the vision tracking input to either the detections or trackings topic
controller.inputs(vision_tracking=detections_topic)
# Set the vision follower configuration
vision_follower_config = VisionRGBFollowerConfig(
control_horizon=3, enable_search=False, target_search_pause=6, tolerance=0.2
)
controller.algorithms_config = vision_follower_config # We can also configure other algorithms (DWA, DV etc.) and pass a list to the algorithms_config property
# Init the drive manager with the default parameters
driver = DriveManager(component_name="my_driver")
launcher = Launcher()
launcher.add_pkg(
components=[vision],
ros_log_level="warn",
)
launcher.kompass(
components=[controller, driver],
)
# # Set the robot config for all components
launcher.robot = my_robot
launcher.bringup()
```
## Trigger the following action
After running your complete system you can send a goal to the controller's action server `/my_controller/track_vision_target` of type [`kompass_interfaces.action.TrackVisionTarget`](https://github.com/automatika-robotics/kompass/tree/main/kompass_interfaces/action) to start tracking a selected label (`person` for example):
```shell
ros2 action send_goal /my_controller/track_vision_target kompass_interfaces/action/TrackVisionTarget "{label: 'person'}"
```
You can also re-run the previous script and activate the target search by adding the following config or sending the config along with the action send_goal:
```python
vision_follower_config = VisionRGBFollowerConfig(
control_horizon=3, enable_search=True, target_search_pause=6, tolerance=0.2
)
```
```shell
ros2 action send_goal /my_controller/track_vision_target kompass_interfaces/action/TrackVisionTarget "{label: 'person', search_radius: 1.0, search_timeout: 30}"
```
```
## File: tutorials/vision_tracking_depth.md
```markdown
# Vision Tracking Using Depth Information
This tutorial guides you through creating a vision tracking system using a depth camera. We'll leverage RGB-D with the `VisionRGBDFollower` in Kompass to detect and follow objects more robustly. With the depth information available, this will create more precise understanding of the environment and lead to more accurate and robust object following (as compared to [using RGB images](./vision_tracking.md)).
## Before you start
Lets take care of some preliminary setup.
### Setup Your Depth Camera ROS2 Node
First things first — your robot needs a depth camera to see in 3D and get the `RGBD` input. For this tutorial, we are using an **Intel RealSense** that is available on many mobile robots and well supported in ROS2 and in simulation.
To get your RealSense camera running:
```bash
sudo apt install ros--realsense2-camera
# Launch the camera node to start streaming both color and depth images
ros2 launch realsense2_camera rs_camera.launch.py
```
### Start vision detection using an ML model
To implement and run this example we will need a detection model processing the RGBD camera images to provide the Detection information. Similarly to the [previous tutorial](vision_tracking.md), we will use [**EmbodiedAgents**](https://automatika-robotics.github.io/ros-agents/intro.html) package. EmbodiedAgents provides a [Vision Component](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.components.vision.html), which will allow us to easily deploy a ROS node in our system that interacts with vision models.
Therefore, before starting with this tutorial you need to install both packages:
- Install **EmbodiedAgents**: check the instructions [here](https://automatika-robotics.github.io/ros-agents/installation.html)
```{seealso}
[EmbodiedAgents](https://automatika-robotics.github.io/embodied-agents) is another [Sugarcoat🍬](https://automatika-robotics.github.io/sugarcoat) based package used for creating interactive embodied agents that can understand, remember, and act upon contextual information from their environment.
```
## Setting up the vision component and model client in EmbodiedAgents
In this example, we will set `enable_local_classifier` to `True` in the vision component so the model would be deployed directly on the robot. Additionally, we will set the input topic to be the `RGBD` camera topic. This setting will allow the `Vision` component to **publish both the depth and the rgb image data along with the detections**.
```python
from agents.components import Vision
from agents.config import VisionConfig
from agents.ros import Topic
image0 = Topic(name="/camera/rgbd", msg_type="RGBD")
detections_topic = Topic(name="detections", msg_type="Detections")
detection_config = VisionConfig(threshold=0.5, enable_local_classifier=True)
vision = Vision(
inputs=[image0],
outputs=[detections_topic],
trigger=image0,
config=detection_config,
component_name="detection_component",
)
```
```{seealso}
See all available VisionModel options [here](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.models.html), and all available model clients in agents [here](https://automatika-robotics.github.io/ros-agents/apidocs/agents/agents.clients.html)
```
## Setup your robot and your Controller
You can setup your robot in the same way we did in the [previous tutorial](./vision_tracking.md/#setup-your-robot).
```python
from kompass.actions import Action
from kompass.event import OnAny, OnChangeEqual
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
)
import numpy as np
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=1,0, max_acc=3.0, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=4.0, max_acc=6.0, max_decel=10.0, max_steer=np.pi / 3
),
)
```
Then we will setup the `Controller` component to use the `VisionRGBDFollower`. We will also need to provide two additional inputs:
- The detections topic
- The depth camera info topic
```python
from kompass.components import Controller, ControllerConfig
depth_cam_info_topic = Topic(name="/camera/aligned_depth_to_color/camera_info", msg_type="CameraInfo")
config = ControllerConfig(ctrl_publish_type="Parallel")
controller = Controller(component_name="controller", config=config) # Optionally a config file can be provided here config_file=path_to_config_file
controller.inputs(vision_detections=detections_topic, depth_camera_info=depth_cam_info_topic)
controller.algorithm = "VisionRGBDFollower"
```
## Complete your system with helper components
To make the system more complete and robust, we will add:
- `DriveManager` - to handle sending direct commands to the robot and ensure safety with its emergency stop
- `LocalPlanner` - to provide the controller with more robust local perception, to do so we will also set the controller's `direct_sensor` property to `False`
```python
from kompass.components import DriveManager, LocalMapper
controller.direct_sensor = False
driver = DriveManager(component_name="driver")
mapper = LocalMapper(component_name="local_mapper")
```
## Bring it all together!
```{code-block} python
:caption: vision_depth_follower.py
:linenos:
from agents.components import Vision
from agents.config import VisionConfig
from agents.ros import Topic
from kompass.components import Controller, ControllerConfig, DriveManager, LocalMapper
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
)
from kompass.launcher import Launcher
import numpy as np
image0 = Topic(name="/camera/rgbd", msg_type="RGBD")
detections_topic = Topic(name="detections", msg_type="Detections")
detection_config = VisionConfig(threshold=0.5, enable_local_classifier=True)
vision = Vision(
inputs=[image0],
outputs=[detections_topic],
trigger=image0,
config=detection_config,
component_name="detection_component",
)
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=1,0, max_acc=3.0, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=4.0, max_acc=6.0, max_decel=10.0, max_steer=np.pi / 3
),
)
depth_cam_info_topic = Topic(name="/camera/aligned_depth_to_color/camera_info", msg_type="CameraInfo")
# Setup the controller
config = ControllerConfig(ctrl_publish_type="Parallel")
controller = Controller(component_name="controller", config=config)
controller.inputs(vision_detections=detections_topic, depth_camera_info=depth_cam_info_topic)
controller.algorithm = "VisionRGBDFollower"
controller.direct_sensor = False
# Add additional helper components
driver = DriveManager(component_name="driver")
mapper = LocalMapper(component_name="local_mapper")
# Bring it up with the launcher
launcher = Launcher()
launcher.add_pkg(components=[vision], ros_log_level="warn",
package_name="automatika_embodied_agents",
executable_entry_point="executable",
multiprocessing=True)
launcher.kompass(components=[controller, mapper, driver])
# Set the robot config for all components
launcher.robot = my_robot
launcher.bringup()
```
```{tip}
You can take your design to the next step and make your system more robust by adding some [events](events_actions.md) or defining some [fallbacks](https://automatika-robotics.github.io/sugarcoat/design/fallbacks.html)!
```
```
## File: tutorials/record_load_path.md
```markdown
# Path Recording & Replay
**Save successful paths and re-execute them on demand.**
Sometimes you don't need a dynamic planner to calculate a new path every time. In scenarios like **routine patrols, warehousing, or repeatable docking**, it is often more reliable to record a "golden path" once and replay it exactly.
The **Planner** component facilitates this via three ROS 2 services:
1. `save_plan_to_file`: Saves the currently active plan (or recorded history) to a CSV file.
2. `load_plan_from_file`: Loads a CSV file and publishes it as the current global plan.
3. `start_path_recording`: Starts recording the robot's actual odometry history to be saved later.
## The Recipe
This recipe sets up a basic navigation stack but exposes the **Save/Load/Record Services** to the Web UI instead of the standard "Click-to-Nav" action.
**Create a file named `path_recorder.py`:**
```python
import numpy as np
import os
from ament_index_python.packages import get_package_share_directory
# Kompass Imports
from kompass.robot import (
AngularCtrlLimits, LinearCtrlLimits, RobotGeometry, RobotType, RobotConfig, RobotFrames
)
from kompass.components import (
DriveManager, DriveManagerConfig, Planner, PlannerConfig,
MapServer, MapServerConfig, TopicsKeys, Controller
)
from kompass.ros import Topic, Launcher, ServiceClientConfig
from kompass.control import ControllersID
from kompass_interfaces.srv import PathFromToFile, StartPathRecording
def run_path_recorder():
kompass_sim_dir = get_package_share_directory(package_name="kompass_sim")
# 1. Robot Configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.4, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3),
)
# 2. Configure Components
planner = Planner(component_name="planner", config=PlannerConfig(loop_rate=1.0))
planner.run_type = "Timed"
controller = Controller(component_name="controller")
controller.algorithm = ControllersID.PURE_PURSUIT
controller.direct_sensor = True # Use direct sensor for simple obstacle checks
driver = DriveManager(
component_name="drive_manager",
config=DriveManagerConfig(critical_zone_distance=0.05)
)
# Handle message types (Twist vs TwistStamped)
cmd_msg_type = "TwistStamped" if os.environ.get("ROS_DISTRO") in ["rolling", "jazzy", "kilted"] else "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
map_server = MapServer(
component_name="global_map_server",
config=MapServerConfig(
map_file_path=os.path.join(kompass_sim_dir, "maps", "turtlebot3_webots.yaml"),
grid_resolution=0.5
)
)
# 3. Define Services for UI Interaction
save_path_srv = ServiceClientConfig(
name=f"{planner.node_name}/save_plan_to_file", srv_type=PathFromToFile
)
load_path_srv = ServiceClientConfig(
name=f"{planner.node_name}/load_plan_from_file", srv_type=PathFromToFile
)
start_path_recording = ServiceClientConfig(
name=f"{planner.node_name}/start_path_recording", srv_type=StartPathRecording
)
# 4. Launch
launcher = Launcher()
launcher.kompass(
components=[map_server, planner, driver, controller],
multiprocessing=True,
)
odom_topic = Topic(name="/odometry/filtered", msg_type="Odometry")
launcher.inputs(location=odom_topic)
launcher.robot = my_robot
launcher.frames = RobotFrames(world="map", odom="map", scan="LDS-01")
# 5. Enable UI
launcher.enable_ui(
inputs=[save_path_srv, load_path_srv, start_path_recording],
outputs=[
map_server.get_out_topic(TopicsKeys.GLOBAL_MAP),
odom_topic,
planner.get_out_topic(TopicsKeys.GLOBAL_PLAN),
],
)
launcher.bringup()
if __name__ == "__main__":
run_path_recorder()
```
---
## Workflow: Two Ways to Generate a Path
Once the recipe is running and you have the Kompass UI open (`http://0.0.0.0:5001`), you can generate a path using either the planner or by manually driving the robot.
### Option A: Save a Computed Plan
Use this if you want to save the path produced by the global planner and you want to "freeze" that exact path for future use.
1. **Generate Plan:** Trigger the planner (e.g., via the `/clicked_point` input on the UI).
2. **Verify:** Check if the generated path looks good...
3. **Save:** In the UI Inputs panel, go to the `planner/save_plan_to_file` in the *Inputs* panel
* Enter the **file_location:** `/tmp/` and **file_name:** `computed_path.json`
* Click **Send**.
### Option B: Record a Driven Path (Teleop)
Use this if you want the robot to follow a human-demonstrated path (e.g., a specific maneuver through a tight doorway).
1. **Start Recording:** In the UI Inputs panel, select `planner/start_path_recording`.
* **recording_time_step:** `0.1` (Records a point every 0.1 seconds).
* Click **Call**.
2. **Drive:** Use your keyboard or joystick (e.g., `ros2 run teleop_twist_keyboard teleop_twist_keyboard`) to drive the robot along the desired route.
3. **Save:** When finished, select `planner/save_plan_to_file`.
* Enter the **file_location:** `/tmp/` and **file_name:** `computed_path.csv`
* Click **Send**.
*(Note: Calling save automatically stops the recording process).*
---
## Replay the Path
Now that you have your "Golden Path" saved (either computed or recorded), you can replay it anytime.
1. **Restart:** You can restart the stack or simply clear the current plan.
2. **Load:** In the UI **Inputs** panel, select `planner/load_plan_from_file`.
* **file_location:** `/tmp/`
* **file_name:** `driven_path.csv` (or `computed_path.csv`)
* Click **Send**.
**Result:**
The planner immediately loads the CSV file and publishes it as the **Global Plan**. The **Controller** receives this path and begins executing it immediately, retracing your steps exactly.
---
## Use Cases
* **Routine Patrols:** Record a perfect lap around a facility and replay it endlessly.
* **Complex Docking:** Manually drive a complex approach to a charging station, save the plan, and use it for reliable docking.
* **Multi-Robot Coordination:** Share a single "highway" path file among multiple robots to ensure they stick to verified lanes.
```
## File: tutorials/events_composed.md
```markdown
# Complex Events: Logic Gates & Fusion
In the previous tutorials, we triggered actions based on single, isolated conditions (e.g., "If Mapper Fails" or "If Person Seen").
However, real-world autonomy is rarely that simple. A robot shouldn't stop *every* time it sees an obstacle, maybe it only stops if it's moving fast. It shouldn't return home *just* because the battery is low, maybe it waits until it finishes its current task.
In this tutorial, we will use **Logic Gates** and **Multi-Topic Fusion** to create smarter, more robust reflexes.
## Complex Condition Logic
Kompass allows you to compose complex triggers using standard Python bitwise operators. This effectively turns your Event definitions into a high-level logic circuit.
| Logic | Operator | Description | Use Case |
| :--- | :--- | :--- | :--- |
| **AND** | `&` | All conditions must be True. | "Safety Checks" (Speed > 0 **AND** Obstacle Close) |
| **OR** | `\|` | At least one condition is True. | "Redundancy" (Lidar Blocked **OR** Bumper Hit) |
| **NOT** | `~` | Inverts the condition. | "Exclusion" (Target Seen **AND NOT** Low Battery) |
---
## Scenario: The "Smart" Emergency Stop (AND Logic)
**The Problem:** A naive emergency stop triggers whenever an object is within 0.5m. But if the robot is docking or maneuvering in a tight elevator, this stops it unnecessarily.
**The Solution:** We create a "Smart Stop" that triggers ONLY if: An obstacle is close (< 0.5m) **AND** The robot is moving fast (> 0.1 m/s)
1. Define Event Sources (Topics)
We need data from the **Radar** (perception) and **Odometry** (state).
```python
from kompass.ros import Topic
# 1. Lidar (0.2s timeout for safety)
radar = Topic(name="/radar_front", msg_type="Float32", data_timeout=0.2)
# 2. Odometry (0.5s timeout)
odom = Topic(name="/odom", msg_type="Odometry", data_timeout=0.5)
```
2. Define The Composed Event
We combine the conditions using the `&` operator.
```python
from kompass.ros import Event
# Condition A: Obstacle Close
is_obstacle_close = radar.msg.data < 0.3
# Condition B: Moving Fast (Linear X velocity)
is_robot_moving_fast = odom.msg.twist.twist.linear.x > 0.1
# Composed Event: BOTH must be True
event_smart_stop = Event(
event_condition=(is_obstacle_close & is_robot_moving_fast),
on_change=True
)
```
```
## File: tutorials/events_cross_healing.md
```markdown
# Events For Cross-Component Healing
In the [Fallbacks](fallbacks_simple.md) tutorial, we learned how a component can heal *itself* (e.g., restarting itself). But sophisticated autonomy requires more than just self-repair. It requires **System-Level Awareness** and **Behavioral Reflexes**.
In this tutorial, we will use simple **Events** to implement **Cross-Component Healing:** Components monitoring each other to fix system-level issues.
## Scenario A: The "Unstuck" Reflex
The `Controller` gets stuck in a local minimum (e.g., the robot is facing a corner). It will report an `ALGORITHM_FAILURE` because it cannot find a valid velocity command. We detect this status and ask the `DriveManager` to execute a blind "Unblock" maneuver (e.g., rotate in place or back up).
```{tip}
All components health status topics are accessible in the API using `component.status_topic`
```
```python
# Import the required primitives
from kompass.ros import Event, Action, Topic
# Import the ComponentStatus message to access the status values
from automatika_ros_sugar.msg import ComponentStatus
# Define Event: Controller reports Algorithm Failure
# We set a `on_change` True to trigger the event only when the status changes to algorithm failure the first time
# i.e. the associated action will not keep triggering while the controller keeps detecting the failure
event_controller_fail = Event(
controller.status_topic.msg.status
== ComponentStatus.STATUS_FAILURE_ALGORITHM_LEVEL,
keep_event_delay=60.0
)
# Define Action: Drive Manager's Unblock method
# This is a pre-defined method in DriveManager that executes a recovery maneuver
unblock_action = Action(method=driver.move_to_unblock)
```
## Scenario B: The "Blind" Reflex
The `LocalMapper` crashes or faces an error failing to provide a high fidelity local map. The `Controller`, which relies on the map, can detect that the `LocalMapper is "not healthy" and reconfigure itself to use raw sensor data directly (Reactive Mode).
```python
# Import the reconfiguration action
from kompass.actions import update_parameter
# Import the ComponentStatus message to access the status values
from automatika_ros_sugar.msg import ComponentStatus
# Define Event: Mapper is NOT Healthy
# We set `handle_once` to True to trigger this event only ONCE during the lifetime oft he system
event_mapper_fault = Event(
mapper.status_topic.msg.status != ComponentStatus.STATUS_HEALTHY,
handle_once=True
)
# Define Action: Reconfigure Controller to use direct sensor data
activate_direct_sensor_mode_action = update_parameter(
component=controller, param_name="use_direct_sensor", new_value=True
)
```
## Register Your Events/Actions
After defining your event/action pairs you only need to register the dictionary with the system launcher when adding your components:
```python
# Define your events/action dictionary
events_actions = {
event_controller_fail: unblock_action,
event_mapper_fault: activate_direct_sensor_mode_action,
}
# Pass kompass components to the launcher and register the events
launcher.kompass(
components=[planner, controller, driver, mapper],
activate_all_components_on_start=True,
multi_processing=True,
events_actions=events_actions
)
```
## Complete Recipe
Here is the complete `turtlebot3_cross_healing.py` script combining the point navigation system, the fallbacks and the cross-component healing events
```{code-block} python
:caption: turtlebot3_with_fallbacks.py
:linenos:
import numpy as np
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
)
from kompass.config import RobotConfig, RobotFrames
from kompass.components import (
Controller,
DriveManager,
Planner,
PlannerConfig,
)
from kompass.control import ControllersID
from kompass.ros import Topic, Launcher, Action
# Import the ComponentStatus message to access the status values
from automatika_ros_sugar.msg import ComponentStatus
# Import the reconfiguration action
from kompass.actions import update_parameter
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Set the robot frames
robot_frames = RobotFrames(
robot_base='base_link',
odom='odom',
world='map',
scan='LDS-01')
# Setup components with default config, inputs and outputs
planner_config = PlannerConfig(loop_rate=1.0) # 1 Hz
planner = Planner(component_name="planner", config=planner_config)
# Set Planner goal input to Rviz Clicked point
goal_topic = Topic(name="/clicked_point", msg_type="PointStamped")
planner.inputs(goal_point=goal_topic)
# Get a default controller component
controller = Controller(component_name="controller")
# Configure Controller to use local map instead of direct sensor information
controller.direct_sensor = False
# Select the primary control algorithm
controller.algorithm = ControllersID.DWA
# Define the action
switch_algorithm_action = Action(method=controller.set_algorithm, args=(ControllersID.PURE_PURSUIT,))
# Set the Controller 'algorithm-level' failure fallback
controller.on_algorithm_fail(action=[Action(controller.restart), switch_algorithm_action], max_retries=1)
# Get the default DriveManager
driver = DriveManager(component_name="drive_manager")
# Publish Twist or TwistStamped from the DriveManager based on the distribution
if "ROS_DISTRO" in os.environ and (
os.environ["ROS_DISTRO"] in ["rolling", "jazzy", "kilted"]
):
cmd_msg_type : str = "TwistStamped"
else:
cmd_msg_type = "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
# Set the DriveManager System-Level failure Fallback
# Not setting max_retires -> will attempt restarting infinity
driver.on_system_fail(Action(driver.restart))
# Get a default Local Mapper component
mapper = LocalMapper(component_name="mapper")
# Define Events
# 1. Controller Algorithm Failure
# We set a `on_change` True to trigger the event only when the status changes to algorithm failure the first time
# i.e. the associated action will not keep triggering while the controller keeps detecting the failure
event_controller_fail = Event(
controller.status_topic.msg.status
== ComponentStatus.STATUS_FAILURE_ALGORITHM_LEVEL,
keep_event_delay=60.0
)
# 2. Mapper is NOT Healthy
# We set `handle_once` to True to trigger this event only ONCE during the lifetime oft he system
event_mapper_fault = Event(
mapper_status.msg.status != ComponentStatus.STATUS_HEALTHY,
handle_once=True
)
# Define Actions
# 1. Drive Manager's Unblock method
# This is a pre-defined method in DriveManager that executes a recovery maneuver
unblock_action = Action(method=driver.move_to_unblock)
# 2. Action to reconfigure Controller to use direct sensor data
activate_direct_sensor_mode_action = update_parameter(
component=controller, param_name="use_direct_sensor", new_value=True
)
# Define your events/action dictionary
events_actions = {
event_controller_fail: unblock_action,
event_mapper_fault: activate_direct_sensor_mode_action,
}
# Init a launcher
launcher = Launcher()
# Pass kompass components to the launcher and register the events
launcher.kompass(
components=[planner, controller, driver, mapper],
activate_all_components_on_start=True,
multi_processing=True,
events_actions=events_actions
)
# Set the robot
launcher.robot = robot_config
# Set the frames
launcher.frames = frames_config
# Optional Generic Fallback Policy: If any component fails -> restart it with unlimited retries
# launcher.on_fail(action_name="restart")
# After all configuration is done bringup the stack
launcher.bringup()
```
## Next Steps
We have handled single-topic events for cross-component healing. Now let's see how we can use events for external reflexes.
:::{button-link} events_external_reflexes.html
:color: primary
:ref-type: doc
:outline:
Events for External Reflexes →
:::
```
## File: tutorials/events_dynamic.md
```markdown
# Context-Aware Actions
In previous tutorials, our actions were **Static**, i.e. the action argument were pre-defined in the recipe. For example, in the [External Reflexes Tutorial](./events_external_reflexes.md) we defined an event when a "Person is Detected" $\rightarrow$ execute action to trigger the vision follower action server with a pre-defined request message with the target 'person'. But what if the action depends on **what** object is detected?
Real-world autonomy requires context aware actions, where the action arguments can be fetched from the system or environment at the time of execution
Instead of writing separate hard-coded actions for every possibility, Kompass allows you to use **Dynamic Data Injection** from Topics directly in your Actions. In the [Point Navigation](point_navigation.md) tutorial, we manually set goals using Rviz `clicked_point`. In this tutorial, we will upgrade the Turtlebot3 to understand **Semantic Commands**.
## Semantic Navigation using a Context-Aware Action
We will build a system where you can publish a location name (String) to a topic, and the robot will automatically look up the coordinates and navigate there.
## The Setup
**Prerequisites:** Completed [Point Navigation](https://www.google.com/search?q=point_navigation.md).
We will use the exact same set of component as in the [Point Navigation](point_navigation.md) tutorial, plus a simulated "Command" topic.
```python
from kompass.ros import Topic
# Define the Command Source
# Simulates a voice command or fleet management instruction
# Examples: "kitchen", "reception", "station_A"
command_topic = Topic(name="/user_command", msg_type="String")
```
## The Context-Aware Logic
1. We need a function that translates a *Place Name* (String) into a *Goal Pose* (Coordinates).
```python
from geomerty_msgs.msg import PoseStamped
import subprocess
# A simple "Map" of our environment
# In a real app, this can come from a database or a "semantic memory"
WAYPOINTS = {
"kitchen": {"x": 2.0, "y": 0.5},
"reception": {"x": 0.0, "y": 0.0},
"station_a": {"x": -1.5, "y": 2.0}
}
def navigate_to_location(location_name: str):
"""
Looks up the location coordinates and sends them to the planner.
"""
key = location_name.strip().lower()
if key not in WAYPOINTS:
print(f"Unknown location: {key}")
return
coords = WAYPOINTS[key]
x = coords["x"]
y = coords["y"]
# Construct the Bash Command
# We publish to /clicked_point (PointStamped) which the Planner listens to
# Corrected f-string with proper escaping
topic_cmd = (
f"ros2 topic pub --once /clicked_point geometry_msgs/msg/PointStamped "
f"'{{header: {{frame_id: \"map\"}}, point: {{x: {x}, y: {y}, z: 0.0}}}}'"
)
print(f"Executing: {topic_cmd}")
subprocess.run(topic_cmd, shell=True)
```
2. Define an Event that triggers whenever a new command arrives:
```python
# Trigger on ANY new message on the command topic, if the local mapper is healthy
event_command_received = Event(
event_condition=(command_topic & (mapper.status_topic.msg.status == ComponentStatus.STATUS_HEALTHY)),
)
```
3. Define the action:
```python
action_process_command = Action(
method=navigate_to_location,
# DYNAMIC INJECTION:
# We pass 'command_topic.msg.data' as the argument.
# Sugarcoat will fetch the actual string ("kitchen") when the event fires.
args=(command_topic.msg.data,)
)
```
## Complete Recipe
Launch this script, then publish a string to `/user_command` (e.g., `ros2 topic pub /user_command std_msgs/String "data: kitchen" --once`) to see the robot move!
```{code-block} python
:caption: turtlebot3_with_fallbacks.py
:linenos:
import numpy as np
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
)
from kompass.config import RobotConfig, RobotFrames
from kompass.components import (
Controller,
DriveManager,
Planner,
PlannerConfig,
)
from kompass.control import ControllersID
from kompass.ros import Topic, Launcher, Action
# Import the ComponentStatus message to access the status values
from automatika_ros_sugar.msg import ComponentStatus
# Import the reconfiguration action
from kompass.actions import update_parameter
from geomerty_msgs.msg import PoseStamped
import subprocess
# A simple "Map" of our environment
# In a real app, this can come from a database or a "semantic memory"
WAYPOINTS = {
"kitchen": {"x": 2.0, "y": 0.5},
"reception": {"x": 0.0, "y": 0.0},
"station_a": {"x": -1.5, "y": 2.0}
}
def navigate_to_location(location_name: str):
"""
Looks up the location coordinates and sends them to the planner.
"""
key = location_name.strip().lower()
if key not in WAYPOINTS:
print(f"Unknown location: {key}")
return
coords = WAYPOINTS[key]
x = coords["x"]
y = coords["y"]
# Construct the Bash Command
# We publish to /clicked_point (PointStamped) which the Planner listens to
topic_cmd = (
f"ros2 topic pub --once /clicked_point geometry_msgs/msg/PointStamped "
f"'{{header: {{frame_id: \"map\"}}, point: {{x: {x}, y: {y}, z: 0.0}}}}'"
)
print(f"Executing: {topic_cmd}")
subprocess.run(topic_cmd, shell=True)
# Define the Command Source
# Simulates a voice command or fleet management instruction
# Examples: "kitchen", "reception", "station_A"
command_topic = Topic(name="/user_command", msg_type="String")
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Set the robot frames
robot_frames = RobotFrames(
robot_base='base_link',
odom='odom',
world='map',
scan='LDS-01')
# Setup components with default config, inputs and outputs
planner_config = PlannerConfig(loop_rate=1.0) # 1 Hz
planner = Planner(component_name="planner", config=planner_config)
# Set Planner goal input to Rviz Clicked point
goal_topic = Topic(name="/clicked_point", msg_type="PointStamped")
planner.inputs(goal_point=goal_topic)
# Get a default controller component
controller = Controller(component_name="controller")
# Configure Controller to use local map instead of direct sensor information
controller.direct_sensor = False
# Select the primary control algorithm
controller.algorithm = ControllersID.DWA
# Define the action
switch_algorithm_action = Action(method=controller.set_algorithm, args=(ControllersID.PURE_PURSUIT,))
# Set the Controller 'algorithm-level' failure fallback
controller.on_algorithm_fail(action=[Action(controller.restart), switch_algorithm_action], max_retries=1)
# Get the default DriveManager
driver = DriveManager(component_name="drive_manager")
# Publish Twist or TwistStamped from the DriveManager based on the distribution
if "ROS_DISTRO" in os.environ and (
os.environ["ROS_DISTRO"] in ["rolling", "jazzy", "kilted"]
):
cmd_msg_type : str = "TwistStamped"
else:
cmd_msg_type = "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
# Set the DriveManager System-Level failure Fallback
# Not setting max_retires -> will attempt restarting infinity
driver.on_system_fail(Action(driver.restart))
# Get a default Local Mapper component
mapper = LocalMapper(component_name="mapper")
# Define Events
# 1. Controller Algorithm Failure
# We set a `on_change` True to trigger the event only when the status changes to algorithm failure the first time
# i.e. the associated action will not keep triggering while the controller keeps detecting the failure
event_controller_fail = Event(
controller.status_topic.msg.status
== ComponentStatus.STATUS_FAILURE_ALGORITHM_LEVEL,
keep_event_delay=60.0
)
# 2. Mapper is NOT Healthy
# We set `handle_once` to True to trigger this event only ONCE during the lifetime oft he system
event_mapper_fault = Event(
mapper_status.msg.status != ComponentStatus.STATUS_HEALTHY,
handle_once=True
)
# Trigger on ANY new message on the command topic, if the local mapper is healthy
event_command_received = Event(
event_condition=(command_topic & (mapper.status_topic.msg.status == ComponentStatus.STATUS_HEALTHY)),
)
# Define Actions
# 1. Drive Manager's Unblock method
# This is a pre-defined method in DriveManager that executes a recovery maneuver
unblock_action = Action(method=driver.move_to_unblock)
# 2. Action to reconfigure Controller to use direct sensor data
activate_direct_sensor_mode_action = update_parameter(
component=controller, param_name="use_direct_sensor", new_value=True
)
action_process_command = Action(
method=navigate_to_location,
# DYNAMIC INJECTION:
# We pass 'command_topic.msg.data' as the argument.
# Sugarcoat will fetch the actual string ("kitchen") when the event fires.
args=(command_topic.msg.data,)
)
# Define your events/action dictionary
events_actions = {
event_controller_fail: unblock_action,
event_mapper_fault: activate_direct_sensor_mode_action,
event_command_received: action_process_command
}
# Init a launcher
launcher = Launcher()
# Pass kompass components to the launcher and register the events
launcher.kompass(
components=[planner, controller, driver, mapper],
activate_all_components_on_start=True,
multi_processing=True,
events_actions=events_actions
)
# Set the robot
launcher.robot = robot_config
# Set the frames
launcher.frames = frames_config
# Optional Generic Fallback Policy: If any component fails -> restart it with unlimited retries
# launcher.on_fail(action_name="restart")
# After all configuration is done bringup the stack
launcher.bringup()
```
```
## File: tutorials/events_external_reflexes.md
```markdown
# Events for External Reflexes
In the previous [Vision Tracking Using Depth](vision_tracking_depth.md) tutorial, the robot was using the `VisionRGBDFollower` algorithm. This meant that the robot will wait for a user to send a request to the controller to follow a an object with a given ID.
In this tutorial, we will make the behavior intelligent. The robot will:
1. **Idle/Patrol** by default.
2. **Switch to Follow Mode** automatically when a "person" is detected in the video feed.
## 1. The Setup
**Prerequisites:** Completed [Vision Tracking Using Depth](vision_tracking_depth.md).
We use the same component setup (Vision, Controller, DriveManager), but we initialize the Controller in path following mode (standard navigation) instead of Vision mode.
```python
# Start the controller in standard navigation mode (PathFollowing)
controller = Controller(component_name="controller")
controller.algorithm = "PurePursuit" # Default mode
controller.direct_sensor = True
# Vision Component (as before)
```
## 2. Event A: Person Detected -> Follow
We want to detect if the class "person" appears in the detections list. If it does, we trigger the Controller's Action Server to start the tracking behavior.
```python
from kompass.ros import Event
# Trigger if "person" is in the list of detected labels
event_person_detected= Event(
event_condition=detections_topic.msg.labels.contains("person"),
on_change=True
)
```
### The Action
We need to tell the Controller to switch behaviors. We use two actions:
1. **Switch Algorithm:** Change the internal logic to `VisionRGBDFollower`.
2. **Trigger Action Server:** Send a goal to the controller to start the active tracking loop.
```{tip}
If you link an Event to a set of Actions, the action set will get executed in sequence.
```
```python
# Import the required actions
from kompass.actions import update_parameter, send_component_action_server_goal
# Import the vision follower 'ActionServer' ROS2 message
from kompass_interfaces.action import TrackVisionTarget
# Switch to vision following
# Option 1: use set_algorithm action from the controller (same as in the fallbacks tutorial)
# switch_algorithm_action = Action(method=controller.set_algorithm, args=(ControllersID.VISION_DEPTH,))
# Option 2: use the update_parameter action to change the 'algorithm' parameter value
# the value can be set to ControllersID.VISION_DEPTH or directly the string name of the algorithm class VisionRGBDFollower
switch_algorithm_action = update_parameter(
component=controller,
param_name="algorithm",
new_value="VisionRGBDFollower"
)
# Action to trigger the action server to follow a person
action_request_msg = TrackVisionTarget.Goal()
action_request_msg.label = "person" # Specify the target to follow
action_start_person_following = send_component_action_server_goal(
component=controller,
request_msg=action_request_msg,
)
# Set the event/action(s) pair
events_action = {
event_person_detected: [switch_algorithm_action, action_start_person_following]
}
```
## 4. Complete Recipe
Here is the complete script. Launch this, run it and stand in front of the robot to make it follow you!
```{code-block} python
:caption: turtlebot3_with_fallbacks.py
:linenos:
import numpy as np
from agents.components import Vision
from agents.config import VisionConfig
from agents.ros import Topic
from kompass.components import Controller, ControllerConfig, DriveManager, LocalMapper
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig,
)
from kompass.ros import Launcher, Event
from kompass.actions import update_parameter, send_component_action_server_goal
from kompass_interfaces.action import TrackVisionTarget
image0 = Topic(name="/camera/rgbd", msg_type="RGBD")
detections_topic = Topic(name="detections", msg_type="Detections")
detection_config = VisionConfig(threshold=0.5, enable_local_classifier=True)
vision = Vision(
inputs=[image0],
outputs=[detections_topic],
trigger=image0,
config=detection_config,
component_name="detection_component",
)
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=1,0, max_acc=3.0, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=4.0, max_acc=6.0, max_decel=10.0, max_steer=np.pi / 3
),
)
depth_cam_info_topic = Topic(name="/camera/aligned_depth_to_color/camera_info", msg_type="CameraInfo")
# Setup the controller
config = ControllerConfig(ctrl_publish_type="Parallel")
controller = Controller(component_name="controller", config=config)
controller.inputs(vision_detections=detections_topic, depth_camera_info=depth_cam_info_topic)
controller.algorithm = "VisionRGBDFollower"
controller.direct_sensor = False
# Add additional helper components
driver = DriveManager(component_name="driver")
mapper = LocalMapper(component_name="local_mapper")
# Define Event: Trigger if "person" is in the list of detected labels
event_person_detected= Event(
event_condition=detections_topic.msg.labels.contains("person"),
on_change=True
)
# Define Action(s)
switch_algorithm_action = update_parameter(
component=controller,
param_name="algorithm",
new_value="VisionRGBDFollower"
)
# Action to trigger the action server to follow a person
action_request_msg = TrackVisionTarget.Goal()
action_request_msg.label = "person" # Specify the target to follow
action_start_person_following = send_component_action_server_goal(
component=controller,
request_msg=action_request_msg,
)
# Set the event/action(s) pair
events_action = {
event_person_detected: [switch_algorithm_action, action_start_person_following]
}
# Bring it up with the launcher
launcher = Launcher()
launcher.add_pkg(components=[vision], ros_log_level="warn",
package_name="automatika_embodied_agents",
executable_entry_point="executable",
multiprocessing=True)
# Add component and the events to monitor
launcher.kompass(components=[controller, mapper, driver], events_actions=events_action)
# Set the robot config for all components
launcher.robot = my_robot
launcher.bringup()
```
## Next Steps
We have handled single-topic events for both cross-component healing and external reflexes. But real-world decisions are rarely based on one factor.
In the next tutorial, we will use **Logic Gates** to create smarter, composed events (e.g., "If Emergency Stop **AND** Battery Low").
:::{button-link} events_composed.html
:color: primary
:ref-type: doc
:outline:
Logic Gates & Composed Events →
:::
```
## File: tutorials/fallbacks_simple.md
```markdown
# Making the system robust with Fallbacks
In the previous [Point Navigation](./point_navigation.md) tutorial, we built a standard point navigation stack. While functional, it is "optimistic", it assumes that all the components will keep working smoothly and no errors or failures would happen during runtime.
In reality, sensors disconnect, optimization solvers fail to converge, and dynamic obstacles block paths.
In this tutorial, we will give our *Turtlebot3* a few **Fallback** techniques, so the robot can attempt to **self-heal** and **reconfigure itself** when these inevitable failures occur.
## Handling Algorithm Failures
Algorithms used inside the navigation components can fail during runtime for many reasons. A motion control algorithm used in the `Controller` can fail, for example, due to:
* **Mathematical Singularity:** The optimization solver might fail to find a solution that satisfies all kinematic constraints (e.g., the robot is cornered).
* **Local Minima:** The robot might get "stuck" oscillating in a narrow corridor or doorway.
* **Tracking Error:** The robot has drifted too far from the path, and the controller cannot compute a valid velocity to rejoin it smoothly.
Implementation
If our primary high-performance algorithm (e.g., `DWA`) fails, we shouldn't just stop. We can switch to a "Safety" algorithm (like a simple `PurePursuit`).
- Start by importing the `Action` class required to define the Fallback routines, and the `ControllersID` enum for selecting our main controller and our fallback controller:
```python
from kompass.ros import Action
from kompass.control import ControllersID
```
- Select the primary control algorithm, and define the `Action` to switch the algorithm into our fallback choice:
```python
# Select the primary control algorithm
controller.algorithm = ControllersID.DWA
# Define the action
switch_algorithm_action = Action(method=controller.set_algorithm, args=(ControllersID.PURE_PURSUIT,))
```
- Set the fallbacks action sequence: Try restarting the controller -> If fails again -> Switch the control algorithm
```python
controller.on_algorithm_fail(action=[Action(controller.restart), switch_algorithm_action], max_retries=1)
```
## Handling System Failures
Some component, like the `DriveManager` talks directly to the low-level hardware (micro-controller/motor drivers). And in rel-life the robot can face different system failures during runtime:
* **Cable Issues:** Loose USB/Serial cables caused by vibration.
* **EMI:** Electromagnetic interference causing packet loss on the serial bus.
* **Watchdog Trip:** The micro-controller resets itself because it didn't receive a command in time.
### Implementation
To make our system more resilient, we can implement a simple 'restart' fallback for system failures. Since hardware glitches are often transient, this simple restart fallback can re-establishes the serial handshake and be quite effective.
- We can do this by simply adding the following line to our recipe:
```python
# Not setting max_retires -> will attempt restarting infinity
driver.on_system_fail(Action(driver.restart))
```
- We can also select a general **component-level** fallback for all our component by doing:
```python
launcher.on_fail(action_name="restart")
```
## Complete Recipe
Here is the complete `turtlebot3_with_fallbacks.py` script combining the [Point Navigation](https://www.google.com/search?q=./point_navigation.md) setup with our new robustness layer.
```{code-block} python
:caption: turtlebot3_with_fallbacks.py
:linenos:
import numpy as np
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotGeometry,
RobotType,
)
from kompass.config import RobotConfig, RobotFrames
from kompass.components import (
Controller,
DriveManager,
Planner,
PlannerConfig,
)
from kompass.control import ControllersID
from kompass.ros import Topic, Launcher, Action
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Set the robot frames
robot_frames = RobotFrames(
robot_base='base_link',
odom='odom',
world='map',
scan='LDS-01')
# Setup components with default config, inputs and outputs
planner_config = PlannerConfig(loop_rate=1.0) # 1 Hz
planner = Planner(component_name="planner", config=planner_config)
# Set Planner goal input to Rviz Clicked point
goal_topic = Topic(name="/clicked_point", msg_type="PointStamped")
planner.inputs(goal_point=goal_topic)
# Get a default controller component
controller = Controller(component_name="controller")
# Configure Controller to use local map instead of direct sensor information
controller.direct_sensor = False
# Select the primary control algorithm
controller.algorithm = ControllersID.DWA
# Define the action
switch_algorithm_action = Action(method=controller.set_algorithm, args=(ControllersID.PURE_PURSUIT,))
# Set the Controller 'algorithm-level' failure fallback
controller.on_algorithm_fail(action=[Action(controller.restart), switch_algorithm_action], max_retries=1)
# Get the default DriveManager
driver = DriveManager(component_name="drive_manager")
# Publish Twist or TwistStamped from the DriveManager based on the distribution
if "ROS_DISTRO" in os.environ and (
os.environ["ROS_DISTRO"] in ["rolling", "jazzy", "kilted"]
):
cmd_msg_type : str = "TwistStamped"
else:
cmd_msg_type = "Twist"
driver.outputs(robot_command=Topic(name="/cmd_vel", msg_type=cmd_msg_type))
# Set the DriveManager System-Level failure Fallback
# Not setting max_retires -> will attempt restarting infinity
driver.on_system_fail(Action(driver.restart))
# Get a default Local Mapper component
mapper = LocalMapper(component_name="mapper")
# Init a launcher
launcher = Launcher()
# Pass kompass components to the launcher
launcher.kompass(
components=[planner, controller, driver, mapper],
activate_all_components_on_start=True,
multi_processing=True)
# Set the robot
launcher.robot = robot_config
# Set the frames
launcher.frames = frames_config
# Optional Generic Fallback Policy: If any component fails -> restart it with unlimited retries
# launcher.on_fail(action_name="restart")
# After all configuration is done bringup the stack
launcher.bringup()
```
## Next Steps
Your robot can now heal itself from internal failures! But what about reacting to the *external* world?
In the next tutorial, we will use **Events** to make the robot reactive to its environment.
:::{button-link} events_cross_healing.html
:color: primary
:ref-type: doc
:outline:
Adding Reflexes (Events) →
:::
```
## File: tutorials/automated_motion_test.md
```markdown
# Automated Motion Testing
**System identification and response recording made easy.**
The **MotionServer** is a specialized component designed for robot calibration and system identification. It performs two critical tasks:
1. **Automated Testing:** It sends open-loop reference commands (e.g., steps, circles) to the robot.
2. **Data Recording:** It records the robot's actual response (Odometry) versus the sent command (`cmd_vel`) to a CSV file.
This data is essential for tuning controllers, verifying kinematic constraints, or training machine learning models.
---
## The Recipe
Below is a complete recipe to launch a simulation (or real robot), triggering the Motion Server via an event.
**Create a file named `motion_test.py`:**
```python
import numpy as np
import os
from ament_index_python.packages import get_package_share_directory
# Kompass Imports
from kompass.robot import (
AngularCtrlLimits, LinearCtrlLimits, RobotGeometry, RobotType, RobotConfig, RobotFrames
)
from kompass.components import (
DriveManager, DriveManagerConfig, MapServer, MapServerConfig,
TopicsKeys, MotionServer, MotionServerConfig
)
from kompass.ros import Topic, Launcher
def run_motion_test():
kompass_sim_dir = get_package_share_directory(package_name="kompass_sim")
# 1. Robot Configuration
# Define physical limits (Crucial for the MotionServer to generate valid test commands)
my_robot = RobotConfig(
model_type=RobotType.DIFFERENTIAL_DRIVE,
geometry_type=RobotGeometry.Type.CYLINDER,
geometry_params=np.array([0.1, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.4, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3),
)
# 2. Configure Motion Server
# We configure it to run "Circle Tests" for 10 seconds per test
motion_config = MotionServerConfig(
test_period=10.0,
run_circle_test=True,
run_step_test=False,
tests_folder=os.path.expanduser("~/.kompass/tests") # Where to save CSVs
)
motion_server = MotionServer(component_name="motion_server", config=motion_config)
motion_server.run_type = "Event" # Wait for a trigger to start
# 3. Drive Manager
# Acts as the safety layer between MotionServer and the Hardware
driver = DriveManager(
component_name="drive_manager",
config=DriveManagerConfig(critical_zone_distance=0.05)
)
# Handle ROS 2 Distribution message types
cmd_msg_type = "TwistStamped" if os.environ.get("ROS_DISTRO") in ["rolling", "jazzy"] else "Twist"
# 4. Wiring
# The Driver publishes the final hardware command
cmd_topic = Topic(name="/cmd_vel", msg_type=cmd_msg_type)
driver.outputs(robot_command=cmd_topic)
# The MotionServer listens to that SAME topic to record what was actually sent
motion_server.inputs(command=cmd_topic)
# 5. Context (Map Server)
map_server = MapServer(
component_name="global_map_server",
config=MapServerConfig(
map_file_path=os.path.join(kompass_sim_dir, "maps", "turtlebot3_webots.yaml"),
grid_resolution=0.5
)
)
# 6. Launch
launcher = Launcher()
launcher.kompass(components=[map_server, driver, motion_server], multiprocessing=True)
# Link Odometry (The response we want to record)
odom_topic = Topic(name="/odometry/filtered", msg_type="Odometry")
launcher.inputs(location=odom_topic)
launcher.robot = my_robot
launcher.frames = RobotFrames(world="map", odom="map", scan="LDS-01")
# 7. Enable UI
# We expose the RUN_TESTS input so we can trigger it from the browser
launcher.enable_ui(
inputs=[motion_server.get_in_topic(TopicsKeys.RUN_TESTS)],
outputs=[map_server.get_out_topic(TopicsKeys.GLOBAL_MAP), odom_topic]
)
launcher.bringup()
if __name__ == "__main__":
run_motion_test()
```
---
## How to Run the Test
### 1. Launch the Stack
Run the script you just created. Ensure your simulator (e.g., Webots or Gazebo) is running first.
```bash
python3 motion_test.py
```
### 2. Open the UI
Open your browser to the local UI URL (e.g., `http://0.0.0.0:5001`). You will see the map and the robot.
### 3. Trigger the Test
In the **Inputs** panel on the UI, you will see a switch or button for `run_tests`.
* Toggle it to **True** and click **send**.
```{figure} ../_static/images/turtlebot3_run_tests.png
:alt: Motion test UI
:align: center
:width: 80%
```
### 4. Watch the Robot
The robot will immediately execute the configured test pattern (e.g., driving in circles).
1. **Forward Circle:** Max Velocity / 2
2. **Inverse Circle:** Negative Velocity
3. **Backward Circle**
The robot will automatically stop after the sequence is complete.
---
## Analyzing the Data
Once the tests are finished, check the folder configured in `tests_folder` (in the recipe above: `~/.kompass/tests`).
You will find CSV files named by the test type (e.g., `circle_forward.csv`).
**CSV Structure:**
| timestamp | x | y | yaw | cmd_vx | cmd_vy | cmd_omega |
| :--- | :--- | :--- | :--- | :--- | :--- | :--- |
| 16234.12 | 0.0 | 0.0 | 0.0 | 0.2 | 0.0 | 0.2 |
| ... | ... | ... | ... | ... | ... | ... |
You can plot these columns to compare the `cmd_vx` (Reference) vs the derivative of `x` (Response) to calculate your system's step response and latency.
---
## Configuration Options
You can customize the testing behavior via `MotionServerConfig`:
* **`test_period`** *(float, default=10.0)*: Duration of each individual test step in seconds.
* **`run_step_test`** *(bool, default=False)*: Runs linear step inputs (forward/backward straight lines).
* **`run_circle_test`** *(bool, default=True)*: Runs combined linear and angular velocity commands.
* **`tests_folder`** *(str)*: Absolute path where CSV files will be saved.
:::{admonition} Wiring Tip
:class: tip
The **MotionServer** generates commands, but usually sends them to the **DriveManager** first for safety checks. However, for accurate recording, the MotionServer should listen to the **output** of the DriveManager (`cmd_vel`) as its input. This ensures you record exactly what was sent to the motors, including any safety overrides.
:::
```
## File: advanced/design.md
```markdown
# Design Concepts
Kompass is built on [**Sugarcoat🍬**](https://automatika-robotics.github.io/sugarcoat), inheriting a lightweight, expressive, and event-driven architecture for designing ROS2-based systems.
At the core of Kompass is the **Component**, a "super-sweetened" version of the standard ROS2 lifecycle node. By standardizing execution, health monitoring, and data flow, Kompass allows you to build navigation stacks that are not just modular, but **reactive** and **self-healing**.
The following sections outline the five pillars of the Kompass design.
## 1. The Component: Smart Execution unit
The Component is the atomic unit of logic in Kompass. Unlike a standard node, a Component manages its own lifecycle, validates its own configuration, and reports its own health.
```{figure} ../_static/images/diagrams/component_dark.png
:class: dark-only
:alt: Kompass Component
:align: center
```
```{figure} ../_static/images/diagrams/component_light.png
:class: light-only
:alt: Kompass Component
:align: center
Component Structure
```
## 2. Standardized Inputs & Outputs
Components communicate through defined [**Inputs/Outputs**](https://automatika-robotics.github.io/sugarcoat/design/topics.html), allowing you to easily specify the ROS2 topics that connect different parts of your system.
To ensure that different navigation components (planners, controllers, mappers) snap together effortlessly, Kompass communicates through **Standardized I/O Keys**. Crucially, each key supports **multiple message types**, providing extra flexibility and out-of-the-box configuration for your data pipelines.
Each Input/Output key is governed by:
- **Allowed Types:** Ensures data compatibility by strictly defining which message classes are accepted (e.g., a map input only accepting `OccupancyGrid`, or a position input can accept both 'Pose' and `PoseStamped` messages).
- **Cardinality:** Defines the topology of the connection—specifically, **if a stream is mandatory** and **how many sources are allowed** (e.g., "Requires exactly 1 Odometry source" or "Accepts up to 3 PointCloud sources").
```{seealso}
See a complete list of the Inputs/Outputs keys in Kompass stack along with configuration examples [here](./advanced_conf/topics.md)
```
## 3. Active Resilience: Health & Fallbacks
Robots operate in unpredictable environments. Kompass components are designed to handle failures gracefully rather than crashing the whole stack.
Self-Monitoring (Health Status)
Every component continuously introspects its state and broadcasts a [**Health Status**](https://automatika-robotics.github.io/sugarcoat/design/status.html). This allows the system to distinguish between a crashing driver and a path planning algorithm that simply can't find a route.
The following health codes help diagnose issues at different layers—from algorithmic faults to systemic problems—allowing fine-grained fault handling and robust navigation behavior:
| Status Code | Description |
|-------------|--------------------------|
| 0 | Running - Healthy |
| 1 | Failure: Algorithm Level |
| 2 | Failure: Component Level |
| 3 | Failure: System Level |
| 4 | Failure: General |
Automatic Recovery (Fallbacks)
Why wake up a human when the robot can fix itself? Components are configured with [**Fallback**](https://automatika-robotics.github.io/sugarcoat/design/fallbacks.html) behaviors. When a specific Health Status level is reported, the component automatically triggers user-defined Actions—such as re-initializing a driver, clearing a costmap, or switching to a recovery behavior—without manual intervention.
## 4. Dynamic Orchestration: Events & Actions
Static navigation stacks are brittle. They struggle to handle edge cases like sensor failures, sudden dynamic obstacles, or low-battery emergencies without hard-coded, nested conditional logic.
Kompass's Event-Driven Architecture give the navigation stack "reflexes." It decouples Perception (Events) from Decision Making (Actions), allowing the robot to adapt to changing contexts in real-time through configuration, not code.
Static navigation stacks are brittle. Kompass uses an Event-Driven Architecture to adapt to changing contexts in real-time.
[**Events**](https://automatika-robotics.github.io/sugarcoat/design/events.html) (Contextual Triggers)
An Event monitors data streams (ROS2 topics) to detect specific conditions. Unlike simple callbacks, Kompass Events allow for Logical Composition and Sensor Fusion directly in the trigger definition.
Using the new API, you can define complex navigational reflexes with:
- Logical Composition: Combine conditions using Pythonic syntax (&, |, ~). Example: "If (Lidar detects obstacle) AND (Robot is in fast mode)..."
- Multi-Topic Blackboard: Fuse internal state with external perception. Example: "If (Battery is Low) AND (Distance to Dock > 50m)..."
- Data Freshness: Ensure safety by ignoring stale sensor data using data_timeout
:::{tip}
Think in Behaviors.
Events allow you to read the robot's logic like a sentence: "If the terrain is rough AND the payload is heavy, THEN switch to conservative planning."
:::
[**Actions**](https://automatika-robotics.github.io/sugarcoat/design/actions.html) (Dynamic Responses)
Actions are the routines executed when an Event fires. In Kompass, Actions are dynamic and context-aware.
Using Dynamic Data Injection, Kompass can pass the data from **Any Topic** into the action arguments at runtime. This allows for generic, reusable recovery behaviors that adapt to the situation.
:::{seealso}
Learn more about adding events/actions to your recipe [here](../tutorials/events_actions.md)
:::
## 5. Flexible Execution with Performance in Mind
Kompass respects that different robots have different compute constraints. The [**Launcher**](https://automatika-robotics.github.io/sugarcoat/design/launcher.html) allows you to orchestrate your entire stack with a simple Python API, choosing the execution model that fits your hardware:
- Multi-threaded: Components run as threads in a single process. Ideal for low-latency communication via shared memory.
- Multi-process: Components run in isolated processes. Ideal for stability (one crash doesn't kill the system) and distributing load.
An internal [**Monitor**](https://automatika-robotics.github.io/sugarcoat/design/monitor.html) runs alongside the stack, continuously supervising the state of components and the events being triggered—making the system self-aware and adaptable in real-time.
::::{tab-set}
:::{tab-item} Multi-Threaded Execution
:sync: multi-threaded
```{figure} ../_static/images/diagrams/multi_threaded_dark.png
:class: dark-only
:alt: Kompass Multi-threaded execution
:align: center
```
```{figure} ../_static/images/diagrams/multi_threaded_light.png
:class: light-only
:alt: Kompass Multi-threaded execution
:align: center
Multi-threaded execution
```
:::
:::{tab-item} Multi-Process Execution
:sync: multi-process
```{figure} ../_static/images/diagrams/multi_process_dark.png
:class: dark-only
:alt: Kompass Multi-process execution
:align: center
```
```{figure} ../_static/images/diagrams/multi_process_light.png
:class: light-only
:alt: Kompass Multi-process execution
:align: center
Multi-process execution
```
:::
::::
```{seealso}
Dive deeper into each of these architectural elements in [Sugarcoat🍬 documentation](https://automatika-robotics.github.io/sugarcoat)
```
```
## File: advanced/types.md
```markdown
# Supported ROS2 Messages
Kompass handles the ROS2 plumbing so you don't have to! Every component automatically initializes the necessary subscribers and publishers for its inputs and outputs, ensuring seamless data flow across your system.
Below is the comprehensive list of ROS2 message types natively supported by the Kompass
```{list-table}
:widths: 40 40
:header-rows: 1
* - Message
- ROS2 package
* - **[String](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.md/#classes)**
- [std_msgs](https://docs.ros2.org/foxy/api/std_msgs/msg/String.html)
* - **[Bool](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.md/#classes)**
- [std_msgs](https://docs.ros2.org/foxy/api/std_msgs/msg/Bool.html)
* - **[Float32](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.md/#classes)**
- [std_msgs](https://docs.ros2.org/foxy/api/std_msgs/msg/Float32.html)
* - **[Float32MultiArray](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.md/#classes)**
- [std_msgs](https://docs.ros2.org/foxy/api/std_msgs/msg/Float32MultiArray.html)
* - **[Float64](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.md/#classes)**
- [std_msgs](https://docs.ros2.org/foxy/api/std_msgs/msg/Float64.html)
* - **[Float64MultiArray](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.md/#classes)**
- [std_msgs](https://docs.ros2.org/foxy/api/std_msgs/msg/Float64MultiArray.html)
* - **[Point](../apidocs/kompass/kompass.data_types.md/#classes)**
- [geometry_msgs](https://docs.ros2.org/foxy/api/geometry_msgs/msg/Point.html)
* - **[PointStamped](../apidocs/kompass/kompass.data_types.md/#classes)**
- [geometry_msgs](https://docs.ros2.org/foxy/api/geometry_msgs/msg/PointStamped.html)
* - **[Pose](../apidocs/kompass/kompass.data_types.md/#classes)**
- [geometry_msgs](https://docs.ros2.org/foxy/api/geometry_msgs/msg/Pose.html)
* - **[PoseStamped](../apidocs/kompass/kompass.data_types.md/#classes)**
- [geometry_msgs](https://docs.ros2.org/foxy/api/geometry_msgs/msg/PoseStamped.html)
* - **[Twist](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [geometry_msgs](https://docs.ros2.org/foxy/api/geometry_msgs/msg/Twist.html)
* - **[TwistStamped](../apidocs/kompass/kompass.data_types.md/#classes)**
- [geometry_msgs](https://docs.ros2.org/foxy/api/geometry_msgs/msg/TwistStamped.html)
* - **[TwistArray](../apidocs/kompass/kompass.data_types.md/#classes)**
- [kompass_interfaces](https://github.com/automatika-robotics/kompass/blob/main/kompass_interfaces/msg/motion/TwistArray.msg)
* - **[Image](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [sensor_msgs](https://docs.ros2.org/foxy/api/sensor_msgs/msg/Image.html)
* - **[CompressedImage](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [sensor_msgs](https://docs.ros2.org/foxy/api/sensor_msgs/msg/CompressedImage.html)
* - **[Audio](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [sensor_msgs](https://docs.ros2.org/foxy/api/sensor_msgs/msg/Audio.html)
* - **[LaserScan](../apidocs/kompass/kompass.data_types.md/#classes)**
- [sensor_msgs](https://docs.ros2.org/foxy/api/sensor_msgs/msg/LaserScan.html)
* - **[PointCloud2](../apidocs/kompass/kompass.data_types.md/#classes)**
- [sensor_msgs](https://docs.ros2.org/foxy/api/sensor_msgs/msg/PointCloud2.html)
* - **[Odometry](../apidocs/kompass/kompass.data_types.md/#classes)**
- [nav_msgs](https://docs.ros2.org/foxy/api/nav_msgs/msg/Odometry.html)
* - **[Path](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [nav_msgs](https://docs.ros2.org/foxy/api/nav_msgs/msg/Path.html)
* - **[OccupancyGrid](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [nav_msgs](https://docs.ros2.org/foxy/api/nav_msgs/msg/OccupancyGrid.html)
* - **[ComponentStatus](https://automatika-robotics.github.io/sugarcoat/apidocs/ros_sugar/ros_sugar.io.supported_types.html)**
- [automatika_ros_sugar](https://github.com/automatika-robotics/sugarcoat/blob/main/msg/ComponentStatus.msg)
* - **Detections**
- [automatika_agents_interfaces](https://github.com/automatika-robotics/ros-agents/blob/main/agents/msg/Detections2D.msg)
* - **Trackings**
- [automatika_agents_interfaces](https://github.com/automatika-robotics/ros-agents/blob/main/agents/msg/Trackings.msg)
```
```
## File: advanced/advanced_conf/topics.md
```markdown
# Inputs and Outputs
**Strict typing and standardized data streams**
In Kompass, Components are designed with strict interfaces to "lock" their functionality. Each input and output is associated with a **unique keyword name** and restricted to a set of specific ROS2 message types.
A single keyword (e.g., `sensor_data`) can accept:
1. **A Single Topic:** One laser scan.
2. **Multiple Topics:** A list of topics (e.g., LiDAR + Radar) which are then fused internally by the component.
## Configuration
Configuring inputs and outputs is done via the `.inputs()` and `.outputs()` methods on any Component instance.
```python
from kompass.components import DriveManager
from kompass.ros import Topic
driver = DriveManager(component_name="driver")
# 1. Configure Inputs
# The 'sensor_data' key accepts a list of topics (fusion)
driver.inputs(
sensor_data=[
Topic(name='/scan', msg_type='LaserScan'),
Topic(name='/radar_data', msg_type='Float64')
]
)
# 2. Configure Outputs
# Remap the 'emergency_stop' signal to a custom topic name
driver.outputs(
emergency_stop=Topic(name='/system/alarm', msg_type='Bool')
)
```
:::{seealso}
See the `Topic` configuration class details in the [API Documentation](https://www.google.com/search?q=../../apidocs/kompass/kompass.components.ros.md).
:::
## Standard Stream Keys
Kompass uses a standardized set of keys across the entire stack.
### {material-regular}`explore` Navigation & Planning
| Enum Key | Keyword String | Description |
| --- | --- | --- |
| **GOAL_POINT** | `"goal_point"` | Target destination for the global planner. |
| **GLOBAL_PLAN** | `"plan"` | The computed path from start to goal. |
| **LOCAL_PLAN** | `"local_plan"` | Short-term path plan output by the Controller. |
| **INTERPOLATED_PATH** | `"interpolation"` | Smoothed or interpolated global path. |
| **REACHED_END** | `"reached_end"` | Flag indicating the goal has been reached. |
### {material-regular}`map` Mapping & Perception
| Enum Key | Keyword String | Description |
| --- | --- | --- |
| **GLOBAL_MAP** | `"map"` | The static global reference map. |
| **LOCAL_MAP** | `"local_map"` | Dynamic occupancy grid of immediate surroundings. |
| **SPATIAL_SENSOR** | `"sensor_data"` | Raw spatial data (LIDAR, Radar, Depth). |
| **VISION_TRACKINGS** | `"vision_tracking"` | Tracking data from vision systems (bounding boxes/masks). |
| **DEPTH_CAM_INFO** | `"depth_camera_info"` | Camera intrinsics parameters. |
| **TRACKED_POINT** | `"tracked_point"` | Specific point or object currently being tracked. |
### {material-regular}`gamepad` Control & Actuation
| Enum Key | Keyword String | Description |
| --- | --- | --- |
| **INTERMEDIATE_CMD** | `"command"` | Velocity command produced by the Controller. |
| **INTERMEDIATE_CMD_LIST** | `"multi_command"` | List of candidate velocity commands. |
| **FINAL_COMMAND** | `"robot_command"` | Final, safety-checked command sent to the Driver. |
| **EMERGENCY** | `"emergency_stop"` | Signal for immediate robot halt. |
### {material-regular}`settings_system_daydream` System & State
| Enum Key | Keyword String | Description |
| --- | --- | --- |
| **ROBOT_LOCATION** | `"location"` | Current robot position and orientation (Odometry). |
| **RUN_TESTS** | `"run_tests"` | Trigger flag to initiate calibration procedures. |
:::{tip}
Each component's specific requirements (required vs. optional streams) can be found in its respective documentation page (e.g., [Planner Inputs](../../navigation/path_planning.md/#inputs), [DriveManager Outputs](../../navigation/driver.md/#outputs)).
:::
```
## File: advanced/advanced_conf/qos.md
```markdown
# Topic QoS Configuration
**Fine-tune communication reliability and persistence.**
Quality of Service (QoS) profiles allow you to tune how data is handled between nodes. You can configure whether to prioritize speed (Best Effort) or data integrity (Reliable), and whether late-joining nodes should receive past data (Transient Local).
## Core Concepts
Understanding the trade-offs in ROS2 communication.
- {material-regular}`verified` Reliability - **Reliable vs. Best Effort.**
* **Reliable:** Guarantees delivery. Retries if packets are lost. (Good for services/control).
* **Best Effort:** Fire and forget. Fast, but drops data if network is bad. (Good for sensor streams).
- {material-regular}`save` Durability - **Volatile vs. Transient Local.**
* **Volatile:** No history. Late joiners only see *new* data.
* **Transient Local:** The publisher "saves" the last $N$ messages. Late joiners get the last known value immediately. (Good for maps/configuration).
- {material-regular}`history` History - **Keep Last vs. Keep All.**
* **Keep Last:** Only store a fixed queue of $N$ samples. Oldest are dropped.
* **Keep All:** Store everything (subject to resource limits).
## Configuration Parameters
The `QoSConfig` class provides a wrapper to easily set these policies in Kompass.
- {material-regular}`history` history - **`qos.HistoryPolicy`**
Configuration of samples to store.
*Default:* `KEEP_LAST`
- {material-regular}`layers` queue_size - **`int`**
The depth of the queue. Only honored if `history` is set to `KEEP_LAST`.
*Range:* [5, 100]
*Default:* `10`
- {material-regular}`verified_user` reliability - **`qos.ReliabilityPolicy`**
Samples deliverance guarantee.
*Default:* `RELIABLE`
- {material-regular}`inventory_2` durability - **`qos.DurabilityPolicy`**
Controls whether published samples are stored for late-joiners.
*Default:* `VOLATILE`
## Usage Example
```python
from kompass.ros import Topic, QoSConfig
from rclpy import qos
# 1. Define the Profile
# Example: A profile for a Map topic (Needs to be reliable and available to late joiners)
qos_map_profile = QoSConfig(
history=qos.HistoryPolicy.KEEP_LAST,
queue_size=1,
reliability=qos.ReliabilityPolicy.RELIABLE,
durability=qos.DurabilityPolicy.TRANSIENT_LOCAL
)
# 2. Apply to Topic
topic = Topic(
name='/local_map',
msg_type='OccupancyGrid',
qos_profile=qos_map_profile
)
```
```
## File: advanced/algorithms/dwa.md
```markdown
# DWA
**GPU-accelerated Dynamic Window Approach.**
DWA is a classic local planning method developed in the 90s.[^1] It is a sampling-based controller that generates a set of constant-velocity trajectories within a "Dynamic Window" of reachable velocities.
Kompass supercharges this algorithm using **SYCL-based hardware acceleration**, allowing it to sample and evaluate thousands of candidate trajectories in parallel on **Nvidia, AMD, or Intel** GPUs. This enables high-frequency control loops even in complex, dynamic environments with dense obstacle fields.
It is highly effective for differential drive and omnidirectional robots.
## How it Works
The algorithm operates in a three-step pipeline at every control cycle:
- {material-regular}`speed` 1. Search Space - **Compute Dynamic Window.**
Calculate the range of reachable linear and angular velocities ($v, \omega$) for the next time step, limited by the robot's maximum acceleration and current speed.
- {material-regular}`timeline` 2. Rollout - **Sample Trajectories.**
Generate a set of candidate trajectories by sampling velocity pairs within the dynamic window and simulating the robot's motion forward in time.
- {material-regular}`grading` 3. Evaluate - **Score & Select.**
Discard trajectories that collide with obstacles (using **FCL**). Score the remaining valid paths based on distance to goal, path alignment, and smoothness.
## Supported Sensory Inputs
DWA requires spatial data to perform collision checking during the rollout phase.
* {material-regular}`radar` LaserScan
* {material-regular}`grain` PointCloud
* {material-regular}`grid_on` OccupancyGrid
## Parameters and Default Values
```{list-table}
:widths: 10 10 10 70
:header-rows: 1
* - Name
- Type
- Default
- Description
* - control_time_step
- `float`
- `0.1`
- Time interval between control actions (sec). Must be between `1e-4` and `1e6`.
* - prediction_horizon
- `float`
- `1.0`
- Duration over which predictions are made (sec). Must be between `1e-4` and `1e6`.
* - control_horizon
- `float`
- `0.2`
- Duration over which control actions are planned (sec). Must be between `1e-4` and `1e6`.
* - max_linear_samples
- `int`
- `20`
- Maximum number of linear control samples. Must be between `1` and `1e3`.
* - max_angular_samples
- `int`
- `20`
- Maximum number of angular control samples. Must be between `1` and `1e3`.
* - sensor_position_to_robot
- `List[float]`
- `[0.0, 0.0, 0.0]`
- Position of the sensor relative to the robot in 3D space (x, y, z) coordinates.
* - sensor_rotation_to_robot
- `List[float]`
- `[0.0, 0.0, 0.0, 1.0]`
- Orientation of the sensor relative to the robot as a quaternion (x, y, z, w).
* - octree_resolution
- `float`
- `0.1`
- Resolution of the Octree used for collision checking. Must be between `1e-9` and `1e3`.
* - costs_weights
- `TrajectoryCostsWeights`
- see [defaults](./cost_eval.md/#configuration-weights)
- Weights for trajectory cost evaluation.
* - max_num_threads
- `int`
- `1`
- Maximum number of threads used when running the controller. Must be between `1` and `1e2`.
```
```{note}
All the previous parameters can be configured when using DWA algorithm directly in your Python recipe or using a config file (as shown in the usage example)
```
## Usage Example
DWA can be activated by setting the `algorithm` property in the [Controller](../../navigation/control.md) configuration.
```{code-block} python
:caption: dwa.py
from kompass.components import Controller, ControllerConfig
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig
)
from kompass.control import ControllersID
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.BOX,
geometry_params=np.array([0.3, 0.3, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Set DWA algorithm using the config class
controller_config = ControllerConfig(algorithm="DWA")
# Set YAML config file
config_file = "my_config.yaml"
controller = Controller(component_name="my_controller",
config=controller_config,
config_file=config_file)
# algorithm can also be set using a property
controller.algorithm = ControllersID.DWA # or "DWA"
```
```{code-block} yaml
:caption: my_config.yaml
my_controller:
# Component config parameters
loop_rate: 10.0
control_time_step: 0.1
prediction_horizon: 4.0
ctrl_publish_type: 'Array'
# Algorithm parameters under the algorithm name
DWA:
control_horizon: 0.6
octree_resolution: 0.1
max_linear_samples: 20
max_angular_samples: 20
max_num_threads: 3
costs_weights:
goal_distance_weight: 1.0
reference_path_distance_weight: 1.5
obstacles_distance_weight: 2.0
smoothness_weight: 1.0
jerk_weight: 0.0
```
## Trajectory Samples Generation
Trajectory samples are generated using a constant velocity generator for each velocity value within the reachable range to generate the configured maximum number of samples (see `max_linear_samples` and `max_angular_samples` in the [config parameters](#parameters-and-default-values)).
The shape of the sampled trajectories depends heavily on the robot's kinematic model:
::::{tab-set}
:::{tab-item} Ackermann
:sync: ackermann
**Car-Like Motion**
Note the limited curvature constraints typical of car-like steering.
:::
:::{tab-item} Differential
:sync: diff
**Tank/Diff Drive**
Includes rotation-in-place (if configured) and smooth arcs.
:::
:::{tab-item} Omni
:sync: omni
**Holonomic Motion**
Includes lateral (sideways) movement samples.
:::
::::
:::{admonition} Rotate-Then-Move
:class: note
To ensure natural movement for Differential and Omni robots, Kompass implements a **Rotate-Then-Move** policy. Simultaneous rotation and high-speed linear translation is restricted to prevent erratic behavior.
:::
## Best Trajectory Selection
A collision-free admissibility criteria is implemented within the trajectory samples generator using [FCL](../../integrations/fcl.md) to check the collision between the simulated robot state and the reference sensor input.
Once admissible trajectories are sampled, the **Best Trajectory** is selected by minimizing a weighted cost function:
You can tune these weights (`costs_weights`) to change the robot's behavior (e.g., sticking closer to the path vs. prioritizing obstacle clearance).
:::{button-link} cost_eval.html
:color: primary
:ref-type: doc
:outline:
Learn more about Cost Evaluation →
:::
[^1]: [Dieter Foxy, Wolf Burgardy and Sebastian Thrun. The Dynamic Window Approach to Collision Avoidance. IEEE Robotics & Automation Magazine ( Volume: 4, Issue: 1, March 1997)](https://www.ri.cmu.edu/pub_files/pub1/fox_dieter_1997_1/fox_dieter_1997_1.pdf)
```
## File: advanced/algorithms/pure_pursuit.md
```markdown
# Pure Pursuit
**Geometric path tracking with reactive collision avoidance.**
Pure Pursuit is a fundamental path-tracking algorithm. It calculates the curvature required to move the robot from its current position to a specific "lookahead" point on the path, simulating how a human driver looks forward to steer a vehicle.
Kompass enhances the standard implementation (based on [Purdue SIGBOTS](https://wiki.purduesigbots.com/software/control-algorithms/basic-pure-pursuit)) by adding an integrated **Simple Search Collision Avoidance** layer. This allows the robot to deviate locally from the path to avoid unexpected obstacles without needing a full replan.
## How it Works
The controller executes a four-step cycle:
- {material-regular}`ads_click` 1. Find Target - **Locate Lookahead.**
Find the point on the path that is distance $L$ away from the robot. $L$ scales with speed ($L = k \cdot v$).
- {material-regular}`gesture` 2. Steering - **Compute Curvature.**
Calculate the arc required to reach that target point based on the robot's kinematic constraints.
- {material-regular}`verified_user` 3. Safety - **Collision Check.**
Project the robot's motion forward using the `prediction_horizon` to check for immediate collisions.
- {material-regular}`alt_route` 4. Avoidance - **Local Search.**
If the nominal arc is blocked, the controller searches through `max_search_candidates` to find a safe velocity offset that clears the obstacle while maintaining progress.
## Supported Sensory Inputs
To enable the collision avoidance layer, spatial data is required.
* {material-regular}`radar` LaserScan
* {material-regular}`grain` PointCloud
* {material-regular}`grid_on` OccupancyGrid
*(Note: The controller can run in "blind" tracking mode without these inputs, but collision avoidance will be disabled.)*
## Configuration Parameters
```{list-table}
:widths: 15 10 10 65
:header-rows: 1
* - Name
- Type
- Default
- Description
* - lookahead_gain_forward
- `float`
- `0.8`
- Factor to scale lookahead distance by current velocity ($L = k \cdot v$).
* - prediction_horizon
- `int`
- `10`
- Number of future steps used to check for potential collisions along the path.
* - path_search_step
- `float`
- `0.2`
- Offset step used to search for alternative velocity commands when the nominal path is blocked.
* - max_search_candidates
- `int`
- `10`
- Maximum number of search iterations to find a collision-free command.
```
## Usage Example:
Pure Pursuit controller can be used in the [Controller](../../navigation/control.md) component by setting 'algorithm' property or component config parameter. The Controller will configure Pure Pursuit algorithm using the default values of all the previous configuration parameters. The specific algorithms parameters can be configured using a config file or the algorithm's configuration class.
```{code-block} python
:caption: pure_pursuit.py
from kompass.components import Controller, ControllerConfig
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig
)
from kompass.control import ControllersID, PurePursuitConfig
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.OMNI,
geometry_type=RobotGeometry.Type.BOX,
geometry_params=np.array([0.3, 0.3, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Initialize the controller
controller = Controller(component_name="my_controller")
# Set the all algorithms desired configuration
pure_pursuit_config = PurePursuitConfig(
lookahead_gain_forward=0.5, prediction_horizon=8, max_search_candidates=20
)
controller.algorithms_config = pure_pursuit_config
# NOTE: We can configure more than one algorithm to switch during runtime
# other_algorithm_config = ....
# controller.algorithms_config = [pure_pursuit_config, other_algorithm_config]
# Set the algorithm to Pure Pursuit
controller.algorithm = ControllersID.PURE_PURSUIT
```
## Performance & Results
The following tests demonstrate the controller's ability to track reference paths (**thin dark blue**) and avoid obstacles (**red x**).
### 1. Nominal Tracking
Performance on standard geometric paths (U-Turns and Circles) without interference.
::::{grid} 1 3 3 3
:gutter: 2
:::{grid-item-card} Ackermann
:class-card: sugar-card
**U-Turn**
:::
:::{grid-item-card} Differential
:class-card: sugar-card
**Circle**
:::
:::{grid-item-card} Omni
:class-card: sugar-card
**Circle**
:::
::::
### 2. Collision Avoidance
Scenarios where static obstacles are placed directly on the global path. The controller successfully identifies the blockage and deviates to finding a safe path around it.
::::{grid} 1 3 3 3
:gutter: 2
:::{grid-item-card} Ackermann
:class-card: sugar-card
**Straight + Obstacles**
:::
:::{grid-item-card} Differential
:class-card: sugar-card
**U-Turn + Obstacles**
:::
:::{grid-item-card} Omni
:class-card: sugar-card
**Straight + Obstacles**
:::
::::
:::{admonition} Observations
:class: note
* **Convergence:** Smooth convergence to the reference path across all kinematic models.
* **Clearance:** The simple search algorithm successfully clears obstacle boundaries before returning to the path.
* **Stability:** No significant oscillation observed during avoidance maneuvers.
:::
```
## File: advanced/algorithms/stanley.md
```markdown
# Stanley Steering
**Front-wheel feedback control for path tracking.**
Stanley is a geometric path tracking method originally developed for the DARPA Grand Challenge.[^1] Unlike [Pure Pursuit](./pure_pursuit.md) (which looks ahead), Stanley uses the **Front Axle** as its reference point to calculate steering commands.
It computes a steering angle $\delta(t)$ based on two error terms:
1. **Heading Error** ($\psi_e$): Difference between the robot's heading and the path direction.
2. **Cross-Track Error** ($e$): Lateral distance from the front axle to the nearest path segment.
The control law combines these to minimize error exponentially:
$$
\delta(t) = \psi_e(t) + \arctan \left( \frac{k \cdot e(t)}{v(t)} \right)
$$
## Key Features
- {material-regular}`directions_car` Ackermann Native - Designed specifically for car-like steering geometry. It is naturally stable at high speeds for these vehicles.
- {material-regular}`rotate_right` Multi-Model Support - Kompass extends Stanley to Differential and Omni robots by applying a **Rotate-Then-Move** strategy when orientation errors are large.
- {material-regular}`blind` Sensor-Less - Does not require LiDAR or depth data. It is a pure path follower.
## Configuration Parameters
```{list-table}
:widths: 10 10 10 70
:header-rows: 1
* - Name
- Type
- Default
- Description
* - heading_gain
- `float`
- `0.7`
- Heading gain in the control law. Must be between `0.0` and `1e2`.
* - cross_track_min_linear_vel
- `float`
- `0.05`
- Minimum linear velocity for cross-track control (m/s). Must be between `1e-4` and `1e2`.
* - min_angular_vel
- `float`
- `0.01`
- Minimum allowable angular velocity (rad/s). Must be between `0.0` and `1e9`.
* - cross_track_gain
- `float`
- `1.5`
- Gain for cross-track in the control law. Must be between `0.0` and `1e2`.
* - max_angle_error
- `float`
- `np.pi / 16`
- Maximum allowable angular error (rad). Must be between `1e-9` and `π`.
* - max_distance_error
- `float`
- `0.1`
- Maximum allowable distance error (m). Must be between `1e-9` and `1e9`.
```
## usage Example
Stanley algorithm can be used in the [Controller](../../navigation/control.md) component by setting 'algorithm' property or component config parameter. The Controller will configure Stanley algorithm using the default values of all the previous configuration parameters. To configure custom values of the parameters, a YAML file is passed to the component.
```{code-block} python
:caption: stanley.py
from kompass.components import Controller, ControllerConfig
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig
)
from kompass.control import ControllersID
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.BOX,
geometry_params=np.array([0.3, 0.3, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Set Stanley algorithm using the config class
controller_config = ControllerConfig(algorithm="Stanley") # or ControllersID.STANLEY
# Set YAML config file
config_file = "my_config.yaml"
controller = Controller(component_name="my_controller",
config=controller_config,
config_file=config_file)
# algorithm can also be set using a property
controller.algorithm = ControllersID.STANLEY # or "Stanley"
```
```{code-block} yaml
:caption: my_config.yaml
my_controller:
# Component config parameters
loop_rate: 10.0
control_time_step: 0.1
ctrl_publish_type: 'Sequence'
# Algorithm parameters under the algorithm name
Stanley:
cross_track_gain: 1.0
heading_gain: 2.0
```
:::{admonition} Safety Note
:class: warning
Stanley does **not** have built-in obstacle avoidance. It is strongly recommended to use this controller in conjunction with the **[Drive Manager](https://www.google.com/search?q=drive_manager.md)** to provide Emergency Stop and Slowdown safety layers.
:::
[^1]: [Hoffmann, Gabriel M., Claire J. Tomlin, Michael Montemerlo, and Sebastian Thrun. "Autonomous Automobile Trajectory Tracking for Off-Road Driving: Controller Design, Experimental Validation and Racing." American Control Conference. 2007, pp. 2296–2301](https://ieeexplore.ieee.org/document/4282788)
```
## File: advanced/algorithms/dvz.md
```markdown
# DVZ
**Fast, reactive collision avoidance for dynamic environments.**
The DVZ (Deformable Virtual Zone) is a reactive control method introduced by R. Zapata in 1994.[^1] It models the robot's safety perimeter as a "virtual bubble" (zone) that deforms when obstacles intrude.
Unlike sampling methods (like DWA) that simulate future trajectories, DVZ calculates a reaction vector based directly on how the bubble is being "squished" by the environment. This makes it **extremely computationally efficient** and ideal for crowded, fast-changing environments where rapid reactivity is more important than global optimality.
[^1]: [Zapata, R., Lépinay, P., and Thompson, P. “Reactive behaviors of fast mobile robots”.
In: Journal of Robotic Systems 11.1 (1994)](https://www.researchgate.net/publication/221787033_Reactive_Motion_Planning_for_Mobile_Robots)
## How it Works
The algorithm continuously computes a deformation vector to steer the robot away from intrusion.
- {material-regular}`radio_button_unchecked` 1. Define Zone - **Create Bubble.**
Define a circular (or elliptical) protection zone around the robot with a nominal undeformed radius $R$.
- {material-regular}`radar` 2. Sense - **Measure Intrusion.**
Using LaserScan data, compute the *deformed radius* $d_h(\alpha)$ for every angle $\alpha \in [0, 2\pi]$ around the robot.
- {material-regular}`compress` 3. Compute Deformation - **Calculate Metrics.**
* **Intrusion Intensity ($I_D$):** How much total "stuff" is inside the zone.
$I_D = \frac{1}{2\pi} \int_{0}^{2\pi}\frac{R - d_h(\alpha)}{R} d\alpha$
* **Deformation Angle ($\Theta_D$):** The primary direction of the intrusion.
$\Theta_D = \frac{\int_{0}^{2\pi} (R - d_h(\alpha))\alpha d\alpha}{I_D}$
- {material-regular}`shortcut` 4. React - **Control Law.**
The final control command minimizes $I_D$ (pushing away from the deformation) while trying to maintain the robot's original heading towards the goal.
## Supported Sensory Inputs
DVZ relies on dense 2D range data to compute the deformation integral.
* {material-regular}`radar` LaserScan
## Configuration Parameters
DVZ balances two competing forces: **Path Following** (Geometric) vs. **Obstacle Repulsion** (Reactive).
```{list-table}
:widths: 10 10 10 70
:header-rows: 1
* - Name
- Type
- Default
- Description
* - min_front_margin
- `float`
- `1.0`
- Minimum front margin distance. Must be between `0.0` and `1e2`.
* - K_linear
- `float`
- `1.0`
- Proportional gain for linear control. Must be between `0.1` and `10.0`.
* - K_angular
- `float`
- `1.0`
- Proportional gain for angular control. Must be between `0.1` and `10.0`.
* - K_I
- `float`
- `5.0`
- Proportional deformation gain. Must be between `0.1` and `10.0`.
* - side_margin_width_ratio
- `float`
- `1.0`
- Width ratio between the deformation zone front and side (circle if 1.0). Must be between `1e-2` and `1e2`.
* - heading_gain
- `float`
- `0.7`
- Heading gain of the internal pure follower control law. Must be between `0.0` and `1e2`.
* - cross_track_gain
- `float`
- `1.5`
- Gain for cross-track error of the internal pure follower control law. Must be between `0.0` and `1e2`.
```
## usage Example
DVZ algorithm can be used in the [Controller](../../navigation/control.md) component by setting 'algorithm' property or component config parameter. The Controller will configure DVZ algorithm using the default values of all the previous configuration parameters. To configure custom values of the parameters, a YAML file is passed to the component.
```{code-block} python
:caption: dvz.py
from kompass.components import Controller, ControllerConfig
from kompass.robot import (
AngularCtrlLimits,
LinearCtrlLimits,
RobotCtrlLimits,
RobotGeometry,
RobotType,
RobotConfig
)
from kompass.control import LocalPlannersID
# Setup your robot configuration
my_robot = RobotConfig(
model_type=RobotType.ACKERMANN,
geometry_type=RobotGeometry.Type.BOX,
geometry_params=np.array([0.3, 0.3, 0.3]),
ctrl_vx_limits=LinearCtrlLimits(max_vel=0.2, max_acc=1.5, max_decel=2.5),
ctrl_omega_limits=AngularCtrlLimits(
max_vel=0.4, max_acc=2.0, max_decel=2.0, max_steer=np.pi / 3)
)
# Set DVZ algorithm using the config class
controller_config = ControllerConfig(algorithm="DVZ") # or LocalPlannersID.DVZ
# Set YAML config file
config_file = "my_config.yaml"
controller = Controller(component_name="my_controller",
config=controller_config,
config_file=config_file)
# algorithm can also be set using a property
controller.algorithm = ControllersID.DVZ # or "DVZ"
```
```{code-block} yaml
:caption: my_config.yaml
my_controller:
# Component config parameters
loop_rate: 10.0
control_time_step: 0.1
ctrl_publish_type: 'Sequence'
# Algorithm parameters under the algorithm name
DVZ:
cross_track_gain: 1.0
heading_gain: 2.0
K_angular: 1.0
K_linear: 1.0
min_front_margin: 1.0
side_margin_width_ratio: 1.0
```
```
## File: advanced/algorithms/cost_eval.md
```markdown
# Trajectory Cost Evaluation
**Scoring candidate paths for optimal selection.**
In sampling-based controllers like [DWA](dwa.md), dozens of candidate trajectories are generated at every time step. To choose the best one, Kompass uses a weighted sum of several cost functions.
The total cost $J$ for a given trajectory is calculated as:
$$
J = \sum (w_i \cdot C_i)
$$
Where $w_i$ is the configured weight and $C_i$ is the normalized cost value.
## Hardware Acceleration
To handle high-frequency control loops with large sample sets, Kompass leverages **SYCL** for massive parallelism.
::::{grid} 1 1 1 1
:gutter: 3
:::{grid-item-card} {material-regular}`speed;1.5em;sd-text-primary` SYCL Kernels
:link: ../benchmark
:link-type: doc
:class-card: sugar-card
**Vendor-Agnostic GPU Support**
Each cost function below is implemented as a specialized **SYCL kernel**. This allows the controller to evaluate thousands of trajectory points in parallel on **Nvidia, AMD, or Intel** GPUs, significantly reducing latency compared to CPU-only implementations.
See the performance gains in our Benchmarks →
:::
::::
## Built-in Cost Functions
Kompass includes optimized implementations for the following cost components:
| Cost Component | Description | Goal |
| :--- | :--- | :--- |
| **Reference Path** | Average distance between the candidate trajectory and the global reference path. | **Stay on track.** Keep the robot from drifting away from the global plan. |
| **Goal Destination** | Euclidean distance from the end of the trajectory to the final goal point. | **Make progress.** Favor trajectories that actually move the robot closer to the destination. |
| **Obstacle Distance** | Inverse of the minimum distance to the nearest obstacle (from LaserScan/PointCloud). | **Stay safe.** Heavily penalize trajectories that come too close to walls or objects. |
| **Smoothness**| Average change in velocity (acceleration) along the trajectory. | **Drive smoothly.** Prevent jerky velocity changes. |
| **Jerk** | Average change in acceleration along the trajectory. | **Protect hardware.** Minimize mechanical stress and wheel slip. |
## Configuration Weights
You can tune the behavior of the robot by adjusting the weights ($w_i$) in your configuration.
```{list-table}
:widths: 10 10 10 70
:header-rows: 1
* - Name
- Type
- Default
- Description
* - reference_path_distance_weight
- `float`
- `3.0`
- Weight of the reference path cost. Must be between `0.0` and `1e3`.
* - goal_distance_weight
- `float`
- `3.0`
- Weight of the goal position cost. Must be between `0.0` and `1e3`.
* - obstacles_distance_weight
- `float`
- `1.0`
- Weight of the obstacles distance cost. Must be between `0.0` and `1e3`.
* - smoothness_weight
- `float`
- `0.0`
- Weight of the trajectory smoothness cost. Must be between `0.0` and `1e3`.
* - jerk_weight
- `float`
- `0.0`
- Weight of the trajectory jerk cost. Must be between `0.0` and `1e3`.
```
:::{tip}
Setting a weight to `0.0` completely disables that specific cost calculation kernel, saving computational resources.
:::
```
## File: integrations/ompl.md
```markdown
# OMPL (Open Motion Planning Library)
**State-of-the-art sampling-based motion planning.**
[OMPL (Open Motion Planning Library)](https://github.com/ompl/ompl) is the industry standard for generic motion planning algorithms. Kompass integrates OMPL to solve complex planning problems in 2D and 3D spaces for various robot morphologies.
Kompass provides efficient Nanobind bindings in the `kompass_core` package, bridging your Python components with OMPL's C++ core.
- {material-regular}`architecture;1.5em;sd-text-primary` Geometric Planners - **Algorithm Suite**. Direct access to over 20 geometric planners (RRT*, PRM, EST, etc.) configurable in your recipe or via a configuration file (YAML, TOML, JSON).
- {material-regular}`rotate_90_degrees_ccw;1.5em;sd-text-primary` SE2 State Space **2D Navigation**. It provides an ['SE2State'](https://ompl.kavrakilab.org/classompl_1_1base_1_1SE2StateSpace.html) consisting of a 2D position and rotation in the plane: ```SE(2): (x, y, yaw)```
- {material-regular}`check_box;1.5em;sd-text-primary` Validity Checker - **Collision Checking**. Built-in '[StateValidityChecker](https://ompl.kavrakilab.org/classompl_1_1base_1_1StateValidityChecker.html)' which implements a collision checker (using [FCL](fcl.md)) to insure finding collision free paths.
## Configuration
Configure OMPL behavior via your configuration file.
```yaml
ompl:
# Verbosity
log_level: 'WARN'
# Timeouts
planning_timeout: 10.0 # (s) Max time to search for a solution
simplification_timeout: 0.01 # (s) Max time to optimize the path
# Constraints
goal_tolerance: 0.01 # (m) Goal reached threshold
# Optimization
optimization_objective: 'PathLengthOptimizationObjective'
# Active Algorithm
planner_id: 'ompl.geometric.KPIECE1'
```
## Planner Benchmarks
Planning performance simulated on a **Turtlebot3 Waffle** in Gazebo.
* **Repetitions:** 20 per planner
* **Timeout:** 2.0 seconds
* **Metric:** Average values
:::{tip}
To run this benchmark yourself, refer to the [kompass_navigation tests](https://github.com/automatika-robotics/kompass-navigation/tree/dev/tests).
:::
```{list-table}
:header-rows: 1
:widths: 15 10 15 15 15 20
* - Method
- Solved
- Time (s)
- Length (m)
- Simplify (s)
- ROS 2 Pub (s)
* - **ABITstar**
- ✅
- 1.071
- 2.948
- 0.0075
- 0.00056
* - **BFMT**
- ✅
- 0.1128
- 3.487
- 0.0066
- 0.00017
* - **BITstar**
- ✅
- 1.0732
- 2.962
- 0.0061
- 0.00015
* - **BKPIECE1**
- ✅
- 0.0703
- 4.469
- 0.0178
- 0.00019
* - **BiEST**
- ✅
- 0.0619
- 4.418
- 0.0108
- 0.00014
* - **EST**
- ✅
- 0.0640
- 4.059
- 0.0107
- 0.00013
* - **FMT**
- ✅
- 0.1330
- 3.628
- 0.0063
- 0.00016
* - **InformedRRT***
- ✅
- 1.0679
- 2.962
- 0.0046
- 0.00013
* - **KPIECE1**
- ✅
- 0.0676
- 5.439
- 0.0148
- 0.00017
* - **LBKPIECE1**
- ✅
- 0.0754
- 5.174
- 0.0200
- 0.00017
* - **LBTRRT**
- ✅
- 1.0696
- 3.221
- 0.0050
- 0.00012
* - **LazyLBTRRT**
- ✅
- 1.0672
- 3.305
- 0.0053
- 0.00013
* - **LazyPRM**
- ❌
- 1.0811
- 0.0
- 0.0
- 0.0
* - **LazyPRMstar**
- ✅
- 1.0701
- 3.030
- 0.0063
- 0.00012
* - **LazyRRT**
- ✅
- 0.0982
- 4.520
- 0.0160
- 0.00013
* - **PDST**
- ✅
- 0.0678
- 3.836
- 0.0090
- 0.00013
* - **PRM**
- ✅
- 1.0672
- 3.306
- 0.0068
- 0.00012
* - **PRMstar**
- ✅
- 1.0740
- 3.720
- 0.0085
- 0.00012
* - **ProjEST**
- ✅
- 0.0683
- 4.190
- 0.0082
- 0.00013
* - **RRT**
- ✅
- 0.0906
- 4.860
- 0.0190
- 0.00014
* - **RRTConnect**
- ✅
- 0.0754
- 4.780
- 0.0140
- 0.00014
* - **RRTXstatic**
- ✅
- 1.0712
- 3.030
- 0.0041
- 0.00012
* - **RRTsharp**
- ✅
- 1.0680
- 3.010
- 0.0052
- 0.00013
* - **RRTstar**
- ✅
- 1.0672
- 2.960
- 0.0042
- 0.00013
* - **SBL**
- ✅
- 0.0800
- 4.039
- 0.0121
- 0.00015
* - **SST**
- ✅
- 1.0681
- 2.630
- 0.0012
- 0.00018
* - **STRIDE**
- ✅
- 0.0679
- 4.120
- 0.0098
- 0.00015
* - **TRRT**
- ✅
- 0.0798
- 4.110
- 0.0109
- 0.00015
```
## Supported Planners & Parameters
Click on a planner to view its default configuration parameters.
:::{dropdown} ABITstar
* `delay_rewiring_to_first_solution`: False
* `drop_unconnected_samples_on_prune`: False
* `find_approximate_solutions`: False
* `inflation_scaling_parameter`: 10.0
* `initial_inflation_factor`: 1000000.0
* `prune_threshold_as_fractional_cost_change`: 0.05
* `rewire_factor`: 1.1
* `samples_per_batch`: 100
* `stop_on_each_solution_improvement`: False
* `truncation_scaling_parameter`: 5.0
* `use_graph_pruning`: True
* `use_just_in_time_sampling`: False
* `use_k_nearest`: True
* `use_strict_queue_ordering`: True
:::
:::{dropdown} AITstar
* `find_approximate_solutions`: True
* `rewire_factor`: 1.0
* `samples_per_batch`: 100
* `use_graph_pruning`: True
* `use_k_nearest`: True
:::
:::{dropdown} BFMT
* `balanced`: False
* `cache_cc`: True
* `extended_fmt`: True
* `heuristics`: True
* `nearest_k`: True
* `num_samples`: 1000
* `optimality`: True
* `radius_multiplier`: 1.0
:::
:::{dropdown} BITstar
* `delay_rewiring_to_first_solution`: False
* `drop_unconnected_samples_on_prune`: False
* `find_approximate_solutions`: False
* `prune_threshold_as_fractional_cost_change`: 0.05
* `rewire_factor`: 1.1
* `samples_per_batch`: 100
* `stop_on_each_solution_improvement`: False
* `use_graph_pruning`: True
* `use_just_in_time_sampling`: False
* `use_k_nearest`: True
* `use_strict_queue_ordering`: True
:::
:::{dropdown} BKPIECE1
* `border_fraction`: 0.9
* `range`: 0.0
:::
:::{dropdown} BiEST
* `range`: 0.0
:::
:::{dropdown} EST
* `goal_bias`: 0.5
* `range`: 0.0
:::
:::{dropdown} FMT
* `cache_cc`: True
* `extended_fmt`: True
* `heuristics`: False
* `num_samples`: 1000
* `radius_multiplier`: 1.1
* `use_k_nearest`: True
:::
:::{dropdown} InformedRRTstar
* `delay_collision_checking`: True
* `goal_bias`: 0.05
* `number_sampling_attempts`: 100
* `ordered_sampling`: False
* `ordering_batch_size`: 1
* `prune_threshold`: 0.05
* `range`: 0.0
* `rewire_factor`: 1.1
* `use_k_nearest`: True
:::
:::{dropdown} KPIECE1
* `border_fraction`: 0.9
* `goal_bias`: 0.05
* `range`: 0.0
:::
:::{dropdown} LBKPIECE1
* `border_fraction`: 0.9
* `range`: 0.0
:::
:::{dropdown} LBTRRT
* `epsilon`: 0.4
* `goal_bias`: 0.05
* `range`: 0.0
:::
:::{dropdown} LazyLBTRRT
* `epsilon`: 0.4
* `goal_bias`: 0.05
* `range`: 0.0
:::
:::{dropdown} LazyPRM
* `max_nearest_neighbors`: 8
* `range`: 0.0
:::
:::{dropdown} LazyPRMstar
*(No specific default parameters listed)*
:::
:::{dropdown} LazyRRT
* `goal_bias`: 0.05
* `range`: 0.0
:::
:::{dropdown} PDST
* `goal_bias`: 0.05
:::
:::{dropdown} PRM
* `max_nearest_neighbors`: 8
:::
:::{dropdown} PRMstar
*(No specific default parameters listed)*
:::
:::{dropdown} ProjEST
* `goal_bias`: 0.05
* `range`: 0.0
:::
:::{dropdown} RRT
* `goal_bias`: 0.05
* `intermediate_states`: False
* `range`: 0.0
:::
:::{dropdown} RRTConnect
* `intermediate_states`: False
* `range`: 0.0
:::
:::{dropdown} RRTXstatic
* `epsilon`: 0.0
* `goal_bias`: 0.05
* `informed_sampling`: False
* `number_sampling_attempts`: 100
* `range`: 0.0
* `rejection_variant`: 0
* `rejection_variant_alpha`: 1.0
* `rewire_factor`: 1.1
* `sample_rejection`: False
* `update_children`: True
* `use_k_nearest`: True
:::
:::{dropdown} RRTsharp
* `goal_bias`: 0.05
* `informed_sampling`: False
* `number_sampling_attempts`: 100
* `range`: 0.0
* `rejection_variant`: 0
* `rejection_variant_alpha`: 1.0
* `rewire_factor`: 1.1
* `sample_rejection`: False
* `update_children`: True
* `use_k_nearest`: True
:::
:::{dropdown} RRTstar
* `delay_collision_checking`: True
* `focus_search`: False
* `goal_bias`: 0.05
* `informed_sampling`: True
* `new_state_rejection`: False
* `number_sampling_attempts`: 100
* `ordered_sampling`: False
* `ordering_batch_size`: 1
* `prune_threshold`: 0.05
* `pruned_measure`: False
* `range`: 0.0
* `rewire_factor`: 1.1
* `sample_rejection`: False
* `tree_pruning`: False
* `use_admissible_heuristic`: True
* `use_k_nearest`: True
:::
:::{dropdown} SBL
* `range`: 0.0
:::
:::{dropdown} SST
* `goal_bias`: 0.05
* `pruning_radius`: 3.0
* `range`: 5.0
* `selection_radius`: 5.0
:::
:::{dropdown} STRIDE
* `degree`: 16
* `estimated_dimension`: 3.0
* `goal_bias`: 0.05
* `max_degree`: 18
* `max_pts_per_leaf`: 6
* `min_degree`: 12
* `min_valid_path_fraction`: 0.2
* `range`: 0.0
* `use_projected_distance`: False
:::
:::{dropdown} TRRT
* `goal_bias`: 0.05
* `range`: 0.0
* `temp_change_factor`: 0.1
:::
```
## File: integrations/fcl.md
```markdown
# FCL (Flexible Collision Library)
**High-speed geometric collision detection.**
[FCL](https://github.com/flexible-collision-library/fcl) is a generic library for performing proximity and collision queries on geometric models.
Kompass leverages FCL to perform precise collision checks between the robot's kinematic model and both static (map) and dynamic (sensor) obstacles during path planning and control.
```