armarx_localization_and_mapping Overview

Foreword

The goal of this project is to equip a mobile robot with self-localization capabilities, enabling it to determine its position within a known environment.

The core process involves:

  • Generating a detailed map of the environment.
  • Registering the map within a global reference (i.e., world) frame.
  • Allowing the robot to localize itself within the map.

Terms and Concepts

Chain of Transformations

As we perform a map-based approach, we define a chain of transformations (frames). This is inspired by the Robot Operating System (ROS):

  • world → map: This transformation represents the relationship between the global world frame and the map frame. The map frame is typically static and aligned with the environment's reference points, such as walls or landmarks.
  • map → odom: This transformation bridges the map frame and the odometry frame. It accounts for drift in the odometry system and ensures that the robot's position in the map remains consistent over time.
  • odom → root: This transformation represents the robot's position and orientation relative to the odometry frame. It is continuously updated based on the robot's motion as measured by its odometry sensors, i.e. wheel encoders. While odometry provides valuable motion data, it is subject to drift over time due to sensor inaccuracies and modeling errors.

These transformations form a hierarchical chain, allowing the robot to localize itself accurately within the environment while accounting for sensor drift and maintaining a consistent global reference.

In our case, the transformations are determined as follows:

  • world → map: Once the map is recorded, it is manually aligned with the world frame to establish this transformation.
  • map → odom: This transformation is dynamically updated by the algorithms in this package. Initially, the transformation map → root is computed to determine the robot's position within the map. Since the transformation odom → root is provided by the robot, the map → odom transformation is adjusted to correct localization errors and maintain consistency.
  • odom → root: This transformation is directly provided by the robot's proprioceptive sensors, such as its real-time unit, which tracks its motion and orientation through wheel encoders and integrates velocity measurements into position estimates.

This hierarchical approach ensures accurate localization by continuously refining the robot's position within the map while accounting for sensor drift and environmental changes.

Additionally, each transformation can be updated at different rates depending on the sensor or system providing the data. For instance, the map → root transformation is typically updated at the frequency of the laser scanner sensors, which ranges from approximately 10 Hz to 50 Hz. In contrast, the odom → root transformation is updated at a much higher frequency, around 1000 Hz, as it is provided by the robot's real-time unit. (Yet, this information will also only be available at a lower rate in the robot state memory.)

Integration into Memory System

The aforementioned transformations are available in the robot state memory within the Localization core segment.

From the robot state memory, we obtain the odometry estimates. Also, we provide the world → map transformation and continuously update the map → root transformation.

Structure of this Package

On the code level, this package contains the following libraries and components:

Libraries

  • The self_localization library provides a base class for different self-localization strategies. It also simplifies the communication with the memory system.

Components

  • The component fake_localizer simulates localization data for testing purposes. It eliminates the need for actual sensor input or a physical environment. The robot's location can be configured through the component's properties, making it ideal for debugging and development scenarios.
  • The component cartographer_mapping_and_localization is responsible for simultaneous localization and mapping (SLAM) using the Cartographer library. This component allows the robot to dynamically create a map of its environment while simultaneously determining its position within that map. It supports both mapping and localization modes, making it versatile for various operational scenarios.