One of the key elements that enables a mobile robot to navigate autonomously is its ability to perceive the surrounding environment.
Among the most widespread and reliable sensors used in robotics is the 2D LiDAR (Light Detection and Ranging).
This device has revolutionized mobile robotics thanks to its accuracy and the ease with which it can be integrated into ROS 2 systems.
In this article, we will analyze how a LiDAR works, what information it provides to the robot, and how to add and simulate it in ROS 2 and Gazebo.
A 2D LiDAR operates by emitting laser beams and measuring the time it takes for the reflected light to return after hitting an object. From this measurement, the sensor computes the distance to obstacles in multiple directions, producing a polar map of the environment.
When repeated hundreds of times per second, these measurements create a two-dimensional point cloud representing the outlines of walls, furniture, and any detected object.
LiDARs provide high precision, often down to a few centimeters, and are capable of scanning up to 360° depending on the model. However, they only capture data on a plane (2D) and can be less effective in outdoor environments with direct sunlight.
In ROS 2, LiDAR data is published on the /scan
topic as messages of type sensor_msgs/LaserScan
.
To integrate a LiDAR sensor into the robot’s model, we need to extend the URDF (Unified Robot Description Format) by defining a new link that represents the sensor and a joint that attaches it to the robot’s base.
The link describes both the visual appearance of the LiDAR (using its mesh file) and its physical properties, such as collision geometry and inertial parameters.
The joint, in turn, establishes how this new link is positioned and connected to the main structure of the robot.
Below is the complete URDF snippet that defines the LiDAR link and its fixed joint connection to the base:
This line defines a new link named lidar_link
, representing the physical body of the sensor inside the robot model.
The <visual>
block specifies the graphical appearance of the sensor, loading a 3D mesh of the Hokuyo LiDAR. This model is used in visualization tools like RViz and Gazebo but does not affect physics.
Here we define the simplified collision geometry. Instead of using the complex mesh, a cylinder with 5 cm radius and length is used. This reduces computational load while still approximating the sensor’s shape.
The <inertial>
block assigns physical properties to the sensor. A mass of 0.1 kg is realistic for a small LiDAR. The origin is aligned with the geometry center, and the inertia matrix defines how the object responds to forces and rotations. Simplified values ensure stable simulation.
This section introduces a fixed joint connecting the LiDAR to the robot’s base_link
. The <origin>
sets the position: the sensor is placed 20 cm above the base without any rotation.
At this point, the LiDAR is fully integrated into the robot’s URDF model.
To replicate the behavior of the LiDAR in simulation and obtain data streams within ROS 2, we need to configure a dedicated Gazebo plugin. This plugin generates laser scan data based on the parameters we define, such as update rate, measurement range, angular resolution, and noise model.
The configuration below specifies the ROS topic on which the simulated LiDAR publishes, the reference frame of the sensor, and the physical characteristics of the laser beam, including horizontal scanning limits and the number of samples per scan. By tuning these parameters, we can achieve a realistic simulation that closely matches the performance of the actual hardware.
Here is the full plugin configuration:
/
~/scan:=/scan
30
/scan
lidar_link
0.12
3.5
0.01
720
1
-1.5708
1.5708
gaussian
0.0
0.01
Opens the section with Gazebo-specific tags inside the URDF/Xacro.
Loads the ROS-Gazebo plugin for simulating a LiDAR sensor.
/
~/scan:=/scan
Configures integration with ROS 2, publishing the scan on the /scan
topic.
30
Sets the update rate to 30 Hz, meaning thirty scans per second.
/scan
Specifies the name of the published topic, ensuring compatibility with navigation stacks.
lidar_link
Associates the scans with the lidar_link
frame defined in the URDF.
0.12
3.5
0.01
Defines the sensor’s range: minimum 12 cm, maximum 3.5 m, with 1 cm resolution.
720
1
-1.5708
1.5708
Describes the scan geometry: 720 beams across 180° (from -1.5708 to +1.5708 radians), providing an angular resolution of ~0.25°.
gaussian
0.0
0.01
Adds Gaussian noise with zero mean and 1 cm standard deviation to simulate real-world imperfections.
Closes the plugin definition. Once loaded in Gazebo, the sensor publishes sensor_msgs/LaserScan
messages at 30 Hz.
After updating the URDF and plugin configuration, you can launch the simulation with:
ros2 launch arduinobot_gazebo simulated_robot.launch.py
Then, in a new terminal, check the data being published:
ros2 topic echo /scan
You should see a stream of range values corresponding to the distances measured by the LiDAR’s beams.
In this article, we explored how a 2D LiDAR works, how to integrate it into the robot’s URDF model, and how to simulate it in Gazebo to generate realistic scan data. This type of sensor is one of the most powerful and versatile tools for autonomous navigation, forming the foundation for mapping, localization, and obstacle avoidance.