Obstacle Avoidance Control with LiDAR Sensor
This document explains how to implement obstacle avoidance control using a LiDAR sensor in Python in a ROS2 Humble environment. The node publishes geometry_msgs/Twist type messages to the /kachaka/manual_control/cmd_vel topic and subscribes to sensor_msgs/LaserScan type messages from the /kachaka/lidar/scan topic.
Learning Objectives
By the end of this lesson, you will be able to:
- Understand how LiDAR sensors work and how to process their data
- Implement obstacle detection using sensor data
- Create reactive control algorithms for obstacle avoidance
- Design sector-based obstacle detection systems
Program Overview
This program demonstrates obstacle avoidance by:
- Using a LiDAR sensor to detect obstacles around the robot
- Controlling the robot’s movement based on detected obstacle information
- Detecting obstacles by dividing the sensor data into three sectors: front, right, and left
- Implementing basic movements (straight, rotate, stop) to avoid obstacles
1. What is a LiDAR Sensor
LiDAR (Light Detection and Ranging) is a sensor that measures the distance to objects using laser light. It can scan a 360-degree range and obtain the distance to obstacles in each direction.
Features of LiDAR Sensors
- High Precision: Can measure distances in millimeter units
- Wide Range: Can scan a 360-degree range (*depends on the sensor)
- High Speed: Can obtain distance data in real-time
- Robustness: Works in various lighting conditions (unlike cameras)
- Range: Typically measures distances from a few centimeters to tens of meters
Important Considerations:
- LiDAR data can be affected by reflective surfaces
- Some materials may not reflect laser light well
- The sensor has minimum and maximum range limits
- Data processing is required to interpret the raw measurements
Application to Obstacle Avoidance
Using LiDAR sensors, the following obstacle avoidance functions can be realized:
- Detection and avoidance of obstacles ahead
- Detection and avoidance of surrounding obstacles
- Planning and execution of safe paths
2. Prerequisites
- ROS2 Humble is installed
- Python3 is available
- ROS2 workspace (e.g.,
~/ros2_ws) is set up
Required Packages
rclpy: ROS2 Python client librarygeometry_msgs: Package for handling geometric messages such as position and velocitysensor_msgs: Package for handling sensor datanumpy: Library for numerical calculations
3. Creating a Package
First, create a package in the src directory of the ROS2 workspace.
cd ~/ros2_ws/src
ros2 pkg create --build-type ament_python kachaka_lidar_control --dependencies rclpy geometry_msgs sensor_msgs numpy --node-name kachaka_lidar_controlThis command generates a package with the following structure:
kachaka_lidar_control/
├── resource/
│ └── kachaka_lidar_control
├── kachaka_lidar_control/
│ ├── __init__.py
│ └── kachaka_lidar_control.py
├── test/
├── package.xml
└── setup.py4. Node Implementation
Create a kachaka_lidar_control.py file with the following content in the kachaka_lidar_control directory within the kachaka_lidar_control package.
#!/usr/bin/env python3
import rclpy
from rclpy.node import Node
import numpy as np
from geometry_msgs.msg import Twist
from sensor_msgs.msg import LaserScan
# Import for QoS settings
from rclpy.qos import QoSProfile, QoSReliabilityPolicy, QoSHistoryPolicy
def normalize_angle(angle):
"""
Normalize an angle to the range [-π, π).
This function ensures angles are always in the standard range, which is
important when working with LiDAR data where angles can wrap around.
For example, an angle of 3π/2 becomes -π/2.
Parameters:
angle: Input angle in radians (can be any value)
Returns:
Normalized angle in the range [-π, π)
"""
# The formula: (angle + π) % (2π) - π
# This shifts the angle by π, takes modulo 2π, then shifts back
# This ensures the result is always in [-π, π)
return (angle + np.pi) % (2.0 * np.pi) - np.pi
class KachakaLidarControl(Node):
def __init__(self):
super().__init__('kachaka_lidar_control')
# Create QoS profile for LaserScan subscription
qos_profile = QoSProfile(
history=QoSHistoryPolicy.KEEP_LAST,
depth=10,
reliability=QoSReliabilityPolicy.BEST_EFFORT
)
# Publisher for velocity commands
self.cmd_vel_pub = self.create_publisher(Twist, '/kachaka/manual_control/cmd_vel', 10)
# Subscriber for LiDAR scan data
self.scan_sub = self.create_subscription(LaserScan, '/kachaka/lidar/scan', self.callback_scan, qos_profile)
# Initialize minimum distance variables for each sector
self.front_min = None
self.right_min = None
self.left_min = None
# Create a timer to call process function at 10Hz (every 0.1 seconds)
# The timer ensures we regularly check for obstacles and update robot movement
# 10Hz is a good balance between responsiveness and computational load
self.timer = self.create_timer(0.1, self.process)
def callback_scan(self, data):
"""
- Front sector: -15° to +15° ( -0.26 rad to +0.26 rad )
- Right sector: -180° to -15° ( -3.14 rad to -0.26 rad )
- Left sector: +15° to +180° ( +0.26 rad to +3.14 rad )
"""
# Sector boundaries in the shifted coordinate system
front_min_angle = np.deg2rad(-15) # -15°
front_max_angle = np.deg2rad(15) # +15°
right_min_angle = -np.pi # -180°
right_max_angle = np.deg2rad(-15) # -15°
left_min_angle = np.deg2rad(15) # +15°
left_max_angle = np.pi # +180°
# Initialize minimum distances with sensor's maximum range
front_min = data.range_max
right_min = data.range_max
left_min = data.range_max
# Iterate through each distance reading and calculate shifted angle
angle = data.angle_min
for r in data.ranges:
# Skip invalid readings
if r == 0.0 or r < data.range_min:
angle += data.angle_increment
continue
# Shift angle by +π/2 so that "front" is 0 in the code
# LiDAR sensors typically have 0° at the back or side, so we rotate
# the coordinate system to make the front of the robot 0°
shifted_angle = angle + np.pi / 2
# Normalize to ensure the angle is in [-π, π)
shifted_angle = normalize_angle(shifted_angle)
# Check which sector the shifted angle belongs to
if front_min_angle <= shifted_angle <= front_max_angle:
# Front sector
if r < front_min:
front_min = r
elif right_min_angle <= shifted_angle < right_max_angle:
# Right sector
if r < right_min:
right_min = r
elif left_min_angle < shifted_angle <= left_max_angle:
# Left sector
if r < left_min:
left_min = r
angle += data.angle_increment
# Save results for use in process()
self.front_min = front_min
self.right_min = right_min
self.left_min = left_min
# Debug log
self.get_logger().info(
f"Front: {front_min:.2f} m, Right: {right_min:.2f} m, Left: {left_min:.2f} m"
)
def process(self):
"""
Determine robot movement based on obstacle detection.
This method is called periodically by the timer (10Hz). It implements
a simple reactive obstacle avoidance algorithm:
Movement logic:
- If front sector is clear (> threshold), move forward.
- Otherwise, if right sector is blocked, rotate left.
- Otherwise, if left sector is blocked, rotate right.
- Otherwise, stop (all directions blocked or no data).
This is a simple "bug algorithm" - the robot tries to go forward,
and if blocked, turns away from obstacles.
"""
vel = Twist()
# Distance threshold in meters
# If an obstacle is closer than this, we consider it a blockage
threshold = 1.0 # meters
# Check if front is clear
if self.front_min is not None and self.front_min > threshold:
# Front is clear - move forward
vel.linear.x = 0.2 # Forward speed
vel.angular.z = 0.0 # No rotation
else:
# Front is blocked or we don't have valid data
# Try to turn away from obstacles
if self.right_min is not None and self.right_min < threshold:
# Obstacle detected on right side => rotate left to avoid it
vel.linear.x = 0.0 # Stop forward motion
vel.angular.z = 0.5 # Rotate left (positive = counterclockwise)
elif self.left_min is not None and self.left_min < threshold:
# Obstacle detected on left side => rotate right to avoid it
vel.linear.x = 0.0 # Stop forward motion
vel.angular.z = -0.5 # Rotate right (negative = clockwise)
else:
# No clear direction or all sectors blocked => stop for safety
vel.linear.x = 0.0
vel.angular.z = 0.0
# Publish the velocity command
self.cmd_vel_pub.publish(vel)
def main(args=None):
rclpy.init(args=args)
node = KachakaLidarControl()
try:
rclpy.spin(node)
except KeyboardInterrupt:
pass
node.destroy_node()
rclpy.shutdown()
if __name__ == '__main__':
main()5. Editing setup.py
If you are not using –node-name, set the entry point in setup.py at the package root.
entry_points={
'console_scripts': [
'kachaka_lidar_control = kachaka_lidar_control.kachaka_lidar_control:main'
],6. Building and Running the Package
Building the Package
cd ~/ros2_ws
colcon build --packages-select kachaka_lidar_controlLoading Environment Settings
source install/setup.bashRunning the Node
ros2 run kachaka_lidar_control kachaka_lidar_control7. Detailed Code Explanation
Classes and Methods
KachakaLidarControl: Main node class__init__: Node initialization, publisher and subscriber setupcallback_scan: Callback function when receiving LiDAR scan dataprocess: Function to determine robot movement based on obstacle information
Control Parameters
threshold: Distance threshold to determine obstacles [m]front_min,right_min,left_min: Minimum distance in each sector [m]vel.linear.x: Linear velocity [m/s]vel.angular.z: Angular velocity [rad/s]
Movement Sequence
The program executes the following movement sequence:
- If there is no obstacle ahead (
front_min > threshold): Move straight - If there is an obstacle ahead and an obstacle on the right: Rotate left
- If there is an obstacle ahead and an obstacle on the left: Rotate right
- Otherwise: Stop
Implementation Points of LiDAR Sensor
Sector Division:
- Detect obstacles by dividing into three sectors: front, right, and left
- Sector boundaries are defined by angles
Invalid Data Processing:
- Treat data with distance 0 or less than minimum range as invalid
- This reduces the impact of noise and outliers
QoS Profile Settings:
- Set a QoS profile with
BEST_EFFORTreliability for the LiDAR scan data subscriber - This allows for data delay or loss and prioritizes real-time performance
- Set a QoS profile with
Exercises
Avoiding Multiple Obstacles
Let’s extend the control program so that the robot can move safely in an environment with multiple obstacles.
Content
Try implementing your favorite obstacle avoidance algorithm!