Skip to main content

Chapter 2.4: Sensor Simulation

Learning Objectives

By the end of this chapter, students will be able to:

  • Configure various sensor types in Gazebo simulation
  • Understand sensor noise modeling and realistic simulation
  • Integrate simulated sensors with ROS 2 topics
  • Optimize sensor performance and accuracy
  • Troubleshoot common sensor simulation issues

Introduction

Sensors are the eyes, ears, and sensory organs of robots, providing the critical data needed for perception, navigation, and interaction with the environment. In simulation, accurately modeling sensors is crucial for developing and testing perception algorithms that will eventually run on real robots. For humanoid robots operating in human environments, realistic sensor simulation is particularly important as these robots must perceive and understand the same world that humans do.

In this chapter, we'll explore how to configure and use various sensor types in Gazebo simulation, focusing on creating realistic sensor models that closely match their real-world counterparts. We'll cover cameras, LIDAR, IMUs, force/torque sensors, and other sensor types commonly used in humanoid robotics.

Camera Simulation

Basic Camera Configuration

Cameras are essential for visual perception in humanoid robots:

<sensor name="head_camera" type="camera">
<pose>0.1 0 0 0 0 0</pose>
<camera name="head_camera">
<horizontal_fov>1.047</horizontal_fov> <!-- 60 degrees -->

<clip>
<near>0.1</near>
<far>100</far>
</clip>
</camera>
<always_on>true</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
</sensor>

Advanced Camera Features

For more realistic camera simulation, add noise and distortion models:

<sensor name="head_camera" type="camera">
<pose>0.1 0 0 0 0 0</pose>
<camera name="head_camera">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>100</far>
</clip>

<!-- Add noise to simulate real camera sensor -->
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.007</stddev>
</noise>

<!-- Add distortion to simulate lens effects -->
<distortion>
<k1>-0.172933</k1>
<k2>0.205327</k2>
<k3>-0.043047</k3>
<p1>-0.001798</p1>
<p2>-0.000959</p2>
<center>0.5 0.5</center>
</distortion>
</camera>
<always_on>true</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
</sensor>

Depth Camera Configuration

Depth cameras provide 3D perception capabilities:

<sensor name="depth_camera" type="depth">
<pose>0.1 0 0.05 0 0 0</pose>
<camera name="depth_camera">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>10</far>
</clip>
</camera>
<always_on>true</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
</sensor>

Stereo Camera Setup

Stereo cameras provide depth perception through triangulation:

<!-- Left camera -->
<sensor name="stereo_left" type="camera">
<pose>0.05 0.06 0 0 0 0</pose>
<camera name="stereo_left">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>10</far>
</clip>
</camera>
<always_on>true</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
</sensor>

<!-- Right camera -->
<sensor name="stereo_right" type="camera">
<pose>0.05 -0.06 0 0 0 0</pose>
<camera name="stereo_right">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>10</far>
</clip>
</camera>
<always_on>true</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
</sensor>

LIDAR Simulation

2D LIDAR Configuration

2D LIDAR is commonly used for navigation and obstacle detection:

<sensor name="laser_2d" type="ray">
<pose>0.1 0 0.1 0 0 0</pose>
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1.0</resolution>
<min_angle>-3.14159</min_angle> <!-- -π -->
<max_angle>3.14159</max_angle> <!-- π -->
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>10.0</max>
<resolution>0.01</resolution>
</range>
</ray>
<always_on>true</always_on>
<update_rate>10</update_rate>
<visualize>true</visualize>
</sensor>

3D LIDAR Configuration

3D LIDAR provides full 3D environment mapping:

<sensor name="laser_3d" type="ray">
<pose>0.1 0 0.2 0 0 0</pose>
<ray>
<scan>
<horizontal>
<samples>640</samples>
<resolution>1</resolution>
<min_angle>-3.14159</min_angle>
<max_angle>3.14159</max_angle>
</horizontal>
<vertical>
<samples>64</samples>
<resolution>1</resolution>
<min_angle>-0.2618</min_angle> <!-- -15 degrees -->
<max_angle>0.2618</max_angle> <!-- 15 degrees -->
</vertical>
</scan>
<range>
<min>0.1</min>
<max>20.0</max>
<resolution>0.01</resolution>
</range>
</ray>
<always_on>true</always_on>
<update_rate>10</update_rate>
<visualize>true</visualize>
</sensor>

Adding Noise to LIDAR

Real LIDAR sensors have measurement noise:

<sensor name="laser_2d" type="ray">
<pose>0.1 0 0.1 0 0 0</pose>
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1.0</resolution>
<min_angle>-3.14159</min_angle>
<max_angle>3.14159</max_angle>
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>10.0</max>
<resolution>0.01</resolution>
</range>
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.01</stddev> <!-- 1cm standard deviation -->
</noise>
</ray>
<always_on>true</always_on>
<update_rate>10</update_rate>
<visualize>true</visualize>
</sensor>

IMU Simulation

Basic IMU Configuration

IMUs provide orientation and acceleration data:

<sensor name="imu_sensor" type="imu">
<pose>0 0 0 0 0 0</pose>
<always_on>true</always_on>
<update_rate>100</update_rate>
<imu>
<angular_velocity>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
<bias_mean>0.001</bias_mean>
<bias_stddev>0.0001</bias_stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
<bias_mean>0.001</bias_mean>
<bias_stddev>0.0001</bias_stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
<bias_mean>0.001</bias_mean>
<bias_stddev>0.0001</bias_stddev>
</noise>
</z>
</angular_velocity>
<linear_acceleration>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
<bias_mean>0.01</bias_mean>
<bias_stddev>0.001</bias_stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
<bias_mean>0.01</bias_mean>
<bias_stddev>0.001</bias_stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
<bias_mean>0.01</bias_mean>
<bias_stddev>0.001</bias_stddev>
</noise>
</z>
</linear_acceleration>
</imu>
</sensor>

IMU Placement for Humanoid Balance

For humanoid robots, IMU placement is critical for balance:

<!-- Place IMU at the center of mass for best balance information -->
<joint name="imu_joint" type="fixed">
<parent link="torso"/>
<child link="imu_link"/>
<origin xyz="0 0 0" rpy="0 0 0"/>
</joint>

<link name="imu_link">
<inertial>
<mass value="0.001"/>
<inertia ixx="0.0001" ixy="0" ixz="0" iyy="0.0001" iyz="0" izz="0.0001"/>
</inertial>
</link>

<gazebo reference="imu_link">
<sensor name="imu_sensor" type="imu">
<always_on>true</always_on>
<update_rate>200</update_rate> <!-- Higher rate for balance control -->
<imu>
<!-- Include orientation data -->
<orientation>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1e-3</stddev>
</noise>
</orientation>
<!-- Angular velocity and linear acceleration as before -->
<angular_velocity>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
</noise>
</z>
</angular_velocity>
<linear_acceleration>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
</noise>
</z>
</linear_acceleration>
</imu>
</sensor>
</gazebo>

Force/Torque Sensor Simulation

Joint Force/Torque Sensors

Force/torque sensors are essential for manipulation tasks:

<gazebo reference="left_wrist">
<sensor name="left_wrist_ft" type="force_torque">
<always_on>true</always_on>
<update_rate>100</update_rate>
<force_torque>
<frame>child</frame> <!-- Measurement frame -->
<measure_direction>child_to_parent</measure_direction>
</force_torque>
</sensor>
</gazebo>

Custom Force/Torque Sensor

For more complex force sensing:

<gazebo reference="end_effector">
<sensor name="end_effector_force" type="force_torque">
<always_on>true</always_on>
<update_rate>500</update_rate> <!-- High rate for manipulation -->
<force_torque>
<frame>sensor</frame>
<measure_direction>child_to_parent</measure_direction>
</force_torque>
</sensor>
</gazebo>

GPS Simulation

GPS Sensor Configuration

GPS is useful for outdoor humanoid applications:

<sensor name="gps_sensor" type="gps">
<pose>0 0 1.0 0 0 0</pose> <!-- Position on top of robot -->
<always_on>true</always_on>
<update_rate>1</update_rate>
<gps>
<position_sensing>
<horizontal>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2.0</stddev> <!-- 2m standard deviation -->
</noise>
</horizontal>
<vertical>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>4.0</stddev> <!-- 4m standard deviation -->
</noise>
</vertical>
</position_sensing>
</gps>
</sensor>

Sensor Integration with ROS 2

Camera Sensor Integration

Cameras publish to ROS 2 topics automatically when properly configured:

<!-- Camera with ROS 2 interface -->
<sensor name="rgb_camera" type="camera">
<pose>0.1 0 0 0 0 0</pose>
<camera name="head_camera">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>100</far>
</clip>
</camera>
<always_on>true</always_on>
<update_rate>30</update_rate>
<visualize>true</visualize>
<plugin name="camera_controller" filename="libgazebo_ros_camera.so">
<frame_name>head_camera_frame</frame_name>
<topic_name>image_raw</topic_name>
<hack_baseline>0.07</hack_baseline>
</plugin>
</sensor>

LIDAR Integration

LIDAR sensors also publish to ROS 2 topics:

<sensor name="laser_scan" type="ray">
<pose>0.1 0 0.1 0 0 0</pose>
<ray>
<scan>
<horizontal>
<samples>360</samples>
<resolution>1.0</resolution>
<min_angle>-3.14159</min_angle>
<max_angle>3.14159</max_angle>
</horizontal>
</scan>
<range>
<min>0.1</min>
<max>10.0</max>
<resolution>0.01</resolution>
</range>
</ray>
<always_on>true</always_on>
<update_rate>10</update_rate>
<visualize>true</visualize>
<plugin name="laser_controller" filename="libgazebo_ros_ray.so">
<ros>
<namespace>/humanoid_robot</namespace>
<remapping>~/out:=scan</remapping>
</ros>
<output_type>sensor_msgs/LaserScan</output_type>
<frame_name>laser_frame</frame_name>
</plugin>
</sensor>

IMU Integration

IMU sensors publish orientation, angular velocity, and linear acceleration:

<gazebo reference="imu_link">
<sensor name="imu_sensor" type="imu">
<always_on>true</always_on>
<update_rate>100</update_rate>
<imu>
<angular_velocity>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>2e-3</stddev>
</noise>
</z>
</angular_velocity>
<linear_acceleration>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
</noise>
</x>
<y>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
</noise>
</y>
<z>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>1.7e-2</stddev>
</noise>
</z>
</linear_acceleration>
</imu>
<plugin name="imu_plugin" filename="libgazebo_ros_imu.so">
<ros>
<namespace>/humanoid_robot</namespace>
<remapping>~/out:=imu</remapping>
</ros>
<frame_name>imu_link</frame_name>
<body_name>imu_link</body_name>
<update_rate>100</update_rate>
<gaussian_noise>0.01</gaussian_noise>
</plugin>
</sensor>
</gazebo>

Sensor Performance Optimization

Reducing Computational Load

For better simulation performance, optimize sensor settings:

<!-- Lower resolution for faster simulation -->
<sensor name="low_res_camera" type="camera">
<pose>0.1 0 0 0 0 0</pose>
<camera name="low_res_camera">
<horizontal_fov>1.047</horizontal_fov>

<clip>
<near>0.1</near>
<far>10</far> <!-- Shorter range -->
</clip>
</camera>
<always_on>true</always_on>
<update_rate>15</update_rate> <!-- Lower update rate -->
<visualize>false</visualize> <!-- Disable visualization for headless sim -->
</sensor>

Multi-threading Sensor Processing

Gazebo can process sensors in separate threads:

<!-- Configure Gazebo to use multiple threads -->
<gazebo>
<plugin name="sensor_thread_manager" filename="libgazebo_ros_sensor_thread_manager.so">
<threads>4</threads>
</plugin>
</gazebo>

Sensor Fusion and Coordination

Synchronizing Multiple Sensors

For effective perception, sensors need to be properly coordinated:

<!-- Example: Synchronized sensor configuration -->
<sdf version="1.7">
<model name="humanoid_with_sensors">
<!-- ... robot definition ... -->

<!-- Camera with specific timing -->
<sensor name="camera" type="camera">
<update_rate>30</update_rate>
<plugin name="camera_controller" filename="libgazebo_ros_camera.so">
<frame_name>camera_frame</frame_name>
<topic_name>camera/image_raw</topic_name>
<camera_info_topic_name>camera/camera_info</camera_info_topic_name>
</plugin>
</sensor>

<!-- IMU with higher frequency for fusion -->
<sensor name="imu" type="imu">
<update_rate>200</update_rate>
<plugin name="imu_controller" filename="libgazebo_ros_imu.so">
<frame_name>imu_frame</frame_name>
<topic_name>imu/data</topic_name>
</plugin>
</sensor>

<!-- LIDAR with appropriate frequency -->
<sensor name="lidar" type="ray">
<update_rate>10</update_rate>
<plugin name="lidar_controller" filename="libgazebo_ros_ray.so">
<frame_name>lidar_frame</frame_name>
<topic_name>scan</topic_name>
</plugin>
</sensor>
</model>
</sdf>

Troubleshooting Sensor Simulation

Common Issues and Solutions

Sensor Not Publishing Data

<!-- Solution: Ensure proper plugin configuration -->
<sensor name="camera" type="camera">
<!-- ... camera definition ... -->
<plugin name="camera_controller" filename="libgazebo_ros_camera.so">
<frame_name>camera_frame</frame_name>
<topic_name>camera/image_raw</topic_name>
<!-- Make sure namespace is correct -->
<robot_namespace>/my_robot</robot_namespace>
</plugin>
</sensor>

High CPU Usage from Sensors

<!-- Solution: Reduce update rates and resolution -->
<sensor name="camera" type="camera">
<update_rate>10</update_rate> <!-- Lower update rate -->
<camera name="camera">

</camera>
<visualize>false</visualize> <!-- Disable visualization -->
</sensor>

Sensor Noise Too High/Low

<!-- Solution: Adjust noise parameters to match real sensors -->
<sensor name="depth_camera" type="depth">
<camera name="depth_camera">
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.01</stddev> <!-- Adjust to match real sensor -->
</noise>
</camera>
</sensor>

Best Practices for Sensor Simulation

1. Match Real Sensor Characteristics

Configure simulated sensors to match real hardware:

<!-- Example: Matching RealSense D435 camera -->
<sensor name="realsense_camera" type="depth">
<camera name="realsense_camera">
<horizontal_fov>1.047</horizontal_fov> <!-- 60 degrees -->

<clip>
<near>0.1</near>
<far>10</far>
</clip>
<!-- Noise parameters matching real sensor -->
<noise>
<type>gaussian</type>
<mean>0.0</mean>
<stddev>0.007</stddev>
</noise>
</camera>
</sensor>

2. Proper Sensor Placement

Place sensors where they would be on the real robot:

<!-- Camera in the head for humanoid-like vision -->
<joint name="head_camera_joint" type="fixed">
<parent link="head"/>
<child link="head_camera_frame"/>
<origin xyz="0.05 0 0.05" rpy="0 0 0"/> <!-- Forward and slightly up -->
</joint>

<link name="head_camera_frame">
<inertial>
<mass value="0.001"/>
<inertia ixx="0.0001" ixy="0" ixz="0" iyy="0.0001" iyz="0" izz="0.0001"/>
</inertial>
</link>

<gazebo reference="head_camera_frame">
<sensor name="head_camera" type="camera">
<!-- Camera configuration -->
</sensor>
</gazebo>

3. Realistic Noise Modeling

Include appropriate noise models for realistic simulation:

<!-- IMU with realistic noise parameters -->
<sensor name="imu_sensor" type="imu">
<always_on>true</always_on>
<update_rate>100</update_rate>
<imu>
<angular_velocity>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.02</stddev> <!-- 0.02 rad/s = ~1.1 deg/s -->
<bias_mean>0.01</bias_mean> <!-- 0.01 rad/s = ~0.6 deg/s bias -->
<bias_stddev>0.005</bias_stddev>
</noise>
</x>
<!-- Similar for y and z axes -->
</angular_velocity>
<linear_acceleration>
<x>
<noise type="gaussian">
<mean>0.0</mean>
<stddev>0.017</stddev> <!-- 0.017 * 9.81 m/s² = ~0.17 m/s² -->
<bias_mean>0.05</bias_mean> <!-- 0.05 * 9.81 m/s² = ~0.49 m/s² -->
<bias_stddev>0.01</bias_stddev>
</noise>
</x>
<!-- Similar for y and z axes -->
</linear_acceleration>
</imu>
</sensor>

Hands-On Exercise: Sensor Integration

Objective

Integrate multiple sensors into a humanoid robot model and verify proper data publication.

Prerequisites

  • Completed previous chapters
  • Working humanoid robot model
  • Gazebo simulation environment

Steps

  1. Add a camera, IMU, and 2D LIDAR to your humanoid robot
  2. Configure realistic noise models for each sensor
  3. Set up ROS 2 plugins to publish sensor data
  4. Launch the simulation and verify that sensor topics are publishing
  5. Use ROS 2 tools to inspect the sensor data quality
  6. Optimize sensor parameters for performance while maintaining quality

Expected Result

Students will have a humanoid robot with properly configured sensors publishing realistic data to ROS 2 topics.

Assessment Questions

Multiple Choice

Q1: What is the primary purpose of adding noise models to simulated sensors?

  • a) To make the simulation run faster
  • b) To make the simulation more realistic by matching real sensor characteristics
  • c) To reduce the accuracy of the simulation
  • d) To increase the number of sensor topics
Details

Click to reveal answer Answer: b
Explanation: Adding noise models to simulated sensors makes the simulation more realistic by matching the characteristics of real sensors, which have inherent noise and inaccuracies.

Short Answer

Q2: Explain the importance of proper sensor placement on a humanoid robot for effective perception.

Practical Exercise

Q3: Create a robot model with a camera, IMU, and LIDAR. Configure each sensor with realistic parameters and noise models. Verify that each sensor publishes data to the appropriate ROS 2 topics with the correct frame IDs and data quality.

Further Reading

  1. "Gazebo Sensor Tutorial" - Official Gazebo sensor documentation
  2. "Robot Sensors and Perception" - Comprehensive guide to robot sensing
  3. "Simulation-Based Sensor Development" - Best practices for sensor simulation

Summary

In this chapter, we've explored sensor simulation in Gazebo, covering the configuration and use of various sensor types including cameras, LIDAR, IMUs, and force/torque sensors. We've learned how to add realistic noise models, integrate sensors with ROS 2, and optimize sensor performance.

Accurate sensor simulation is crucial for humanoid robotics as it enables the development and testing of perception algorithms that will eventually run on real robots. By properly configuring simulated sensors to match their real-world counterparts, we can bridge the gap between simulation and reality, making the transition to physical robots more successful.

In the next chapter, we'll explore Unity for high-fidelity rendering and how it can complement Gazebo simulation for humanoid robotics applications.