Enhance Autonomous Robot Localization Precision with Advanced IMUs and Sensor Fusion

By Stephen Evanczuk

Contributed By DigiKey's North American Editors

Inertial measurement units (IMUs) are fundamental to a broad range of mobile systems, including industrial robotics, humanoid robots, unmanned aerial vehicles (UAVs), and immersive mixed-reality systems, among others. Although the specific demands for these systems vary with each application, designers are consistently challenged to provide increasingly accurate, real-time orientation and motion data for the general class of application called autonomous mobile robots (AMRs).

This article briefly discusses the unique challenges of AMR localization. It then introduces advanced IMUs from Analog Devices and shows how they can be used to address these challenges in indoor, global positioning system (GPS)-denied environments, while drawing lessons from broader cross-domain use.

Why localization is a challenge for AMR developers

AMRs are central to the productivity of smart factories and warehouses where they help streamline material flow, reduce waste, and improve utilization. Ensuring their accurate localization within the facility is critical to success. In purpose-built facilities, AMR localization challenges can be mitigated through well-placed fiducials (reference markers) or optimized layouts, but most AMRs operate in legacy facilities. In such facilities, the combination of varying lighting, reflective surfaces, and complex geometry makes localization much harder.

Furthermore, the lack of consistent infrastructure, such as standardized aisle widths or predictable floor markings, means that robots confront more complex navigation and mapping tasks.

The nature of the navigational environment results in two key operational challenges.1

  • First, the robot must perform efficient path planning to determine the most optimal route through its environment based on current conditions.
  • Second, it must execute precise localization, continuously updating its own position and orientation in real time as it moves.

In GPS-denied indoor environments, these two objectives must be met entirely with on-board sensing capabilities and computational resources.

To meet these challenges, AMRs use a mix of sensor modalities. Visual perception systems, including cameras, light detection and ranging (LiDAR), and radar, provide rich environmental data. Odometry systems, such as wheel encoders and IMUs, track motion directly from the robot’s movement. Each sensor type offers distinct advantages: Some excel at long-range detection, others at precise detection, but each also has limitations. By combining them intelligently, AMRs can achieve the redundancy and coverage needed to maintain accuracy in dynamic, unpredictable conditions.

What an IMU measures and why it matters

An IMU integrates microelectromechanical systems (MEMS) sensors to measure acceleration and angular velocity in three dimensions. A triaxial accelerometer measures motion along the x, y, and z axes relative to Earth’s gravity, capturing both static forces, such as tilt, and dynamic forces, such as acceleration during motion (Figure 1).

Diagram of triaxial accelerometerFigure 1: A triaxial accelerometer measures acceleration along the x, y, and z axes, providing both dynamic motion data and a static gravity reference. (Image source: Analog Devices)

A triaxial gyroscope measures angular velocity (ωx, ωy, ωz) about each axis (Figure 2), enabling the robot to track orientation changes.

Diagram of triaxial gyroscope measures angular velocity about each axisFigure 2: A triaxial gyroscope measures angular velocity about each axis, enabling accurate tracking of orientation changes. (Image source: Analog Devices)

At the core of both accelerometers and gyroscopes in modern IMUs, MEMS structures deflect or vibrate when subjected to acceleration or rotation, and the resulting changes in capacitance or vibration frequency are converted into electrical signals. The advantage of MEMS-based IMUs is their combination of small size, low power consumption, and high measurement rates, making them practical for integration into mobile platforms.

Some IMUs also include additional sensors that expand their capability. A high-performance magnetometer provides magnetic field measurements that aid orientation estimation in challenging environments, although magnetometers are more common in legacy IMUs. An integrated temperature sensor enables thermal compensation of accelerometer and gyroscope data. A barometer may also be included to measure atmospheric pressure and estimate altitude.

Beyond their sensor array, advanced IMUs also integrate extensive data acquisition signal chains for analog-to-digital conversion, preliminary finite impulse response filtering, and factory calibration to correct sensor biases and axis misalignment (Figure 3). These devices often allow rotation (dƟ) from the IMU’s internal coordinate frame to match the robot’s frame before output, reducing the computational load on the main processor.

Image of functional block diagram of an advanced IMU (click to enlarge)Figure 3: A functional block diagram of an advanced IMU shows an extensive sensor signal chain providing sensing, calibration, compensation, and filtering integrated into a single compact device. (Image source: Analog Devices)

How IMUs strengthen localization when other sensors falter

Certain characteristics of different physical environments can impact the effectiveness of individual sensor modalities. To mitigate the limitations of different sensory systems, a typical AMR relies on a diverse sensor stack that can include vision sensors, time-of-flight (ToF) systems, LiDAR, radar, wheel encoders, and an IMU (Figure 4).

Image of AMR sensor stackFigure 4: An AMR’s sensor stack typically combines vision sensors, an IMU, and wheel encoders to provide complementary information for localization. (Image source: Analog Devices)

In a feature-sparse corridor, for example (Figure 5), the long stretch of walls lacks the distinctive elements needed for visual simultaneous localization and mapping (SLAM) algorithms to match frames to a stored map. Without unique visual cues, the robot’s pose estimate can drift quickly, causing an AMR to lose its position. In this scenario, the heading and orientation information provided by an IMU can sustain robot navigation despite the loss of visual odometry.

Image of robot visual odometry may fail quickly in a long, featureless corridorFigure 5: In a long, featureless corridor, robot visual odometry may fail quickly, causing the AMR to lose its position if heading and orientation information from an IMU are lacking. (Image source: Analog Devices)

In large open spaces, such as a 50 m × 50 m warehouse, many visual features are beyond the effective range of LiDAR (Figure 6), which typically provides a maximum reach of 10 m to 15 m. Uniform layouts such as evenly spaced shelving or storage racks can confuse visual odometry because of the similar appearance of multiple different locations. In this scenario, the combination of IMU measurements and wheel encoder data allows the robot to maintain local pose estimates.

Diagram of IMU measurements and wheel odometry can sustain localization in a large open areaFigure 6: In a large open area, where sensor range limitations and lack of distinguishing visual features degrade visual sensing, IMU measurements and wheel odometry can sustain localization. (Image source: Analog Devices)

Sloped surfaces present another challenge (Figure 7). Standard two-dimensional LiDAR captures points in a flat plane; therefore, a slope can appear to be a vertical obstacle. This misinterpretation can disrupt navigation or cause the robot to avoid traversable paths. Here, IMU pitch and roll data can provide gradient information to mitigate this LiDAR misinterpretation, enabling SLAM algorithms to resolve the gradient and distinguish between traversable slopes and true obstacles.

Image of IMU pitch and roll readings can reveal the gradient of a slopeFigure 7: IMU pitch and roll readings can reveal the gradient of a slope, correcting 2D SLAM misinterpretations and enabling safe AMR navigation. (Image source: Analog Devices)

Environmental factors also degrade localization performance of different sensor modalities (Table 1). Factors such as poor lighting, dynamic environments, reflective surfaces, and a need for rich scene geometry can impact most sensory modalities.

Sensor modality Affected by poor lighting Affected by dynamic movers Affected by reflective surfaces Reliant on rich scene geometry
Standard RGB camera Yes Yes No No
Time of flight No Yes Yes Yes
LiDAR No Yes Yes Yes
Radar No Yes Yes Yes
Wheel odometry No No No No
IMU No No No No

Table 1: Shown is the impact of various environmental factors on sensor effectiveness. (Table source: Analog Devices)

How IMUs’ unique performance capabilities benefit AMRs

IMUs update at higher rates than perception sensors, enabling rapid response to dynamic changes in the environment. Unlike perception systems, which typically operate at 10 Hz to 30 Hz, IMUs can provide processed data at 200 Hz and raw data at up to 4 kHz. With its 10x faster update rate, an IMU can enable updated pose estimates during the longer intervals between perception measurements. This higher update rate ultimately leads to faster reaction times to sudden changes in motion and enhances system reliability in dynamic environments.

IMUs provide the foundation for AMR dead reckoning, where an AMR estimates its current position from a known starting position based on integration of IMU acceleration and angular measurements. By providing data needed to update position, orientation, and speed continually, IMUs enable precise pose estimation for reliable AMR navigation.

Compact size and light weight also favor IMU integration in AMRs. For example, the Analog Devices ADIS16500AMLZ IMU (Figure 8) comes in a BGA package measuring only 15 × 15 × 5 millimeters (mm), yet it integrates a gyroscope, accelerometer, temperature sensor, and a complete signal chain for data acquisition and signal conditioning. This level of integration allows it to deliver comprehensive motion data to the host processor while enabling its use in space-constrained mechanical layouts without compromising the robot’s maneuverability.

Diagram of Analog Devices ADIS16500AMLZ IMUFigure 8: The ADIS16500AMLZ IMU integrates a gyroscope, accelerometer, temperature sensor, and a complete signal chain for data acquisition and signal conditioning. (Image source: Analog Devices)

With its ±2000˚ per second (°/s) gyroscope dynamic range, the ADIS16500AMLZ captures rapid turns without saturation, which is essential for AMRs navigating tight spaces or performing quick obstacle avoidance. The ±392 m per second squared (m/s²) accelerometer dynamic range captures both smooth movement and high-impact shocks. Its 8.1˚ per hour (°/hr) gyroscope bias stability and 125 microns per second squared (μm/s²) accelerometer bias stability reduce drift to enhance dead-reckoning accuracy between corrections.

Factory calibration provides built-in correction for sensitivity, bias, and axis alignment, while dynamic offset correction accounts for temperature shifts, supply voltage changes, and magnetic interference, as well as noise reduction.2 The IMU’s mechanical shock tolerance of 19,600 m/s² and an operating range of −25 to 85°C enable deployment in demanding environments, while its low-noise, high-bandwidth ADCs ensure continued accurate data capture at the high update rates needed in responsive control systems.

IMUs in general are also relatively resistant to electromagnetic interference (EMI) and can operate in varied lighting and environmental conditions. As a result, these devices can serve in a broad array of applications.

Mitigating the performance limitations of IMUs

Despite their performance benefits, IMUs present some inherent limitations.3 Unfiltered noise can affect IMU measurements, which reduces navigation accuracy. Bias in accelerometer and gyroscope sensors accumulates over time, leading to drift in orientation and motion estimates. Nonlinear sensor behavior distorts measurements, and thermoelectric events can lead to angle random walk (ARW) errors in gyroscopes and velocity random walk (VRW) errors in accelerometers that further degrade long-term performance. Unmitigated, these issues reduce localization reliability over time.

Sensor fusion can overcome IMU limitations by integrating IMU data with other sensor inputs to increase the quality and reliability of the data, improve estimation of unmeasured states, and increase coverage to enhance safety. State estimation techniques such as extended Kalman filtering (EKF) (Figure 9) can correct for noise, random walk, and bias instability during normal AMR operation. By measuring acceleration due to Earth’s gravity, pitch and roll gyroscope errors can be eliminated. Finally, bias drift can be tracked and corrected. In operation, EKF effectively enables estimation of past, present, and future states despite lack of complete knowledge about the nature of the modeled system.

Diagram of simplified EKF algorithm processes noisy sensor measurements over timeFigure 9: A simplified EKF algorithm processes noisy sensor measurements over time to produce a corrected, continuous estimate of robot pose and motion. (Image source: Analog Devices)

EKF has gained widespread use because it can model system dynamics and measurement uncertainties, then update the state estimate when new data arrives. Measurements that may contain Gaussian white noise or other inaccuracies are observed over time and used for correction. The filter estimates the true value of measurements by synchronizing measurements between sensors, predicting pose and error estimates, and estimating and updating the uncertainty of the predicted value.

Sensor fusion algorithms are embedded in the Robot Operating System (ROS) open-source robot_localization package,4 which implements EKF-based fusion and utilizes the EKF algorithm at its core (Figure 10).

Diagram of ROS-based sensor fusion software architectureFigure 10: A typical ROS-based sensor fusion software architecture combines multiple sensor inputs through the robot_localization package to produce a robust, continuous pose estimate. (Image source: Analog Devices)

This ROS package enables the fusion of an unrestricted number of sensors and can accept a variety of input types, including IMU data, wheel velocity, and odometry. The fused output includes full 3D position and orientation, linear and angular velocities, and acceleration, which feed directly into navigation and SLAM algorithms. Using these inputs, robot_localization generates an estimated pose state expressed as a vector of actual and derived measurements:

Pose State = (X, Y, Z, roll, pitch, yaw, X˙, Y˙, Z˙, roll˙, pitch˙, yaw˙, X¨, Y¨, Z¨)

Accelerate development of precise AMR locationing

The ADIS16500AMLZ IMU demonstrates how precision sensing and integrated processing can improve AMR localization performance. To help developers accelerate application development, Analog Devices provides the ADIS16500/PCBZ breakout board (Figure 11, left) and accompanying EVAL-ADIS-FX3Z evaluation system (Figure 11, right).

Image of Analog Devices ADIS16500/PCBZ breakout board (left) and the EVAL-ADIS-FX3Z evaluation kit (right)Figure 11: The ADIS16500/PCBZ breakout board (left) and the EVAL-ADIS-FX3Z evaluation kit (right) enable the rapid development of applications based on the ADIS16500 IMU. (Image source: Analog Devices.)

The breakout board comprises the IMU and a 16-pin header that mates to 2 mm ribbon cables to connect to the evaluation system. The evaluation system allows real-time sampling of the IMU at full sample rate and is powered via its USB port. All required software is downloadable from the resource page.

Conclusion

IMUs are essential for maintaining precise localization in AMRs, providing orientation estimates and motion tracking at high update rates even when other sensory modalities fail due to environmental conditions. By using sensor fusion to compensate for limitations across different sensor types, AMRs can perform precise navigation even in dynamic environments that normally confuse AMR localization. With the availability of highly integrated IMUs and associated breakout boards and evaluation systems, developers can quickly design AMRs able to achieve the accurate, reliable localization required for precise navigation.

References

  1. Shoudong Huang and Gamini Dissanayake, Robot Localization: An Introduction, John Wiley & Sons, August 2016.
  2. Randy Carver and Mark Looney, “MEMS Accelerometer Calibration Optimizes Accuracy for Industrial Applications,” EE Times, October 2007.
  3. Oliver J. Woodman, “An Introduction to Inertial Navigation,” University of Cambridge, August 2007.
  4. robot_localization Documentation, v2.6.12, Tom Moore, 2016.
DigiKey logo

Disclaimer: The opinions, beliefs, and viewpoints expressed by the various authors and/or forum participants on this website do not necessarily reflect the opinions, beliefs, and viewpoints of DigiKey or official policies of DigiKey.

About this author

Image of Stephen Evanczuk

Stephen Evanczuk

Stephen Evanczuk has more than 20 years of experience writing for and about the electronics industry on a wide range of topics including hardware, software, systems, and applications including the IoT. He received his Ph.D. in neuroscience on neuronal networks and worked in the aerospace industry on massively distributed secure systems and algorithm acceleration methods. Currently, when he's not writing articles on technology and engineering, he's working on applications of deep learning to recognition and recommendation systems.

About this publisher

DigiKey's North American Editors