AUTO: Multiple Imaging Radars Integration with INS/GNSS for Reliable and Accurate Positioning for Autonomous Vehicles and Robots
- Sep 20, 2021
- 2 min read
Updated: Sep 23, 2024

Autonomous driving is a field that has gathered much interest in recent years with significant research directed at solving the localization problem. To enable a fully autonomous platform, the navigation system is required to provide accurate solutions at high rates, and be reliable and always available in all types of environments. These requirements necessitate the use of multiple sensors, while maintaining an affordable cost to enable widespread adoption.
Inertial Measurement Units (IMU) are commonly used sensors due to their self-contained and always available nature. However, pure Inertial Navigation Systems (INS) accumulate errors over time which limits their use to short durations. To overcome this limitation, a conventional approach has been to integrate INS with GNSS and odometry measurements to constrain the error growth. Such integration schemes can achieve lane level accuracy under open sky conditions, even when using low-cost Micro-Electro-Mechanical Systems (MEMS) based IMUs.
Unfortunately, such systems are not sufficient to provide accurate and reliable positioning in all environments, particularly in GNSS challenged or denied areas like tunnels and urban canyons. To maintain accurate positioning in these areas, perception sensors like cameras, Lidar, or radar, may be utilized to provide another source of absolute positioning information.
This paper presents a multi-radar integrated version of AUTO, a real-time integrated navigation system that provides an accurate, reliable, high rate, and continuous (always available) navigation solution for autonomous platforms by integrating INS, GNSS-RTK, odometer, and multiple radars sensors with HD maps. AUTO uses a tight nonlinear integration scheme to fuse information from multiple imaging radars with the INS/GNSS/odometer solution. The HD maps may come from a map provider or be crowdsourced from radar data.
The results in this paper compare multi-radar configurations of 1 to 5 imaging radars for vehicle and demonstrate the accurate solution achieved through the tightly integrated system. Furthermore, Key Performance Indices (KPI) are presented for a multi-radar configuration of AUTO for vehicle and robot. The results show how radar data contributes significantly with other sensors to provide a high rate, accurate, reliable, and robust navigation solution in GNSS degraded or denied environments and adverse weather conditions.
Read the full paper here.




This article on integrating imaging radars with INS/GNSS for autonomous positioning is incredibly insightful! While doing research and using Old English Translator for a side project, I stumbled upon your site. Great technical details and future outlook!
This post explains autonomous positioning challenges very clearly, and the multi-radar + INS/GNSS integration is presented as a practical way to improve reliability in GNSS-denied environments like tunnels and urban canyons. If you’re creating related mobility tech content or visuals, Color Mixer is a useful tool for building clear, consistent color palettes quickly.
I recently discovered a very useful mouse testing website that can check the condition of your mouse scroll wheel through scrolling tests. If you want to see whether your mouse is damaged, you can try Polling rate test
確かに複数センサーの統合は重要ですが、HDマップの作成や更新のコストも無視できない課題ですよね。別の角度から情報を整理するのに、文字数が役立つかもしれません。
Multiple radars fused with HD maps for urban canyon navigation? That's seriously clever tech. For a different kind of challenge, check out nuzlocke redux.