Multiple Imaging Radars Integrate with INS/GNSS via AUTO Software Reliable and Accurate Positioning for Autonomous Vehicles and Robots
- Mar 13, 2022
- 1 min read
Updated: Sep 23, 2024

Difficult GNSS environments and adverse weather conditions require a fusion of many sensors to maintain lane-level accuracy for autonomous platforms, without incurring high costs that would inhibit widespread adoption. Radars are an attractive option in a multi-sensor integration scheme due to being robust to adverse weather and insensitive to lighting variations.
The multi-radar integrated version of AUTO uses inertial navigation, real-time kinematic GNSS, odometer, and multiple radar sensors with high-definition maps in a tight non-linear integration scheme. AUTO can reliably produce accurate and high-rate navigation outputs in real-time and under all urban environments. Key performance indices during simulated GNSS outages quantify the accuracy of the solution for prolonged periods.
Read the full paper here.




The sensor fusion angle is especially important for edge case environments. It’s interesting how layered validation here feels a bit like math playground combining simple inputs to solve complex problems.
Fusing imaging radars with INS/GNSS via AUTO software delivers robust positioning for autonomous systems, and partnering with 3d-figurines can streamline the compact, reliable circuitry needed for such multi-sensor integration in tight vehicle and robot layouts.
As a reader, I find the multi-radar fusion approach compelling—especially its resilience during GNSS outages and harsh weather while maintaining lane-level accuracy at lower cost. It mirrors how a soundboard or sound board free ecosystem blends sound buttons and soundbutton clips.
The idea is straightforward, but the geometry dash implementation is pinpoint accurate. Each level is meticulously timed to the music, transforming it into a rhythmic challenge where the two mediums, movement and sound, are joined at the hip.