Multiple Imaging Radars Integrate with INS/GNSS via AUTO Software Reliable and Accurate Positioning for Autonomous Vehicles and Robots
- Mar 13, 2022
- 1 min read
Updated: Sep 23, 2024

Difficult GNSS environments and adverse weather conditions require a fusion of many sensors to maintain lane-level accuracy for autonomous platforms, without incurring high costs that would inhibit widespread adoption. Radars are an attractive option in a multi-sensor integration scheme due to being robust to adverse weather and insensitive to lighting variations.
The multi-radar integrated version of AUTO uses inertial navigation, real-time kinematic GNSS, odometer, and multiple radar sensors with high-definition maps in a tight non-linear integration scheme. AUTO can reliably produce accurate and high-rate navigation outputs in real-time and under all urban environments. Key performance indices during simulated GNSS outages quantify the accuracy of the solution for prolonged periods.
Read the full paper here.




This post highlights a critical challenge in autonomous systems: maintaining accuracy in difficult environments. The integration of multiple imaging radars with INS/GNSS via AUTO software sounds like a very promising approach. I'm particularly interested in how this addresses the cost barrier for widespread adoption, as robust positioning shouldn't be a luxury. Solutions like Image to Animation often rely on similar sophisticated sensor fusion for their creative output. It's exciting to see these advancements in practical applications.
That makes a lot of sense relying on just GNSS isn’t enough in tough environments, so combining sensors is the way to go. Radar seems like a smart choice since it performs well even in bad weather or low visibility. I was reading about this while taking a quick break on SoundButtonsPro, and it’s interesting how tech keeps evolving to solve these challenges
i stumbled into Marble Sort Level because the icon looked clean and i’m a sucker for puzzle games with good visual feedback. stayed because every completed box feels like a tiny rescue mission, especially when the belt was seconds away from becoming unusable.
The sensor fusion angle is especially important for edge case environments. It’s interesting how layered validation here feels a bit like math playground combining simple inputs to solve complex problems.
Fusing imaging radars with INS/GNSS via AUTO software delivers robust positioning for autonomous systems, and partnering with 3d-figurines can streamline the compact, reliable circuitry needed for such multi-sensor integration in tight vehicle and robot layouts.