Advancing Autonomous Navigation Research and Development: Insights from the NavINST Multisensor Dataseta
Presentation Menu
The Navigation and Instrumentation (NavINST) Laboratory has created a comprehensive multisensory dataset derived from various road-test trajectories in urban environments, featuring diverse lighting conditions, including indoor garage scenarios with dense 3D maps. This dataset encompasses multiple commercial-grade inertial measurement units (IMUs), a high-end tactical-grade IMU, and different global navigation satellite system (GNSS) receivers. Additionally, it incorporates a wide array of perception-based sensors, such as a solid-state LiDAR—making it one of the first datasets to do so—a mechanical LiDAR, four electronically scanning RADARs, a monocular camera, and two stereo cameras. The dataset also consists of forward speed measurements obtained from the vehicle’s odometer, along with accurately post-processed high-end GNSS/IMU data, providing precise ground-truth positioning and navigation information. The NavINST dataset is designed to support advanced research in high-precision positioning, navigation, mapping, computer vision, and multisensory fusion. It offers rich, multi-sensor data ideal for developing and validating robust algorithms for autonomous vehicles. Lastly, it is fully integrated with the Robot Operating System (ROS), ensuring ease of use and accessibility for the research community.