Computer vision for autonomous vehicles: Problems, datasets and state of the art
Recent years have witnessed enormous progress in AI-related fields such as computer
vision, machine learning, and autonomous vehicles. As with any rapidly growing field, it …
vision, machine learning, and autonomous vehicles. As with any rapidly growing field, it …
Factor graphs for robot perception
F Dellaert, M Kaess - Foundations and Trends® in Robotics, 2017 - nowpublishers.com
We review the use of factor graphs for the modeling and solving of large-scale inference
problems in robotics. Factor graphs are a family of probabilistic graphical models, other …
problems in robotics. Factor graphs are a family of probabilistic graphical models, other …
Visual-lidar odometry and mapping: Low-drift, robust, and fast
Here, we present a general framework for combining visual odometry and lidar odometry in
a fundamental and first principle method. The method shows improvements in performance …
a fundamental and first principle method. The method shows improvements in performance …
Lic-fusion: Lidar-inertial-camera odometry
This paper presents a tightly-coupled multi-sensor fusion algorithm termed LiDAR-inertial-
camera fusion (LIC-Fusion), which efficiently fuses IMU measurements, sparse visual …
camera fusion (LIC-Fusion), which efficiently fuses IMU measurements, sparse visual …
Limo: Lidar-monocular visual odometry
Higher level functionality in autonomous driving depends strongly on a precise motion
estimate of the vehicle. Powerful algorithms have been developed. However, their great …
estimate of the vehicle. Powerful algorithms have been developed. However, their great …
Laser–visual–inertial odometry and mapping with high robustness and low drift
We present a data processing pipeline to online estimate ego‐motion and build a map of the
traversed environment, leveraging data from a 3D laser scanner, a camera, and an inertial …
traversed environment, leveraging data from a 3D laser scanner, a camera, and an inertial …
A review of slam techniques and security in autonomous driving
A Singandhupe, HM La - 2019 third IEEE international …, 2019 - ieeexplore.ieee.org
Simultaneous localization and mapping (SLAM) is a widely researched topic in the field of
robotics, augmented/virtual reality and more dominantly in self-driving cars. SLAM is a …
robotics, augmented/virtual reality and more dominantly in self-driving cars. SLAM is a …
Review on LiDAR-based SLAM techniques
L Huang - 2021 International Conference on Signal Processing …, 2021 - ieeexplore.ieee.org
LiDAR-based Simultaneous Localization and Mapping (LiDAR-SLAM) uses the LiDAR
sensor to localize itself by observing environmental features and incrementally build the …
sensor to localize itself by observing environmental features and incrementally build the …
Efficient and accurate tightly-coupled visual-lidar slam
CC Chou, CF Chou - IEEE Transactions on Intelligent …, 2021 - ieeexplore.ieee.org
We investigate a novel way to integrate visual SLAM and lidar SLAM. Instead of enhancing
visual odometry via lidar depths or using visual odometry as the motion initial guess of lidar …
visual odometry via lidar depths or using visual odometry as the motion initial guess of lidar …
Multi-modal feature constraint based tightly coupled monocular visual-LiDAR odometry and mapping
C Shu, Y Luo - IEEE Transactions on Intelligent Vehicles, 2022 - ieeexplore.ieee.org
In this paper, we present a novel multi-sensor fusion framework for tightly coupled
monocular visual-LiDAR odometry and mapping. Compared to previous visual-LiDAR fusion …
monocular visual-LiDAR odometry and mapping. Compared to previous visual-LiDAR fusion …