A review of multi-sensor fusion slam systems based on 3D LIDAR
The ability of intelligent unmanned platforms to achieve autonomous navigation and
positioning in a large-scale environment has become increasingly demanding, in which …
positioning in a large-scale environment has become increasingly demanding, in which …
Camera, LiDAR and multi-modal SLAM systems for autonomous ground vehicles: a survey
M Chghaf, S Rodriguez, AE Ouardi - Journal of Intelligent & Robotic …, 2022 - Springer
Abstract Simultaneous Localization and Mapping (SLAM) have been widely studied over the
last years for autonomous vehicles. SLAM achieves its purpose by constructing a map of the …
last years for autonomous vehicles. SLAM achieves its purpose by constructing a map of the …
R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
In this paper, we propose a novel LiDAR-Inertial-Visual sensor fusion framework termed R 3
LIVE, which takes advantage of measurement of LiDAR, inertial, and visual sensors to …
LIVE, which takes advantage of measurement of LiDAR, inertial, and visual sensors to …
R LIVE: A Robust, Real-Time, LiDAR-Inertial-Visual Tightly-Coupled State Estimator and Mapping
In this letter, we propose a robust, real-time tightly-coupled multi-sensor fusion framework,
which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust …
which fuses measurements from LiDAR, inertial sensor, and visual camera to achieve robust …
Super odometry: Imu-centric lidar-visual-inertial estimator for challenging environments
We propose Super Odometry, a high-precision multi-modal sensor fusion framework,
providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and …
providing a simple but effective way to fuse multiple sensors such as LiDAR, camera, and …
Fast-livo: Fast and tightly-coupled sparse-direct lidar-inertial-visual odometry
To achieve accurate and robust pose estimation in Simultaneous Localization and Mapping
(SLAM) task, multisensor fusion is proven to be an effective solution and thus provides great …
(SLAM) task, multisensor fusion is proven to be an effective solution and thus provides great …
Unified multi-modal landmark tracking for tightly coupled lidar-visual-inertial odometry
We present an efficient multi-sensor odometry system for mobile platforms that jointly
optimizes visual, lidar, and inertial information within a single integrated factor graph. This …
optimizes visual, lidar, and inertial information within a single integrated factor graph. This …
Efficient and accurate tightly-coupled visual-lidar slam
CC Chou, CF Chou - IEEE Transactions on Intelligent …, 2021 - ieeexplore.ieee.org
We investigate a novel way to integrate visual SLAM and lidar SLAM. Instead of enhancing
visual odometry via lidar depths or using visual odometry as the motion initial guess of lidar …
visual odometry via lidar depths or using visual odometry as the motion initial guess of lidar …
PyPose: A library for robot learning with physics-based optimization
Deep learning has had remarkable success in robotic perception, but its data-centric nature
suffers when it comes to generalizing to ever-changing environments. By contrast, physics …
suffers when it comes to generalizing to ever-changing environments. By contrast, physics …
Edge robotics: Edge-computing-accelerated multirobot simultaneous localization and mapping
With the wide penetration of smart robots in multifarious fields, the simultaneous localization
and mapping (SLAM) technique in robotics has attracted growing attention in the community …
and mapping (SLAM) technique in robotics has attracted growing attention in the community …