A survey on global lidar localization: Challenges, advances and open problems

H Yin, X Xu, S Lu, X Chen, R Xiong, S Shen… - International Journal of …, 2024 - Springer
Abstract Knowledge about the own pose is key for all mobile robot applications. Thus pose
estimation is part of the core functionalities of mobile robots. Over the last two decades …

[PDF][PDF] A survey on global lidar localization

H Yin, X Xu, S Lu, X Chen, R Xiong… - arXiv preprint arXiv …, 2023 - researchgate.net
Knowledge about the own pose is key for all mobile robot applications. Thus pose
estimation is part of the core functionalities of mobile robots. In the last two decades, LiDAR …

Brain-inspired computing: A systematic survey and future trends

G Li, L Deng, H Tang, G Pan, Y Tian… - Proceedings of the …, 2024 - ieeexplore.ieee.org
Brain-inspired computing (BIC) is an emerging research field that aims to build fundamental
theories, models, hardware architectures, and application systems toward more general …

Benchmarking Neural Radiance Fields for Autonomous Robots: An Overview

Y Ming, X Yang, W Wang, Z Chen, J Feng… - arXiv preprint arXiv …, 2024 - arxiv.org
Neural Radiance Fields (NeRF) have emerged as a powerful paradigm for 3D scene
representation, offering high-fidelity renderings and reconstructions from a set of sparse and …

Loner: Lidar only neural representations for real-time slam

S Isaacson, PC Kung, M Ramanagopal… - IEEE Robotics and …, 2023 - ieeexplore.ieee.org
This letter proposes LONER, the first real-time LiDAR SLAM algorithm that uses a neural
implicit scene representation. Existing implicit mapping methods for LiDAR show promising …

Towards robust robot 3d perception in urban environments: The ut campus object dataset

A Zhang, C Eranki, C Zhang, JH Park… - IEEE Transactions …, 2024 - ieeexplore.ieee.org
We introduce the UT Campus Object Dataset (CODa), a mobile robot egocentric perception
dataset collected on the University of Texas Austin Campus. Our dataset contains 8.5 hours …

Liv-gaussmap: Lidar-inertial-visual fusion for real-time 3d radiance field map rendering

S Hong, J He, X Zheng, C Zheng… - IEEE Robotics and …, 2024 - ieeexplore.ieee.org
We introduce an integrated precise LiDAR, Inertial, and Visual (LIV) multimodal sensor
fused mapping system that builds on the differentiable Gaussians to improve the mapping …

ECMD: An event-centric multisensory driving dataset for SLAM

P Chen, W Guan, F Huang, Y Zhong… - IEEE Transactions …, 2023 - ieeexplore.ieee.org
Leveraging multiple sensors enhances complex environmental perception and increases
resilience to varying luminance conditions and high-speed motion patterns, achieving …

[HTML][HTML] Mapping of potential fuel regions using uncrewed aerial vehicles for wildfire prevention

ME Andrada, D Russell, T Arevalo-Ramirez, W Kuang… - Forests, 2023 - mdpi.com
This paper presents a comprehensive forest mapping system using a customized drone
payload equipped with Light Detection and Ranging (LiDAR), cameras, a Global Navigation …

Tail: A terrain-aware multi-modal slam dataset for robot locomotion in deformable granular environments

C Yao, Y Ge, G Shi, Z Wang, N Yang… - IEEE Robotics and …, 2024 - ieeexplore.ieee.org
Terrain-aware perception holds the potential to improve the robustness and accuracy of
autonomous robot navigation in the wilds, thereby facilitating effective off-road traversals …