Event-based feature tracking with probabilistic data association
Asynchronous event-based sensors present new challenges in basic robot vision problems
like feature tracking. The few existing approaches rely on grouping events into models and …
like feature tracking. The few existing approaches rely on grouping events into models and …
Unsupervised learning of dense optical flow, depth and egomotion with event-based sensors
We present an unsupervised learning pipeline for dense depth, optical flow and egomotion
estimation for autonomous driving applications, using the event-based output of the …
estimation for autonomous driving applications, using the event-based output of the …
On event-based optical flow detection
Event-based sensing, ie, the asynchronous detection of luminance changes, promises low-
energy, high dynamic range, and sparse sensing. This stands in contrast to whole image …
energy, high dynamic range, and sparse sensing. This stands in contrast to whole image …
Unsupervised learning of dense optical flow, depth and egomotion from sparse event data
In this work we present a lightweight, unsupervised learning pipeline for\textit {dense} depth,
optical flow and egomotion estimation from sparse event output of the Dynamic Vision …
optical flow and egomotion estimation from sparse event output of the Dynamic Vision …
Ground moving vehicle detection and movement tracking based on the neuromorphic vision sensor
Moving-objects detection is a critical ability for an autonomous vehicle. Facing the high
detection requirements and the slow target-extraction problem of a common camera, this …
detection requirements and the slow target-extraction problem of a common camera, this …
[PDF][PDF] MOMS with Events: Multi-object motion segmentation with monocular event cameras
Segmentation of moving objects in dynamic scenes is a key process in scene understanding
for both navigation and video recognition tasks. Without prior knowledge of the object …
for both navigation and video recognition tasks. Without prior knowledge of the object …
Calibration of event-based camera and 3d lidar
R Song, Z Jiang, Y Li, Y Shan… - 2018 WRC Symposium …, 2018 - ieeexplore.ieee.org
Calibration is an important step before fusing the data of multi-sensors together. The
methods which had been proposed to calibrate a regular camera and 3D LiDAR cannot be …
methods which had been proposed to calibrate a regular camera and 3D LiDAR cannot be …
Event-based optical flow on neuromorphic hardware
Event-based sensing, ie the asynchronous detection of luminance changes, promises low-
energy, high dynamic range, and sparse sensing. This stands in contrast to whole image …
energy, high dynamic range, and sparse sensing. This stands in contrast to whole image …
Asynchronous event-based motion processing: From visual events to probabilistic sensory representation
In this work, we propose a two-layered descriptive model for motion processing from retina
to the cortex, with an event-based input from the asynchronous time-based image sensor …
to the cortex, with an event-based input from the asynchronous time-based image sensor …
Estimating visual motion using an event-based artificial retina
LI Abdul-Kreem, H Neumann - … 2015, Berlin, Germany, March 11–14 …, 2016 - Springer
Biologically inspired computational models of visual processing often utilize conventional
frame-based cameras for data acquisition. Instead, the Dynamic Vision Sensor (DVS) …
frame-based cameras for data acquisition. Instead, the Dynamic Vision Sensor (DVS) …