Fusion of passive and active electro-optical sensor data for enhanced scene understanding in challenging conditions
Abstract
Passive electro-optical systems, such as visible and infrared light cameras, and active systems, such as ladar or LiDAR, can acquire detailed two- and three-dimensional images of a scene. This paper presents a sensor fusion framework that combines passive and active electro-optical sensor data to reveal subtle patterns. It creates fused data structures that merge 3D coordinates and light intensities, and customizes recent methods for classification of high-dimensional irregular data to segment the fused structures into different object classes. The framework can also be used to discriminate weak laser return pulses from noise by extracting point clusters in the higher-dimensional data under certain constraints. Methods for estimating and compensating for motion during acquisition of the data can be integrated in the framework to prevent misalignments. Experiments on IR images and 3D point clouds acquired by a ladar demonstrate scene segmentation, object recognition and motion estimation in various challenging conditions.
Description
Bae, Egil.
Fusion of passive and active electro-optical sensor data for enhanced scene understanding in challenging conditions. Proceedings of SPIE, the International Society for Optical Engineering 2024 ;Volum 13200.