|
Download PDFOpen PDF in browserCBILR: Camera Bi-Directional LiDAR-Radar Fusion for Robust Perception in Autonomous DrivingEasyChair Preprint 136607 pages•Date: June 14, 2024AbstractSafe and reliable autonomous driving hinges on
robust perception under challenging environments. Multi-sensor
fusion, particularly camera-LiDAR-Radar integration, plays a
pivotal role in achieving this goal. Different sensors have spe-
cific advantages and disadvantages. Existing pipelines are often
constrained by adverse weather conditions, where cameras and
LiDAR suffer significant degradation. This paper introduces the
Camera Bi-directional LiDAR-Radar (CBILR) fusion pipeline,
which leverages the strengths of sensors to enhance LiDAR
and Radar point clouds. CBILR innovates with a bi-directional
prefusion step between LiDAR and Radar, leading to richer
feature representations. First, prefusion combines LiDAR and
Radar points to compensate for individual sensor weaknesses.
Next, the pipeline fuses the pre-fused features with camera
features in the bird’s eye view (BEV) space, resulting in a
comprehensive multi-modal representation. Experiments have
demonstrated that CBILR outperforms state-of-the-art pipelines,
achieving superior robustness in challenging weather scenarios.
The code is at https://github.com/Artimipt/CBILR. Keyphrases: Camera, Fusion, LiDAR, Radar, autonomous vehicle, self-driving Download PDFOpen PDF in browser |
|
|