PLFF-SLAM: A POINT AND LINE FEATURE FUSED VISUAL SLAM ALGORITHM FOR DYNAMIC ILLUMINATION ENVIRONMENTS

PLFF-SLAM: A Point and Line Feature Fused Visual SLAM Algorithm for Dynamic Illumination Environments

PLFF-SLAM: A Point and Line Feature Fused Visual SLAM Algorithm for Dynamic Illumination Environments

Blog Article

Mobile robots rely on Visual Simultaneous Localization and Mapping (SLAM) as their primary technology.However, in environments with dynamic lighting changes, current state-of-the-art visual SLAM techniques are vulnerable to trajectory drift, which leads to poor robustness and low positioning accuracy.This work proposes PLFF-SLAM, a point and line feature fused ANTIOXIDANT FORMULA visual SLAM method, as a solution to these issues.

Firstly, we designed a point feature extraction method based on the GCNV2 network, which improves the extraction performance of the point feature network by modifying the encoding layer structure and reconstructing the loss function.Then, to extract and match line features, we design an enhanced LSD algorithm.To address trajectory drift, we finally combine the point and line characteristics, which significantly increases the visual Westerns Bit SLAM system’s robustness and positioning accuracy.

We conduct validation on public datasets UMA-VI and EuRoC.The results show the proposed PLFF-SLAM algorithm, compared with existing methods has a smaller absolute trajectory error and higher pose estimation accuracy, which can effectively improve the accuracy and robustness of the visual SLAM algorithm in dynamic illumination environments.

Report this page