This research proposes GNN-FT-SLAM, a disturbance-tolerant sensor fusion framework for autonomous robots that combines Graph Neural Networks (GNN) at the perception layer with an uncertainty-aware graph-factor SLAM backend. GNN constructs a multicenter graph (camera, LiDAR, IMU, odometry) to contextually model measurement reliability and predict adaptive covariances that are then used as factor weights in SLAM optimization. The pipeline includes multicenter synchronization, dynamic graph construction, reliability-focused message passing, probabilistic (aleatoric/epistemic) heads, as well as fault detection–isolation and modality reconfiguration (fallback and dynamic factor activation) modules. Evaluations on nominal, synthetic stress (motion blur, glare/low-light, LiDAR sparsity, IMU bias), and real-world fault scenarios demonstrate performance improvements over robust baselines (ORB-SLAM3, LIO-SAM, VINS-Mono): 32–55% reduction in ATE, improved RPE, fault detection AUROC up to 0.92, and improved uncertainty calibration (NLL and ECE decreased). The system runs in real-time (~27 Hz) on an edge GPU with an average latency of 37 ms. These findings confirm that combining deep learning graph representations and probabilistic inference results in adaptive, uncertainty-aware, and fault-tolerant sensor fusion, relevant for autonomous robot operations in dynamic and cluttered environments.
Copyrights © 2025