Vo, Chi Thanh
Unknown Affiliation

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Path planning and obstacle avoidance for UAVs using Theta* and modulated velocity obstacle avoidance with 2D LiDAR Tran, Hoang Thuan; Tran, Dong LT.; Vo, Chi Thanh
Bulletin of Electrical Engineering and Informatics Vol 14, No 6: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v14i6.10594

Abstract

This paper proposes a novel framework for autonomous unmanned aerial vehicle (UAV) navigation in complex environments, seamlessly integrating Theta* for global path planning with a simplified modulated velocity obstacle avoidance (MVOA) algorithm for local obstacle avoidance. Theta* generates optimal, smooth paths, while MVOA processes 2D LiDAR data as a single obstacle block to compute modulated velocities, enabling efficient avoidance of static and dynamic obstacles with minimal computational overhead. Compared to MVOA-only navigation, the integration of Theta* and MVOA produced shorter trajectories and faster mission completion with smoother velocity adjustments, demonstrating clear improvements in efficiency and stability. Simulation results show the framework maintains a 0.6 m safety distance and operates at 10 Hz, underscoring its robustness and reliability. The resulting control velocity is transmitted to an ArduPilot-based flight controller via MAVLink, ensuring precise, real-time execution. The current implementation focuses on 2D navigation in a planar environment as a foundation for future 3D expansion, with all results obtained through high-fidelity simulation. Building on these findings, the framework shows strong potential for real-time applications such as swarm UAV coordination, terrain surveying, and indoor navigation, offering a scalable solution for autonomous systems in dynamic settings.
3D mapping for unmanned aerial vehicle combining LiDAR and depth camera in indoor environments Tran, Hoang Thuan; Vo, Chi Thanh; Ha, My Duyen; Tu, Nong Trong; Ngan, Du Van; Le, Nam Hoai; Hoa, Duong Van
Bulletin of Electrical Engineering and Informatics Vol 14, No 6: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v14i6.10573

Abstract

Indoor reconnaissance missions for unmanned aerial vehicles (UAVs) pose significant challenges in scene reconstruction, mapping, and environmental feature extraction. Relying on a single type of sensor often results in limited accuracy, increased susceptibility to environmental noise, and a lack of comprehensive spatial information. To address these issues, this study proposes a mapping method that combines light detection and ranging (LiDAR) and depth camera data. The method collects data from both LiDAR and a depth camera integrated on the UAV, then performs preprocessing on both data sources to construct local 3D maps using the real-time appearance-based mapping (RTAB-Map) algorithm. Subsequently, the local maps are merged using a filtering method to generate a detailed and complete global map. Real-time experiments conducted on Ubuntu 20.04 using the robot operating system (ROS) Noetic libraries demonstrate that this multi-sensor fusion approach provides richer and more comprehensive environmental information, thereby enhancing the effectiveness of mapping tasks in unknown indoor environments.