Claim Missing Document
Check
Articles

Found 3 Documents
Search
Journal : JOURNAL OF APPLIED INFORMATICS AND COMPUTING

A Design and Implementation of a 3-Axis UAV Drone Gimbal Rig for Testing Stability and Performance Parameters in the Laboratory Wijaya, Ryan Satria; Zulpriadi, Zulpriadi; Prayoga, Senanjung; Fatekha, Rifqi Amalya
Journal of Applied Informatics and Computing Vol. 9 No. 3 (2025): June 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i3.9577

Abstract

This study designs a 3-axis UAV gimbal rig for testing stability and performance before deployment in real-world flight conditions. The gimbal rig simulates the vertical, lateral, and longitudinal axes to ensure reliable operation in various scenarios. Made from lightweight aluminum alloy, the structure minimizes vibrations and maintains rigidity during testing. For precise motion tracking, each axis is equipped with an LPD3806- 600BM-G5 rotary encoder, offering accurate feedback on movement. The Arduino Nano processes the encoder data, displaying real-time results on a 16x2 LCD with an I2C interface for easy monitoring. Additionally, a push-button system enables users to switch between different readings for each axis. This setup aids researchers in analyzing UAV dynamics and refining both firmware and hardware. Future enhancements may include wireless data logging and integration of machine learning techniques to predict maintenance needs, further supporting UAV stability testing in various applications, including aerospace, defense, and commercial use.
A Real-Time Hand Gesture Control of a Quadcopter Swarm Implemented in the Gazebo Simulation Environment Wijaya, Ryan Satria; Prayoga, Senanjung; Fatekha, Rifqi Amalya; Mubarak, Muhammad Thoriq
Journal of Applied Informatics and Computing Vol. 9 No. 3 (2025): June 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i3.9578

Abstract

With the advancement of technology, human-robot interaction (HRI) is becoming more intuitive, including through hand gesture-based control. This study aims to develop a real-time hand gesture recognition system to control a quadcopter swarm within a simulated environment using ROS and Gazebo. The system utilizes Google's MediaPipe framework for detecting 21 hand landmarks, which are then processed through a custom-trained neural network to classify 13 predefined gestures. Each gesture corresponds to a specific command such as basic motion, rotation, or swarm formation, and is published to the /cmd_vel topic using the ROS communication framework. Simulation tests were performed in Gazebo and covered both individual drone maneuvers and simple swarm formations. The results demonstrated a gesture classification accuracy of 90%, low latency, and stable response across multiple drones. This approach offers a scalable and efficient solution for real-time swarm control based on hand gestures, contributing to future applications in human-drone interaction systems.
Real-Time Chinese Chess Piece Character Recognition using Edge AI Wijaya, Ryan Satria; Anadia, Atika Yunisa; Fatekha, Rifqi Amalya; Prayoga, Senanjung
Journal of Applied Informatics and Computing Vol. 9 No. 4 (2025): August 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i4.10056

Abstract

This research focuses on developing a character analysis system on Chinese chess pieces (xiangqi) using computer vision technology with the deep learning framework PyTorch. The system is designed to detect and interpret text written on chess pieces in real time, making it easier for players to identify the function of each piece. The implementation is done using a web camera and can be applied to embedded devices such as Jetson Nano. This research aims to develop an automatic recognition system that can help players better understand the game of xiangqi by identifying characters on pieces in real time. The test results show that the system successfully recognized 14 pieces correctly. The system developed using Jetson Nano can directly process image data with a processing time of 0.0222 seconds. This data is obtained from the average of each FPS image from the web camera.