Ryan Satria Wijaya
Department Of Electrical Engineering, Politeknik Negeri Batam, Indonesia

Published : 10 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 10 Documents
Search

Robotics training to improve STEM skills of Islamic boarding school students in Batam Jamzuri, Eko Rudiawan; Soebhakti, Hendawan; Prayoga , Senanjung; Fatekha, Rifqi Amalya; Wibisana, Anugerah; Nakul, Fitriyanti; Hasnira, H.; Analia, Riska; Susanto, S.; Wijaya, Ryan Satria; Suciningtyas, Ika Karlina Laila Nur; Puspita, Widya Rika; Lubis, Eka Mutia; Jefiza, Adlian; Budiana, B.; Firdaus, Ahmad Riyad
Journal of Community Service and Empowerment Vol. 5 No. 1 (2024): April
Publisher : Universitas Muhammadiyah Malang

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.22219/jcse.v5i1.26895

Abstract

One potential approach to addressing the challenges posed by the advent of Industry 4.0 and Society 5.0 is to offer robotics training. This endeavor aims to enhance students' foundational understanding of STEM (Science, Technology, Engineering, and Mathematics) disciplines. The study involved collaborating with the Pondok Pesantren Granada, an Islamic Boarding School located in Batam, to provide robotics training as community service activities. The study included 29 trainees: 15 from class XI and 7 from classes X and XII. The teaching was conducted using a combination of didactic instruction, interactive discourse, and hands-on exercises. Trainees are administered a written examination to assess their proficiency level before and after the training program. The training outcomes exhibited a significant improvement in the mean STEM proficiency of trainees, with an increase of 38.15%. Furthermore, a series of activities have been effectively implemented, resulting in trainee satisfaction ratings exceeding 50% concerning course materials, trainer, and teaching equipment. A mere 17% of the individuals undergoing training expressed dissatisfaction with the allocated time, particularly the hands-on component's duration.
Omni-directional Movement on the MRT PURVI Ship Robot Wijaya, Ryan Satria; Kaputra, Aldi; Prasetyo, Naufal Abdurrahman; Soebhakti, Hendawan; Prayoga, Senanjung; Wibisana, Anugerah; Fatekha, Rifqi Amalya; Jamzuri, Eko Rudiawan; Nugroho, Mochamad Ari Bagus
Journal of Applied Electrical Engineering Vol 7 No 2 (2023): JAEE, December 2023
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaee.v7i2.6475

Abstract

Ship transportation is the primary mode of trade and transportation at sea in the maritime industry. Initially, humans employed ships as a method of pursuing and capturing fish or animals in aquatic environments. As the ship era progresses, it actively engages in all aspects pertaining to ships. Presently, the ship is propelled by its engine, which is a significant improvement over its initial reliance on wood or oars. In addition to engines, propellers are employed to transform the rotational motion of the engine into propulsive force in the marine environment. Propellers are also present on aircraft, serving the same purpose but positioned at various locations in the air. A thruster is a hybrid device that combines an engine and a propeller. This sort of thruster is specifically designed for use on tiny boats or prototypes, for the purpose of simulating, exhibiting, or participating in contests. ESC is a component that facilitates the alteration of the input value to the intended velocity. In addition to their primary function of fulfilling food requirements, ships are presently employed in diverse capacities, including military vessels, tourist vessels, submarines, passenger ships, and more.
Comparative Study of YOLOv5, YOLOv7 and YOLOv8 for Robust Outdoor Detection Wijaya, Ryan Satria; Santonius, Santonius; Wibisana, Anugerah; Jamzuri, Eko Rudiawan; Nugroho, Mochamad Ari Bagus
Journal of Applied Electrical Engineering Vol 8 No 1 (2024): JAEE, June 2024
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaee.v8i1.7207

Abstract

Object detection is one of the most popular applications among young people, especially among millennials and generation Z. The use of object detection has become widespread in various aspects of daily life, such as face recognition, traffic management, and autonomous vehicles. The use of object detection has expanded in various aspects of daily life, such as face recognition, traffic management, and autonomous vehicles. To perform object detection, large and complex datasets are required. Therefore, this research addresses what object detection algorithms are suitable for object detection. In this research, i will compare the performance of several algorithms that are popular among young people, such as YOLOv5, YOLOv7, and YOLOv8 models. By conducting several Experiment Results such as Detection Results, Distance Traveled Experiment Results, Confusion Matrix, and Experiment Results on Validation Dataset, I aim to provide insight into the advantages and disadvantages of these algorithms. This comparison will help young researchers choose the most suitable algorithm for their object detection task.
A Visual-Based Pick and Place on 6 DoF Robot Manipulator Wijaya, Ryan Satria; Pratama, Adhitya; Fatekha, Rifqi Amalya; Soebhakti, Hendawan; Prayoga, Senanjung
Journal of Applied Electrical Engineering Vol 8 No 1 (2024): JAEE, June 2024
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaee.v8i1.7358

Abstract

This paper discusses the application of visual servoing on a 6 DOF robotic manipulator for industrial automation. With visual feedback, the manipulator can perform pick and place operations accurately and efficiently. We explore feature- and model-based visual servoing methods and object detection techniques, including deep learning algorithms. The experimental results show that the integration of visual servoing with pick and place method as well as object detection improves the performance of manipulators in industry. This research contributes to the understanding of visual servoing technology in industrial automation. The conclusion shows that the manipulator is more precise in controlling the X-axis shift in the first two experiments, but faces challenges in the third experiment. The success of the system is affected by environmental factors such as lighting. For further development, research is recommended to improve robustness to environmental variations as well as evaluation of execution speed and object positioning accuracy.
Analisis Kinematika dan Pola Gerakan Berjalan pada Robot Bipedal Humanoid T-FLoW 3.0 WIJAYA, RYAN SATRIA; APRIANDY, KEVIN ILHAM; AL BANNA, M. RIZQI HASAN; DEWANTO, RADEN SANGGAR; PRAMADIHANTO, DADET
ELKOMIKA: Jurnal Teknik Energi Elektrik, Teknik Telekomunikasi, & Teknik Elektronika Vol 10, No 1: Published January 2022
Publisher : Institut Teknologi Nasional, Bandung

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.26760/elkomika.v10i1.31

Abstract

ABSTRAKRobot humanoid merupakan robot menyerupai manusia dengan tingkat kompleksitas yang tinggi dan fungsi yang serbaguna. Pada penelitian ini dilakukan analisis model kinematika gerak pada robot bipedal humanoid TFLoW 3.0, serta menganalisis pola gerakan berjalannya. Pola pergerakan yang diimplementasikan pada robot bipedal TFLoW 3.0 merupakan hasil pendekatan dari teori cara berjalan manusia dengan menggunakan enam gerakan dasar manusia saat berjalan. Kemudian menganalisis model gerakan robot menggunakan kinematika terbalik dengan solusi geometri. Tujuan dari model kinematika terbalik adalah untuk mengubah data input berupa posisi kartesian menjadi nilai sudut untuk setiap parameter joint pada masing-masing Degrees of Freedom (DoF). Lalu dilakukan analisis model mekanik robot saat berjalan yang terbagi atas fase tegak dan fase berayun yang bertujuan untuk mengetahui hasil pengujian.Kata kunci: robot humanoid, gaya berjalan, kinematika, TFLoW, DoF. ABSTRACTHumanoid robots are human-like robots with a high level of complexity and versatile functions. In this study, kinematics analyze on TFLoW 3.0 humanoid bipedal robot is carried out, as well as analyzing the pattern of its walking movement. The implemented movement of TFLoW 3.0 bipedal robot is the result of an approach from human walk using six basic human movements when walking. the robot movement model is analyzed by inverse kinematics with geometric solutions. Invers kinematics model is to transform the input data in the form of a Cartesian position into an angle value for each joint parameter in each Degrees of Freedom (DoF). Then an analysis of the robot's mechanical model when walking is carried out which is divided into a stance phase and a swinging phase which aims to determine the test results.Keywords: humanoid robot, gait, kinematics, TFLoW, DoF.
A Design and Implementation of a 3-Axis UAV Drone Gimbal Rig for Testing Stability and Performance Parameters in the Laboratory Wijaya, Ryan Satria; Zulpriadi, Zulpriadi; Prayoga, Senanjung; Fatekha, Rifqi Amalya
Journal of Applied Informatics and Computing Vol. 9 No. 3 (2025): June 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i3.9577

Abstract

This study designs a 3-axis UAV gimbal rig for testing stability and performance before deployment in real-world flight conditions. The gimbal rig simulates the vertical, lateral, and longitudinal axes to ensure reliable operation in various scenarios. Made from lightweight aluminum alloy, the structure minimizes vibrations and maintains rigidity during testing. For precise motion tracking, each axis is equipped with an LPD3806- 600BM-G5 rotary encoder, offering accurate feedback on movement. The Arduino Nano processes the encoder data, displaying real-time results on a 16x2 LCD with an I2C interface for easy monitoring. Additionally, a push-button system enables users to switch between different readings for each axis. This setup aids researchers in analyzing UAV dynamics and refining both firmware and hardware. Future enhancements may include wireless data logging and integration of machine learning techniques to predict maintenance needs, further supporting UAV stability testing in various applications, including aerospace, defense, and commercial use.
A Real-Time Hand Gesture Control of a Quadcopter Swarm Implemented in the Gazebo Simulation Environment Wijaya, Ryan Satria; Prayoga, Senanjung; Fatekha, Rifqi Amalya; Mubarak, Muhammad Thoriq
Journal of Applied Informatics and Computing Vol. 9 No. 3 (2025): June 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i3.9578

Abstract

With the advancement of technology, human-robot interaction (HRI) is becoming more intuitive, including through hand gesture-based control. This study aims to develop a real-time hand gesture recognition system to control a quadcopter swarm within a simulated environment using ROS and Gazebo. The system utilizes Google's MediaPipe framework for detecting 21 hand landmarks, which are then processed through a custom-trained neural network to classify 13 predefined gestures. Each gesture corresponds to a specific command such as basic motion, rotation, or swarm formation, and is published to the /cmd_vel topic using the ROS communication framework. Simulation tests were performed in Gazebo and covered both individual drone maneuvers and simple swarm formations. The results demonstrated a gesture classification accuracy of 90%, low latency, and stable response across multiple drones. This approach offers a scalable and efficient solution for real-time swarm control based on hand gestures, contributing to future applications in human-drone interaction systems.
Penerapan Visual Servoing Robot Lengan dengan Metode Color Recognition sebagai Pemindah Objek Dua Warna Berbeda Wijaya, Ryan Satria; Rifqi Amalya Fatekha; Senanjung Prayoga; Dzaky Andrawan; Naurah Nazhifah; Mochamad Ari Bagus Nugroho
Journal of Applied Electrical Engineering Vol. 9 No. 1 (2025): JAEE, June 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaee.v9i1.9496

Abstract

Penerapan Visual servoing dengan metode color recognition merupakan sistem yang mengklasifikasikan objek berdasarkan warna dan posisi objek yang terdeteksi melalui kamera untuk menggerakkan servo pada robot lengan. Sistem ini menggunakan Huskylens sebagai kamera yang digunakan untuk mendeteksi warna dan posisi dari sebuah objek dan robot lengan untuk memindahkan objek yang sudah terdeteksi melalui kamera. Dari hasil pengujian, penerapan visual servoing robot lengan dengan metode color recognition dapat berfungsi dengan respon  rata-rata 0,9 detik untuk mengejar objek ketika objek tidak berada di posisi pengambilan dan berfungsi dengan baik untuk mengambil dan meletakkan objek dengan dua warna yaitu biru dan merah ketika berada di posisi pengambilan dengan persentase akurasi deteksi objek 98% serta persentase akurasi pengambilan dan pemindahan objek 100% melalui rentang jarak deteksi minimal 18 – 22 cm diatas objek dan dengan pencahayaan yang terang.
Sensor Fusion – Based Localization for ASV with Linear Regression Optimization Wijaya, Ryan Satria; Jamzuri, Eko Rudiawan; Wibisana, Anugerah; Sinaga, Jepelin Amstrong; Julanba, Vafin
Journal of Applied Informatics and Computing Vol. 9 No. 4 (2025): August 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i4.10048

Abstract

ASV (Autonomous Surface Vehicle) is one of popular innovations in the maritime field that is widely used for various missions on the water surface. The ASV itself has the ability to operate automatically without human intervention. Therefore, ASV requires an accurate and reliable localization system. This research focuses on developing an ASV localization system using waterflow sensors optimized through linear regression and integrated with orientation data from an IMU sensor through sensor fusion to obtain global coordinate position estimation. The experiments conducted showed a significant improvement in accuracy after optimization, with the Root Mean Square Error (RMSE) of the waterflow sensor data decreasing from 161.65 meters to 0.28 meters. Moreover, the yaw data reading by IMU achieved accuracy with RMSE 1.54 degrees. The localization system in the final test achieved RMSE values of 0.07 meters for the X-axis, 0.14 meters for the Y-axis, and 1.9 degrees for yaw during the ASV global positioning experiment. In addition, a GUI (Graphical User Interface) was developed for visualization with average communication latency of 113.6 milliseconds. This localization system is a promising solution in stable water condition.
Real-Time Chinese Chess Piece Character Recognition using Edge AI Wijaya, Ryan Satria; Anadia, Atika Yunisa; Fatekha, Rifqi Amalya; Prayoga, Senanjung
Journal of Applied Informatics and Computing Vol. 9 No. 4 (2025): August 2025
Publisher : Politeknik Negeri Batam

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.30871/jaic.v9i4.10056

Abstract

This research focuses on developing a character analysis system on Chinese chess pieces (xiangqi) using computer vision technology with the deep learning framework PyTorch. The system is designed to detect and interpret text written on chess pieces in real time, making it easier for players to identify the function of each piece. The implementation is done using a web camera and can be applied to embedded devices such as Jetson Nano. This research aims to develop an automatic recognition system that can help players better understand the game of xiangqi by identifying characters on pieces in real time. The test results show that the system successfully recognized 14 pieces correctly. The system developed using Jetson Nano can directly process image data with a processing time of 0.0222 seconds. This data is obtained from the average of each FPS image from the web camera.