Rusman Rusyadi
Swiss German University

Published : 3 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 3 Documents
Search

A MODEL VISION OF SORTING SYSTEM APPLICATION USING ROBOTIC MANIPULATOR Arko Djajadi; Fiona Laoda; Rusman Rusyadi; Tutuko Prajogo; Maralo Sinaga
TELKOMNIKA (Telecommunication Computing Electronics and Control) Vol 8, No 2: August 2010
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/telkomnika.v8i2.615

Abstract

Image processing in today’s world grabs massive attentions as it leads to possibilities of broaden application in many fields of high technology. The real challenge is how to improve existing sorting system in the Moduler Processing System (MPS) laboratirium which consists of four integrated stations of distribution, testing, processing and handling with a new image processing feature. Existing sorting method uses a set of inductive, capacitive and optical sensors do differentiate object color. This paper presents a mechatronics color sorting system solution with the application of image processing. Supported by OpenCV, image processing procedure senses the circular objects in an image captured in realtime by a webcam and then extracts color and position information out of it. This information is passed as a sequence of sorting commands to the manipulator (Mitsubishi Movemaster RV-M1) that does pick-and-place mechanism. Extensive testing proves that this color based object sorting system works 100% accurate under ideal condition in term of adequate illumination, circular objects’ shape and color. The circular objects tested for sorting are silver, red and black. For non-ideal condition, such as unspecified color the accuracy reduces to 80%. 
ANALYSIS, DESIGN AND IMPLEMENTATION OF AN EMBEDDED REALTIME SOUND SOURCE LOCALIZATION SYSTEM BASED ON BEAMFORMING THEORY Arko Djajadi; Rusman Rusyadi; Tommy Handoko; Maralo Sinaga; Jürgen Grueneberg
TELKOMNIKA (Telecommunication Computing Electronics and Control) Vol 7, No 3: December 2009
Publisher : Universitas Ahmad Dahlan

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.12928/telkomnika.v7i3.588

Abstract

This project is intended to analyze, design and implement a realtime sound source localization system by using a mobile robot as the media. The implementated system uses 2 microphones as the sensors, Arduino Duemilanove microcontroller system with ATMega328p as the microprocessor, two permanent magnet DC motors as the actuators for the mobile robot and a servo motor as the actuator to rotate the webcam directing to the location of the sound source, and a laptop/PC as the simulation and display media. In order to achieve the objective of finding the position of a specific sound source, beamforming theory is applied to the system. Once the location of the sound source is detected and determined, the choice is either the mobile robot will adjust its position according to the direction of the sound source or only webcam will rotate in the direction of the incoming sound simulating the use of this system in a video conference. The integrated system has been tested and the results show the system could localize in realtime a sound source placed randomly on a half circle area (0 - 1800) with a radius of 0.3m - 3m, assuming the system is the center point of the circle. Due to low ADC and processor speed, achievable best angular resolution is still limited to 25o.
Vision-based vanishing point detection of autonomous navigation of mobile robot for outdoor applications Leonard Rusli; Brilly Nurhalim; Rusman Rusyadi
Journal of Mechatronics, Electrical Power and Vehicular Technology Vol 12, No 2 (2021)
Publisher : National Research and Innovation Agency

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.14203/j.mev.2021.v12.117-125

Abstract

The vision-based approach to mobile robot navigation is considered superior due to its affordability. This paper aims to design and construct an autonomous mobile robot with a vision-based system for outdoor navigation. This robot receives inputs from camera and ultrasonic sensor. The camera is used to detect vanishing points and obstacles from the road. The vanishing point is used to detect the heading of the road. Lines are extracted from the environment using a canny edge detector and Houghline Transforms from OpenCV to navigate the system. Then, removed lines are processed to locate the vanishing point and the road angle. A low pass filter is then applied to detect a vanishing point better. The robot is tested to run in several outdoor conditions such as asphalt roads and pedestrian roads to follow the detected vanishing point. By implementing a Simple Blob Detector from OpenCV and ultrasonic sensor module, the obstacle's position in front of the robot is detected. The test results show that the robot can avoid obstacles while following the heading of the road in outdoor environments. Vision-based vanishing point detection is successfully applied for outdoor applications of autonomous mobile robot navigation.