Claim Missing Document
Check
Articles

Found 3 Documents
Search

Eyeball Movement Detection To Control Smart Wheelchair Using Eye Aspect Ratio (EAR) Pangestu, Gusti
MATICS Vol 13, No 2 (2021): MATICS
Publisher : Department of Informatics Engineering

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.18860/mat.v13i2.12963

Abstract

Many developed technology's with an aim of helping the disabilities. One of them is a wheelchair. It is the most common stuff that used for helping disabilities as a tool for mobilization. There are two types of wheelchair. The first is the manual wheelchair, operated by hand. The second is an electrical wheelchair, that operated by joystick or other electric device. This research proposed a mechanism to control the wheelchair by using an eye movement. It could be used especially for people with multiple disabilities (hand and foot defects), so they can take an advantage of their eyeballs as a tool to control wheelchair movement. There are five options for controlling the wheelchair (leftward, rightward, upward, downward and center). Leftward, rightward and center used for control direction of smart wheelchair. Furthermore, upward and downward of eye movements used to control the speed of smart wheelchair. Upward command used to increase the speed. Meanwhile, down-ward used to decrease the speed (stop). The proposed method used EAR (Eye Aspect Ratio), which divided into three regions based on sector area, for determining the directions of the eyeball movement. EAR is the value that represents the ratio between the upper eyelid and lower eyelid. The result obtained high accuracy
Real-Time Chicken Counting System using YOLO for FCR Optimization in Small and Medium Poultry Farms Pangestu, Gusti
Journal of Development Research Vol. 9 No. 1 (2025): Volume 9, Number 1, May 2025
Publisher : Lembaga Penelitian dan Pengabdian Masyarakat Universitas Nahdlatul Ulama Blitar

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.28926/jdr.v9i1.436

Abstract

Feed Conversion Ratio (FCR) is a widely recognized technique in the animal farming industry, especially for optimizing feed efficiency and reducing the operational costs. A key aspect of managing FCR involves achieving efficiency in the correlation between the number of animals and the required feed quantity. However, to achieve accurately counting large populations of animals, such as chickens, presents a significant challenge especially in large-scale farming. Computer vision technology offers a promising solution to automate this counting process, helping the FCR management. This research specifically evaluates the capability of the YOLOv11 model for real-time chicken detection. Evaluation of the model's performance indicates high efficacy, achieving accuracy, precision, and recall values of 93%, 94%, and 98%, respectively. The implementation of the technology for precise chicken detection facilitates the accurate adjustment and optimization of feed allocation, which can substantially enhance the overall FCR process.
Klasifikasi Pola Pergerakan Bola Mata Menggunakan Metode Multilayer Backpropagation Amadea, Karina; Bachtiar, Fitra A.; Pangestu, Gusti
Jurnal Teknologi Informasi dan Ilmu Komputer Vol 9 No 2: April 2022
Publisher : Fakultas Ilmu Komputer, Universitas Brawijaya

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.25126/jtiik.2022925668

Abstract

Salah satu organ terpenting yang ada pada tubuh manusia yaitu indera penglihatan. Manusia dapat memperoleh informasi sebanyak 80% hanya dengan melihat. Pada bagian mata, terutama iris, terdapat wilayah-wilayah yang merepresentasikan tiap bagian dari tubuh. Dengan adanya jaringan syaraf yang ada pada iris, dapat diketahui respons terhadap semua perubahan yang ada di dalam tubuh termasuk perubahan semangat hidup hingga karakter atau sifat seseorang. Pada penelitian ini dibuat sebuah sistem untuk mengenali pola pergerakan mata. Salah satu caranya yaitu melalui pendeteksian pupil. Data citra yang digunakan pada penelitian ini berjumlah 65 citra wajah berukuran 1280 x 720 pixel yang akan diklasifikasikan menjadi 5 label yaitu, mata menghadap atas, bawah, depan, kanan, dan kiri. Data citra akan disegmentasi menggunakan framework Deep-VOG untuk kemudian didapatkan 107 hasil dari ekstraksi fitur pupil menggunakan metode Sector Line Distance. Dari hasil ekstraksi tersebut, selanjutnya akan diklasifikasikan menggunakan metode Backpropagation. Arsitektur Backpropagation yang digunakan yaitu menggunakan 1 hidden layer dengan 11 neuron pada hidden layer. Sedangkan untuk parameter-parameter yang digunakan yaitu learning rate sebesar 0,7 dan iterasi sebanyak 100 iterasi. Hasil dari klasifikasi pola pergerakan mata memperoleh tingkat akurasi sebesar 88,24% pada saat pelatihan dan 80,95% pada saat pengujian. Abstract Eye is one of the most important organs in the human body. By using the eye, humans can get as much as 80% of information just by looking. There are several regions that represent each part of the body in the iris of the eye. The presence of a neural network in the iris, helps humans to be able to find out the response to all changes in the body including changes in the spirit of life, character or even a person's nature. In this study, a system to recognize eye movement patterns will be created through pupil detection. A total of 65 facial image data used in this study measuring 1280 x 720 pixels which will be classified into 5 labels, those are eyes facing up, down, front, right, and left. The facial image will be segmented using the Deep-VOG framework and the pupil features will be extracted using the Sector Line Distance method. The results of the 107 pupil extraction data will be classified using the Backpropagation method and obtain an accuracy level of 88,24% in training and 80.95% in testing using 1 hidden layer with 11 hidden neurons, 0,7 of a learning rate, and 100 of the number of iterations.