Indonesian Journal of Electrical Engineering and Computer Science
Vol 21, No 1: January 2021

Smart hand gestures recognition using K-NN based algorithm for video annotation purposes

Malek Zakarya Alksasbeh (Al Hussein Bin Talal University)
Ahmad H AL-Omari (Norther Border University)
Bassam A. Y. Alqaralleh (Al Hussein Bin Talal University)
Tamer Abukhalil (Al Hussein Bin Talal University)
Anas Abukarki (Al Hussein Bin Talal University)
Ibrahim Alkore Alshalabi (Al Hussein Bin Talal University)
Amal Alkaseasbeh (Ministry of Education and Higher Education)



Article Info

Publish Date
01 Jan 2021

Abstract

Sign languages are the most basic and natural form of languages which were used even before the evolution of spoken languages. These sign languages were developed using various sign "gestures" that are made using hand palm. Such gestures are called "hand gestures". Hand gestures are being widely used as an international assistive communication method for deaf people and many life aspects such as sports, traffic control and religious acts. However, the meanings of hand gestures vary among different civilization cultures. Therefore, because of the importance of understanding the meanings of hand gestures, this study presents a procedure whichcan translate such gestures into an annotated explanation. The proposed system implements image and video processing which are recently conceived as one of the most important technologies. The system initially, analyzes a classroom video as an input, and then extracts the vocabulary of twenty gestures. Various methods have been applied sequentially, namely: motion detection, RGB to HSV conversion, and noise removing using labeling algorithms. The extraction of hand parameters is determined by a K-NN algorithm to eventually determine the hand gesture and, hence showing their meanings. To estimate the performance of the proposed method, an experiment using a hand gesture database is performed. The results showed that the suggested method has an average recognition rate of 97%. 

Copyrights © 2021