Malek Zakarya Alksasbeh
Al Hussein Bin Talal University

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 2 Documents
Search

Recognition of handwritten Arabic (Indian) numerals using skeleton matching Bassam Alqaralleh; Malek Zakarya Alksasbeh; Tamer Abukhalil; Harbi Almahafzah; Tawfiq Al Rawashdeh
Indonesian Journal of Electrical Engineering and Computer Science Vol 19, No 3: September 2020
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v19.i3.pp1461-1468

Abstract

This paper brings into discussion the problem of recognizing Arabic numbers using a monocular camera as the only sensor. When a digital image is presented in this application, optical character recognition (OCR) can be exploited to comprehend numerical data. However, there has been a limited success when applied to the handwritten Arabic (Indian) numbers. This paper aims to overcome this limitation and introduces optical character recognition system based on skeleton matching. The proposed approach is used for handwritten Arabic numbers only. The experimental results indicate the effectiveness of the proposed optical character recognition system even for numbers written in worst case. The right system achieves a recognition rate of 99.3 %.
Smart hand gestures recognition using K-NN based algorithm for video annotation purposes Malek Zakarya Alksasbeh; Ahmad H AL-Omari; Bassam A. Y. Alqaralleh; Tamer Abukhalil; Anas Abukarki; Ibrahim Alkore Alshalabi; Amal Alkaseasbeh
Indonesian Journal of Electrical Engineering and Computer Science Vol 21, No 1: January 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v21.i1.pp242-252

Abstract

Sign languages are the most basic and natural form of languages which were used even before the evolution of spoken languages. These sign languages were developed using various sign "gestures" that are made using hand palm. Such gestures are called "hand gestures". Hand gestures are being widely used as an international assistive communication method for deaf people and many life aspects such as sports, traffic control and religious acts. However, the meanings of hand gestures vary among different civilization cultures. Therefore, because of the importance of understanding the meanings of hand gestures, this study presents a procedure whichcan translate such gestures into an annotated explanation. The proposed system implements image and video processing which are recently conceived as one of the most important technologies. The system initially, analyzes a classroom video as an input, and then extracts the vocabulary of twenty gestures. Various methods have been applied sequentially, namely: motion detection, RGB to HSV conversion, and noise removing using labeling algorithms. The extraction of hand parameters is determined by a K-NN algorithm to eventually determine the hand gesture and, hence showing their meanings. To estimate the performance of the proposed method, an experiment using a hand gesture database is performed. The results showed that the suggested method has an average recognition rate of 97%.