Tamer Abukhalil
Al Hussein Bin Talal University

Published : 3 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 3 Documents
Search

Smart detection of offensive words in social media using the soundex algorithm and permuterm index Malek Z. Alksasbeh; Bassam A. Y. Alqaralleh; Tamer Abukhalil; Anas Abukaraki; Tawfiq Al Rawashdeh; Moha'med Al-Jaafreh
International Journal of Electrical and Computer Engineering (IJECE) Vol 11, No 5: October 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v11i5.pp4431-4438

Abstract

Offensive posts in the social media that are inappropriate for a specific age, level of maturity, or impression are quite often destined more to unadult than adult participants. Nowadays, the growth in the number of the masked offensive words in the social media is one of the ethically challenging problems. Thus, there has been growing interest in development of methods that can automatically detect posts with such words. This study aimed at developing a method that can detect the masked offensive words in which partial alteration of the word may trick the conventional monitoring systems when being posted on social media. The proposed method progresses in a series of phases that can be broken down into a pre-processing phase, which includes filtering, tokenization, and stemming; offensive word extraction phase, which relies on using the soundex algorithm and permuterm index; and a post-processing phase that classifies the users’ posts in order to highlight the offensive content. Accordingly, the method detects the masked offensive words in the written text, thus forbidding certain types of offensive words from being published. Results of evaluation of performance of the proposed method indicate a 99% accuracy of detection of offensive words.
Recognition of handwritten Arabic (Indian) numerals using skeleton matching Bassam Alqaralleh; Malek Zakarya Alksasbeh; Tamer Abukhalil; Harbi Almahafzah; Tawfiq Al Rawashdeh
Indonesian Journal of Electrical Engineering and Computer Science Vol 19, No 3: September 2020
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v19.i3.pp1461-1468

Abstract

This paper brings into discussion the problem of recognizing Arabic numbers using a monocular camera as the only sensor. When a digital image is presented in this application, optical character recognition (OCR) can be exploited to comprehend numerical data. However, there has been a limited success when applied to the handwritten Arabic (Indian) numbers. This paper aims to overcome this limitation and introduces optical character recognition system based on skeleton matching. The proposed approach is used for handwritten Arabic numbers only. The experimental results indicate the effectiveness of the proposed optical character recognition system even for numbers written in worst case. The right system achieves a recognition rate of 99.3 %.
Smart hand gestures recognition using K-NN based algorithm for video annotation purposes Malek Zakarya Alksasbeh; Ahmad H AL-Omari; Bassam A. Y. Alqaralleh; Tamer Abukhalil; Anas Abukarki; Ibrahim Alkore Alshalabi; Amal Alkaseasbeh
Indonesian Journal of Electrical Engineering and Computer Science Vol 21, No 1: January 2021
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v21.i1.pp242-252

Abstract

Sign languages are the most basic and natural form of languages which were used even before the evolution of spoken languages. These sign languages were developed using various sign "gestures" that are made using hand palm. Such gestures are called "hand gestures". Hand gestures are being widely used as an international assistive communication method for deaf people and many life aspects such as sports, traffic control and religious acts. However, the meanings of hand gestures vary among different civilization cultures. Therefore, because of the importance of understanding the meanings of hand gestures, this study presents a procedure whichcan translate such gestures into an annotated explanation. The proposed system implements image and video processing which are recently conceived as one of the most important technologies. The system initially, analyzes a classroom video as an input, and then extracts the vocabulary of twenty gestures. Various methods have been applied sequentially, namely: motion detection, RGB to HSV conversion, and noise removing using labeling algorithms. The extraction of hand parameters is determined by a K-NN algorithm to eventually determine the hand gesture and, hence showing their meanings. To estimate the performance of the proposed method, an experiment using a hand gesture database is performed. The results showed that the suggested method has an average recognition rate of 97%.