Mohamed Abo Rizka
Arab Academy for Science Technology and Maritime Transport Cairo

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

ArSentBERT: fine-tuned bidirectional encoder representations from transformers model for Arabic sentiment classification Mohamed Fawzy Abdelfattah; Mohamed Waleed Fakhr; Mohamed Abo Rizka
Bulletin of Electrical Engineering and Informatics Vol 12, No 2: April 2023
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/eei.v12i2.3914

Abstract

Sentiment analysis in the Arabic language is challenging because of its linguistic complexity. Arabic is complex in words, paragraphs, and sentence structure. Moreover, most Arabic documents contain multiple dialects, writing alphabets, and styles (e.g., Franco-Arab). Nevertheless, fine-tuned bidirectional encoder representations from transformers (BERT) models can provide a reasonable prediction accuracy for Arabic sentiment classification tasks. This paper presents a fine-tuning approach for BERT models for classifying Arabic sentiments. It uses Arabic BERT pre-trained models and tokenizers and includes three stages. The first stage is text preprocessing and data cleaning. The second stage uses transfer-learning of the pre-trained models’ weights and trains all encoder layers. The third stage uses a fully connected layer and a drop-out layer for classification. We tested our fine-tuned models on five different datasets that contain reviews in Arabic with different dialects and compared the results to 11 state-of-the-art models. The experiment results show that our models provide better prediction accuracy than our competitors. We show that the choice of the pre-trained BERT model and the tokenizer type improves the accuracy of Arabic sentiment classification.