Claim Missing Document
Check
Articles

Found 3 Documents
Search

Enhancing multi-class text classification in biomedical literature by integrating sequential and contextual learning with BERT and LSTM Ndama, Oussama; Bensassi, Ismail; Ndama, Safae; En-Naimi, El Mokhtar
International Journal of Electrical and Computer Engineering (IJECE) Vol 15, No 4: August 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v15i4.pp4202-4212

Abstract

Classification of sentences in biomedical abstracts into predefined categories is essential for enhancing readability and facilitating information retrieval in scientific literature. We propose a novel hybrid model that integrates bidirectional encoder representations from transformers (BERT) for contextual learning, long short-term memory (LSTM) for sequential processing, and sentence order information to classify sentences from biomedical abstracts. Utilizing the PubMed 200k randomized controlled trial (RCT) dataset, our model achieved an overall accuracy of 88.42%, demonstrating strong performance in identifying methods and results sections while maintaining balanced precision, recall, and F1-scores across all categories. This hybrid approach effectively captures both contextual and sequential patterns of biomedical text, offering a robust solution for improving the segmentation of scientific abstracts. The model's design promotes stability and generalization, making it an effective tool for automatic text classification and information retrieval in biomedical research. These results underscore the model's efficacy in handling overlapping categories and its significant contribution to advancing biomedical text analysis.
Unified BERT-LSTM framework enhances machine learning in fraud detection, financial sentiment, and biomedical classification Ndama, Oussama; Bensassi, Ismail; Ndama, Safae; En-Naimi, El Mokhtar
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 6: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i6.pp5081-5095

Abstract

The current paper proposes a hybrid framework based on the bidirectional encoder representations from transformers (BERT) and long short-term memory (LSTM) networks for classification tasks in three diverse domains: credit card fraud detection (CCFD), financial news sentiment analysis (FNSA), and biomedical paper abstract classification (BPAC). The model leverages the strengths of BERT regarding the learning of contextual embeddings and those of LSTM in capturing sequential dependencies, thus setting the new state-of-the-art performance in each of the three domains. In the CCFD use case, the model was able to achieve an accuracy of 99.11%, considerably outperforming all the competing systems in fraud transaction detection. The BERT-LSTM model achieved a performance of 96.74% for FNSA, improving significantly in sentiment analysis. Finally, the use case of BPAC was robust, with 88.42% accuracy, which clearly classified biomedical abstract sections correctly. It is evident from the findings that this framework generalizes to a wide range of tasks and hence is an adaptable but strong tool in combating challenges of cross-domain classification.
A novel BERT-long short-term memory hybrid model for effective credit card fraud detection Ndama, Oussama; Ndama, Safae; Bensassi, Ismail; En-Naimi, El Mokhtar
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 15, No 1: February 2026
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v15.i1.pp788-797

Abstract

In the rapidly evolving landscape of financial transactions, the detection of fraudulent activities remains a critical challenge for financial institutions worldwide. This study introduces a novel bidirectional encoder representation from transformers (BERT)–long short-term memory (LSTM) hybrid model that integrates both textual and numerical data to enhance credit card fraud detection. Leveraging BERT for deep contextual embeddings and LSTM for sequence analysis, the model provides a comprehensive approach that surpasses traditional fraud detection systems primarily based on numerical analysis. On the validation set, the model achieved a recall of 100% and an accuracy of 99.11%, highlighting strong effectiveness in identifying fraudulent transactions under class imbalance. Through rigorous evaluation, the model demonstrated exceptional accuracy and reliability, promising improvements in fraud detection and mitigation. This paper details the development and validation of the hybrid model, emphasizing its use of mixed data types to capture complex patterns in transaction data. The results indicate a new frontier in fraud detection by combining natural language processing (NLP) and sequential data analysis to create a robust solution for real-world applications, supporting the security and integrity of financial systems globally.