Claim Missing Document
Check
Articles

Found 2 Documents
Search

Enhancing multi-class text classification in biomedical literature by integrating sequential and contextual learning with BERT and LSTM Ndama, Oussama; Bensassi, Ismail; Ndama, Safae; En-Naimi, El Mokhtar
International Journal of Electrical and Computer Engineering (IJECE) Vol 15, No 4: August 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijece.v15i4.pp4202-4212

Abstract

Classification of sentences in biomedical abstracts into predefined categories is essential for enhancing readability and facilitating information retrieval in scientific literature. We propose a novel hybrid model that integrates bidirectional encoder representations from transformers (BERT) for contextual learning, long short-term memory (LSTM) for sequential processing, and sentence order information to classify sentences from biomedical abstracts. Utilizing the PubMed 200k randomized controlled trial (RCT) dataset, our model achieved an overall accuracy of 88.42%, demonstrating strong performance in identifying methods and results sections while maintaining balanced precision, recall, and F1-scores across all categories. This hybrid approach effectively captures both contextual and sequential patterns of biomedical text, offering a robust solution for improving the segmentation of scientific abstracts. The model's design promotes stability and generalization, making it an effective tool for automatic text classification and information retrieval in biomedical research. These results underscore the model's efficacy in handling overlapping categories and its significant contribution to advancing biomedical text analysis.
Unified BERT-LSTM framework enhances machine learning in fraud detection, financial sentiment, and biomedical classification Ndama, Oussama; Bensassi, Ismail; Ndama, Safae; En-Naimi, El Mokhtar
IAES International Journal of Artificial Intelligence (IJ-AI) Vol 14, No 6: December 2025
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijai.v14.i6.pp5081-5095

Abstract

The current paper proposes a hybrid framework based on the bidirectional encoder representations from transformers (BERT) and long short-term memory (LSTM) networks for classification tasks in three diverse domains: credit card fraud detection (CCFD), financial news sentiment analysis (FNSA), and biomedical paper abstract classification (BPAC). The model leverages the strengths of BERT regarding the learning of contextual embeddings and those of LSTM in capturing sequential dependencies, thus setting the new state-of-the-art performance in each of the three domains. In the CCFD use case, the model was able to achieve an accuracy of 99.11%, considerably outperforming all the competing systems in fraud transaction detection. The BERT-LSTM model achieved a performance of 96.74% for FNSA, improving significantly in sentiment analysis. Finally, the use case of BPAC was robust, with 88.42% accuracy, which clearly classified biomedical abstract sections correctly. It is evident from the findings that this framework generalizes to a wide range of tasks and hence is an adaptable but strong tool in combating challenges of cross-domain classification.