Claim Missing Document
Check
Articles

Found 1 Documents
Search

A Survey on Deep Learning for Natural Language Processing: Models, Techniques, and Open Research Problems Hào, Nguyễn Nhật; Vy, Trần Khánh; Phước, Lê Văn
International Journal of Technology and Modeling Vol. 4 No. 2 (2025)
Publisher : Etunas Sukses Sistem

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.63876/ijtm.v4i2.137

Abstract

In recent years, deep learning has emerged as a powerful paradigm in natural language processing (NLP), enabling significant breakthroughs in tasks such as machine translation, sentiment analysis, and question answering. This survey provides a comprehensive overview of deep learning models and techniques that have shaped the evolution of NLP, with a focused lens on the Vietnamese language as a representative low-resource language. We review foundational models including recurrent neural networks (RNNs), convolutional neural networks (CNNs), and Transformer-based architectures such as BERT and GPT, and analyze their applications in Vietnamese NLP tasks. Special attention is given to the development and adaptation of Vietnamese-specific pretrained language models like PhoBERT and ViT5, as well as the use of multilingual approaches to address data scarcity. In addition, the paper discusses practical implementations in Vietnam, such as sentiment analysis of social media, Vietnamese question answering systems, and machine translation, highlighting the opportunities and challenges in this context. We also identify open research problems including limited training data, dialectal variations, code-switching, and ethical concerns, offering insights and directions for future work. This survey aims to serve as a resource for researchers and practitioners seeking to advance NLP capabilities in low-resource languages using deep learning.