Sandre Ebenezer Sibuea
Unknown Affiliation

Published : 1 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search

Beyond Transformers: Evaluating the Robustness and Efficiency of State-Space Models for Next-Generation Natural Language Processing Saron Tua Parsaoran L Tobing; Muhammad Reza Al Thoriq; Setia Widodo; Sandre Ebenezer Sibuea; Asprina BR Surbakti; Siti Jamilah BR Tarigan
Jurnal Info Sains : Informatika dan Sains Vol. 15 No. 01 (2025): Informatika dan Sains , 2025
Publisher : SEAN Institute

Show Abstract | Download Original | Original Source | Check in Google Scholar

Abstract

Transformer architectures have dominated natural language processing (NLP) advancements in recent years, yet their growing computational demands and challenges in robustness motivate exploration of alternative models. This study qualitatively evaluates State-Space Models (SSMs) as a promising next-generation architecture for NLP tasks. By conducting a comprehensive literature analysis and comparative examination of current research, this paper investigates SSMs' theoretical foundations, robustness to input perturbations, efficiency in handling long sequences, and applicability to diverse linguistic contexts. The results show that SSMs offer compelling advantages over Transformers in memory efficiency and sequence modeling capacity, while demonstrating competitive or superior robustness in several NLP benchmarks, highlighting their potential as efficient, scalable, and robust alternatives for future NLP applications.