Jurnal Info Sains : Informatika dan Sains
Vol. 15 No. 01 (2025): Informatika dan Sains , 2025

Beyond Transformers: Evaluating the Robustness and Efficiency of State-Space Models for Next-Generation Natural Language Processing

Saron Tua Parsaoran L Tobing (Unknown)
Muhammad Reza Al Thoriq (Unknown)
Setia Widodo (Unknown)
Sandre Ebenezer Sibuea (Unknown)
Asprina BR Surbakti (Unknown)
Siti Jamilah BR Tarigan (Unknown)



Article Info

Publish Date
15 Aug 2025

Abstract

Transformer architectures have dominated natural language processing (NLP) advancements in recent years, yet their growing computational demands and challenges in robustness motivate exploration of alternative models. This study qualitatively evaluates State-Space Models (SSMs) as a promising next-generation architecture for NLP tasks. By conducting a comprehensive literature analysis and comparative examination of current research, this paper investigates SSMs' theoretical foundations, robustness to input perturbations, efficiency in handling long sequences, and applicability to diverse linguistic contexts. The results show that SSMs offer compelling advantages over Transformers in memory efficiency and sequence modeling capacity, while demonstrating competitive or superior robustness in several NLP benchmarks, highlighting their potential as efficient, scalable, and robust alternatives for future NLP applications.

Copyrights © 2025






Journal Info

Abbrev

InfoSains

Publisher

Subject

Computer Science & IT

Description

urnal Info Sains : Informatika dan Sains (JIS) discusses science in the field of Informatics and Science, as a forum for expressing results both conceptually and technically related to informatics science. The main topics developed include: Cryptography Steganography Artificial Intelligence ...