Mathematical word problems in Indonesian are generally presented as multi-sentence paragraphs, making the identification of arithmetic operations not solely dependent on recognizing numerical values, but also requiring an understanding of events and semantic relationships across sentences. This study formulates the task of arithmetic operation identification as a classification problem of a single primary operation into four classes: addition, subtraction, multiplication, and division. To capture contextual relationships across sentences, an encoder-based Transformer architecture is employed, which is capable of modeling long-range dependencies through a self-attention mechanism. The dataset consists of 900 elementary school-level mathematical word problems constructed in accordance with the Indonesian curriculum. Experimental results show that the model achieves an accuracy of 0.98 and an F1-score of 0.98. Per-class evaluation indicates high and consistent performance, although prediction errors are still observed in cases with ambiguous narrative patterns, particularly where addition is misclassified as multiplication or subtraction, and multiplication is misclassified as division. These findings demonstrate that the Transformer architecture is effective in leveraging multi-sentence context to improve the accuracy of arithmetic operation identification in mathematical word problems.
Copyrights © 2026