Sanjeev Sannakki
Visvesvaraya Technological University

Published : 2 Documents Claim Missing Document
Claim Missing Document
Check
Articles

Found 1 Documents
Search
Journal : Indonesian Journal of Electrical Engineering and Computer Science

Transformer based multi-head attention network for aspect-based sentiment classification Abhinandan Shirahatti; Vijay Rajpurohit; Sanjeev Sannakki
Indonesian Journal of Electrical Engineering and Computer Science Vol 26, No 1: April 2022
Publisher : Institute of Advanced Engineering and Science

Show Abstract | Download Original | Original Source | Check in Google Scholar | DOI: 10.11591/ijeecs.v26.i1.pp472-481

Abstract

Aspect-based sentiment classification is vital in helping manufacturers identify the pros and cons of their products and features. In the latest days, there has been a tremendous surge of interest in aspect-based sentiment classification (ABSC). Since it predicts an aspect term sentiment polarity in a sentence rather than the whole sentence. Most of the existing methods have used recurrent neural networks and attention mechanisms which fail to capture global dependencies of the input sequence and it leads to some information loss and some of the existing methods used sequence models for this task, but training these models is a bit tedious. Here, we propose the multi-head attention transformation (MHAT) network the MHAT utilizes a transformer encoder in order to minimize training time for ABSC tasks. First, we used a pre-trained Global vectors for word representation (GloVe) for word and aspect term embeddings. Second, part-of-speech (POS) features are fused with MHAT to extract grammatical aspects of an input sentence. Whereas most of the existing methods have neglected this. Using the SemEval 2014 dataset, the proposed model consistently outperforms the state-of-the-art methods on aspect-based sentiment classification tasks.