This research proposes mapping Indonesian sentences with single and multiple structures into emotion classes based on a multi-label classification process. The result of this research can apply in various fields, including the development of facial expressions in virtual character animation. Applications in other fields are facial expression analysis, human-computer interaction systems, and other virtual facial character system applications. In previous research, the classification process used for emotion mapping was usually based only on the frequency of occurrence of adjectives. The resulting emotion classes are less representative of sentence semantics. In this research, the proposed sequential model can take into account the semantics of the sentence so that the results of the classification process are more natural and representative of the semantics of the sentence. The method used for the emotion mapping process is multi-label text classification with continuous values between 0-1. This research produces the tolerant-method that utilizes the error value to deliver accuracy in the model evaluation process. The tolerant-method converts the predicted-label, which has an error value less than or equal to the error-tolerant value, to the actual-label for better accuracy. The model used in the classification process is a sequential model, including one-dimensional Convolution Neural Networks (CNN) and bidirectional Long Short-Term Memory (LSTM). The CNN model generates feature maps of each input in a partial way. Meanwhile, bidirectional LSTM captures information from input data in two directions. Experiments were performed using test data on Indonesian sentences. Based on the experimental results, bidirectional LSTM can produce an accuracy of 91% in the 8: 2 data portion and error-tolerant of 0.09.Keywords : Sequential Model, Mapping Compound Emotions, Sentence Semantics, Indonesian Sentences
                        
                        
                        
                        
                            
                                Copyrights © 2020