Social media sentiment analysis is an important field in natural language processing (NLP) to understand public opinion on a topic, product, or policy. This study aims to analyze social media user sentiment by utilizing a combination of the Bidirectional Encoder Representations from Transformers (BERT) and Long Short-Term Memory (LSTM) algorithms. The BERT model is used to extract contextual features from text, while the LSTM serves to capture long-term dependencies in sequence data. The dataset used comes from Indonesian-language social media posts that have been labeled into three sentiment categories: positive, negative, and neutral. The research process includes text preprocessing, tokenization, weighting, model training, and performance evaluation using accuracy, precision, recall, and F1-score metrics. Test results show that the combination of BERT and LSTM produces better performance than using a single model, with an accuracy of over 90%. This study proves that the BERT-LSTM hybrid approach is effective for understanding semantic context in complex social media texts. These findings are expected to contribute to the development of data-based opinion analysis and decision-making systems in the digital era.
Copyrights © 2025