The development of sentiment analysis towards Aspect-Based Sentiment Analysis (ABSA) has made significant progress thanks to deep learning technology, especially through the Bidirectional Encoder Representations from Transformers (BERT) architecture. Despite its increasing popularity, a comprehensive synthesis of global research patterns and optimal model configurations is still urgently needed. This study presents a Systematic Literature Review (SLR) combined with bibliometric analysis to examine BERT-based ABSA research indexed in Scopus. Using the PRISMA and VOSviewer frameworks for visualization, a total of 62 eligible articles up to mid-2025 were analyzed. The results of the study show a strong upward trend of publications with a peak in 2024, where China, India, and Indonesia emerged as the major contributors in this domain. Further, the review identified a critical technical standard for effective model training: the Adam optimizer was the most dominant choice, typically paired with a learning rate between 1e-5 to 2e-5 and a batch size of 16. Regarding performance evaluation, Accuracy and F1-Score are set as de facto standard metrics. These findings provide strategic guidance for researchers to optimize BERT implementation and identify future directions in more in-depth sentiment analysis tasks.