Outlier detection is a critical task in informatics, particularly for analyzing large, complex datasets such as electrical energy consumption records. Identifying anomalies enables the recognition of abnormal usage patterns and potential non-technical losses, which are essential for ensuring reliability and efficiency in innovative grid systems. However, conventional supervised learning approaches are often unsuitable due to the unlabeled and imbalanced nature of real-world consumption data. To address the challenge of validating unsupervised models without ground truth, this study utilizes a controlled synthetic dataset with precise anomaly injection. This approach allows for a rigorous comparative evaluation of two widely adopted algorithms, Isolation Forest and One-Class Support Vector Machine (OC-SVM). The analysis examines detection accuracy, F1-score, and computational efficiency under identical experimental conditions. Results demonstrate that Isolation Forest consistently achieves superior performance, attaining a Detection Accuracy of 0.9948 and an F1-Score of 0.9478, significantly outperforming OC-SVM, which yielded an accuracy of 0.9521 and an F1-Score of only 0.5108. Furthermore, Isolation Forest proved to be exceptionally efficient, requiring only 0.9207 seconds for computation approximately 21 times faster than OC-SVM (19.9460 seconds). These advantages highlight its scalability and suitability for large-scale, near-real-time monitoring applications. Overall, the findings provide empirical evidence of Isolation Forest's effectiveness and offer practical guidance on algorithm selection for intelligent grid analytics
Copyrights © 2026