Identifying epileptogenic zones (EZs) is a crucial step in the pre-surgical evaluation of drug-resistant epilepsy patients. Conventional methods, including EEG/SEEG visual inspection and neurofunctional imaging, often face challenges in accuracy, reproducibility, and subjectivity. The rapid development of artificial intelligence (AI) technologies in signal processing and neuroscience has enabled their growing use in detecting epileptogenic zones. This systematic review aims to explore recent developments in AI applications for localizing epileptogenic zones, focusing on algorithm types, dataset characteristics, and performance outcomes. A comprehensive literature search was conducted in 2025 across databases such as ScienceDirect, Springer Nature, and IEEE Xplore using relevant keyword combinations. The study selection followed PRISMA guidelines, resulting in 34 scientific articles published between 2020 and 2024. Extracted data included AI methods, algorithm types, dataset modalities, and performance metrics (accuracy, AUC, sensitivity, and F1-score). Results showed that deep learning was the most used approach (44%), followed by machine learning (35%), multi-methods (18%), and knowledge-based systems (3%). CNN and ANN were the most commonly applied algorithms, particularly in scalp EEG and SEEG-based studies. Datasets ranged from public sources (Bonn, CHB-MIT) to high-resolution clinical SEEG recordings. Multimodal and hybrid models demonstrated superior performance, with several studies achieving accuracy rates above 98%. This review confirms that AI (especially deep learning with SEEG and multimodal integration) has strong potential to improve the precision, efficiency, and scalability of EZ detection. To facilitate clinical adoption, future research should focus on standardizing data pipelines, validating AI models in real-world settings, and developing explainable, ethically responsible AI systems.
                        
                        
                        
                        
                            
                                Copyrights © 2025