The convergence of edge and cloud computing paradigms has emerged as a critical architectural approach for real-time system optimization. This research synthesizes recent developments in edge–cloud synergy, examining how the combination of edge computing's ultra-low latency capabilities with cloud computing's massive computational resources addresses the growing demands of real-time applications in industrial systems, smart cities, and Internet of Things (IoT) environments. Through comprehensive analysis of contemporary research from 2021 to 2025, this study identifies four primary research trajectories: architecture and orchestration patterns, AI optimization and predictive maintenance, resource scheduling mechanisms, and vertical domain applications. Quantitative evidence demonstrates that hybrid edge–cloud architectures achieve 10–15× latency reduction compared to cloud-only approaches, bandwidth savings exceeding 90%, energy efficiency improvements of 22–42%, and detection accuracy rates approaching 90% in anomaly detection scenarios. However, significant challenges persist in resource management, security frameworks, and standardization efforts. This comprehensive review provides insights into the current state of edge–cloud synergy and identifies critical research directions for advancing real-time system optimization in next-generation networks.