This paper explores the application of quantum-inspired optimization algorithms in the training of large-scale Graph Neural Networks (GNNs) within distributed cloud-edge environments. GNNs have gained significant attention due to their ability to model complex relationships in graph-structured data, yet their training presents challenges such as high computational demand, inefficient resource allocation, and slow convergence, especially for large datasets. Traditional meta-heuristic algorithms, while useful, often face scalability and performance issues when applied to such large-scale tasks. To address these challenges, we propose a quantum-inspired meta-heuristic algorithm that leverages quantum principles, such as superposition and entanglement, to enhance optimization processes. The algorithm was integrated into a hybrid cloud-edge system, where computational tasks are dynamically distributed between edge nodes and the cloud, optimizing resource utilization and reducing latency. Our experimental results demonstrate significant improvements in training speed, resource efficiency, and convergence rate when compared to traditional optimization methods such as Genetic Algorithms and Simulated Annealing. The quantum-inspired algorithm not only accelerates the training process but also reduces memory usage, making it well-suited for large-scale GNN applications. Furthermore, the system's scalability was enhanced by the hybrid cloud-edge architecture, which balances computational load and enables real-time data processing. The findings suggest that quantum-inspired optimization algorithms can significantly improve the training of GNNs in distributed systems, opening new avenues for real-time applications in areas such as social network analysis, anomaly detection, and recommendation systems. Future work will focus on refining these algorithms to handle even larger datasets and more complex GNN architectures, with potential integration into edge devices for enhanced real-time decision-making.