The rapid growth of graph-structured data in domains such as transportation, social networks, and biological systems has increased the demand for more adaptive and efficient Graph Neural Network (GNN) architectures. However, GNN performance remains highly sensitive to hyperparameter configurations, which are often tuned through computationally expensive manual or heuristic methods. This study proposes a novel Bayesian Meta-Learning (BML)-based framework for hyperparameter optimization of GNNs aimed at improving the prediction accuracy of complex network dynamics. The framework integrates Bayesian optimization with a meta-learning prior adaptation mechanism, enabling the model to learn optimal hyperparameter distributions across multiple graph tasks. Experimental evaluations conducted on three benchmark datasets—Cora, Citeseer, and PubMed—comprising up to 20,000 nodes with diverse structural complexities, demonstrate that the proposed BML-GNN framework achieves faster convergence, lower validation loss, and higher predictive accuracy than both baseline GNN and traditional Bayesian Optimization approaches. Quantitatively, the BML-GNN model attains an R² score exceeding 0.97 with a significant reduction in RMSE, confirming its strong generalization capability. Although the method shows notable performance improvements, its computational overhead during meta-training and reliance on well-defined prior distributions represent potential limitations. Overall, the integration of Bayesian Meta-Learning provides a robust, scalable, and uncertainty-aware optimization strategy that advances the development of reliable GNN models for complex network modeling and intelligent system design.
Copyrights © 2025