Temporal graph modeling has become increasingly important for understanding and forecasting the dynamics of complex systems that evolve over time. One of the central challenges in temporal graph learning lies in identifying graph neural network (GNN) architectures that can effectively capture both spatial dependencies and temporal dynamics. This study presents a comprehensive benchmarking analysis of widely used GNN architectures, namely Graph Convolution Network (GCN), GraphSAGE, Graph Attention Network (GAT), Chebyshev Networks (ChebNet), and Simplified Graph Convolution Network (SGC), each integrated with recurrent mechanisms for temporal modeling. The evaluation is conducted on the WikiMaths dataset, a large-scale temporal graph dataset representing user visits of mathematics-related Wikipedia articles. Experimental results demonstrate that the choice of graph convolution operator significantly impacts temporal forecasting performance, with GraphSAGE and ChebNet consistently exhibiting superior performance compared to other architectures. This work provides empirical insights into the strengths and limitations of established temporal GNN models, contributing to a clearer understanding of their applicability in dynamic graph forecasting tasks.
Copyrights © 2026