Graph Recurrent Neural Networks (GRNN) excel in time-series prediction by modeling complicated non-linear relationships among time-series. However, most GRNN models target small datasets that only have tens of time-series or hundreds of time-series. Therefore, they fail to handle large-scale datasets that have tens of thousands of time-series, which exist in many real-world scenarios. To address this scalability issue, we propose Evolving Super Graph Neural Networks (ESGNN), which target large-scale datasets and significantly boost model training. Our ESGNN models multivariate time-series based on super graphs, where each super node is associated with a set of time-series that are highly correlated with each other. To further precisely model dynamic relationships between time-series, ESGNN quickly updates super graphs on the fly by using the LSH algorithm to construct the super edges. The embeddings of super nodes are learned through end-to-end learning and are then used with each target time-series for forecasting. Experimental result shows that ESGNN outperforms previous state-of-the-art methods with a significant runtime speedup (3X-40X faster) and space-saving (5X-4600X less), while only sacrificing little or negligible prediction accuracy. An ablation study is also conducted to investigate the effectiveness of the number of super nodes and the graph update interval.
Learn More