Publications

Hypergraph Neural Networks for Time-series Forecasting

IEEE International Conference on Big Data (BigData)

Publication date: December 15, 2023

Hongjie Chen, Ryan A. Rossi, Kanak Mahadik, Sungchul Kim, Hoda Eldardiry

Many existing deep graph models have shown that forecasting time-series values benefits from modeling the mutual relations between time-series. For example, graph neural networks can exploit the correlations between two CPU utilization time-series, facilitating more accurate predictions. However, the implied pairwise interactions between entities in the graph structure do not always reflect the actual interactions. In a cloud system, for instance, computing tasks are assigned to groups of machines, and CPU utilization time-series within the same group simultaneously interact with one another. Hence, such interactions are beyond-pairwise. In this paper, we propose a novel model called Hypergraph Recurrent Neural Networks (HGRNN) for time-series forecasting. Our model employs a hypergraph to model beyond-pairwise relations, which naturally reflect the actual interactions among entities. We also introduce a novel semi-principled hypergraph construction method to address the challenge of missing hypergraph information. Our model adopts the encoder-decoder framework where historical time-series are digested into an encoded state which is decoded to yield prediction. We further integrate a temporal component to enhance learning from temporal locality. Extensive experiments on large-scale datasets show that our model achieves better forecasting performance compared to state-of-the-art baselines.

Learn More

Research Area:  Adobe Research iconAI & Machine Learning