Exploring Time Granularity on Temporal Graphs for Dynamic Link Prediction in Real-world Networks

Published: 20 Oct 2023, Last Modified: 22 Nov 2023TGL Workshop 2023 LongPaperEveryoneRevisionsBibTeX
Keywords: Graph Representation Learning, Temporal Graphs, Dynamic Graph Neural Networks
TL;DR: We explore the optimal choice of time granularity for training DGNNs on dynamic graphs through extensive experimentation with real-world datasets..
Abstract: Dynamic Graph Neural Networks (DGNNs) have emerged as the predominant approach for processing dynamic graph-structured data. However, the influence of temporal information on model performance and robustness remains insufficiently explored, particularly regarding how models address prediction tasks with different time granularities. In this paper, we explore the impact of time granularity when training DGNNs on dynamic graphs through extensive experiments. We examine graphs derived from various domains and compare three different DGNNs to the baseline model across four varied time granularities. We mainly consider the interplay between time granularities, model architectures, and negative sampling strategies to obtain general conclusions. Our results reveal that a sophisticated memory mechanism and proper time granularity are crucial for a DGNN to deliver competitive and robust performance in the dynamic link prediction task. We also discuss drawbacks in considered models and datasets and propose promising directions for future research on the time granularity of temporal graphs.
Supplementary Material: zip
Format: Long paper, up to 8 pages. If the reviewers recommend it to be changed to a short paper, I would be willing to revise my paper to fit within 4 pages.
Submission Number: 25
Loading