Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

This research introduces the Variational Graph Attention Dynamics (VarGATDyn), addressing the complexities of dynamic graph representation learning, where existing models, tailored for static graphs, prove inadequate. VarGATDyn melds attention mechanisms with a Markovian assumption to surpass the challenges of maintaining temporal consistency and the extensive dataset requirements typical of RNN-based frameworks. It harnesses the strengths of the Variational Graph Auto-Encoder (VGAE) framework, Graph Attention Networks (GAT), and Gaussian Mixture Models (GMM) to adeptly navigate the temporal and structural intricacies of dynamic graphs. Through the strategic application of GMMs, the model handles multimodal patterns, thereby rectifying misalignments between prior and estimated posterior distributions. An innovative multiple-learning methodology bolsters the model's adaptability, leading to an encompassing and effective learning process. Empirical tests underscore VarGATDyn's dominance in dynamic link prediction across various datasets, highlighting its proficiency in capturing multimodal distributions and temporal dynamics.

More information Original publication

DOI

10.1016/j.knosys.2024.112110

Type

Journal article

Publication Date

2024-09-05T00:00:00+00:00

Volume

299

Keywords

Deep generative models, Dynamic graph embedding, Graph attention network, Graph variational neural networks, Markovian assumptions