Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Although graph representation learning has been studied extensively in static graph settings, dynamic graphs are less investigated in this context. This paper proposes a novel integrated variational framework called DYnamic mixture Variational Graph Recurrent Neural Networks (DyVGRNN), which consists of extra latent random variables in structural and temporal modelling. Our proposed framework comprises an integration of Variational Graph Auto-Encoder (VGAE) and Graph Recurrent Neural Network (GRNN) by exploiting a novel attention mechanism. The Gaussian Mixture Model (GMM) and the VGAE framework are combined in DyVGRNN to model the multimodal nature of data, which enhances performance. To consider the significance of time steps, our proposed method incorporates an attention-based module. The experimental results demonstrate that our method greatly outperforms state-of-the-art dynamic graph representation learning methods in terms of link prediction and clustering.2.

Original publication

DOI

10.1016/j.neunet.2023.05.048

Type

Journal article

Journal

Neural Netw

Publication Date

08/2023

Volume

165

Pages

596 - 610

Keywords

Attention mechanism, Dynamic graph representation learning, Dynamic node embedding, Graph recurrent neural network, Variational graph auto-encoder, Cluster Analysis, Learning, Neural Networks, Computer, Normal Distribution