ArmGAN: Adversarial Representation Learning for Network Embedding

ArmGAN: Adversarial Representation Learning for Network Embedding

Network embedding aims to learn low-dimensional representations of nodes in a network, which can be used for many downstream network analysis tasks. Recently, many network embedding methods based on Generative Adversarial Networks (GANs) have been proposed. However, GAN-based methods mainly face two challenges: (1) Existing GAN-based methods often use GANs to learn Gaussian distributions as … Read more

Introduction to GAN: Understanding Generative Adversarial Networks

Introduction to GAN: Understanding Generative Adversarial Networks

Table of Contents What is GAN? What Can GAN Do? Framework and Training of GAN Similarities and Differences Between GAN and Other Generative Models Existing Issues with GAN Models Introduction: GAN has gained significant popularity in the field of images over the past year, and there is a recent trend of it making inroads into … Read more

Overview of Graph Neural Networks: Dynamic Graphs

Overview of Graph Neural Networks: Dynamic Graphs

Introduction Graph neural networks (GNNs) have been widely applied to the modeling and representation learning of graph-structured data. However, mainstream research has been limited to handling static network data, while real complex networks often undergo structural and property evolution over time. The team led by Katarzyna at the University of Technology Sydney recently published a … Read more

RNN Learns Suitable Hidden Dimensions with White Noise

RNN Learns Suitable Hidden Dimensions with White Noise

Abstract Neural networks need the right representations of input data to learn. Recently published in Nature Machine Intelligence, a new study examines how gradient learning shapes a fundamental property of representations in recurrent neural networks (RNNs)—their dimensionality. Through simulations and mathematical analysis, the study demonstrates how gradient descent guides RNNs to compress the dimensionality of … Read more

Understanding Transformers in Graph Neural Networks

Understanding Transformers in Graph Neural Networks

Click on the above“Visual Learning for Beginners”, select to add a star or “pin” Heavyweight insights delivered in real-time Author: Compiled by: ronghuaiyang Introduction The aim of this perspective is to build intuition behind the Transformer architecture in NLP and its connection to Graph Neural Networks. Engineer friends often ask me: “Graph deep learning” sounds … Read more

A Review of Transformers at the Forefront of GNN

A Review of Transformers at the Forefront of GNN

This article is about 4500 words long and is recommended for a reading time of over 10 minutes. This article introduces Graphormer, a graph representation learning method based on the standard Transformer architecture. 1 Introduction The Transformer architecture has shown excellent performance in fields such as natural language processing and computer vision, but it performs … Read more

Understanding GAN Applications in Network Feature Learning

Understanding GAN Applications in Network Feature Learning

This article is a transcript of the live sharing session by Wang Hongwei, a PhD student from Shanghai Jiao Tong University and intern at Microsoft Research Asia, on January 10 during the 23rd PhD Talk. Network representation learning (network embedding) has emerged in recent years as a branch of feature learning research. As a dimensionality … Read more