
Dual Path Graph Convolutional Networks Overview
"Learn about Dual Path Graph Convolutional Networks (DPGCNs) and their advancements in capturing long-range information and improving graph convolutional network architectures for various applications. Explore the methods, including ResGCN, DenseGCN, and Higher Order Graph Recurrent Networks, to enhance performance and overcome common challenges. Dive into the potential of DPGCNs in revolutionizing graph-based machine learning models." (313 characters)
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Dual Path Graph Convolutional Networks Yunhe Li, University of Montreal Montreal, Quebec, Canada Yaochen Hu, Yingxue Zhang, Huawei Noah s Ark Lab Huawei Technologies Canada Montreal, Quebec, Canada
Introduction Despite the huge success of GCNs, most of them are usually limited to shallow architectures lack the ability to extract long-range information from the high-order neighbours stacking more layers into a GCN may lead to several problems vanishing gradient over-smoothing over-squashing residual connections ResGCN residual connections + dilated convolutions DeeperGCN novel generalized aggregation functions + pre-activation version of residual connections GCNII initial residual connections + identity mapping deep GCN achieving at least the same performance as its shallow counterpart
Introduction dense connections JKNet combine all node feature vectors from the previous layers DenseGCN exploits dense connectivity among different GCN layers Flexibly use different neighborhood ranges to learn structure-aware representations while preserving node locality. relieve the problem of over-smoothing alleviate the over-squashing problem high feature redundancy Higher Order Graph Recurrent Networks (HOGRNs) unify most existing architectures of GCNs Dual Path Graph Convolutional Networks (DPGCNs)
Method ResGCN ? ??+1= ???? + ??= ???? + ?0 ?=0 DenseGCN ??+1= ? ??,???? = ? ?0,?0?0,?1?1, ,????
Method - Higher Order Graph Recurrent Networks ? ??0,?1 ??1, ,?? 1 ??= ?????0,?0 ?? summation operator concatenation operator weighted summation operator attention operator pooling operator ?? differentiable graph mapping ?? 1
Method - Dual Path Graph Convolutional Networks ?????= ?? ???= ?? ???= ?? ?????(?? ???(?? ????? ????? ??????0,?1, ,?? 1) ????0,?1, ,?? 1) ?????,?? ??? ?? ?? ?? G?= ?? ???
Open Graph Benchmark (OGB) node property prediction ogbn-proteins protein-protein association networks Undirected Weighted Typed 132, 534 nodes and 39, 561, 252 edges multi-label binary classification predict the presence of protein functions evaluate ROC-AUC scores ogbn-arxiv paper citation networks 169, 343 nodes and 1, 166, 243 directed edges predict the primary categories of the arxiv papers Evaluate accuracy graph property prediction ogbg-molhiv molecular graphs 41, 127 graphs predict the target molecular properties evaluate ROC-AUC scores ogbg-ppa protein-protein association networks 158, 100 protein association graphs predict what taxonomic group the graph originates Evaluate accuracy Datasets