Node Embeddings in Graph Machine Learning

machine learning with graphs n.w
1 / 5
Embed
Share

Explore the applications and limitations of node embeddings in graph machine learning, including clustering, node classification, link prediction, and graph classification. Learn how to utilize node embeddings effectively and understand the drawbacks of shallow encoding methods in graph data analysis.

  • Node Embeddings
  • Graph Machine Learning
  • Limitations
  • Transductive Learning
  • Inductive Learning

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Machine Learning with Graphs 3-3. Applications and Limitations of Shallow Encoding 1. Application of Node Embeddings 2. Limitations of Shallow Encoding 3. Transductive, Inductive Learning ref: Jure Leskovec Stanford CS224W: Machine Learning with Graphs (Chou Yuan-Tung), ton731@gmail.com

  2. What After Node Embeddings? Once we have node embeddings (independent to task), we can continue to the downstream prediction. downstream Input Graph Node Learning Algorithm Prediction Embeddings Shallow Encoding Node-level Edge-level Graph-level SVM Random Forest XGBoost DNN 2

  3. How to Use these Node Embeddings? Given a node i i in a graph, we have it s embedding Zi Zi, we can do : Clustering/community detection: Cluster Zi Node classification: Predict label of node i based on Zi Link prediction: Predict edge (i, j ) based on (Zi, Zj ) Concatenate: f(Zi, Zj) = g([Zi, Zj]) Hadamard: f(Zi, Zj) = g(Zi * Zj) (per position product) Sum/Average: f(Zi, Zj) = g(Zi + Zj) Distance: f(Zi, Zj) = g(|Zi - Zj|2) Graph classification: aggregate node embeddings to form graph embedding Zg. Predict graph label based on graph embedding Zg. 3

  4. Limitations of Shallow Encoders Limitations of shallow embedding methods: O(|V|) parameters are needed: Every node has its own unique embedding No shared parameters between nodes Inherently transductive : Cannot generate embedding for nodes that are not seen during training Do not incorporate node features: Nodes in many graphs have node features that we can and should leverage 4

  5. Transductive vs. Inductive Learning Transductive: Inductive: Training: Testing: In transductive learning, for new coming graphs, we have to train from scratch to get the embeddings. However in inductive learning, the way we create embeddings can be generalized to unseen graphs. Inductive learning on graphs Deep Encoder Graph Neural Networks (GNNs) 5

Related


More Related Content