Dynamic Graph Neural Networks for Anomaly Detection in Time Series

dgnn dynamic graph neural networks for anomaly n.w
1 / 16
Embed
Share

Explore how DGNN utilizes dynamic graph neural networks to detect anomalies in multivariate time series data efficiently. The approach includes Dynamic Subgraph Generation and Adaptive Graph Attention Network for precise learning of temporal relationships, outperforming traditional methods like PCA and IForest.

  • Dynamic Graph
  • Neural Network
  • Anomaly Detection
  • Time Series
  • DGNN

Uploaded on | 2 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. DGNN: Dynamic Graph Neural Networks for Anomaly Detection in Multivariate Time Series Bowen Chen, Hancheng Lu, Yuang Chen, Haoyue Yuan, and Minghui Wang University of Science and Technology of China

  2. Contents 1. Background 2. Approach 3. Evaluation 4. Conclusion

  3. 1. Background Motivation Fail to adapt to complex scenarios. PCA(Principal Component Analysis) IForest Cannot utilize contextual information. USAD DAGMM Fail to utilize neighboring sequence information. TranAD. Anomaly Transformer.

  4. 1. Background Contribution We propose DGNN, a dynamic graph neural network for efficient and precise learning of temporal relationships in time series data. We introduce DSG, a real-time method for generating graphs between sequences, adapting to changing data. We propose AGAT, an Adaptive Graph Attention Network that effectively captures features from multiple time series using correlation-based attention coefficients. Extensive experiments on various real-world datasets validate the superior performance of DGNN compared to state-of-the-art methods.

  5. 2. Approach Overview of DGNN

  6. 2. Approach Dynamic Subgraph Generation(DSG) Algorithm Workflow 1. Input time series. 2. Calculate the correlation matrix. 3. Traverse the matrix to obtain pairs of sequences with correlation values greater than the threshold. 4. Create edges using the pair indices. 5. Add the edges to a collection. 6. Build subgraphs using the edges. DSG converts a fully connected graph into multiple strongly connected subgraphs.

  7. 2. Approach Dynamic Subgraph Generation(DSG) The Pearson product-moment correlation coefficient (PPMCC) is a measure between -1 and 1. It is defined as the covariance divided by the product of the standard deviations of two variables, as shown in the formula. ???(?,?) ????(?,?) = (3) ???[?]???[?] Where ???(?, ? ) epresents the covariance between ? and ? , and ???[?] and ???[?] represent the variances of ? and ?, respectively. ??,?= ????(??,??) > ? ??? ???

  8. 2. Approach Adaptive Graph Attention Network (AGAT) Fusion of neighbor features. We propose an Adaptive Graph Attention Network (AGAT) for feature extraction based on correlation coefficients. AGAT captures the relationships between neighboring sensors by aggregating node information based on subgraphs. Unlike existing GAT methods, our feature extractor utilizes correlation coefficients to compute the weights. Computation of neighbor weights. ??,?= ????(? ?,? ?) ???(??,?) ? ?????(??,?) ??,?= ????????(??,?) = ???(?????????(????(? ?,? ?) )) ? ?????(?????????(????(? ?,? ?))) ??,?=

  9. 3. Evaluation Datasets The table summarizes the features of the datasets in the table. For example, the SMAP dataset consists of 25 sequences, with a training set length of 135,183 and a test set length of 427,616, with an anomaly ratio of 13.13%.

  10. 3. Evaluation Number of edges The graph illustrates a comparison of edge numbers between fully connected GNN and DGNN across different datasets. From the graph, it can be observed that in ASD, MSL, SMAP, SWaT, and WADI datasets, the number of edges in DGNN is only 30%, 2%, 6%, 4%, and 4% respectively compared to fully connected GNN. This significantly reduces the computational cost of GNN.

  11. 3. Evaluation Results of anomaly detection The F1-score(%) results for anomaly detection on five publicly available datasets. DGNN outperforms existing state-of-the-art (SOTA) algorithms on five public datasets.

  12. 3. Evaluation Ablation F1-score (%) results of DGNN using different variables for anomaly detection on five publicly available datasets. Replacing cluster-based graph structures with fully connected graphs leads to a decrease in performance across all datasets. Removing attention mechanisms significantly reduces the performance of the model. Replacing LSTM with fully connected layers for prediction results in a decrease in model performance across all datasets.

  13. 3. Evaluation Interpretability of DSG The structures of 1_FIT_001_PV, 1_MV_001_STATUS, 1_P_001_STATUS, and 1_P_003_STATUS in the WADI dataset are related. The red region in the graph represents a group of nodes with similar characteristics, demonstrating the rationale of constructing subgraphs based on correlations. The pink region is another instance of clustering.

  14. 3. Evaluation Interpretability of DGNN The figure shows an example of the predicted results of the proposed DGNN and the comparative algorithm GDN on the SMAP dataset. The top figure displays the results predicted by GDN, while the bottom figure shows the results predicted by DGNN. The orange line represents the predicted values, and the blue line represents the actual values. From the figure, it can be observed that when node 1 exhibits anomalies around ? = 1500, GDN generates false alarms, while DGNN does not.

  15. 4. Conclusion DGNN outperforms state-of-the-art algorithms on five public datasets. DSG reduces edges, brings similar time series closer, and accelerates GNN convergence. AGAT effectively extracts and integrates neighboring node features, surpassing existing GAT methods. In the future, we plan to expand DSG's application to other GNN models, replacing fully connected graphs with dynamic subgraphs to enhance performance.

  16. Thanks! Q&A

Related


More Related Content