Relation Inference among Sensor Time-Series with Neural Networks

ece228 course project relation inference among n.w
1 / 22
Embed
Share

Explore how neural networks in the frequency domain can be used for relation inference among sensor time-series in smart buildings, focusing on metadata representation, fundamental sensor relations, and literature surveys on similarity metric learning for time-series analysis.

  • Sensor Relations
  • Neural Networks
  • Time-Series Analysis
  • Smart Buildings
  • Metadata

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. ECE228 Course Project: Relation Inference among Sensor Time-Series with Neural Networks in Frequency Domain Byungjun Kim - Ziwei Liu - Steven Wong UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  2. Background - Applications for Smart Building Smart building Smart building holds great potential for promoting user s comfort such as - reducing energy consumption - fault detection and diagnosis Metadata (Point name) Metadata (Point name): Representation of each sensor s context - Naming conventions are vendor-specific - Hard to be deployed for inferring context Example for building metadata and the corresponding encoded contextual information. UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  3. Background - Relation Inference Relation inference: f Relation inference: finding out the relation among sensors. Representative methods uses either of - metadata - time-series observed at sensor Fundamental relations among sensors Fundamental relations among sensors 1. Functional relationship: Variable Air Volume (VAV) is connected to which Air Handling Unit (AHU) 2. Spatial relationship: Which sensors are co-located in the same physical space A typical building contains multiple AHUs, each heating/cooling the air and circulating it to dozens of VAV boxes. UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  4. Literature Survey - Sensor Relation Inference [1] J. Y. Park, B. Lasternas, and A. Aziz. Data Framework to Find the Physical Association between Framework to Find the Physical Association between AHU and VAV Terminal Unit AHU and VAV Terminal Unit. 2018. Data- -Driven Driven - Deploys cross-correlation for characterization and random forest for classification Model used in Park s paper (2018) [2] Koh et al. Plaster: Development Framework for Metadata Normalization Development Framework for Metadata Normalization Methods. Methods. 2016 Buildsys Plaster: An Integration, Benchmark, and An Integration, Benchmark, and Workflow of Plaster (2016 Buildsys) - Requires controlled perturbation to identify related sensors UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  5. Literature Survey - Similarity Metric Learning for Time-Series [3] Gharghabi et al. Mpdist: A novel time Mpdist: A novel time- -series distance measure to allow data mining in more distance measure to allow data mining in more challenging scenarios challenging scenarios. 2018 ICDM series - Dynamic Time Warping (DTW) is used for leveraging similarities among time-series [4] Hong et al. Relation Inference among Sensor Relation Inference among Sensor Time Series in Smart Buildings with Metric Time Series in Smart Buildings with Metric Learning Learning. 2019 AAAI Result using MPdist for clustering American girls names (2018 ICDM) - Deploys neural networks for inferencing relations of smart-building sensors Basis of our work - UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  6. Dataset The dataset consists of instances of each room. Each with equipped with: # of rooms 52 Duration of time-series One week Time interval of sensor readings 15 minutes CO2, humidity, light, temperature Types of sensors in one room UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  7. Feature Extraction System objective: System objective: - Grouping sensor which locates in the same room Loss function Loss function: - To push away uncorrelated sensors - With difference of distance with that of correlated sensors (Triplet - With ratio of distances between correlated and uncorrelated (Angular Triplet) Objective feature of model Objective feature of model: - Similarities between two time-series - Obtained by distance between sensors - Evaluated with equation ||ya- yb||2 Angular) UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  8. Models - Data Preprocessing - STFT Short Short- -Time Fourier Transformation (STFT): Time Fourier Transformation (STFT): - Different delays on different sensors for single event - Convert time-domain series into frequency-domain representation robust for different time delay Input format for network model: Input format for network model: - Fourier-transformed coefficients are complex-valued - In the model [4], naive Cartesian- coordinate representation coefficients are taken as input C Cn n= A = An n+ jB + jBn n - (A An n, B Bn n) is taken as input UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  9. Models - Neural Network Model Triplet Network: Triplet Network: - Input format as triplet (X Xa a, X - To sample correlated and uncorrelated sensors in balance Convolutional Neural Network: Convolutional Neural Network: for feature extraction , Xp p, X , Xn n) 4 convolutional layers with ReLU function 4 convolutional layers with ReLU function Loss function Loss function: - Combination of Triplet - Option using only Triplet loss Triplet and Angular loss Angular loss only Triplet loss UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  10. Models - Relation Inference Minimum K Minimum K- -cut problem - Finding set of edges partitioning the graph into k separately connected components - Using values on edges cut problem: In our model: In our model: - ||ya- yp||2, evaluating similarities of two vertices, is mainly used for loss function - K-cut problem is NP-hard -> Deploy approximated algorithm [5] Minimum K-cut problem example UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  11. Results - Simulation Simulation environments: Simulation environments: Free GPU provided by the Google Colab, used in the browser. Training procedures: Training procedures: 12 hours available at a runtime. Ran a jupyter notebook file at each time, and other code in use was in the .py format and uploaded by mounting Google Drive. Two Significant Advantages: Two Significant Advantages: Don t need to mess up with the local environment anymore. More straightforward paths and import machism. UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  12. Results - Performance Metrics I. Triplet accuracy on test set Triplet accuracy on test set Triplet Accuracy: Input Output correct triplet: III. Room III. Room- -wise accuracy on test wise accuracy on test- -set set (ya, yc, yn) s.t. ||ya- yc||2 < ||ya- yn||2 Each room has 4 sensor reading. Percentage of rooms for which the 4 sensor readings are grouped together correctly. II. Recall II. Recall UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  13. Results - Hyperparameter tuning The best accuracies (on test set): The best accuracies (on test set): Triplet Triplet Accuracy Accuracy Recal Recal l l Room Room- -wise Accuracy Accuracy wise Best accuracies (%) Best accuracies (%) 86 90 100 Gain obtained by selecting best hyperparameter: Gain obtained by selecting best hyperparameter: Triplet Triplet Accuracy Accuracy Recall Recall Room Room- -wise Accuracy Accuracy wise Hyperparameter Hyperparameter Adam Adam 3% / (no gain) 6% Batch size Batch size (32 vs 16) (32 vs 16) 11% 40% 50% The Triplet Accuracy with different hyperparameters Combined loss Combined loss / / / UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  14. Future Works to be Completed in Final Report - Model improvements Model improvements - Input with polar coordinate coefficient - Comparison schemes Comparison schemes - For evaluating similarity - Dynamic Time Warping (DTW) UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  15. References [1] [1] J. Y. Park, B. Lasternas, and A. Aziz. Data between AHU and VAV Terminal Unit between AHU and VAV Terminal Unit. 2018. Data- -Driven Framework to Find the Physical Association Driven Framework to Find the Physical Association [2] [2] Koh et al. An Integration, Benchmark, and Development Framework for Metadata Normalization An Integration, Benchmark, and Development Framework for Metadata Normalization Methods. Methods. 2016 Buildsys [3] [3] Gharghabi et al. Mpdist: A novel time Mpdist: A novel time- -series distance measure to allow data mining in more series distance measure to allow data mining in more challenging scenarios challenging scenarios. 2018 ICDM [4] [4] Hong et al. Relation Inference among Sensor Time Series in Smart Buildings with Metric Learning Relation Inference among Sensor Time Series in Smart Buildings with Metric Learning. 2019 AAAI [5] Deb et al. A fast and elitist multiobjective genetic algorithm: Nsga A fast and elitist multiobjective genetic algorithm: Nsga- -ii ii. 2002 IEEE Transactions on Evolutionary Computation UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  16. Code explanation & Demo Code explanation & Demo UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  17. The Code - stn.yaml FFT config FFT config CNN config CNN config window_size: 200 epoch: 1 dropout: 0.2 k_coefficient: 32 batch_size: 8 / 16 / 32 grad_norm: 0 stride: 20 optim: 'Adam' / SGD weight_decay: 0.0001 interval: 5 learning_rate: 0.0001 loss: 'comb' / triplet max_length: 130000 UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  18. The Code - Reading the Data Reading in the dataset is done and its functions are as follows: def read_ahu_csv(path, column) def read_colocation_data(config) def read_csv(path, config) def read_ground_truth(path) def read_facility_vav(facility_id, mapping) def read_facility_ahu(facility_id, ahu_list) def read_vav_csv(path, column) UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  19. Machine Learning Neural Network Architecture - 1 self self.conv1: .conv1: - 1D convolution (1D because kernel goes through all of frequency spectrum) - 256 output channels, with a kernel size of 8, and a stride of 1 Activation Function: - ReLU (standard) 1D Maxpool - kernel size of 8, stride of 2 - - UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  20. Machine Learning Neural Network Architecture - 2 self self.conv2: .conv2: - - - 1D convolution (384 output channels, kernel size 7, stride 1) ReLU activation 1D Maxpool (kernel size 3, stride 2) self self.conv3: .conv3: - - - 1D convolution (128 output channels, kernel size 6, stride 1) ReLU activation 1D Maxpool (kernel size 3, stride 2) UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  21. Machine Learning Neural Network Architecture - 3 self self.conv4 .conv4: - 1D convolution with 1 output channel, kernel size of 1 This is a 1x1x128 convolution: - Takes all the input channels, and does a dot product between each stacked channel. - self self.dropout1: .dropout1: - Puts in a dropout rate specified by input parameter dropout_rate UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

  22. Machine Learning Neural Network Architecture - 4 def forward(self,x): x = self.conv1(x) > self.conv2(x) > self.conv3(x) > self.conv4(x) x = x.view(x.size(0), -1) x = self.dropout1(x) norm = x.norm(dim = 1, p = 2, keepdim = True) x = x.div(norm.expand_as(x)) return x (a 1D tensor return) UCSD- Electrical Engineering - Spring 2020 - ECE228 - Relation Inference Presentation

Related


More Related Content