Human Activity Recognition Using Multi-Sensor Data Fusion and CNNs

human activity recognition from multiple sensors n.w
1 / 22
Embed
Share

Explore how Human Activity Recognition (HAR) is enhanced through the fusion of data from multiple sensors and Convolutional Neural Networks (CNNs). This study details the methodology, experiments, results, and related works in HAR, emphasizing the importance of sensor fusion for accurate activity identification in various applications such as health monitoring, assisted living, sports, and surveillance.

  • Activity Recognition
  • Sensor Fusion
  • CNNs
  • Multi-Sensor Data
  • HAR

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. Human Activity Recognition from Multiple Sensors Data Using Multi-fusion Representations and CNNs FARZAN MAJEED NOORI, Department of Informatics, University of Oslo, Norway MICHAEL RIEGLER, Simula Metropolitan Center for Digitalization and Kristiania University College, Norway MD ZIA UDDIN, Department of Informatics, University of Oslo, Norway JIM TORRESEN, Department of Informatics and RITMO, University of Oslo, Norway 1

  2. Index Introduction Overview Methodology Experiments Results Related Works Conclusions 2 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  3. Human Activity Recognition (HAR) To identify daily life activities such as sitting, standing, running, walking, and so on. Scopes HAR in health monitoring systems Ambient assisted living Detecting activities during sports Surveillance 3 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  4. What is Data Fusion Data Fusion is the combining of sensory data or data derived from sensory data such that the resulting information is in some sense better when these sources were used individually Elmenreich, Wilfried. "An introduction to sensor fusion." Vienna University of Technology, Austria 502 (2002): 1-28. 4 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  5. Why Sensors Data Fusion When the sensor cannot measure all relevant attributes Single sensor modality to detect human actions is not very robust Health or activity monitor systems having several sensors can be more useful to discriminate complex activities 5 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  6. Multi-modal Sensor Fusion (1\2) The combination of how to combine data from multiple (and possible diverse) sensors in order to make inferences about a physical event, activity, or situation OR Multiple sensors measuring different aspects to increase reliability and decrease vulnerability 6 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  7. Multi-modal Sensor Fusion (2\2) Data-level-fusion: raw data from several modalities would be fused before putting into the classifier Feature-level-fusion: Combined the features of gyroscope and accelerometer to make the system more robust to recognize human activities Decision-level-fusion: Random Forest classifier was used after extracting classification results form each representation of data 7 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  8. Single sensory data/baseline & Data-level fusion Single Sensor Data Pre- CNN Output Processing Pre- Data-Level Fusion CNN Multi-sensors Output Processing 8 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  9. Feature-level Fusion & Decision-level fusion CNN 1st Representation Multi-sensors\ Multi- Representations Pre- Output Processing CNN nth Representation Feature Level Fusion Hidden Layers CNN 1st Representation Initial Classifier Pre- Multi-sensors\ Multi- Representations Output Processing CNN nth Representation Decision-level fusion Initial Classifier 9 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  10. Methodology (Distance Matrices and Image Representations) RPs as a tool: A recurrence of a state at time i at a different time j is marked within a two- dimensional squared matrix with ones and zeros where both axes represent time: A distance matrix D contains the distance between each pair of points, and used as the first image type representation second image representation, the data extracted from x, y, and z axes of the accelerometer was stacked to generate the images 10 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  11. Datasets and Metrics Three datasets: Wireless Sensor Data Mining (WISDM) versions 1.1 and 2.0 Context-Awareness via Wrist-Worn Motion Sensors (HANDY) 11 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  12. Datasets and Metrics (Data Level Fusion) 12 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  13. Datasets and Metrics Measure the performance: Accuracy= Precision= Recall= 13 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  14. Datasets and Metrics (HANDY Dataset Analysis) Experiments using accelerometer data and magnetometer data for different representations xyz images of accelerometer and magnetometer data distance matrices of accelerometer and magnetometer data xyz images and distance matrices of accelerometer data xyz images and distance matrices of magnetometer data Combination of all four representations 14 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  15. Results (Comparison with previous studies ) 15 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  16. Results (Precision and Recall of All Fusion Approaches for Individual Activities of WISDM Version 1.1 ) 16 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  17. Results (Precision and Recall of All Fusion Approaches for Individual Activities of WISDM Version 1.1 ) 17 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  18. Results 18 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  19. Results 19 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  20. Results 20 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  21. Conclusions Two Matrix representation present Distance matrix representation, image representation (the raw x,y, and z axes values) The paper examined three data fusion approach Data level fusion, Feature Level fusion, Decision-level fusion Three different datasets (collected using smartphones and wristwatches) were used to validate the novelty and performance Two different representations were fused using data-level-fusion, feature- level-fusion, and decision Level-fusion Our approach also showed significant results in an uncontrolled environment (i.e., WISDM version 2.0 dataset) 21 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

  22. Motivation and Step 3 Step 2 Step 1 22 23-June-21 ACM Transactions on Multimedia Computing Communications and Applications (TOMM 2020)

Related


More Related Content