A Novel IoT-Perceptive Human Activity Recognition (HAR) Approach Using Multi-Head Convolutional Attention - Publikacja - MOST Wiedzy

Wyszukiwarka

A Novel IoT-Perceptive Human Activity Recognition (HAR) Approach Using Multi-Head Convolutional Attention

Abstrakt

Together with fast advancement of the Internet of Things (IoT), smart healthcare applications and systems are equipped with increasingly more wearable sensors and mobile devices. These sensors are used not only to collect data, but also, and more importantly, to assist in daily activity tracking and analyzing of their users. Various human activity recognition (HAR) approaches are used to enhance such tracking. Most of the existing HAR methods depend on exploratory case-based shallow feature learning architectures, which straggle with correct activity recognition when put into real life practice. To tackle this problem, we propose a novel approach that utilizes the convolutional neural networks (CNNs) and the attention mechanism for HAR. In the presented method, the activity recognition accuracy is improved by incorporating attention into multi-head convolutional neural networks for better feature extraction and selection. Proof of concept experiments are conducted on a publicly available dataset from Wireless Sensor Data Mining (WISDM) laboratory. The results demonstrate higher accuracy of our proposed approach in comparison with the current methods.

Cytowania

  • 1 1 8

    CrossRef

  • 0

    Web of Science

  • 1 1 6

    Scopus

Autorzy (5)

Cytuj jako

Pełna treść

pobierz publikację
pobrano 197 razy
Wersja publikacji
Accepted albo Published Version
Licencja
Copyright (2020 IEEE)

Słowa kluczowe

Informacje szczegółowe

Kategoria:
Publikacja w czasopiśmie
Typ:
artykuły w czasopismach
Opublikowano w:
IEEE Internet of Things Journal nr 7, strony 1072 - 1080,
ISSN: 2327-4662
Język:
angielski
Rok wydania:
2019
Opis bibliograficzny:
Zhang H., Xiao Z., Wang J., Li F., Szczerbicki E.: A Novel IoT-Perceptive Human Activity Recognition (HAR) Approach Using Multi-Head Convolutional Attention// IEEE Internet of Things Journal -Vol. 7,iss. 2 (2019), s.1072-1080
DOI:
Cyfrowy identyfikator dokumentu elektronicznego (otwiera się w nowej karcie) 10.1109/jiot.2019.2949715
Bibliografia: test
  1. Atzori L, Iera A, Morabito G. The internet of things: A survey. Computer networks. 2010 Oct 28;54(15):2787-2805. otwiera się w nowej karcie
  2. Ashton K. That 'internet of things' thing. RFID journal. 2009 Jun 22;22(7):97-114. otwiera się w nowej karcie
  3. Kortuem G, Kawsar F, Sundramoorthy V, Fitton D. Smart objects as building blocks for the internet of things. IEEE Internet Computing. 2009 Dec 1;14(1):44-51. otwiera się w nowej karcie
  4. Perera C, Zaslavsky A, Christen P, Georgakopoulos D. Context aware computing for the internet of things: A survey. IEEE communications surveys & tutorials. 2014 Jan 1;16(1):414-454. otwiera się w nowej karcie
  5. Lu W, Fan F, Chu J, Jing P, Su Y. Wearable Computing for Internet of Things: A Discriminant Approach for Human Activity Recognition. In Proc., IEEE Internet of Things Journal. 2018. otwiera się w nowej karcie
  6. Wang S, Zhou G. A review on radio based activity recognition. Digital Communications and Networks. 2015 Feb 1;1(1):20-29. otwiera się w nowej karcie
  7. Kianoush S, Savazzi S, Vicentini F, Rampa V, Giussani M. Device-free RF human body fall detection and localization in industrial workplaces. IEEE Internet of Things Journal. 2017 Apr; 4(2):351-362. otwiera się w nowej karcie
  8. Liu J, Wang G, Duan LY, Abdiyeva K, Kot AC. Skeleton-based human action recognition with global context-aware attention LSTM networks. IEEE Transactions on Image Processing. 2018 Apr; 27(4):1586-1599. otwiera się w nowej karcie
  9. Gu F, Khoshelham K, Valaee S, Shang J, Zhang R. Locomotion activity recognition using stacked denoising autoencoders. IEEE Internet of Things Journal. 2018 Jun;5(3):2085-2093. otwiera się w nowej karcie
  10. Lin J, Keogh E, Lonardi S, Chiu B. A symbolic representation of time series, with implications for streaming algorithms. InProceedings of the 8th ACM SIGMOD workshop on Research issues in data mining and knowledge discovery 2003 Jun 13 (pp. 2-11). ACM. otwiera się w nowej karcie
  11. Huynh T, Schiele B. Analyzing features for activity recognition. InProceedings of the 2005 joint conference on Smart objects and ambient intelligence: innovative context-aware services: usages and technologies 2005 Oct 12 (pp. 159-163). ACM. otwiera się w nowej karcie
  12. Bulling A, Blanke U, Schiele B. A tutorial on human activity recognition using body-worn inertial sensors. ACM Computing Surveys (CSUR). otwiera się w nowej karcie
  13. Yang, J.B., Nguyen, M.N., San, P.P., Li, X.L., Krishnaswamy, S.. Deep Convolutional Neural Networks On Multichannel Time Series For Human Activity Recognition. In Proceedings of the 24th International Joint Conference on Artificial Intelligence (IJCAI), Buenos Aires, Argentina, 25-31 July 2015; pp. 3995-4001. otwiera się w nowej karcie
  14. LeCun, Y. et al. Handwritten digit recognition with a back-propagation network. In Proc. Advances in Neural Information Processing Systems 396-404 (1990).
  15. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I. Attention is all you need. InAdvances in neural information processing systems 2017, pp. 5998-6008.
  16. Y. LeCun, Y. Bengio, and G. Hinton, "Deep learning," Nature, vol. 521, no. 7553, pp. 436-444, 2015. otwiera się w nowej karcie
  17. S. Ö. Arık, H. Jun and G. Diamos, "Fast Spectrogram Inversion Using Multi-Head Convolutional Neural Networks," in IEEE Signal Processing Letters, vol. 26, no. 1, pp. 94-98, Jan. 2019.
  18. Larochelle, Hugo and Hinton, Geoffrey E. Learning to combine foveal glimpses with a third-order boltzmann machine. In NIPS, pp. 1243-1251, 2010.
  19. H. Du and J. Qian, "Hierarchical Gated Convolutional Networks with Multi-Head Attention for Text Classification," 2018 5th International Conference on Systems and Informatics (ICSAI), Nanjing, 2018, pp. 1170-1175. otwiera się w nowej karcie
  20. Zhu, Y., Zhao, C., Guo, H., Wang, J., Xu, Z., & Lu, H.. Attention couplenet: fully convolutional attention coupling network for object detection. IEEE Transactions on Image Processing, VOL. 28, NO. 1, pp. 113-126, JAN. 2019. otwiera się w nowej karcie
  21. Kelvin Xu, Jimmy Ba, Ryan Kiros, Kyunghyun Cho, Aaron Courville, Ruslan Salakhudinov, Rich Zemel, and Yoshua Bengio. Show, attend and tell: Neural image caption generation with visual attention. In International conference on machine learning, pages 2048-2057, 2015.
  22. J. K. Chorowski, D. Bahdanau, D. Serdyuk, K. Cho, and Y. Bengio, "Attention-based models for speech recognition," in Proc. Adv. Neural Inf. Process. Syst., pp. 577-585, 2015.
  23. Dzmitry Bahdanau, Kyunghyun Cho, and Yoshua Bengio. Neural machine translation by jointly learning to align and translate. arXiv preprint arXiv:1409.0473, 2014. otwiera się w nowej karcie
  24. Zichao Yang, Xiaodong He, Jianfeng Gao, Li Deng, and Alex Smola. Stacked attention networks for image question answering. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 21-29, 2016. otwiera się w nowej karcie
  25. Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan N Gomez, Łukasz Kaiser, and Illia Polosukhin, "Attention is all you need," in Advances in Neural Information Processing Systems, 2017, pp. 5998-6008.
  26. O. D. Incel, M. Kose, and C. Ersoy, "A review and taxonomy of activity recognition on mobile phones," J. Bionanosci., vol. 3, no. 2, pp. 145-171, 2013. otwiera się w nowej karcie
  27. J. R. Kwapisz, G. M. Weiss, and S. A. Moore, "Activity recognition using cell phone accelerometers," ACM SigKDD Explorations Newsletter, pp. 74-82, 2011. otwiera się w nowej karcie
  28. E. Zdravevski, P. Lameski, V. Trajkovik, A. Kulakov, I. Chorbev, R. Goleva, N. Pombo, and N. Garcia, "Improving activity recognition accuracy in ambient assisted living systems by automated feature engineering," IEEE Access, pp.1-17, 2017. otwiera się w nowej karcie
Weryfikacja:
Politechnika Gdańska

wyświetlono 181 razy

Publikacje, które mogą cię zainteresować

Meta Tagi