Open Access   Article Go Back

Comparative Study of the Deep Learning Neural Networks on the basis of the Human Activity Recognition

Saurav Singla1 , Anjali Patel2

Section:Research Paper, Product Type: Journal Paper
Volume-8 , Issue-11 , Page no. 27-32, Nov-2020

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v8i11.2732

Online published on Nov 30, 2020

Copyright © Saurav Singla, Anjali Patel . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Saurav Singla, Anjali Patel, “Comparative Study of the Deep Learning Neural Networks on the basis of the Human Activity Recognition,” International Journal of Computer Sciences and Engineering, Vol.8, Issue.11, pp.27-32, 2020.

MLA Style Citation: Saurav Singla, Anjali Patel "Comparative Study of the Deep Learning Neural Networks on the basis of the Human Activity Recognition." International Journal of Computer Sciences and Engineering 8.11 (2020): 27-32.

APA Style Citation: Saurav Singla, Anjali Patel, (2020). Comparative Study of the Deep Learning Neural Networks on the basis of the Human Activity Recognition. International Journal of Computer Sciences and Engineering, 8(11), 27-32.

BibTex Style Citation:
@article{Singla_2020,
author = {Saurav Singla, Anjali Patel},
title = {Comparative Study of the Deep Learning Neural Networks on the basis of the Human Activity Recognition},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {11 2020},
volume = {8},
Issue = {11},
month = {11},
year = {2020},
issn = {2347-2693},
pages = {27-32},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=5257},
doi = {https://doi.org/10.26438/ijcse/v8i11.2732}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v8i11.2732}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=5257
TI - Comparative Study of the Deep Learning Neural Networks on the basis of the Human Activity Recognition
T2 - International Journal of Computer Sciences and Engineering
AU - Saurav Singla, Anjali Patel
PY - 2020
DA - 2020/11/30
PB - IJCSE, Indore, INDIA
SP - 27-32
IS - 11
VL - 8
SN - 2347-2693
ER -

VIEWS PDF XML
799 18280 downloads 141 downloads
  
  
           

Abstract

The Human Activity Recognition using the Signal produced by the Sensors have a number of applications in field of the fitness and health. The Human activities are recorded with the help of the various types of the sensor embedded in a wearable device or in a smartphone. There are many research works have been done for the Human Activity Recognition using the machine-learning as well as deep-learning models, but there is requirement to find out that which model is more efficient for a specific dataset, for which the comparative study of the model comes in mind. In this research paper the comparative study of three most efficient Deep Learning models LSTM-RNN, GRU-RNN and CNN has been performed on the most famous dataset ‘Human Activity Recognition Using Smartphones Data Set’ present at UCI machine-learning repository. ‘LSTM-RNN’ is abbreviated for ‘Long Short-Term Memory-Recurrent Neural Network’ is an updated version of the recurrent neural network based on the concept of back-propagation, is capable of remembering the dependencies for comparatively longer time-span. ‘GRU-RNN’ is abbreviated for ‘Gated Recurrent Units-Recurrent Neural Network’ is also an updated version of the recurrent neural network based on the concept of back-propagation, with fewer parameters than LSTM-RNN. ‘CNN’ is abbreviated for ‘Convolutional Neural Network’ is a feed forward Neural network using Convolutional layers for feature-extraction and fully-connected layer for classification.

Key-Words / Index Term

Human Activity Recognition (HAR), LSTM-RNN, GRU-RNN, CNN

References

[1] S. S. Anju, K. V. Kavitha, “Performance Evolution of Varoius Machine Learning Technique for Human Activity Recognition using Smartphone”, Vol.7, Issue.8, pp.316-319, 2019.
[2] S. R. Ramammurthy, N. Roy, “Recent trends in machine learning for human activity recognition- A Survey”, Wiley Interdisciplinary Reviews: Data Minining and Knowledge Discovery, 2018, doi: 10. 1002/widm. 1254.
[3] S. Wan, Q. Lianyong, X. Xiolong, T. Chao, G. Zonhua, “Deep Learning Models for Real-time Human Recognition with Smartphones”, Mobile Networks and applications, 2019, doi: 10. 1007/S11036. 019- 01445- x.
[4] G.A. Kani, P. Geetha, A. Gomathi, “Human Activity recognition using deep learning with Ggradient Fused Handcrafted Features and Categorization based on Machine Learning Technique”, International Journal of Computer Sciences and Engineering, Vol.6, Issue.7, pp.1-7, 2018.
[5] N. Geeta, E.S. Samundeeswari, “A Review on Human Activity Recognition System”, International Journal of Computer Sciences and Engineering, Vol.6, Issue.12, pp.825-829, 2018.
[6] M. Badshah, “Sensor-based Human Activity Recognition using Smartphones”, Sensors, 2019, doi: 10.31979/etd.8fjc-drpn.
[7] R. Dey, F.M. Salem, “Gate-Variants of Gated Recurrent Unit (GRU) Neural Networks”, journal of IEEE Acess,2017, doi: 10.1109/MWSAS.2017.8053243.
[8] W. Jiang, Z. Yin, and., “ Human activity recognition using wearable sensors by deep convolutional neural networks.”, In Proceedings of the 23rd ACM international conference on Multimedia (pp. 1307-1310), 2015.
[9] H. Cho, S.M. Yoon, “Divide and Conquer based 1D CNN Human Activity Recognition using test data Sharpening”, Sensors, 2018, 1055, doi:10.3390/s 18041005.