Open Access   Article Go Back

Indian Sign Language Recognition for Static and Dynamic Hand Gestures

Manav Prajapati1 , Mitesh Makawana2 , Sahil Hada3

Section:Research Paper, Product Type: Journal Paper
Volume-8 , Issue-9 , Page no. 54-58, Sep-2020

CrossRef-DOI:   https://doi.org/10.26438/ijcse/v8i9.5458

Online published on Sep 30, 2020

Copyright © Manav Prajapati, Mitesh Makawana, Sahil Hada . This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

View this paper at   Google Scholar | DPI Digital Library

How to Cite this Paper

  • IEEE Citation
  • MLA Citation
  • APA Citation
  • BibTex Citation
  • RIS Citation

IEEE Style Citation: Manav Prajapati, Mitesh Makawana, Sahil Hada, “Indian Sign Language Recognition for Static and Dynamic Hand Gestures,” International Journal of Computer Sciences and Engineering, Vol.8, Issue.9, pp.54-58, 2020.

MLA Style Citation: Manav Prajapati, Mitesh Makawana, Sahil Hada "Indian Sign Language Recognition for Static and Dynamic Hand Gestures." International Journal of Computer Sciences and Engineering 8.9 (2020): 54-58.

APA Style Citation: Manav Prajapati, Mitesh Makawana, Sahil Hada, (2020). Indian Sign Language Recognition for Static and Dynamic Hand Gestures. International Journal of Computer Sciences and Engineering, 8(9), 54-58.

BibTex Style Citation:
@article{Prajapati_2020,
author = {Manav Prajapati, Mitesh Makawana, Sahil Hada},
title = {Indian Sign Language Recognition for Static and Dynamic Hand Gestures},
journal = {International Journal of Computer Sciences and Engineering},
issue_date = {9 2020},
volume = {8},
Issue = {9},
month = {9},
year = {2020},
issn = {2347-2693},
pages = {54-58},
url = {https://www.ijcseonline.org/full_paper_view.php?paper_id=5212},
doi = {https://doi.org/10.26438/ijcse/v8i9.5458}
publisher = {IJCSE, Indore, INDIA},
}

RIS Style Citation:
TY - JOUR
DO = {https://doi.org/10.26438/ijcse/v8i9.5458}
UR - https://www.ijcseonline.org/full_paper_view.php?paper_id=5212
TI - Indian Sign Language Recognition for Static and Dynamic Hand Gestures
T2 - International Journal of Computer Sciences and Engineering
AU - Manav Prajapati, Mitesh Makawana, Sahil Hada
PY - 2020
DA - 2020/09/30
PB - IJCSE, Indore, INDIA
SP - 54-58
IS - 9
VL - 8
SN - 2347-2693
ER -

VIEWS PDF XML
429 704 downloads 140 downloads
  
  
           

Abstract

Humans are called as social animals and because of that communication becomes a very integral part of a human being. Humans use verbal and non-verbal forms of speech for communication purposes, but not all humans are capable of verbal speech, for e.g. Deaf and Mute people. Hence, Sign Languages are developed for them, but still there is a hindrance in the communication for them. So, using the hand gestures, this paper presents a system where CNN network is used to for the classification of Alphabets and Numbers. CNN is used because alphabets and number gestures are static gestures in Indian Sign Language and CNNs give very good results for image classification. This uses hand-masked (skin-segmentation) images for training the model. For the dynamic hand gestures, the system uses LSTM network for the classification task. LSTM are well known for accurately predicting the data which is distributed in time-frame. This paper presents two models, CNN and LSTM for predicting different type of hand gestures i.e. static as well as dynamic.

Key-Words / Index Term

Indian Sign Language, CNN, Skin-segmentation, LSTM

References

[1] M. Mohandes, M. Deriche, J. Liu, "Image-Based and Sensor-Based Approaches to Arabic Sign Language Recognition", IEEE Transactions on Human-Machine Systems, Vol.44, Issue.4, pp. 551-557, 2014.
[2] C. Zhu, W. Sheng, "Wearable Sensor-Based Hand Gesture and Daily Activity Recognition for Robot-Assisted Living", IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans, Vol.41, Issue.3, pp.569-573, 2011.
[3] T. Jaya, and V. Rajendran, “Hand-Talk Assistive Technology for the Dumb”, International Journal of Scientific Research in Network Security and Communication (IJSRNSC), Vol.6, Issue.5, pp.27-31, 2018.
[4] L. K. Ramkumar, S. Premchand, G. K. Vijayakumar, “Sign Language Recognition using Depth Data and CNN”, SSRG International Journal of Computer Sciences and Engineering (SSRG - IJCSE), Vol.6, Issue.1, pp.9-14, 2019.
[5] P. Gupta, A. K. Agrawal, S. Fatima, “Sign Language Problem and Solutions for Deaf and Dumb People”, In the Proceedings of the International Conference on System Modeling & Advancement in Research Trends (SMART), Moradabad, India, 2014.
[6] A. S. Nikam, A. G. Ambekar, "Sign Language Recognition Using Image Based Hand Gesture Recognition Techniques", In the Proceedings of the Online International Conference on Green Engineering and Technologies (IC-GET), Coimbatore, India, pp.1-5, 2016.
[7] N. S. Lele, “Image Classification Using Convolutional Neural Network”, International Journal of Scientific Research in Computer Science and Engineering (IJSRCSE), Vol.6, Issue.3, pp.22-26, 2018.
[8] J. L. Raheja, A. Mishra, and A. Chaudhary, "Indian Sign Language Recognition Using SVM", Pattern Recognition and Image Analysis, Vol.26, Issue.2, pp.434-441, 2016.
[9] A. S. Ghotkar, G. K. Kharate, “Study of Vision Based Hand Gesture Recognition Using Indian Sign Language”, International Journal on Smart Sensing and Intelligent Systems, Vol.7, Issue.1, 2014.
[10] K. Simonyan, A. Zisserman, “Two-stream Convolutional Networks for Action Recognition in Videos”, Advances in Neural Information Processing Systems, pp.568-576, 2014.
[11] L. Wang, Y. Xiong, Z. Wang, Y. Qiao, D Lin, X. Tang, and L. Van Gool, "Temporal Segment Networks: Towards Good Practices for Deep Action Recognition", In the Proceedings of the European conference on Computer Vision, pp.20-36. Springer, Cham, 2016.
[12] A. Karpathy, G. Toderici, S. Shetty, T. Leung, R. Sukthankar and L. Fei-Fei, “Large-scale Video Classification with Convolutional Neural Networks”, In the Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp.1725-1732, 2014.
[13] J. Sun, J. Wang, T. C. Yeh, “Video understanding: from video classification to captioning”. In the Proceedings of the Computer Vision and Pattern Recognition, pp.1-9, Stanford University, 2017.
[14] J. Rege, A. Naikdalal, K. Nagar, R. Karani, “Interpretation of Indian Sign Language through Video Streaming”, International Journal of Computer Science and Engineering (IJCSE), Vol.3, Issue.11, pp.58-62, 2015.
[15] Pradip Patel, Narendra Patel, “Vision Based Real-time Recognition of Hand Gestures for Indian Sign Language using Histogram of Oriented Gradients Features”, in International Journal of Next-Generation Computing, Vol. 10, No. 2, July 2019.