Volume 20 No 22 (2022)
 Download PDF
Modelling and Optimization of Hand Gesture Recognition using a Transfer- Learning-based Convolutional Neural Network
Sunil G. Deshmukh, Shekhar M. Jagade
Abstract
The ability of communicating effectively is an essential component of our daily life. Due to their inability for speaking and listening, those who are deaf or dumb have a difficult time interacting with others. Utilization of hand gestures, often known as sign language, is probably of the best and well recognized methods. It is crucial to establish programs that can recognize sign language movements and gestures so that deaf and dumb persons may facilitate communication with even those who don't grasp sign idioms. The goal of this research is to use sign languages to take a preliminary step in removing the impediment to interact between hearing-impaired and deaf persons. Modern Convolutional Neural Networks (CNN) provide outstanding results when used to handle image recognition problems in computer visuals. Through the use of Transfer Learning, it can be created a comprehensive CNN network in the most efficient method. In this study, a CNN model based on transfer learning was created. The datasets have been working with an aggregate of 48281 images for training and 12071 for validating. In order to compare the pre-trained model with the suggested model, primarily concentrated on the VG16 and ResNet50 networks, a well-known CNN architecture. The VGG16 model that was suggested shown to have the greatest performance, with a training accuracy of 99.94%
Keywords
Convolutional neural network, VGG16, ResNet50, Transfer learning, Training of images, Accuracy, Hand gesture, architectural model.
Copyright
Copyright © Neuroquantology

Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Articles published in the Neuroquantology are available under Creative Commons Attribution Non-Commercial No Derivatives Licence (CC BY-NC-ND 4.0). Authors retain copyright in their work and grant IJECSE right of first publication under CC BY-NC-ND 4.0. Users have the right to read, download, copy, distribute, print, search, or link to the full texts of articles in this journal, and to use them for any other lawful purpose.