In recent years, hand gesture recognition has
been used in a variety of fields, especially in the area of
man-machine interaction (MMI), where it is regarded as
a more natural and versatile input than conventional
input devices such as mice and keyboards. Since there is
a high distance between the user and the machine, using
a physical controlling device such as a keyboard and
mouse for human interaction with the computer hinders
the normal interface. Our goal is to solve this problem by
developing an application that uses hand movements to
monitor some of the basic computer functions through
an integrated webcam. To make our tasks easy, a Hand
Gesture Recognition device senses gestures and converts
them to specific actions. With the aid of the Jester
dataset, a model can be created using a 3D convolutional
neural network and deep learning, which will be
interfaced using Django, React.JS, and Electron. The
key outcome predicted is that the user, using hand
gestures, would be able to monitor the system's basic
functions, providing them with the ultimate convenience.
Keywords : Human-Computer Interaction, Jester Dataset, 3D Convolutional Neural Network, Deep Learning