Project Description:
Human beings communicate through language, be it verbal or be it a sign language that makes use of body motion. Hearing and Speech impaired people, having no way to communicate verbally, make use of Sign Language. They perform gestures using a sign language in order to convey their message and effectively communicate with each other. Since, not everyone knows about Indian Sign Language (ISL), it becomes difficult for normal people to fluently communicate with Hearing and Speech impaired community. This project proposes ISL gesture recognition system in order to decrease this communication gap. The dataset consists of videos of ISL gestures, which are performed by different Subjects. The proposed system uses OpenPose library, which helps in creating the skeleton of human body and thus it provides keypoints of the whole human body frame by frame. The use of this library removes the dependency on lighting conditions and background. It helps in focusing on just the gesture movements. After extracting the keypoints, Long Short Term Memory (LSTM) is used for classification of gestures. LSTM model classifies which ISL gesture the particular video belongs to.