Malaysian Sign Language Detection with Convolutional Neural Network for Disabilities
DOI:
https://doi.org/10.37934/araset.58.1.294302Keywords:
Malaysia sign language, Convolutional Neural Network, web-based application, TensorFlowAbstract
Sign language is the main form of communication used by deaf people. Most of their activities like speaking and learning involved sign language. In Malaysia, instead of a deaf person, only a few people know sign language because it's not part of the co-curriculum. Hence, normal people need to know sign language so that they can communicate with deaf people effectively. However, learning sign language requires a lot of effort and takes years of study. Therefore, this research tends to initiate the sign language image detection system that utilizes the deep learning model. The initial effort involves the development of an image detection system that can identify the alphabet of sign language from A to Z. There will be 26 alphabets that contribute to 26 classes of image detection using the Convolutional Neural Network. 85 images will be collected for each alphabet and the total data will be distributed into 80% and 20% for training and testing the CNN model, respectively. The result showed that the proposed system produced more than 95% detection accuracy of the alphabet based on finger sign gestures. For future works, the proposed system will be evaluated with a more complex dataset in terms of words.