Malaysian Sign Language Detection with Convolutional Neural Network for Disabilities

Authors

  • Iffat Adib Kurnia Mohd Ramzi Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia
  • Mohd Syahril Ramadhan Mohd Saufi Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia
  • Mohd Zarhamdy Md. Zain Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia
  • Mastura Ab Wahid Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia
  • Mat Hussin Ab Talib Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia
  • Nur Fathiah Waziralilah Faculty of Mechanical Engineering, Alfaisal University, Riyadh 11533, Saudi Arabia
  • Kamarul Arifin Hassan Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

DOI:

https://doi.org/10.37934/araset.58.1.294302

Keywords:

Malaysia sign language, Convolutional Neural Network, web-based application, TensorFlow

Abstract

Sign language is the main form of communication used by deaf people. Most of their activities like speaking and learning involved sign language. In Malaysia, instead of a deaf person, only a few people know sign language because it's not part of the co-curriculum. Hence, normal people need to know sign language so that they can communicate with deaf people effectively. However, learning sign language requires a lot of effort and takes years of study. Therefore, this research tends to initiate the sign language image detection system that utilizes the deep learning model. The initial effort involves the development of an image detection system that can identify the alphabet of sign language from A to Z. There will be 26 alphabets that contribute to 26 classes of image detection using the Convolutional Neural Network. 85 images will be collected for each alphabet and the total data will be distributed into 80% and 20% for training and testing the CNN model, respectively. The result showed that the proposed system produced more than 95% detection accuracy of the alphabet based on finger sign gestures. For future works, the proposed system will be evaluated with a more complex dataset in terms of words.

Downloads

Download data is not yet available.

Author Biographies

Iffat Adib Kurnia Mohd Ramzi, Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

muhammadiffat@graduate.utm.my

Mohd Syahril Ramadhan Mohd Saufi, Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

syahrilramadhan@mail.fkm.utm.my

Mohd Zarhamdy Md. Zain, Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

zarhamdy@utm.my

Mastura Ab Wahid, Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

masturawahid@utm.my

Mat Hussin Ab Talib, Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

mathussin@utm.my

Nur Fathiah Waziralilah, Faculty of Mechanical Engineering, Alfaisal University, Riyadh 11533, Saudi Arabia

tiah14@gmail.com

Kamarul Arifin Hassan, Faculty of Mechanical Engineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor, Malaysia

kamarularifin@utm.my

Downloads

Published

2024-10-18

How to Cite

Mohd Ramzi, I. A. K., Mohd Saufi, M. S. R., Md. Zain, M. Z., Ab Wahid, M., Ab Talib, M. H., Waziralilah, N. F., & Hassan, K. A. (2024). Malaysian Sign Language Detection with Convolutional Neural Network for Disabilities. Journal of Advanced Research in Applied Sciences and Engineering Technology, 294–302. https://doi.org/10.37934/araset.58.1.294302

Issue

Section

Articles