One of the best ways of communication between Deaf people themselves and normal people is based on sign language or so-called hand gestures. In Arab society, only deaf people and specialists could deal with Arabic sign language, which makes the deaf community narrow and thus communicating with normal people is difficult. In addition to that, studying the problem of Arabic sign language recognition (ArSLR) has been paid attention recently, which emphasizes the necessity of investigating other approaches for such problem. This paper proposes a novel ArSLR scheme based on an unsupervised deep learning algorithm, a deep belief network (DBN) coupled with a direct use of tiny images in particular, to recognize and classify Arabic alphabetical letters. The use of deep learning contributed to extracting the most important features that are sparsely represented and played an important role in simplifying the overall recognition task. In total, around 6,000 samples of the 28 Arabic alphabetic signs have been used after resizing and normalization for features extraction. The classification process was investigated using a simple classifier, a softmax regression, and achieved an overall accuracy of 95.6%, showing a high reliability of the DBN-based Arabic alphabetical characters recognition.
Key words: Arabic sign language, sign language recognition, deep belief network, softmax regression, classification.
|