作者
Abu Saleh Musa Miah, Md Al Mehedi Hasan, Yuichi Okuyama, Yoichi Tomioka, Jungpil Shin
发表日期
2024/6
期刊
Pattern Analysis and Applications
卷号
27
期号
2
页码范围
37
出版商
Springer London
简介
Automatic sign language recognition (SLR) stands as a vital aspect within the realms of human–computer interaction and computer vision, facilitating the conversion of hand signs utilized by individuals with significant hearing and speech impairments into equivalent text or voice. Researchers have recently used hand skeleton joint information instead of the image pixel due to light illumination and complex background-bound problems. However, besides the hand information, body motion and facial gestures play an essential role in expressing sign language emotion. Also, a few researchers have been working to develop an SLR system by taking a multi-gesture dataset, but their performance accuracy and time complexity are not sufficient. In light of these limitations, we introduce a spatial and temporal attention model amalgamated with a general neural network designed for the SLR system. The main idea of our …
引用总数
学术搜索中的文章