AUTHOR=Li Jie , Zhong Junpei , Wang Ning TITLE=A multimodal human-robot sign language interaction framework applied in social robots JOURNAL=Frontiers in Neuroscience VOLUME=Volume 17 - 2023 YEAR=2023 URL=https://www.frontiersin.org/journals/neuroscience/articles/10.3389/fnins.2023.1168888 DOI=10.3389/fnins.2023.1168888 ISSN=1662-453X ABSTRACT=This paper presents a multimodal Chinese sign language (CSL) gesture recognition framework, which can be used for human-robot interaction (HRI) between social robots and deaf-mutes. The CSL gesture information including both static and dynamic gestures is captured from two different modal sensors. A wearable Myo armband and a Leap Motion sensor are used to collect human arm surface electromyography (sEMG) signals and hand 3D vectors, respectively. Two modalities of gesture datasets are preprocessed and fused to improve the recognition accuracy and to reduce the processing time cost of the network before sending it to the classifier. Since the input datasets of the proposed framework are temporal sequence gestures, the long-short term memory recurrent neural network is used to classify these input sequences. Comparative experiments are performed on an NAO robot to test our method. Moreover, our method can effectively improve CSL gesture recognition accuracy, which has potential applications in a variety of gesture interaction scenarios not only in social robots.