Impact Factor 2.606
2017 JCR, Clarivate Analytics 2018

The world's most-cited Neurosciences journals

Methods ARTICLE Provisionally accepted The full-text will be published soon. Notify me

Front. Neurorobot. | doi: 10.3389/fnbot.2019.00007

An EEG/EMG/EOG-Based Multimodal Human- machine Interface to Real-time Control of a Soft Robot Hand

 Jinhua Zhang1,  Baozeng Wang1, Cheng Zhang1,  Yanqing Xiao2* and Michael Y. Wang1, 3
  • 1Key Laboratory of Education Ministry for Modern Design and Rotor-Bearing System, School of Mechanical Engineering, Xi'an Jiaotong University, China
  • 2School of Biological Science and Medical Engineering, Beihang University, China
  • 3HKUST Robotics Institute, Hong Kong University of Science and Technology, Hong Kong

Brain-computer interface (BCI) technology shows potential for application to motor rehabilitation therapies that use neural plasticity to restore motor function and improve quality of life of stroke survivors. However, it is often difficult for BCI systems to provide the variety of control commands necessary for multi-task real-time control of soft robot naturally. In this study, a novel multimodal human-machine interface system (mHMI) is developed using combinations of electrooculography (EOG), electroencephalography (EEG), and electromyogram (EMG) to generate numerous control instructions. Moreover, we also explore subject acceptance of an affordable wearable soft robot to move basic hand actions during robot-assisted movement. Six healthy subjects separately perform left and right hand motor imagery, looking-left and looking-right eye movements, and different hand gestures in different modes to control a soft robot in a variety of actions. The results indicate that the number of mHMI control instructions is significantly greater than achievable with any individual mode. Furthermore, the mHMI can achieve an average classification accuracy of 93.83% with the average information transfer rate of 47.41 bits/min, which is entirely equivalent to a control speed of 17 actions per minute. The study is expected to construct a more user-friendly mHMI for real-time control of soft robot to help healthy or disabled persons perform basic hand movements in friendly and convenient way.

Keywords: electroencephalogram (EEG), electromyogram (EMG), electrooculogram (EOG), multimodal human-machine interface (mHMI), Soft robot hand

Received: 18 Aug 2018; Accepted: 28 Feb 2019.

Edited by:

Feihu Zhang, Northwestern Polytechnical University, China

Reviewed by:

Jing Jin, East China University of Science and Technology, China
Rong Song, Sun Yat-sen University, China  

Copyright: © 2019 Zhang, Wang, Zhang, Xiao and Wang. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

* Correspondence: Ms. Yanqing Xiao, School of Biological Science and Medical Engineering, Beihang University, Beijing, China, xioyanjingx@126.com