Real-time closed-eye-based communication system using EEG sensors and auditory feedback
-
1
Tokyo Institute of Technology, Department of Information Processing, Japan
-
2
Tokyo Institute of Technology, Japan
-
3
Tokyo Institute of Technology, Japan
-
4
Tokyo Institute of Technology, Japan
-
5
Tokyo Institute of Technology, Japan
-
6
Tokyo Institute of Technology, Japan
Brain-computer interfaces based on visual information feedback may lead to user’s eye fatigue. To avoid exhausting the user, we present real-time control with five commands for asynchronous and non-invasive closed-eye-based communication system using two temporal EEG sensors.
Five participants were asked to move a cursor on the screen in five positions (up, down, left, right, and center) with closed eyes to show the possibility of sending commands in real-time with auditory feedback. For data analysis, we used Haar wavelet to detect the instance of eye movement and four time- series features to distinguish between five classes of eye movement. The four features were maximum wavelet coefficient, area under the curve, amplitude, and velocity of the time series corresponding to the samples at around minimum or maximum values of the negative and positive peak waves, respectively. The classification results of the proposed algorithm were translated into real-time commands to control a cursor on the screen with closed eyes. Using auditory feedback, a mean classification accuracy of 80.2% was obtained for control with five commands.
Through this work, we can help not only handicapped people but also the blind persons to use their eye movements with auditory feedback for controlling smart-home applications. For able-bodied users, the idea of sending commands with closed eyes can decrease the fatigue issue related to rich detailed visual environments. The visual information feedback can be replaced by information from the tactile, olfactory, or auditory senses in some applications such as the case of reducing or increasing the room temperature and the volume of music.
Conference:
2015 International Workshop on Clinical Brain-Machine Interfaces (CBMI2015), Tokyo, Japan, 13 Mar - 15 Mar, 2015.
Presentation Type:
Poster 2-1
Topic:
Clinical Brain-Machine Interfaces
Citation:
Belkacem
AN,
Yoshimura
N,
Shin
D,
Kambara
H and
Koike
Y
(2015). Real-time closed-eye-based communication system using EEG sensors and auditory feedback.
Conference Abstract:
2015 International Workshop on Clinical Brain-Machine Interfaces (CBMI2015).
doi: 10.3389/conf.fnhum.2015.218.00004
Copyright:
The abstracts in this collection have not been subject to any Frontiers peer review or checks, and are not endorsed by Frontiers.
They are made available through the Frontiers publishing platform as a service to conference organizers and presenters.
The copyright in the individual abstracts is owned by the author of each abstract or his/her employer unless otherwise stated.
Each abstract, as well as the collection of abstracts, are published under a Creative Commons CC-BY 4.0 (attribution) licence (https://creativecommons.org/licenses/by/4.0/) and may thus be reproduced, translated, adapted and be the subject of derivative works provided the authors and Frontiers are attributed.
For Frontiers’ terms and conditions please see https://www.frontiersin.org/legal/terms-and-conditions.
Received:
23 Apr 2015;
Published Online:
29 Apr 2015.
*
Correspondence:
Dr. Abdelkader N Belkacem, Tokyo Institute of Technology, Department of Information Processing, Yokohama, Kanagawa, Japan, belkacem011@hotmail.com
Dr. Natsue Yoshimura, Tokyo Institute of Technology, Yokohama, Kanagawa, Japan, yoshimura@cns.pi.titech.ac.jp
Dr. Duk Shin, Tokyo Institute of Technology, Yokohama, Kanagawa, Japan, shinduk@cns.pi.titech.ac.jp
Dr. Hiroyuki Kambara, Tokyo Institute of Technology, Yokohama, Kanagawa, Japan, hkambara@hi.pi.titech.ac.jp
Dr. Yasuharu Koike, Tokyo Institute of Technology, Yokohama, Kanagawa, Japan, koike@pi.titech.ac.jp