Event Abstract

Brain–Robot and Speller Interfaces Using Spatial Multisensory Brain-computer Interface Paradigms

  • 1 University of Tsukuba, Life Science of TARA, Japan

We present several novel approaches to adaptive and multisensory brain–computer interface (BCI) technologies. We review our successful brain—robot and Japanese kana—character—speller paradigms using exogenous (stimulus—driven) BCI paradigms, which are based on spatial auditory, tactile (somatosensory) and visual (code—modulated chromatic visual evoked potential — cVEP) modalities. The user intentions are decoded from the brainwaves in real time and translated to a symbiotic humanoid robot NAO navigation or 48—characters—based Japanese kana—character—speller. In base of the symbiotic humanoid robot NAO navigation communication protocol between the BCI output and the robot is realized in an internet—of—things (IoT) scenario using an user datagram protocol (UDP). This constitutes a direct brain and IoT interfacing paradigm. We also present our novel adaptive EEG preprocessing technique based on a synchrosqueezing transform (SST) applied to artifacts removal. SST outperforms the classical time–frequency analysis methods of the non—linear and non—stationary signals such as EEG. The proposed method is also computationally more effective comparing to the empirical mode decomposition (EMD) data—driven class of methods. The SST filtering allows for online EEG preprocessing application which is essential in the case of the BCI applications. Finally we discuss new machine learning techniques utilizing Riemannian geometry—based information geometry framework to classification of the user intentional responses elucidation from EEG. Results obtained from healthy users reproducing simple brain–robot control and Japanese kana spelling tasks support the research hypotheses of the spatial multsensory BCIs feasibility with support of data—driven and adaptive online signal processing methods.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5

References

1. Chang, M., Nishikawa, N., Struzik, Z.R., Mori, K., Makino, S., Mandic, D., Rutkowski, T.M.: Comparison of P300 responses in auditory, visual and audio- visual spatial speller BCI paradigms. In: Proceedings of the Fifth International Brain-Computer Interface Meeting 2013, p. Article ID: 156. Graz University of Technology Publishing House, Austria, Asilomar Conference Center, Pacific Grove, CA USA, June 3–7, 2013. http://castor.tugraz.at/doku/BCIMeeting2013/156.pdf
2. Chang,M.,Mori,K.,Makino,S.,Rutkowski,T.M.:Spatialauditorytwo-stepinput Japanese syllabary brain-computer interface speller. Procedia Technology 18, 25–31 (2014). http://www.sciencedirect.com/science/article/pii/ S2212017314005283
3. Kodama, T., Makino, S., Rutkowski, T.M.: Spatial tactile brain-computer interface paradigm applying vibration stimuli to large areas of user’s back. In: Mueller-Putz, G., Bauernfeind, G., Brunner, C., Steyrl, D., Wriessnegger, S., Scherer, R. (eds.) Pro- ceedings of the 6th International Brain-Computer Interface Conference 2014, pp. Article ID 032-1-4. Graz University of Technology Publishing House (2014)
4. Krusienski, D.J., Sellers, E.W., Cabestaing, F., Bayoudh, S., McFarland, D.J., Vaughan, T.M., Wolpaw, J.R.: A comparison of classification techniques for the P300 speller. Journal of Neural Engineering 3(4), 299 (2006). http://stacks.iop.org/1741-2552/3/i=4/a=007
5. Patterson, J.R., Grabois, M.: Locked-in syndrome: a review of 139 cases. Stroke 17(4), 758–764 (1986). http://stroke.ahajournals.org/content/17/4/758.abstract
6. Rutkowski, T.M.: Tactile-body BCI-based NAO robot control. http://youtu.be/ sn6OEBBKsPQ
7. Rutkowski,T.M.:Tactile-pressureBCI-basedNAOrobotcontrol.http://youtu.be/ dSrZ5O59vhI
8. Rutkowski, T.M., Mori, H.: Tactile and bone-conduction auditory brain com- puter interface for vision and hearing impaired users. Journal of Neuro- science Methods 244, 45–51 (2015). http://www.sciencedirect.com/science/ article/pii/S0165027014001265, brain Computer Interfaces; Tribute to Greg A. Gerhardt
9. Schalk, G., Mellinger, J.: A Practical Guide to Brain-Computer Interfacing with BCI2000. Springer-Verlag, London (2010)
10. Schreuder, M., Blankertz, B., Tangermann, M.: A new auditory multi-class brain- computer interface paradigm: Spatial hearing as an informative cue. PLoS ONE 5(4), e9813 (2010)
11. Shimizu, K., Makino, S., Rutkowski, T.M.: Inter-stimulus interval study for the tactile point-pressure brain-computer interface. In: 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. (accepted, in press). IEEE Engineering in Medicine and Biology Society. IEEE Press, August 25–29, 2015. http://arxiv.org/abs/1506.04458
12. Shimizu,K.,Mori,H.,Makino,S.,Rutkowski,T.M.:Tactilepressurebrain-computer interface using point matrix pattern paradigm. In: 15th International Symposium on Soft Computing and Intelligent Systems (SCIS), 2014 Joint 7th International Con- ference on and Advanced Intelligent Systems (ISIS), pp. 473–477, December 2014. doi:10.1109/SCIS-ISIS.2014.7044756
13. Wolpaw, J., Wolpaw, E.W. (eds.): Brain-Computer Interfaces: Principles and Prac- tice. Oxford University Press (2012)

Keywords: multisensory perception, BCI, BMI (Brain Machine Interface), tactile sensing, Auditory Perception, Visual Perception, visual evoked potential (VEP), P300 event-related potential, Somatosensory Evoked Potentials (SEP), Auditory Evoked Potentials, Classification

Conference: German-Japanese Adaptive BCI Workshop, Kyoto, Japan, 28 Oct - 29 Oct, 2015.

Presentation Type: Oral presentation (Invited speakers)

Topic: Adaptive BCI

Citation: Rutkowski TM (2015). Brain–Robot and Speller Interfaces Using Spatial Multisensory Brain-computer Interface Paradigms. Front. Comput. Neurosci. Conference Abstract: German-Japanese Adaptive BCI Workshop. doi: 10.3389/conf.fncom.2015.56.00014

Received: 11 Oct 2015; Published Online: 04 Nov 2015.

* Correspondence: Dr. Tomasz M Rutkowski, University of Tsukuba, Life Science of TARA, Tsukuba, Ibaraki, 160-0022, Japan, tomek@bci-lab.info

© 2007 - 2017 Frontiers Media S.A. All Rights Reserved