A Novel EOG/EEG Hybrid Human–Machine Interface Adopting Eye Movements and ERPs: Application to Robot Control

This study presents a novel human-machine interface (HMI) based on both electrooculography (EOG) and electroencephalography (EEG). This hybrid interface works in two modes: an EOG mode recognizes eye movements such as blinks, and an EEG mode detects event related potentials (ERPs) like P300. While both eye movements and ERPs have been separately used for implementing assistive interfaces, which help patients with motor disabilities in performing daily tasks, the proposed hybrid interface integrates them together. In this way, both the eye movements and ERPs complement each other. Therefore, it can provide a better efficiency and a wider scope of application. In this study, we design a threshold algorithm that can recognize four kinds of eye movements including blink, wink, gaze, and frown. In addition, an oddball paradigm with stimuli of inverted faces is used to evoke multiple ERP components including P300, N170, and VPP. To verify the effectiveness of the proposed system, two different online experiments are carried out. One is to control a multifunctional humanoid robot, and the other is to control four mobile robots. In both experiments, the subjects can complete tasks effectively by using the proposed interface, whereas the best completion time is relatively short and very close to the one operated by hand.

[1]  Xiaorong Gao,et al.  An online multi-channel SSVEP-based brain–computer interface using a canonical correlation analysis method , 2009, Journal of neural engineering.

[2]  Ricardo Chavarriaga,et al.  A hybrid brain–computer interface based on the fusion of electroencephalographic and electromyographic activities , 2011, Journal of neural engineering.

[3]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[4]  G. Pfurtscheller,et al.  Self-Paced Operation of an SSVEP-Based Orthosis With and Without an Imagery-Based “Brain Switch:” A Feasibility Study Towards a Hybrid BCI , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[5]  C Neuper,et al.  A comparison of three brain–computer interfaces based on event-related desynchronization, steady state visual evoked potentials, or a hybrid approach using both signals , 2011, Journal of neural engineering.

[6]  F. Jolesz,et al.  Brain–machine interface via real-time fMRI: Preliminary study on thought-controlled robotic arm , 2009, Neuroscience Letters.

[7]  G Pfurtscheller,et al.  Toward a hybrid brain–computer interface based on imagined movement and visual attention , 2010, Journal of neural engineering.

[8]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[9]  F. L. D. Silva,et al.  Event-related EEG/MEG synchronization and desynchronization: basic principles , 1999, Clinical Neurophysiology.

[10]  Keum-Shik Hong,et al.  Real-time feature extraction of P300 component using adaptive nonlinear principal component analysis , 2011, Biomedical engineering online.

[11]  H. Flor,et al.  A spelling device for the paralysed , 1999, Nature.

[12]  Jie Li,et al.  Evaluation and Application of a Hybrid Brain Computer Interface for Real Wheelchair Parallel Control with Multi-Degree of Freedom , 2014, Int. J. Neural Syst..

[13]  K. Hong,et al.  CLASSIFYING MENTAL ACTIVITIES FROM EEG-P 300 SIGNALS USING ADAPTIVE NEURAL NETWORKS , 2012 .

[14]  Manuel Mazo,et al.  Wheelchair Guidance Strategies Using EOG , 2002, J. Intell. Robotic Syst..

[15]  Chun-Liang Hsu,et al.  EOG-based Human-Computer Interface system development , 2010, Expert Syst. Appl..

[16]  H. Ritter,et al.  Asynchronous, parallel on-line classification of P300 and ERD for an efficient hybrid BCI , 2011, 2011 5th International IEEE/EMBS Conference on Neural Engineering.

[17]  Christian Laugier,et al.  Hybrid P300 and mu-beta brain computer interface to operate a brain controlled wheelchair , 2008 .

[18]  Mohammad Hassan Moradi,et al.  A new approach for EEG feature extraction in P300-based lie detection , 2009, Comput. Methods Programs Biomed..

[19]  Fanglin Chen,et al.  A Speedy Hybrid BCI Spelling Approach Combining P300 and SSVEP , 2014, IEEE Transactions on Biomedical Engineering.

[20]  Stefan Haufe,et al.  Single-trial analysis and classification of ERP components — A tutorial , 2011, NeuroImage.

[21]  Fumitoshi Matsuno,et al.  EOG/ERP hybrid human-machine interface for robot control , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Keum-Shik Hong,et al.  Online binary decision decoding using functional near-infrared spectroscopy for the development of brain–computer interface , 2014, Experimental Brain Research.

[23]  E Donchin,et al.  A new method for off-line removal of ocular artifact. , 1983, Electroencephalography and clinical neurophysiology.

[24]  J.R. Wolpaw,et al.  BCI meeting 2005-workshop on signals and recording methods , 2006, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[25]  A. Cichocki,et al.  A novel BCI based on ERP components sensitive to configural processing of human faces , 2012, Journal of neural engineering.

[26]  J. Wolpaw,et al.  EEG correlates of P300-based brain–computer interface (BCI) performance in people with amyotrophic lateral sclerosis , 2012, Journal of neural engineering.

[27]  Yi Li,et al.  A hybrid brain-computer interface control strategy in a virtual environment , 2011, Journal of Zhejiang University SCIENCE C.

[28]  Gerhard Tröster,et al.  Eye Movement Analysis for Activity Recognition Using Electrooculography , 2011, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[29]  Keum-Shik Hong,et al.  Decoding of four movement directions using hybrid NIRS-EEG brain-computer interface , 2014, Front. Hum. Neurosci..

[30]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[31]  Klaus-Robert Müller,et al.  Enhanced Performance by a Hybrid Nirs–eeg Brain Computer Interface , 2022 .

[32]  B. Allison,et al.  BCI Demographics: How Many (and What Kinds of) People Can Use an SSVEP BCI? , 2010, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[33]  F. Mussa-Ivaldi,et al.  Brain–machine interfaces: computational demands and clinical needs meet basic neuroscience , 2003, Trends in Neurosciences.

[34]  H. Semlitsch,et al.  A solution for reliable and valid reduction of ocular artifacts, applied to the P300 ERP. , 1986, Psychophysiology.

[35]  Yuanqing Li,et al.  An EEG-Based BCI System for 2-D Cursor Control by Combining Mu/Beta Rhythm and P300 Potential , 2010, IEEE Transactions on Biomedical Engineering.

[36]  Dean J Krusienski,et al.  A comparison of classification techniques for the P300 Speller , 2006, Journal of neural engineering.

[37]  E. John,et al.  Evoked-Potential Correlates of Stimulus Uncertainty , 1965, Science.

[38]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[39]  Keum-Shik Hong,et al.  Classification of functional near-infrared spectroscopy signals corresponding to the right- and left-wrist motor imagery for development of a brain–computer interface , 2013, Neuroscience Letters.

[40]  B.Z. Allison,et al.  ERPs evoked by different matrix sizes: implications for a brain computer interface (BCI) system , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[41]  Xingyu Wang,et al.  Spatial-Temporal Discriminant Analysis for ERP-Based Brain-Computer Interface , 2013, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[42]  Roxane J. Itier,et al.  Face, eye and object early processing: What is the face specificity? , 2006, NeuroImage.

[43]  Wolfgang Rosenstiel,et al.  An MEG-based brain–computer interface (BCI) , 2007, NeuroImage.

[44]  A. Kübler,et al.  Flashing characters with famous faces improves ERP-based brain–computer interface performance , 2011, Journal of neural engineering.

[45]  Bruno Jammes,et al.  Automatic EOG analysis: A first step toward automatic drowsiness scoring during wake-sleep transitions , 2008 .

[46]  Andrej M. Savić,et al.  Toward a Hybrid BCI for Grasp Rehabilitation , 2011 .

[47]  Yodchanan Wongsawat,et al.  Hybrid EEG-EOG brain-computer interface system for practical machine control , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[48]  Ying Sun,et al.  Asynchronous P300 BCI: SSVEP-based control state detection , 2010, 2010 18th European Signal Processing Conference.

[49]  Günter Edlinger,et al.  A Hybrid Brain-Computer Interface for Smart Home Control , 2011, HCI.

[50]  Wolfgang Rosenstiel,et al.  Online use of error-related potentials in healthy users and people with severe motor impairment increases performance of a P300-BCI , 2012, Clinical Neurophysiology.

[51]  Yuanqing Li,et al.  A Hybrid BCI System Combining P300 and SSVEP and Its Application to Wheelchair Control , 2013, IEEE Transactions on Biomedical Engineering.

[52]  Arne Robben,et al.  Towards the detection of error-related potentials and its integration in the context of a P300 speller brain-computer interface , 2012, Neurocomputing.