Exploring Combinations of Different Color and Facial Expression Stimuli for Gaze-Independent BCIs

Background: Some studies have proven that a conventional visual brain computer interface (BCI) based on overt attention cannot be used effectively when eye movement control is not possible. To solve this problem, a novel visual-based BCI system based on covert attention and feature attention has been proposed and was called the gaze-independent BCI. Color and shape difference between stimuli and backgrounds have generally been used in examples of gaze-independent BCIs. Recently, a new paradigm based on facial expression changes has been presented, and obtained high performance. However, some facial expressions were so similar that users couldn't tell them apart, especially when they were presented at the same position in a rapid serial visual presentation (RSVP) paradigm. Consequently, the performance of the BCI is reduced. New Method: In this paper, we combined facial expressions and colors to optimize the stimuli presentation in the gaze-independent BCI. This optimized paradigm was called the colored dummy face pattern. It is suggested that different colors and facial expressions could help users to locate the target and evoke larger event-related potentials (ERPs). In order to evaluate the performance of this new paradigm, two other paradigms were presented, called the gray dummy face pattern and the colored ball pattern. Comparison with Existing Method(s): The key point that determined the value of the colored dummy faces stimuli in BCI systems was whether the dummy face stimuli could obtain higher performance than gray faces or colored balls stimuli. Ten healthy participants (seven male, aged 21–26 years, mean 24.5 ± 1.25) participated in our experiment. Online and offline results of four different paradigms were obtained and comparatively analyzed. Results: The results showed that the colored dummy face pattern could evoke higher P300 and N400 ERP amplitudes, compared with the gray dummy face pattern and the colored ball pattern. Online results showed that the colored dummy face pattern had a significant advantage in terms of classification accuracy (p < 0.05) and information transfer rate (p < 0.05) compared to the other two patterns. Conclusions: The stimuli used in the colored dummy face paradigm combined color and facial expressions. This had a significant advantage in terms of the evoked P300 and N400 amplitudes and resulted in high classification accuracies and information transfer rates. It was compared with colored ball and gray dummy face stimuli.

[1]  J. Polich Updating P300: An integrative theory of P3a and P3b , 2007, Clinical Neurophysiology.

[2]  Chang-Hwan Im,et al.  Classification of selective attention to auditory stimuli: Toward vision-free brain–computer interfacing , 2011, Journal of Neuroscience Methods.

[3]  Cuntai Guan,et al.  High performance P300 speller for brain-computer interface , 2004, IEEE International Workshop on Biomedical Circuits and Systems, 2004..

[4]  Hubert Cecotti,et al.  Spelling with non-invasive Brain–Computer Interfaces – Current and future trends , 2011, Journal of Physiology-Paris.

[5]  Dong Ming,et al.  Exploring Combinations of Auditory and Visual Stimuli for Gaze-Independent Brain-Computer Interfaces , 2014, PloS one.

[6]  A. Cichocki,et al.  The Changing Face of P300 BCIs: A Comparison of Stimulus Changes in a P300 BCI Involving Faces, Emotion, and Movement , 2012, PloS one.

[7]  Tim Curran,et al.  The FN400 indexes familiarity-based recognition of faces , 2007, NeuroImage.

[8]  Luciano Gamberini,et al.  Covert Visuospatial Attention Orienting in a Brain-Computer Interface for Amyotrophic Lateral Sclerosis Patients , 2013, Neurorehabilitation and neural repair.

[9]  Michael Tangermann,et al.  Listen, You are Writing! Speeding up Online Spelling with a Dynamic Auditory BCI , 2011, Front. Neurosci..

[10]  Dennis J. McFarland,et al.  Brain–computer interfaces for communication and control , 2002, Clinical Neurophysiology.

[11]  G. Pfurtscheller,et al.  Brain-Computer Interfaces for Communication and Control. , 2011, Communications of the ACM.

[12]  Fanglin Chen,et al.  A novel hybrid BCI speller based on the incorporation of SSVEP into the P300 paradigm , 2013, Journal of neural engineering.

[13]  R. Fazel-Rezai,et al.  Human Error in P300 Speller Paradigm for Brain-Computer Interface , 2007, 2007 29th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[14]  E. Donchin,et al.  Talking off the top of your head: toward a mental prosthesis utilizing event-related brain potentials. , 1988, Electroencephalography and clinical neurophysiology.

[15]  Xingyu Wang,et al.  An adaptive P300-based control system , 2011, Journal of neural engineering.

[16]  Touradj Ebrahimi,et al.  An efficient P300-based brain–computer interface for disabled subjects , 2008, Journal of Neuroscience Methods.

[17]  Benjamin Blankertz,et al.  Gaze-independent BCI-spelling using rapid serial visual presentation (RSVP) , 2013, Clinical Neurophysiology.

[18]  A. Kübler,et al.  Toward brain-computer interface based wheelchair control utilizing tactually-evoked event-related potentials , 2014, Journal of NeuroEngineering and Rehabilitation.

[19]  A. Kübler,et al.  A Brain–Computer Interface Controlled Auditory Event‐Related Potential (P300) Spelling System for Locked‐In Patients , 2009, Annals of the New York Academy of Sciences.

[20]  Xingyu Wang,et al.  An ERP-Based BCI using an oddball Paradigm with Different Faces and Reduced errors in Critical Functions , 2014, Int. J. Neural Syst..

[21]  Clemens Brunner,et al.  An adaptive P 300-based control system , 2011 .

[22]  Bin He,et al.  BRAIN^COMPUTER INTERFACE , 2007 .

[23]  Benjamin Blankertz,et al.  Control-display mapping in brain–computer interfaces , 2012, Ergonomics.

[24]  A. Cichocki,et al.  A survey of the dummy face and human face stimuli used in BCI paradigm , 2015, Journal of Neuroscience Methods.

[25]  J. Wolpaw,et al.  Does the ‘P300’ speller depend on eye gaze? , 2010, Journal of neural engineering.

[26]  Tao Liu,et al.  N200-speller using motion-onset visual response , 2009, Clinical Neurophysiology.

[27]  Jonathan R Wolpaw,et al.  Brain–computer interface systems: progress and prospects , 2007, Expert review of medical devices.

[28]  Jan B. F. van Erp,et al.  A Tactile P300 Brain-Computer Interface , 2010, Front. Neurosci..

[29]  J. Wolpaw,et al.  A novel P300-based brain–computer interface stimulus presentation paradigm: Moving beyond rows and columns , 2010, Clinical Neurophysiology.

[30]  G Müller-Putz,et al.  An independent SSVEP-based brain–computer interface in locked-in syndrome , 2014, Journal of neural engineering.

[31]  Francisco J. Pelayo,et al.  An auditory Brain-Computer Interface with Accuracy Prediction , 2012, Int. J. Neural Syst..

[32]  Pedro J. García-Laencina,et al.  Efficient Automatic Selection and Combination of EEG Features in Least Squares Classifiers for Motor Imagery Brain-Computer Interfaces , 2013, Int. J. Neural Syst..

[33]  Benjamin Blankertz,et al.  Exploring motion VEPs for gaze-independent communication , 2012, Journal of neural engineering.

[34]  I. A. Basyul,et al.  N1 wave in the P300 BCI is not sensitive to the physical characteristics of stimuli. , 2009, Journal of integrative neuroscience.

[35]  Benjamin Blankertz,et al.  Performance optimization of ERP-based BCIs using dynamic stopping , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[36]  D Thompson,et al.  Optimizing the P300-based brain–computer interface: current status, limitations and future directions , 2011, Journal of neural engineering.

[37]  C. Neuper,et al.  Toward a high-throughput auditory P300-based brain–computer interface , 2009, Clinical Neurophysiology.

[38]  J. Wolpaw,et al.  Clinical Applications of Brain-Computer Interfaces: Current State and Future Prospects , 2009, IEEE Reviews in Biomedical Engineering.

[39]  L. R. Quitadamo,et al.  Which Physiological Components are More Suitable for Visual ERP Based Brain–Computer Interface? A Preliminary MEG/EEG Study , 2010, Brain Topography.

[40]  Xingyu Wang,et al.  A P300 Brain-Computer Interface Based on a Modification of the Mismatch Negativity Paradigm , 2015, Int. J. Neural Syst..

[41]  Yijun Wang,et al.  A high-speed BCI based on code modulation VEP , 2011, Journal of neural engineering.

[42]  Michitaka Hirose,et al.  Brain-Computer Interfaces, Virtual Reality, and Videogames , 2008, Computer.

[43]  E Donchin,et al.  The mental prosthesis: assessing the speed of a P300-based brain-computer interface. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[44]  M S Treder,et al.  Gaze-independent brain–computer interfaces based on covert attention and feature attention , 2011, Journal of neural engineering.

[45]  Bernhard Schölkopf,et al.  An Auditory Paradigm for Brain-Computer Interfaces , 2004, NIPS.

[46]  A. Kübler,et al.  Face stimuli effectively prevent brain–computer interface inefficiency in patients with neurodegenerative disease , 2013, Clinical Neurophysiology.

[47]  A. Cichocki,et al.  A novel BCI based on ERP components sensitive to configural processing of human faces , 2012, Journal of neural engineering.

[48]  S Frenzel,et al.  Two communication lines in a 3 × 3 matrix speller. , 2011, Journal of neural engineering.

[49]  Benjamin Blankertz,et al.  A novel brain-computer interface based on the rapid serial visual presentation paradigm , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[50]  A. Cichocki,et al.  An optimized ERP brain–computer interface based on facial expression changes , 2014, Journal of neural engineering.

[51]  Christa Neuper,et al.  Motor imagery and EEG-based control of spelling devices and neuroprostheses. , 2006, Progress in brain research.

[52]  B. Blankertz,et al.  (C)overt attention and visual speller design in an ERP-based brain-computer interface , 2010, Behavioral and Brain Functions.

[53]  Wei Wu,et al.  Integrating the spatial profile of the N200 speller for asynchronous brain-computer interfaces , 2011, 2011 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[54]  Jie Li,et al.  Design of assistive Wheelchair System directly Steered by Human Thoughts , 2013, Int. J. Neural Syst..