Computer vision enhances mobile eye-tracking to expose expert cognition in natural-scene visual-search tasks

Mobile eye-tracking provides the fairly unique opportunity to record and elucidate cognition in action. In our research, we are searching for patterns in, and distinctions between, the visual-search performance of experts and novices in the geo-sciences. Traveling to regions resultant from various geological processes as part of an introductory field studies course in geology, we record the prima facie gaze patterns of experts and novices when they are asked to determine the modes of geological activity that have formed the scene-view presented to them. Recording eye video and scene video in natural settings generates complex imagery that requires advanced applications of computer vision research to generate registrations and mappings between the views of separate observers. By developing such mappings, we could then place many observers into a single mathematical space where we can spatio-temporally analyze inter- and intra-subject fixations, saccades, and head motions. While working towards perfecting these mappings, we developed an updated experiment setup that allowed us to statistically analyze intra-subject eye-movement events without the need for a common domain. Through such analyses we are finding statistical differences between novices and experts in these visual-search tasks. In the course of this research we have developed a unified, open-source, software framework for processing, visualization, and interaction of mobile eye-tracking and high-resolution panoramic imagery.

[1]  N. Charness,et al.  The perceptual aspect of skilled performance in chess: Evidence from eye movements , 2001, Memory & cognition.

[2]  M. Stella Atkins,et al.  Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment , 2004, ETRA.

[3]  Jeff B. Pelz,et al.  Building a lightweight eyetracking headgear , 2004, ETRA.

[4]  Anand K. Gramopadhye,et al.  Use of eye movements as feedforward training for a synthetic aircraft inspection task , 2005, CHI.

[5]  T. Crawford,et al.  How do radiologists do it? The influence of experience and training on searching for chest nodules. , 2006 .

[6]  An Active Vision Approach to Understanding and Improving Visual Training in the Geosciences , 2009 .

[7]  P. Haggard,et al.  Experts see it all: configural effects in action observation , 2010, Psychological research.

[8]  F. Pollick,et al.  Expertise with multisensory events eliminates the effect of biological motion rotation on audiovisual synchrony perception. , 2010, Journal of vision.

[9]  Jeff B. Pelz,et al.  SemantiCode: using content similarity and database-driven matching to code wearable eyetracker gaze data , 2010, ETRA.

[10]  T. Crawford,et al.  Viewing another person's eye movements improves identification of pulmonary nodules in chest x-ray inspection. , 2010, Journal of experimental psychology. Applied.

[11]  Karen M. Evans,et al.  Analyzing complex gaze behavior in the natural world , 2011, Electronic Imaging.

[12]  Karen M. Evans,et al.  Collecting and Analyzing Eye-Tracking Data in Outdoor Environments , 2012 .

[13]  H. Rhody,et al.  Sphere2: Jerry's rig, an OpenGL application for non-linear panorama viewing and interaction , 2012, 2012 Western New York Image Processing Workshop.

[14]  Karen M. Evans,et al.  Ego-motion compensation improves fixation detection in wearable eye tracking , 2012, ETRA.

[15]  Diego Andina,et al.  Do biological synapses perform probabilistic computations? , 2013, Neurocomputing.

[16]  Brandon B. May Imaging methods for understanding and improving visual training in the geosciences , 2013 .

[17]  Nathan D. Cahill,et al.  Image sequence event detection VIA recurrence analysis , 2013, 2013 IEEE Western New York Image Processing Workshop (WNYIPW).