An Environment to Acknowledge the Interface between Affect and Cognition

Human intelligence is being increasingly redefined to include the all-encompassing effect of emotions upon what used to be considered ’pure reason’. With the recent progress of research in computer vision, speech/prosody recognition, and bio-feedback, real-time recognition of affect could very well prove to enhance humancomputer interaction considerably, as well as to assist further progress in the development of new emotion theories. We propose an adaptive system architecture designed to integrate the output of various multimodal subsystems. Based upon the perceived user’s state, the agent can adapt its interface by responding most appropriately to the current needs of its user, and provide intelligent multi-modal feedback to the user. We concentrate on one aspect of the implementation of such an environment: facial expression recognition. We give preliminary results about our approach which uses a neural network.

[1]  J. Stainer,et al.  The Emotions , 1882, Nature.

[2]  R. Birdwhistell Kinesics and Context: Essays on Body Motion Communication , 1971 .

[3]  P. Ekman,et al.  Unmasking the Face: A Guide to Recognizing Emotions From Facial Expressions , 1975 .

[4]  G. Bower Mood and memory. , 1981, The American psychologist.

[5]  R. Zajonc On the primacy of affect. , 1984 .

[6]  R. Zajonc,et al.  Affect and cognition: The hard interface. , 1985 .

[7]  T. J. Stonham,et al.  Practical Face Recognition and Verification with Wisard , 1986 .

[8]  A. Young,et al.  Aspects of face processing , 1986 .

[9]  N. Frijda,et al.  Can computers feel? Theory and design of an emotional system , 1987 .

[10]  M. Dyer Emotions and their computations: Three computer models , 1987 .

[11]  R. Zajonc,et al.  Feeling and facial efference: implications of the vascular theory of emotion. , 1989, Psychological review.

[12]  Demetri Terzopoulos,et al.  Analysis of facial images using physical and anatomical models , 1990, [1990] Proceedings Third International Conference on Computer Vision.

[13]  Garrison W. Cottrell,et al.  EMPATH: Face, Emotion, and Gender Recognition Using Holons , 1990, NIPS.

[14]  Margaret A. Boden,et al.  The philosophy of artificial intelligence , 1990, Oxford readings in philosophy.

[15]  Aaron Sloman,et al.  Motives, Mechanisms, and Emotions , 1987, The Philosophy of Artificial Intelligence.

[16]  Nicole Chovil Discourse‐oriented facial displays in conversation , 1991 .

[17]  M. Turk,et al.  Eigenfaces for Recognition , 1991, Journal of Cognitive Neuroscience.

[18]  D. Tucker,et al.  Neural mechanisms of emotion. , 1992, Journal of consulting and clinical psychology.

[19]  Joseph E LeDoux Brain mechanisms of emotion and emotional learning , 1992, Current Biology.

[20]  Sati McKenzie,et al.  Machine Interpretation of Emotion: Design of a Memory-Based Expert System for Interpreting Facial Expressions in Terms of Signaled Emotions , 1993, Cogn. Sci..

[21]  Iain R. Murray,et al.  Toward the simulation of emotion in synthetic speech: a review of the literature on human vocal emotion. , 1993, The Journal of the Acoustical Society of America.

[22]  Akikazu Takeuchi,et al.  Speech Dialogue With Facial Displays: Multimodal Human-Computer Conversation , 1994, ACL.

[23]  M. Rosenblum,et al.  Human emotion recognition from motion using a radial basis function network architecture , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[24]  Irfan Essa,et al.  Tracking facial motion , 1994, Proceedings of 1994 IEEE Workshop on Motion of Non-rigid and Articulated Objects.

[25]  Alice J. O'Toole,et al.  Connectionist models of face processing: A survey , 1994, Pattern Recognit..

[26]  Michael J. Black,et al.  Tracking and recognizing rigid and non-rigid facial motions using local parametric models of image motion , 1995, Proceedings of IEEE International Conference on Computer Vision.

[27]  Pattie Maes,et al.  Agents that reduce work and information overload , 1994, CACM.

[28]  David G. Stork,et al.  Speechreading: an overview of image processing, feature extraction, sensory integration and pattern recognition techniques , 1996, Proceedings of the Second International Conference on Automatic Face and Gesture Recognition.

[29]  David Rumelhart,et al.  A neural network model of micro- and macroprosody , 1996 .

[30]  Christine L. Lisetti Motives for Intelligent Agents: Computational Scripts for Emotion Concepts , 1997, SCAI.

[31]  Rüdiger Dillmann,et al.  Haptic output in multimodal user interfaces , 1997, IUI '97.

[32]  B. Kushner Descartes' error. , 1998, Journal of AAPOS : the official publication of the American Association for Pediatric Ophthalmology and Strabismus.

[33]  Andrew Stern,et al.  Panel on affect and emotion in the user interface , 1998, IUI '98.

[34]  Takeo Kanade,et al.  Neural Network-Based Face Detection , 1998, IEEE Trans. Pattern Anal. Mach. Intell..