TAME: Time-Varying Affective Response for Humanoid Robots

This paper describes the design of a complex time-varying affective software architecture. It is an expansion of the TAME architecture (Traits, Attitudes, Moods, and Emotions) as applied to humanoid robotics. In particular it is intended to promote effective human-robot interaction by conveying the robot’s affective state to the user in an easy-to-interpret manner.

[1]  P. Costa,et al.  Toward a new generation of personality theories: Theoretical contexts for the five-factor model. , 1996 .

[2]  Ronald C. Arkin,et al.  Behavioral overlays for non-verbal communication expression on a humanoid robot , 2007, Auton. Robots.

[3]  Lola Cañamero,et al.  Towards a model of emotion expression in an interactive robot head , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[4]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[5]  R. Arkin Moving Up the Food Chain: Motivation and Emotion in Behavior-Based Robots , 2003 .

[6]  G. Saucier Mini-markers: a brief version of Goldberg's unipolar big-five markers. , 1994, Journal of personality assessment.

[7]  A. Young,et al.  Emotion Perception from Dynamic and Static Body Expressions in Point-Light and Full-Light Displays , 2004, Perception.

[8]  Rosalind W. Picard Affective computing: (526112012-054) , 1997 .

[9]  Ronald C. Arkin,et al.  Human perspective on affective robotic behavior: a longitudinal study , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  Nico H. Frijda,et al.  Varieties of affect: Emotions and episodes, moods, and sentiments. , 1994 .

[11]  B. J. Fogg,et al.  Computers are social actors: a review of current research , 1997 .

[12]  Atsuo Takanishi,et al.  Experimental study on robot personality for humanoid head robot , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[13]  Masahiro Fujita,et al.  An ethological and emotional basis for human-robot interaction , 2003, Robotics Auton. Syst..

[14]  Albert Mehrabian,et al.  A theory of affiliation , 1974 .

[15]  Ronald C. Arkin,et al.  Multiagent Mission Specification and Execution , 1997, Auton. Robots.

[16]  L. Cañamero Playing the Emotion Game with Feelix , 2002 .

[17]  D. Watson Mood and temperament , 2000 .

[18]  Dave Moffat,et al.  Personality Parameters and Programs , 1997, Creating Personalities for Synthetic Actors.

[19]  D. Watson,et al.  Development and validation of brief measures of positive and negative affect: the PANAS scales. , 1988, Journal of personality and social psychology.

[20]  P. Gallaher Individual differences in nonverbal behavior : dimensions of style , 1992 .

[21]  Mark Coulson Expressing emotion through body movement: a component process approach , 2009 .

[22]  Bernd Kleinjohann,et al.  MEXI: Machine with Emotionally eXtended Intelligence , 2003, HIS.

[23]  Janet Palmer,et al.  Affective guidance of intelligent agents: How emotion controls cognition , 2009, Cognitive Systems Research.

[24]  Hiroshi Ishiguro,et al.  Motion modification method to control affective nuances for robots , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[25]  Yasuhisa Hasegawa,et al.  Facial expressive robotic head system for human-robot communication and its application in home environment , 2004, Proceedings of the IEEE.

[26]  Agnar Aamodt,et al.  Case-Based Reasoning: Foundational Issues, Methodological Variations, and System Approaches , 1994, AI Commun..

[27]  Harukazu Igarashi,et al.  Design and Application of Hybrid Intelligent Systems , 2003 .

[28]  Dirk Lefeber,et al.  Probo, a testbed for human robot interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[29]  R. R. Abidin Psychological Assessment Resources , 1995 .

[30]  Clifford Nass,et al.  Does computer-generated speech manifest personality? an experimental test of similarity-attraction , 2000, CHI.

[31]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[32]  Ronald C. Arkin,et al.  An Behavior-based Robotics , 1998 .

[33]  Judith A. Hall,et al.  A thin slice perspective on the accuracy of first impressions , 2007 .

[34]  Lilia V. Moshkina,et al.  Recognizing Nonverbal Affective Behavior in Humanoid Robots , 2010 .

[35]  Reid G. Simmons,et al.  Modeling Affect in Socially Interactive Robots , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[36]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[37]  Christian Heath,et al.  IEEE International Symposium on Robot and Human Interactive Communication , 2009 .

[38]  Ronald C. Arkin,et al.  Time-Varying Affective Response for Humanoid Robots , 2009, FIRA.

[39]  R. Gifford Mapping Nonverbal Behavior on the Interpersonal Circle , 1991 .

[40]  Javier R. Movellan,et al.  Learning to Make Facial Expressions , 2009, 2009 IEEE 8th International Conference on Development and Learning.

[41]  K. M. Lee,et al.  Can robots manifest personality? : An empirical test of personality recognition, social responses, and social presence in human-robot interaction , 2006 .

[42]  Martin Buss,et al.  Design and Evaluation of Emotion-Display EDDIE , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[43]  L. Cañamero Playing the emotion game with Feelix : what can a LEGO robot tell us about emotion? , 2002 .