Content-based search overviews and exploratory browsing of movies with MovieClouds

Videos, and especially movies, often engage viewers perceptually, cognitively and emotionally, by combining diverse symbol systems, such as images, texts, music and narration to tell stories. As one of the biggest sources of entertainment, in individual and social contexts, they are increasingly accessible as enormous collections over the internet, in social media and iTV. These richer environments demand for new and more powerful ways to search, browse and view videos that may benefit from video content-based analysis and classification techniques. In this paper, we describe and evaluate MovieClouds, its core and extended features of content processing, interactive search, overview and browsing designed to access, explore and visualise movies, from overview clouds, at the movies space down to the movies, based on the information conveyed in their different content tracks. It adopts a tag cloud unifying-paradigm, to extend to movies the power, flexibility, engagement and fun usually associated with clouds.

[1]  S. Milgram Psychological maps of Paris , 1976 .

[2]  J. Russell A circumplex model of affect. , 1980 .

[3]  P. Ekman Are there basic emotions? , 1992, Psychological review.

[4]  Christopher Ahlberg,et al.  Tight Coupling: Guiding User Actions in a Direct Manipulation Retrieval System , 1995, BCS HCI.

[5]  Michael Burmester,et al.  Hedonic and ergonomic quality aspects determine a software's appeal , 2000, CHI.

[6]  E. Toms,et al.  Serendipitous Information Retrieval , 2000, DELOS.

[7]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Teresa Chambel,et al.  Context perception in video-based hypermedia spaces , 2002, HYPERTEXT '02.

[9]  Min Chen,et al.  Video visualization , 2003 .

[10]  Tao Li,et al.  Detecting emotion in music , 2003, ISMIR.

[11]  K. Scherer What are emotions? And how can they be measured? , 2005 .

[12]  Chloé Clavel,et al.  Events Detection for an Audio-Based Surveillance System , 2005, 2005 IEEE International Conference on Multimedia and Expo.

[13]  Alan Hanjalic,et al.  Affective video content representation and modeling , 2005, IEEE Transactions on Multimedia.

[14]  Alexander G. Hauptmann Lessons for the Future from a Decade of Informedia Video Analysis Research , 2005, CIVR.

[15]  Fabio Vignoli,et al.  Visual Playlist Generation on the Artist Map , 2005, ISMIR.

[16]  Mohan S. Kankanhalli,et al.  Audio Based Event Detection for Multimedia Surveillance , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[17]  Michael J. Muller,et al.  Getting our head in the clouds: toward evaluation studies of tagclouds , 2007, CHI.

[18]  Sylvia D. Kreibig,et al.  Cardiovascular, electrodermal, and respiratory response patterns to fear- and sadness-inducing films. , 2007, Psychophysiology.

[19]  Stathes Hadjiefthymiades,et al.  Semantic Video Classification Based on Subtitles and Domain Terminologies , 2007, KAMC.

[20]  Mohammad Soleymani,et al.  Affective Characterization of Movie Scenes Based on Multimedia Content Analysis and User's Physiological Emotional Responses , 2008, 2008 Tenth IEEE International Symposium on Multimedia.

[21]  David M. Nichols,et al.  How people find videos , 2008, JCDL '08.

[22]  Teresa Chambel,et al.  VideoSpace: a 3D Video Experience , 2008 .

[23]  C.-C. Jay Kuo,et al.  Environmental sound recognition using MP-based features , 2008, 2008 IEEE International Conference on Acoustics, Speech and Signal Processing.

[24]  Daniela Karin Rosner,et al.  Tag Clouds: Data Analysis Tool or Social Signaller? , 2008, Proceedings of the 41st Annual Hawaii International Conference on System Sciences (HICSS 2008).

[25]  Michael G. Christel Amplifying Video Information-Seeking Success through Rich, Exploratory Interfaces , 2008, New Directions in Intelligent Interactive Multimedia.

[26]  Martin Wattenberg,et al.  TIMELINESTag clouds and the case for vernacular visualization , 2008, INTR.

[27]  Gonçalo Marques,et al.  A Music Classification Method based on Timbral Features , 2009, ISMIR.

[28]  Mário J. Silva,et al.  Automatic creation of a reference corpus for political opinion mining in user-generated content , 2009, TSA@CIKM.

[29]  Andreas Butz,et al.  CloudMonster: Support Flexible Browsing and Searching within Music Collections , 2009, INTERACT.

[30]  Steffen Lohmann,et al.  Comparison of Tag Cloud Layouts: Task-Related Performance and Visual Exploration , 2009, INTERACT.

[31]  Harry W. Agius,et al.  Analysing user physiological responses for affective video summarisation , 2009, Displays.

[32]  Jeffrey J. Scott,et al.  MUSIC EMOTION RECOGNITION: A STATE OF THE ART REVIEW , 2010 .

[33]  Ya-Xi Chen,et al.  Exploratory browsing: enhancing the browsing experience with media collections , 2010 .

[34]  Marcos Aurélio Domingues,et al.  Three Current Issues In Music Autotagging , 2011, ISMIR.

[35]  Eva Oliveira,et al.  Towards Emotional Interaction: Using Movies to Automatically Learn Users' Emotional States , 2011, INTERACT.

[36]  Fabien Gouyon,et al.  Short-term Feature Space and Music Genre Classification , 2011 .

[37]  Hsin-Min Wang,et al.  Colorizing tags in tag cloud: a novel query-by-tag music search system , 2011, ACM Multimedia.

[38]  Pedro Martins,et al.  Ifelt: accessing movies through our emotions , 2011, EuroITV '11.

[39]  Pedro Martins,et al.  Being Happy, Healthy and Whole Watching Movies That Affect Our Emotions , 2011, ACII.