SMaRT: the Smart Meeting Room Task at ISL

As computational and communications systems become increasingly smaller, faster, more powerful, and more integrated, the goal of interactive, integrated meeting support rooms is slowly becoming reality. It is already possible, for instance, to rapidly locate task-related information during a meeting, filter it, and share it with remote users. Unfortunately, the technologies that provide such capabilities are as obstructive as they are useful - they force humans to focus on the tool rather than the task. Thus the veneer of utility often hides the true costs of use, which are longer, less focused human interactions. To address this issue, we present our current research efforts towards SMaRT: the Smart Meeting Room Task. The goal of SMaRT is to provide meeting support services that do not require explicit human-computer interaction. Instead, by monitoring the activities in the meeting room using both video and audio analysis, the room is able to react appropriately to users' needs and allow the users to focus on their own goals.

[1]  Alexander H. Waibel,et al.  Multimodal people ID for a multimedia meeting browser , 1999, MULTIMEDIA '99.

[2]  Alexander H. Waibel,et al.  Face recognition in a meeting room , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[3]  Hagen Soltau,et al.  The ISL Meeting Room System , 2001 .

[4]  Rainer Stiefelhagen,et al.  Tracking focus of attention in meetings , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[5]  Alex Waibel,et al.  Speaker, accent, and language identification using multilingual phone strings , 2002 .

[6]  Rainer Stiefelhagen,et al.  Towards vision-based 3-D people tracking in a smart room , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[7]  Xilin Chen,et al.  Towards monitoring human activities using an omnidirectional camera , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[8]  Jie Zhu,et al.  Head orientation and gaze direction in meetings , 2002, CHI Extended Abstracts.

[9]  Thomas Schaaf,et al.  Lecture and presentation tracking in an intelligent meeting room , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[10]  Susanne Burger,et al.  The ISL meeting corpus: the impact of meeting type on speech style , 2002, INTERSPEECH.