Interactive Demonstration of Pointing Gestures for Virtual Trainers
|
||
Abstract: |
||
While interactive virtual humans are becoming widely used in education, training and delivery of instructions, building the animations required for such interactive characters in a given scenario remains a complex and time consuming work. One of the key problems is that most of the systems controlling virtual humans are mainly based on pre-defined animations which have to be re-built by skilled animators specifically for each scenario. In order to improve this situation this paper proposes a framework based on the direct demonstration of motions via a simplified and easy to wear set of motion capture sensors. The proposed system integrates motion segmentation, clustering and interactive motion blending in order to enable a seamless interface for programming motions by demonstration. . | ||
Paper:
|
||
Video:
Bibtex: @inproceedings { huang09hcii, author = { Yazhou Huang and Marcelo Kallmann }, title = { Interactive Demonstration of Pointing Gestures for Virtual Trainers }, booktitle = { Human-Computer Interaction, Part II, Proceedings of HCI International 2009, LNCS }, year = { 2009 }, pages = { 178--187 }, location = { San Diego, California }, publisher = { Springer }, address = { Berlin } } (for information on other projects, see our research and publications pages) |