Center for Human Modeling and Simulation
Document Type
Conference Paper
Date of this Version
May 2000
Abstract
We describe a new paradigm in which a user can produce a wide range of expressive, natural-looking movements of animated characters by specifying their manners and attitudes with natural language verbs and adverbs. A natural language interpreter, a Parameterized Action Representation (PAR), and an expressive motion engine (EMOTE) are designed to bridge the gap between natural language instructions issued by the user and expressive movements carried out by the animated characters. By allowing users to customize basic movements with natural language terms to support individualized expressions, our approach may eventually lead to the automatic generation of expressive movements from speech text, a storyboard script, or a behavioral simulation.
Keywords
laban movement analysis, gesture, interactive computer animation, natural language control
Recommended Citation
Zhao, L., Costa, M., & Badler, N. I. (2000). Interpreting Movement Manner. Retrieved from https://repository.upenn.edu/hms/47
Date Posted: 18 July 2007
This document has been peer reviewed.
Comments
Copyright 2000 IEEE. Reprinted from Proceedings of Computer Graphics 2000, May 2000, pages 98-103.
Publisher URL: http://dx.doi.org/10.1109/CA.2000.889052
This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Pennsylvania's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.