Center for Human Modeling and Simulation

Document Type

Conference Paper

Date of this Version

July 2002

Comments

Postprint version. Copyright ACM, 2002. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in Symposium on Computer Animation, pages 65-71.
Publisher URL: http://doi.acm.org/10.1145/545261.545272

Abstract

We propose a control mechanism for facial expressions by applying a few carefully chosen parametric modifications to preexisting expression data streams. This approach applies to any facial animation resource expressed in the general MPEG-4 form, whether taken from a library of preset facial expressions, captured from live performance, or entirely manually created. The MPEG-4 Facial Animation Parameters (FAPs) represent a facial expression as a set of parameterized muscle actions, given as intensity of individual muscle movements over time. Our system varies expressions by changing the intensities and scope of sets of MPEG-4 FAPs. It creates variations in “expressiveness” across the face model rather than simply scale, interpolate, or blend facial mesh node positions. The parameters are adapted from the Effort parameters of Laban Movement Analysis (LMA); we developed a mapping from their values onto sets of FAPs. The FacEMOTE parameters thus perturb a base expression to create a wide range of expressions. Such an approach could allow real-time face animations to change underlying speech or facial expression shapes dynamically according to current agent affect or user interaction needs.

Keywords

Facial animation, Animation systems, MPEG

Share

COinS
 

Date Posted: 24 July 2007

This document has been peer reviewed.