Real-Time Virtual Humans
Penn collection
Degree type
Discipline
Subject
human modeling
computer animation
virtual reality
autonomous agents
language
and action
computer graphics.
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
The last few years have seen great maturation in the computation speed and control methods needed to portray 30 virtual humans suitable for real interactive applications. We first describe the state of the art, then focus on the particular approach taken at the University of Pennsylvania with the Jack system. Various aspects of real-time virtual humans are considered, such as appearance and motion, interactive control, autonomous action, gesture, attention, locomotion, and multiple individuals. The underlying architecture consists of a sense-control-act structure that permits reactive behaviors to be locally adaptive to the environment, and a PaT-Net parallel finite-state machine controller that can be used to drive virtual humans through complex tasks. We then argue for a deep connection between language and animation and describe current efforts in linking them through two systems: the Jack Presenter and the JackMOO extension to lambdaM00. Finally, we outline a Parameterized Action Representation for mediating between language instructions and animated actions.