Center for Human Modeling and Simulation

Document Type

Journal Article

Date of this Version

December 1996


Copyright 1996 IEEE. Reprinted from IEEE Transactions on Visualization and Computer Graphics, Volume 2, Issue 4, December 1996, pages 283-298.

This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Pennsylvania's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to By choosing to view this document, you agree to all provisions of they copyright laws protecting it.


We describe a new framework for efficiently computing and storing global illumination effects for complex, animated environments. The new framework allows the rapid generation of sequences representing any arbitrary path in a "view space" within an environment in which both the viewer and objects move. The global illumination is stored as time sequences of range-images at base locations that span the view space. We present algorithms for determining locations for these base images, and the time steps required to adequately capture the effects of object motion. We also present algorithms for computing the global illumination in the base images that exploit spatial and temporal coherence by considering direct and indirect illumination separately. We discuss an initial implementation using the new framework. Results and analysis of our implementation demonstrate the effectiveness of the individual phases of the approach; we conclude with an application of the complete framework to a complex environment that includes object motion.


animation, global illumination, image-based rendering, radiosity, ray tracing, walk-throughs.



Date Posted: 18 July 2007

This document has been peer reviewed.