A Temporal Image-Based Approach to Motion Reconstruction for Globally Illuminated Animated Environments
Penn collection
Degree type
Discipline
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
This paper presents an approach to motion sampling and reconstruction for globally illuminated animated environments (under fixed viewing conditions) based on sparse spatio-temporal scene sampling, a resolution-independent temporal file format, and a Delaunay triangulation pixel reconstruction method. The argument is made that motion that is usually achieved by rendering complete images of a scene at a high frame rate (i.e. flipbook style frame-based animation) can be adequately reconstructed using many less samples (often on the order of that required to generate a single, complete, high quality frame) from the sparse image data stored in bounded slices of our temporal file. The scene is rendered using a ray tracing algorithm modified to randomly sample over space - the image plane (x, y), and time (t), yielding (x, y, t) samples that are stored in our spatio-temporal images. Reconstruction of object motion, reconstructing a picture of the scene at a desired time, is performed by projecting the (x, y, t) samples onto the desired temporal plane with the appropriate weighting, constructing the 2D Delaunay triangulation of the sample points, and Gouraud (or Phong) shading the resulting triangles. Both first and higher order visual effects, illumination and visibility, are handled. Silhouette edges and other discontinuities are more difficult to track but can be addressed with a combination of triangle filtering and image postprocessing.