Pipeline Rendering: Interaction and Realism Through Hardware-Based Multi-Pass Rendering
While large investments are made in sophisticated graphics hardware, most realistic rendering is still performed off-line using ray trace or radiosity systems. A coordinated use of hardware-provided bitplanes and rendering pipelines can, however, approximate ray trace quality illumination effects in a user-interactive environment, as well as provide the tools necessary for a user to declutter such a complex scene. A variety of common ray trace and radiosity illumination effects are presented using multi-pass rendering in a pipeline architecture. We provide recursive reflections through the use of secondary viewpoints, and present a method for using a homogeneous 2-D projective image mapping to extend this method for refractive transparent surfaces. This paper then introduces the Dual Z-buffer, or DZ-buffer, an evolutionary hardware extension which, along with current frame-buffer functions such as stencil planes and accumulation buffers, provides the hardware platform to render non-refractive transparent surfaces in a back-to-front or front-to-back order. We extend the traditional use of shadow volumes to provide reflected and refracted shadows as well as specular light reclassification. The shadow and lighting effects are then incorporated into our recursive viewpoint paradigm. Global direct illumination is provided through a shadow blending technique. Hardware surface illumination is fit to a physically-based BRDF to provide a better local direct model, and the framework permits incorporation of a radiosity solution for indirect illumination as well. Additionally, we incorporate material properties including translucency, light scattering, and non-uniform transmittance to provide a general framework for creating realistic renderings. The DZ-buffer also provides decluttering facilities such as transparency and clipping. This permits selective scene viewing through arbitrary view-dependent and non-planar clipping and transparency surfaces in real-time. The combination of these techniques provide for understandable, realistic scene rendering at typical rates 5-50 times that of a comparable ray trace images. In addition, the pixel-parallel nature of these methods leads to exploration of further hardware rendering engine extensions which can exploit this coherence.