Departmental Papers (CIS)

Date of this Version

April 2002

Document Type

Journal Article

Comments

Copyright 2002 IEEE. Reprinted from IEEE Transactions on Visualization and Computer Graphics, Volume 8, Issue 2, April-June 2002, pages 171-182.
Publisher URL: http://ieeexplore.ieee.org/xpl/tocresult.jsp?isNumber=21552&puNumber=2945

This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Pennsylvania's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

Abstract

This paper presents a simple approach to capturing the appearance and structure of immersive scenes based on the imagery acquired with an omnidirectional video camera. The scheme proceeds by combining techniques from structure-from-motion with ideas from image-based rendering. An interactive photogrammetric modeling scheme is used to recover the locations of a set of salient features in the scene (points and lines) from image measurements in a small set of keyframe images. The estimates obtained from this process are then used as a basis for estimating the position and orientation of the camera at every frame in the video clip. By augmenting the video sequence with pose information, we provide the end-user with the ability to index the video sequence spatially as opposed to temporally. This allows the user to explore the immersive scene by interactively selecting the desired viewpoint and viewing direction.

Keywords

Reconstruction, immersive environments, omnidirectional video, pose estimation

Share

COinS
 

Date Posted: 24 November 2004

This document has been peer reviewed.