GRASP Lab Camera Systems and Their Effects on Algorithms
Penn collection
General Robotics, Automation, Sensing and Perception Laboratory
Degree type
Discipline
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
When image processing techniques are applied to real images, differences in information content between the horizontal and vertical directions should be considered. The minimum spatial scale at which an algorithm can be applied isotropically is limited by the characteristics of the imaging system. Below that scale, threshold values used within algorithms are dependent on the orientation of a objects in a scene. The response of an electro-optic system has been described based on the ability of a human observer to detect changes in intensity. However, when the observer is a machine rather than a human, the size of the response required for detection is likely to be larger. In addition, the orientation dependence of the magnitude of intensity changes must be considered. This report is a brief review of three sources of differences between the horizontal and vertical directions in real images, using equipment in the University of Pennsylvania General Robotics and Active Sensory Perception Laboratory. The shape of the pixels is not square, the pixel data is not independent, and the spatial frequency response is different. Other image acquisition errors, such as blooming, CCD blemishes and uniform illumination signature are addressed in [I]. The impact of these differences on several basic image processing algorithms is discussed.