Date of Award


Degree Type


Degree Name

Doctor of Philosophy (PhD)

Graduate Group

Mechanical Engineering & Applied Mechanics

First Advisor

Katherine J. Kuchenbecker


The haptic sensations one feels when interacting with physical objects create a rich and varied impression of the objects, allowing one to gather information about texture, shape, compressibility, and other physical characteristics. The human sense of touch excels at sensing and interpreting these haptic cues, even when the object is felt through an intermediary tool instead of directly with a bare finger. Dragging, pressing, and tapping a tool on the object allow you to sense the object's roughness, slipperiness, and hardness as a combination of vibrations and forces. Unfortunately, the richness of these interaction cues is missing from many virtual environments, leading to a less satisfying and less immersive experience than one encounters in the physical world. However, we can create the perceptual illusion of touching a real object by displaying the appropriate haptic signals during virtual interactions.

This thesis presents methods for creating haptic models of textured surfaces from acceleration, force, and speed data recorded during physical interactions. The models are then used to synthesize haptic signals that are displayed to the user during rendering through vibrotactile and/or kinesthetic feedback. The haptic signals, which are a function of the interaction conditions and motions used during rendering, must respond realistically to the user's motions in the virtual environment. We conducted human subject studies to test how well our virtual surfaces capture the psychophysical dimensions humans perceive when exploring textured surfaces with a tool.

Three haptic rendering systems were created for displaying virtual surfaces using these surface models. An initial system displayed virtual versions of textured surfaces on a tablet computer using models of the texture vibrations induced when dragging a tool across the real surfaces. An evaluation of the system showed that displaying the texture vibrations accurately captured the surface's roughness, but additional modeling and rendering considerations were needed to capture the full feel of the surface. Using these results, a second system was created for rendering a more complete three-dimensional version of the haptic surfaces including surface friction and event-based tapping transients in addition to the texture vibrations. An evaluation of this system showed that we have created the most realistic haptic surfaces to date. The force-feedback haptic device used in this system, however, was not without its limitations, including low surface stiffness and undesired inertia and friction. We developed an ungrounded haptic augmented reality system to overcome these limitations. This system allowed us to change the perceived texture and friction of a physical three-dimensional object using the previously-developed haptic surface models.

Files over 3MB may be slow to open. For best results, right-click and select "save as..."