Culbertson, Heather Marie
Email Address
ORCID
Disciplines
Search Results
Now showing 1 - 2 of 2
Publication Data-Driven Haptic Modeling and Rendering of Realistic Virtual Textured Surfaces(2015-01-01) Culbertson, Heather MarieThe haptic sensations one feels when interacting with physical objects create a rich and varied impression of the objects, allowing one to gather information about texture, shape, compressibility, and other physical characteristics. The human sense of touch excels at sensing and interpreting these haptic cues, even when the object is felt through an intermediary tool instead of directly with a bare finger. Dragging, pressing, and tapping a tool on the object allow you to sense the object's roughness, slipperiness, and hardness as a combination of vibrations and forces. Unfortunately, the richness of these interaction cues is missing from many virtual environments, leading to a less satisfying and less immersive experience than one encounters in the physical world. However, we can create the perceptual illusion of touching a real object by displaying the appropriate haptic signals during virtual interactions. This thesis presents methods for creating haptic models of textured surfaces from acceleration, force, and speed data recorded during physical interactions. The models are then used to synthesize haptic signals that are displayed to the user during rendering through vibrotactile and/or kinesthetic feedback. The haptic signals, which are a function of the interaction conditions and motions used during rendering, must respond realistically to the user's motions in the virtual environment. We conducted human subject studies to test how well our virtual surfaces capture the psychophysical dimensions humans perceive when exploring textured surfaces with a tool. Three haptic rendering systems were created for displaying virtual surfaces using these surface models. An initial system displayed virtual versions of textured surfaces on a tablet computer using models of the texture vibrations induced when dragging a tool across the real surfaces. An evaluation of the system showed that displaying the texture vibrations accurately captured the surface's roughness, but additional modeling and rendering considerations were needed to capture the full feel of the surface. Using these results, a second system was created for rendering a more complete three-dimensional version of the haptic surfaces including surface friction and event-based tapping transients in addition to the texture vibrations. An evaluation of this system showed that we have created the most realistic haptic surfaces to date. The force-feedback haptic device used in this system, however, was not without its limitations, including low surface stiffness and undesired inertia and friction. We developed an ungrounded haptic augmented reality system to overcome these limitations. This system allowed us to change the perceived texture and friction of a physical three-dimensional object using the previously-developed haptic surface models.Publication The Penn Haptic Texture Toolkit for Modeling, Rendering, and Evaluating Haptic Virtual Textures(2014-02-12) Culbertson, Heather; Lopez Delgado, Juan Jose; Kuchenbecker, Katherine J.The Penn Haptic Texture Toolkit (HaTT) is a collection of 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render these textures using an impedance-type haptic device such as a SensAble Phantom Omni. This toolkit was developed to provide haptics researchers with a method by which to compare and validate their texture modeling and rendering methods. The included rendering code has the additional benefit of allowing others, both researchers and designers, to incorporate our textures into their virtual environments, which will lead to a richer experience for the user.