Kuchenbecker, Katherine J.

Email Address
ORCID
Disciplines
Research Projects
Organizational Units
Position
Introduction
Research Interests

Search Results

Now showing 1 - 10 of 33
  • Publication
    A GPU-Based Approach for Real-Time Haptic Rendering of 3D Fluids
    (2008-12-01) Yang, Meng; Safonova, Alla; Kuchenbecker, Katherine J; Zhou, Zehua
    Real-time haptic rendering of three-dimensional fluid flow will improve the interactivity and realism of video games and surgical simulators, but it remains a challenging undertaking due to its high computational cost. In this work we propose an innovative GPUbased approach that enables real-time haptic rendering of highresolution 3D Navier-Stokes fluids. We show that moving the vast majority of the computation to the GPU allows for the simulation of touchable fluids at resolutions and frame rates that are significantly higher than any other recent real-time methods without a need for pre-computations [Baxter and Lin 2004; Mora and Lee 2008; Dobashi et al. 2006].
  • Publication
    The Penn Baxter Face Database
    (2017-03-23) Fitter, Naomi T.; Kuchenbecker, Katherine J.
    The Penn Baxter Face Database is composed of Baxter robot face images designed in a variety of expressions and colors. Each of these images was photographed on the physical Baxter robot and assessed by internet raters (N = 568) in an Amazon Mechanical Turk survey. Raters assessed the pleasantness and energeticness of each robot face and also shared how safe and pleased each face made them feel. This project was published in our ICSR 2016 paper entitled “Designing and Assessing Expressive Open-Source Faces for the Baxter Robot.” After hearing of interest from other researchers, we previously released our Baxter face database on GitHub at https://github.com/nfitter/BaxterFaces. This dataset, now additionally available on Scholarly Commons, includes the developed Baxter faces, photographs used in the Mechanical Turk survey, editable source files for the studied faces, and bonus faces developed in our subsequent design work with Baxter. These contents may benefit any Baxter users who publish images to the robot's face. The organization of the database is explained in the included ReadMe file.
  • Publication
    The Penn Hand-Clapping Motion Dataset
    (2016-11-27) Fitter, Naomi T; Kuchenbecker, Katherine J
    The Penn Hand-Clapping Motion Dataset is composed of inertial measurement unit (IMU) recordings from the hand motions of 15 naïve people. Each of these individuals participated in an experiment during which they were asked to pantomime various sequences of 10 different motions: back five, clap, double, down five, front five, lap pat, left five, right five, right snap, and up five. The examined motions comprise most typical actions from hand-clapping games like “Pat-a-cake” and “Slide.” This project was published in our IROS 2016 paper entitled “Using IMU Data to Demonstrate Hand-Clapping Games to a Robot.” After hearing of interest from other researchers, we are releasing the corresponding motion dataset, which was originally collected to help us investigate whether we could train highly accurate and rapid classifiers to label hand-clapping game motions performed by everyday people. This dataset, explained further in the included ReadMe file, may interest researchers who investigate human motion.
  • Publication
    Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data
    (2012-03-01) Culbertson, Heather; Romano, Joseph M; Castillo, Pablo; Mintz, Max; Kuchenbecker, Katherine J
    Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one’s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user’s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts.
  • Publication
    Design of Body-Grounded Tactile Actuators for Playback of Human Physical Contact
    (2011-06-01) Kuchenbecker, Katherine J; Stanley, Andrew A
    We present four wearable tactile actuators capable of recreating physical sensations commonly experienced in human interactions, including tapping on, dragging across, squeezing, and twisting an individual’s wrist. In seeking to create tactile signals that feel natural and are easy to understand, we developed movement control interfaces to play back each of these forms of actual human physical contact. Through iterative design, prototyping, programming, and testing, each of these servo-motor-based mechanisms produces a signal that is gradable in magnitude, can be played in a variety of temporal patterns, is localizable to a small area of skin, and, for three of the four actuators, has an associated direction. Additionally, we have tried to design toward many of the characteristics that have made high frequency vibration the most common form of wearable tactile feedback, including low cost, light weight, comfort, and small size. Bolstered by largely positive comments from naive users during an informal testing session, we plan to continue improving these devices for future use in tactile motion guidance.
  • Publication
    Lessons in Using Vibrotactile Feedback to Guide Fast Arm Motions
    (2011-06-01) Kuchenbecker, Katherine J; Bark, Karlin; Khanna, Preeya; Irwin, Rikki; Kapur, Pulkit; Jax, Steven A; Buxbaum, Laurel J.
    We present and evaluate an arm-motion guidance system that uses magnetic tracking sensors and low cost vibrotactile actuators. The system measures the movement of the user’s arm and provides vibration feedback at the wrist and elbow when they stray from the desired motion. An initial study was conducted to investigate whether adding tactile feedback to visual feedback reduces motion errors when a user is learning a new arm trajectory. Although subjects preferred it, we found that the addition of tactile feedback did not affect motion tracking performance. We also found no strong preference or performance differences between attractive and repulsive tactile feedback. Some factors that may have influenced these results include the speed and the complexity of the tested motions, the type of tactile actuators and drive signals used, and inconsistencies in joint angle estimation due to Euler angle gimbal lock. We discuss insights from this analysis and provide suggestions for future systems and studies in tactile motion guidance.
  • Publication
    The AirWand: Design and Characterization of a Large-Workspace Haptic Device
    (2009-05-12) Romano, Joseph M.; Kuchenbecker, Katherine J.
    Almost all commercially available haptic interfaces share a common pitfall, a small shoebox-sized workspace; these devices typically rely on rigid-link manipulator design concepts. In this paper we outline our design for a new kinesthetic haptic system that drastically increases the usable haptic workspace. We present a proof-of-concept prototype, along with our analysis of its capabilities. Our design uses optical tracking to sense the position of the device, and air jet actuation to generate forces. By combining these two technologies, we are able to detach our device from the ground, thus sidestepping many problems that have plagued traditional haptic devices including workspace size, friction, and inertia. We show that optical tracking and air jet actuation successfully enable kinesthetic haptic interaction with virtual environments. Given an appropriately large volume high-pressure air source, and a reasonably high speed tracking system, this design paradigm has many desirable qualities when compared to traditional haptic design schemes.
  • Publication
    The Penn Haptic Texture Toolkit for Modeling, Rendering, and Evaluating Haptic Virtual Textures
    (2014-02-12) Culbertson, Heather; Lopez Delgado, Juan Jose; Kuchenbecker, Katherine J.
    The Penn Haptic Texture Toolkit (HaTT) is a collection of 100 haptic texture and friction models, the recorded data from which the models were made, images of the textures, and the code and methods necessary to render these textures using an impedance-type haptic device such as a SensAble Phantom Omni. This toolkit was developed to provide haptics researchers with a method by which to compare and validate their texture modeling and rendering methods. The included rendering code has the additional benefit of allowing others, both researchers and designers, to incorporate our textures into their virtual environments, which will lead to a richer experience for the user.
  • Publication
    HALO: Haptic Alerts for Low-hanging Obstacles in White Cane Navigation
    (2012-03-01) Kuchenbecker, Katherine J; Wang, Yunqing
    White canes give the visually impaired the freedom to travel independently in unknown environments, but they cannot warn the user of overhead hazards such as tree branches. This paper presents the development and evaluation of a device that provides haptic cues to warn a visually impaired user of low-hanging obstacles during white cane navigation. The Haptic Alerts for Low-hanging Obstacles (HALO) system is a portable and affordable attachment to traditional white canes. By pairing distance data acquired from an ultrasonic range sensor with vibration feedback delivered by an eccentric mass motor, the device aims to alert users of low-hanging obstacles without interfering with the standard functionality of a white cane. We conducted a preliminary validation study wherein twelve blindfolded subjects navigated a custom obstacle course with and without vibration alerts from HALO. The results showed that this new device is intuitive and highly effective at enabling the user to safely navigate around low-hanging obstacles.
  • Publication
    Event-Based Haptics and Acceleration Matching: Portraying and Assessing the Realism of Contact
    (2005-04-04) Kuchenbecker, Katherine J.; Fiene, Jonathan P.; Niemeyer, Günter
    Contact in a typical haptic environment resembles the experience of tapping on soft foam, rather than on a hard object. Event-based, high-frequency transient forces must be superimposed with traditional proportional feedback to provide realistic haptic cues at impact. We have developed a new method for matching the accelerations experienced during real contact, inverting a dynamic model of the device to compute appropriate force feedback transients. We evaluated this haptic rendering paradigm by conducting a study in which users blindly rated the realism of tapping on a variety of virtually rendered surfaces as well as on three real objects. Event-based feedback significantly increased the realism of the virtual surfaces, and the acceleration matching strategy was rated similarly to a sample of real wood on a foam substrate. This work provides a new avenue for achieving realism of contact in haptic interactions.