Kuchenbecker, Katherine J.

Email Address
ORCID
Disciplines
Research Projects
Organizational Units
Position
Introduction
Research Interests

Search Results

Now showing 1 - 10 of 33
  • Publication
    A GPU-Based Approach for Real-Time Haptic Rendering of 3D Fluids
    (2008-12-01) Yang, Meng; Safonova, Alla; Kuchenbecker, Katherine J; Zhou, Zehua
    Real-time haptic rendering of three-dimensional fluid flow will improve the interactivity and realism of video games and surgical simulators, but it remains a challenging undertaking due to its high computational cost. In this work we propose an innovative GPUbased approach that enables real-time haptic rendering of highresolution 3D Navier-Stokes fluids. We show that moving the vast majority of the computation to the GPU allows for the simulation of touchable fluids at resolutions and frame rates that are significantly higher than any other recent real-time methods without a need for pre-computations [Baxter and Lin 2004; Mora and Lee 2008; Dobashi et al. 2006].
  • Publication
    The Penn Baxter Face Database
    (2017-03-23) Fitter, Naomi T.; Kuchenbecker, Katherine J.
    The Penn Baxter Face Database is composed of Baxter robot face images designed in a variety of expressions and colors. Each of these images was photographed on the physical Baxter robot and assessed by internet raters (N = 568) in an Amazon Mechanical Turk survey. Raters assessed the pleasantness and energeticness of each robot face and also shared how safe and pleased each face made them feel. This project was published in our ICSR 2016 paper entitled “Designing and Assessing Expressive Open-Source Faces for the Baxter Robot.” After hearing of interest from other researchers, we previously released our Baxter face database on GitHub at https://github.com/nfitter/BaxterFaces. This dataset, now additionally available on Scholarly Commons, includes the developed Baxter faces, photographs used in the Mechanical Turk survey, editable source files for the studied faces, and bonus faces developed in our subsequent design work with Baxter. These contents may benefit any Baxter users who publish images to the robot's face. The organization of the database is explained in the included ReadMe file.
  • Publication
    The Penn Hand-Clapping Motion Dataset
    (2016-11-27) Fitter, Naomi T; Kuchenbecker, Katherine J
    The Penn Hand-Clapping Motion Dataset is composed of inertial measurement unit (IMU) recordings from the hand motions of 15 naïve people. Each of these individuals participated in an experiment during which they were asked to pantomime various sequences of 10 different motions: back five, clap, double, down five, front five, lap pat, left five, right five, right snap, and up five. The examined motions comprise most typical actions from hand-clapping games like “Pat-a-cake” and “Slide.” This project was published in our IROS 2016 paper entitled “Using IMU Data to Demonstrate Hand-Clapping Games to a Robot.” After hearing of interest from other researchers, we are releasing the corresponding motion dataset, which was originally collected to help us investigate whether we could train highly accurate and rapid classifiers to label hand-clapping game motions performed by everyday people. This dataset, explained further in the included ReadMe file, may interest researchers who investigate human motion.
  • Publication
    Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data
    (2012-03-01) Culbertson, Heather; Romano, Joseph M; Castillo, Pablo; Mintz, Max; Kuchenbecker, Katherine J
    Dragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one’s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user’s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts.
  • Publication
    Recreating the Feel of the Human Chest in a CPR Manikin via Programmable Pneumatic Damping
    (2012-03-01) Kuchenbecker, Katherine J; Stanley, Andrew A; Healey, Simon K; Maltese, Matthew R
    It is well known that the human chest exhibits a strong force displacement hysteresis during CPR, a stark contrast to the non hysteretic behavior of standard spring manikins. We hypothesize that individuals with experience performing CPR on humans would perceive a manikin with damping as more realistic and better for training. By analyzing data collected from chest compressions on real patients, we created a dynamic model that accounts for this hysteresis with a linear spring and a one-way variable damper, and we built a new high-fidelity manikin to enact the desired force displacement relationship. A linkage attached to the chest plate converts vertical compression motions to the horizontal displacement of a set of pneumatic dashpot pistons, sending a volume of air into and out of the manikin through a programmable valve. Position and pressure sensors allow a microcontroller to adjust the valve orifice so that the provided damping force closely follows the desired damping force throughout the compression cycle. Eight experienced CPR practitioners tested both the new manikin and an identical looking standard manikin; the manikin with damping received significantly higher ratings for haptic realism and perceived utility as a training tool.
  • Publication
    Spatially Distributed Tactile Feedback for Kinesthetic Motion Guidance
    (2010-04-08) Kapur, Pulkit; Jensen, Mallory; Kuchenbecker, Katherine J.; Buxbaum, Laurel J.; Jax, Steven A.
    Apraxic stroke patients need to perform repetitive arm movements to regain motor functionality, but they struggle to process the visual feedback provided by typical virtual rehabilitation systems. Instead, we imagine a low cost sleeve that can measure the movement of the upper limb and provide tactile feedback at key locations. The feedback provided by the tactors should guide the patient through a series of desired movements by allowing him or her to feel limb configuration errors at each instant in time. After discussing the relevant motion capture and actuator options, this paper describes the design and programming of our current prototype, a wearable tactile interface that uses magnetic motion tracking and shaftless eccentric mass motors. The sensors and actuators are attached to the sleeve of an athletic shirt with novel plastic caps, which also help focus the vibration on the user's skin. We connect the motors in current drive for improved performance, and we present a full parametric model for their in situ dynamic response (acceleration output given current input).
  • Publication
    Stiffness Discrimination with Visual and Proprioceptive Cues
    (2009-04-03) Gurari, Netta; Kuchenbecker, Katherine J.; Okamura, Allison M.
    This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the user's applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.
  • Publication
    High Frequency Acceleration Feedback Significantly Increases the Realism of Haptically Rendered Textured Surfaces
    (2010-04-08) McMahan, William; Romano, Joseph M.; Rahuman, Amal M. Abdul; Kuchenbecker, Katherine J.
    Almost every physical interaction generates high frequency vibrations, especially if one of the objects is a rigid tool. Previous haptics research has hinted that the inclusion or exclusion of these signals plays a key role in the realism of haptically rendered surface textures, but this connection has not been formally investigated until now. This paper presents a human subject study that compares the performance of a variety of surface rendering algorithms for a master-slave teleoperation system; each controller provides the user with a different combination of position and acceleration feedback, and subjects compared the renderings with direct tool-mediated exploration of the real surface. We use analysis of variance to examine quantitative performance metrics and qualitative realism ratings across subjects. The results of this study show that algorithms that include high-frequency acceleration feedback in combination with position feedback achieve significantly higher realism ratings than traditional position feedback alone. Furthermore, we present a frequency-domain metric for quantifying a controller's acceleration feedback performance; given a constant surface stiffness, the median of this metric across subjects was found to have a significant positive correlation with median realism rating.
  • Publication
    Spectral Subtraction of Robot Motion Noise for Improved Event Detection in Tactile Acceleration Signals
    (2012-06-01) Kuchenbecker, Katherine J; McMahan, William
    New robots for teleoperation and autonomous manipulation are increasingly being equipped with high-bandwidth accelerometers for measuring the transient vibrational cues that occur during con- tact with objects. Unfortunately, the robot's own internal mechanisms often generate significant high-frequency accelerations, which we term ego-vibrations. This paper presents an approach to characterizing and removing these signals from acceleration measurements. We adapt the audio processing technique of spectral subtraction over short time windows to remove the noise that is estimated to occur at the robot's present joint velocities. Implementation for the wrist roll and gripper joints on a Willow Garage PR2 robot demonstrates that spectral subtraction significantly increases signal-to-noise ratio, which should improve vibrotactile event detection in both teleoperation and autonomous robotics.
  • Publication
    Shaping Event-Based Haptic Transients Via an Improved Understanding of Real Contact Dynamics
    (2007-04-02) Fiene, Jonathan P.; Kuchenbecker, Katherine J.
    Haptic interactions with stiff virtual surfaces feel more realistic when a short-duration transient is added to the spring force at contact. But how should this event-based transient be shaped? To answer this question, we present a targeted user study on virtual surface realism that demonstrates the importance of scaling transients correctly and hints at the complexity of this dynamic relationship. We then present a detailed examination of the dynamics of tapping on a rigid surface with a hand-held probe; theoretical modeling is combined with empirical data to determine the influence of impact velocity, impact acceleration, and user grip force on the resulting transient surface force. The derived mathematical relationships provide a formula for generating open-loop, event-based force transients upon impact with a virtual surface. By incorporating an understanding of the dynamics of real interactions into the re-creation of virtual contact, these findings promise to improve the performance and realism of a wide range of haptic simulations.