Kuchenbecker, Katherine J.
Email Address
ORCID
Disciplines
Search Results
Now showing 1 - 10 of 33
Publication Spectral Subtraction of Robot Motion Noise for Improved Event Detection in Tactile Acceleration Signals(2012-06-01) Kuchenbecker, Katherine J; McMahan, WilliamNew robots for teleoperation and autonomous manipulation are increasingly being equipped with high-bandwidth accelerometers for measuring the transient vibrational cues that occur during con- tact with objects. Unfortunately, the robot's own internal mechanisms often generate significant high-frequency accelerations, which we term ego-vibrations. This paper presents an approach to characterizing and removing these signals from acceleration measurements. We adapt the audio processing technique of spectral subtraction over short time windows to remove the noise that is estimated to occur at the robot's present joint velocities. Implementation for the wrist roll and gripper joints on a Willow Garage PR2 robot demonstrates that spectral subtraction significantly increases signal-to-noise ratio, which should improve vibrotactile event detection in both teleoperation and autonomous robotics.Publication The Penn Hand-Clapping Motion Dataset(2016-11-27) Fitter, Naomi T; Kuchenbecker, Katherine JThe Penn Hand-Clapping Motion Dataset is composed of inertial measurement unit (IMU) recordings from the hand motions of 15 naïve people. Each of these individuals participated in an experiment during which they were asked to pantomime various sequences of 10 different motions: back five, clap, double, down five, front five, lap pat, left five, right five, right snap, and up five. The examined motions comprise most typical actions from hand-clapping games like “Pat-a-cake” and “Slide.” This project was published in our IROS 2016 paper entitled “Using IMU Data to Demonstrate Hand-Clapping Games to a Robot.” After hearing of interest from other researchers, we are releasing the corresponding motion dataset, which was originally collected to help us investigate whether we could train highly accurate and rapid classifiers to label hand-clapping game motions performed by everyday people. This dataset, explained further in the included ReadMe file, may interest researchers who investigate human motion.Publication High Frequency Acceleration Feedback Significantly Increases the Realism of Haptically Rendered Textured Surfaces(2010-04-08) McMahan, William; Romano, Joseph M.; Rahuman, Amal M. Abdul; Kuchenbecker, Katherine J.Almost every physical interaction generates high frequency vibrations, especially if one of the objects is a rigid tool. Previous haptics research has hinted that the inclusion or exclusion of these signals plays a key role in the realism of haptically rendered surface textures, but this connection has not been formally investigated until now. This paper presents a human subject study that compares the performance of a variety of surface rendering algorithms for a master-slave teleoperation system; each controller provides the user with a different combination of position and acceleration feedback, and subjects compared the renderings with direct tool-mediated exploration of the real surface. We use analysis of variance to examine quantitative performance metrics and qualitative realism ratings across subjects. The results of this study show that algorithms that include high-frequency acceleration feedback in combination with position feedback achieve significantly higher realism ratings than traditional position feedback alone. Furthermore, we present a frequency-domain metric for quantifying a controller's acceleration feedback performance; given a constant surface stiffness, the median of this metric across subjects was found to have a significant positive correlation with median realism rating.Publication Shaping Event-Based Haptic Transients Via an Improved Understanding of Real Contact Dynamics(2007-04-02) Fiene, Jonathan P.; Kuchenbecker, Katherine J.Haptic interactions with stiff virtual surfaces feel more realistic when a short-duration transient is added to the spring force at contact. But how should this event-based transient be shaped? To answer this question, we present a targeted user study on virtual surface realism that demonstrates the importance of scaling transients correctly and hints at the complexity of this dynamic relationship. We then present a detailed examination of the dynamics of tapping on a rigid surface with a hand-held probe; theoretical modeling is combined with empirical data to determine the influence of impact velocity, impact acceleration, and user grip force on the resulting transient surface force. The derived mathematical relationships provide a formula for generating open-loop, event-based force transients upon impact with a virtual surface. By incorporating an understanding of the dynamics of real interactions into the re-creation of virtual contact, these findings promise to improve the performance and realism of a wide range of haptic simulations.Publication The Penn Baxter Face Database(2017-03-23) Fitter, Naomi T.; Kuchenbecker, Katherine J.The Penn Baxter Face Database is composed of Baxter robot face images designed in a variety of expressions and colors. Each of these images was photographed on the physical Baxter robot and assessed by internet raters (N = 568) in an Amazon Mechanical Turk survey. Raters assessed the pleasantness and energeticness of each robot face and also shared how safe and pleased each face made them feel. This project was published in our ICSR 2016 paper entitled “Designing and Assessing Expressive Open-Source Faces for the Baxter Robot.” After hearing of interest from other researchers, we previously released our Baxter face database on GitHub at https://github.com/nfitter/BaxterFaces. This dataset, now additionally available on Scholarly Commons, includes the developed Baxter faces, photographs used in the Mechanical Turk survey, editable source files for the studied faces, and bonus faces developed in our subsequent design work with Baxter. These contents may benefit any Baxter users who publish images to the robot's face. The organization of the database is explained in the included ReadMe file.Publication Refined Methods for Creating Realistic Haptic Virtual Textures from Tool-Mediated Contact Acceleration Data(2012-03-01) Culbertson, Heather; Romano, Joseph M; Castillo, Pablo; Mintz, Max; Kuchenbecker, Katherine JDragging a tool across a textured object creates rich high-frequency vibrations that distinctly convey the physical interaction between the tool tip and the object surface. Varying one’s scanning speed and normal force alters these vibrations, but it does not change the perceived identity of the tool or the surface. Previous research developed a promising data-driven approach to embedding this natural complexity in a haptic virtual environment: the approach centers on recording and modeling the tool contact accelerations that occur during real texture interactions at a limited set of force-speed combinations. This paper aims to optimize these prior methods of texture modeling and rendering to improve system performance and enable potentially higher levels of haptic realism. The key elements of our approach are drawn from time series analysis, speech processing, and discrete-time control. We represent each recorded texture vibration with a low-order auto-regressive moving-average (ARMA) model, and we optimize this set of models for a specific tool-surface pairing (plastic stylus and textured ABS plastic) using metrics that depend on spectral match, final prediction error, and model order. For rendering, we stably resample the texture models at the desired output rate, and we derive a new texture model at each time step using bilinear interpolation on the line spectral frequencies of the resampled models adjacent to the user’s current force and speed. These refined processes enable our TexturePad system to generate a stable and spectrally accurate vibration waveform in real time, moving us closer to the goal of virtual textures that are indistinguishable from their real counterparts.Publication Design of Body-Grounded Tactile Actuators for Playback of Human Physical Contact(2011-06-01) Kuchenbecker, Katherine J; Stanley, Andrew AWe present four wearable tactile actuators capable of recreating physical sensations commonly experienced in human interactions, including tapping on, dragging across, squeezing, and twisting an individual’s wrist. In seeking to create tactile signals that feel natural and are easy to understand, we developed movement control interfaces to play back each of these forms of actual human physical contact. Through iterative design, prototyping, programming, and testing, each of these servo-motor-based mechanisms produces a signal that is gradable in magnitude, can be played in a variety of temporal patterns, is localizable to a small area of skin, and, for three of the four actuators, has an associated direction. Additionally, we have tried to design toward many of the characteristics that have made high frequency vibration the most common form of wearable tactile feedback, including low cost, light weight, comfort, and small size. Bolstered by largely positive comments from naive users during an informal testing session, we plan to continue improving these devices for future use in tactile motion guidance.Publication Haptic Displayof Realistic Tool Contact via Dynamically Compensated Control of a Dedicated Actuator(2009-12-15) McMahan, William; Kuchenbecker, Katherine J.High frequency contact accelerations convey important information that the vast majority of haptic interfaces cannot render. Building on prior work, we present an approach to haptic interface design that uses a dedicated linear voice coil actuator and a dynamic system model to allow the user to feel these signals. This approach was tested through use in a bilateral teleoperation experiment where a user explored three textured surfaces under three different acceleration control architectures: none, constant gain, and dynamic compensation. The controllers that use the dedicated actuator vastly outperform traditional position-position control at conveying realistic contact accelerations. Analysis of root mean square error, linear regression, and discrete Fourier transforms of the acceleration data also indicate a slight performance benefit for dynamic compensation over constant gain.Publication Improving Telerobotic Touch Via High-Frequency Acceleration Matching(2006-06-26) Kuchenbecker, Katherine J.; Niemeyer, GünterHumans rely on information-laden high-frequency accelerations in addition to quasi-static forces when interacting with objects via a handheld tool. Telerobotic systems have traditionally struggled to portray such contact transients due to closed-loop bandwidth and stability limitations, leaving remote objects feeling soft and undefined. This work seeks to maximize the user’s feel for the environment through the approach of acceleration matching; high-frequency fingertip accelerations are combined with standard low-frequency position feedback without requiring a secondary actuator on the master device. In this method, the natural dynamics of the master are identified offline using frequency-domain techniques, estimating the relationship between commanded motor current and handle acceleration while a user holds the device. During subsequent telerobotic interactions, a high-bandwidth sensor measures accelerations at the slave’s end effector, and the real-time controller re-creates these important signals at the master handle by inverting the identified model. The details of this approach are explored herein, and its ability to render hard and rough surfaces is demonstrated on a standard master-slave system. Combining high-frequency acceleration matching with position-error-based feedback of quasi-static forces creates a hybrid signal that closely corresponds to human sensing capabilities, instilling telerobotics with a more realistic sense of remote touch.Publication The Touch Thimble: Providing Fingertip Contact Feedback During Point-Force Haptic Interaction(2008-03-14) Kuchenbecker, Katherine J; Ferguson, David; Kutzer, Michael; Moses, Matthew; Okamura, Allison MTouching a real object with your fingertip provides simultaneous tactile and force feedback, yet most haptic interfaces for virtual environments can convey only one of these two essential modalities. To address this opportunity, we designed, prototyped, and evaluated the Touch Thimble, a new fingertip device that provides the user with the cutaneous sensation of making and breaking contact with virtual surfaces. Designed to attach to the endpoint of an impedance-type haptic interface like a SensAble Phantom, the Touch Thimble includes a slightly oversize cup that is suspended around the fingertip by passive springs. When the haptic interface applies contact forces from the virtual environment, the springs deflect to allow contact between the user's fingertip and the inner surface of the cup. We evaluated a prototype Touch Thimble against a standard thimble in a formal user study and found that it did not improve nor degrade subjects' ability to recognize smoothly curving surfaces. Although four of the eight subjects preferred it to the standard interface, overall the Touch Thimble made subjects slightly slower at recognizing the presented shapes. Detailed subject comments point out strengths and weaknesses of the current design and suggest avenues for future development of the device.