Departmental Papers (MEAM)

Document Type

Conference Paper

Subject Area

GRASP

Date of this Version

4-2-2007

Comments

Suggested Citation:
Kuchenbecker, Katherine J. Netta Gurari, and Allison M. Okaura. (2007). Quantifying the Value of Visual and Haptic Position Feedback During Force-Based Motion Control. Second Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems. Tsukuba, Japan. March 22-24, 2007.

©2007 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.

NOTE: At the time of publication, author Katherine J. Kuchenbecker was affiliated with Johns Hopkins University. Currently, she is a faculty member in the Department of Mechanical Engineering and Applied Mechanics at the University of Pennsylvania.

Abstract

Controlling the motion of a prosthetic upper limb without visual feedback is extremely difficult because the wearer does not know the prosthesis’ configuration. This paper describes an experiment designed to determine the relative importance of visual and haptic position feedback during targeted force-based motion by non-amputee human subjects as an analogy to prosthetic use. Subjects control the angle of a virtual proxy through an admittance relationship by generating torque at the MCP joint of the right index finger. During successive repetitions of a target acquisition task, the proxy’s state is selectively conveyed to the user through graphical display, finger motion, and tactile stimulation. Performance metrics for each feedback condition will provide insights on the role of haptic position feedback and may help guide the development of future upper-limb prostheses.

Share

COinS
 

Date Posted: 18 August 2010

This document has been peer reviewed.