Center for Human Modeling and Simulation

Document Type

Journal Article

Date of this Version

12-2008

Publication Source

Medicine Meets Virtual Reality

Abstract

A myriad of surgical tasks rely on puncturing tissue membranes (Fig. 1) and cutting through tissue mass. Properly training a practitioner for such tasks requires a simulator that can display both the graphical changes and the haptic forces of these deformations, punctures, and cutting actions. This paper documents our work to create a simulator that can model these effects in real time. Generating graphic and haptic output necessitates the use of a predictive model to track the tissue’s physical state. Many finite element methods (FEM) exist for computing tissue deformation ([1],[4]). These methods often obtain accurate results, but they can be computationally intensive for complex models. Real-time tasks using this approach are often limited in their complexity and workspace domain due to the large computational overhead of FEM. The computer graphics community has developed a large range of methods for modeling deformable media [5], often trading complete physical accuracy for computational speedup. Casson and Laugier [3] outline a mass-spring mesh model based on these principles, but they do not explore its usage with haptic interaction. Gerovich et al. [2] detail a set of haptic interaction rules (Fig. 2) for one dimensional simulation of multi-layer deformable tissue, but they do not provide strategies for integrating this model with realistic graphic feedback.

Copyright/Permission Statement

This is the author's post-print version.

Share

COinS
 

Date Posted: 19 February 2016