Departmental Papers (CIS)

Document Type

Conference Paper

Date of this Version

May 2004

Comments

Copyright 2004 IEEE. Reprinted from Proceedings of the 6th IEEE International Conference on Automatic Face and Gesture Recognition 2004 (FGR 2004), pages 875-880.
Publisher URL: http://ieeexplore.ieee.org/xpl/tocresult.jsp?isNumb er=28919&page=9

This material is posted here with permission of the IEEE. Such permission of the IEEE does not in any way imply IEEE endorsement of any of the University of Pennsylvania's products or services. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to pubs-permissions@ieee.org. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.

Abstract

We present an algorithm for automatic inference of human upper body motion. A graph model is proposed for inferring human motion, and motion inference is posed as a mapping problem between state nodes in the graph model and features in image patches. Belief propagation is utilized for Bayesian inference in this graph. A multiple-frame inference model/algorithm is proposed to combine both structural and temporal constraints in human motion. We also present a method for capturing constraints of human body configuration under different view angles. The algorithm is applied in a prototype system that can automatically label upper body motion from videos, without manual initialization of body parts.

Keywords

Bayes methods, Markov processes, belief networks, graph theory, image motion analysis, object detection, tracking, Bayesian inference, Markov network model, belief propagation, graph model, human motion detection, human motion tracking, human upper body motion, motion energy image, multiple frame motion inference model

Share

COinS
 

Date Posted: 15 November 2004

This document has been peer reviewed.