Badler, Norman I

Email Address
ORCID
Disciplines
Research Projects
Organizational Units
Position
Introduction
Research Interests

Search Results

Now showing 1 - 10 of 184
  • Publication
    A Machine Translation System from English to American Sign Language
    (2000-10-10) Zhao, Liwei; Kipper, Karin; Schuler, William; Vogler, Christian; Palmer, Martha; Badler, Norman I
    Research in computational linguistics, computer graphics and autonomous agents has led to the development of increasingly sophisticated communicative agents over the past few years, bringing new perspective to machine translation research. The engineering of language-based smooth, expressive, natural-looking human gestures can give us useful insights into the design principles that have evolved in natural communication between people. In this paper we prototype a machine translation system from English to American Sign Language (ASL), taking into account not only linguistic but also visual and spatial information associated with ASL signs.
  • Publication
    Crowd Simulation Incorporating Agent Psychological Models, Roles and Communication
    (2005-11-24) Silverman, Barry G; Badler, Norman I; Pelechano, Nuria; O'Brien, Kevin
    We describe a new architecture to integrate a psychological model into a crowd simulation system in order to obtain believable emergent behaviors. Our existing crowd simulation system (MACES) performs high level wayfinding to explore unknown environments and obtain a cognitive map for navigation purposes, in addition to dealing with low level motion within each room based on social forces. Communication and roles are added to achieve individualistic behaviors and a realistic way to spread information about the environment. To expand the range of realistic human behaviors, we use a system (PMFserv) that implements human behavior models from a range of ability, stress, emotion, decision theoretic and motivation sources. An architecture is proposed that combines and integrates MACES and PMFserv to add validated agent behaviors to crowd simulations.
  • Publication
    Interactive Behaviors for Bipedal Articulated Figures
    (1991) Phillips, Cary B.; Badler, Norman I
    We describe techniques for interactively controlling bipedal articulated figures through kinematic constraints. These constraints model certain behavioral tendencies which capture some of the characteristics of human-like movement, and give us control over such elements as the figures' balance and stability. They operate in near real-time, so provide behavioral control for interactive manipulation. These constraints form the basis of an interactive motion-generation system that allows the active movement elements to be layered on top of the passive behavioral constraints.
  • Publication
    Virtual Humans for Animation, Ergonomics, and Simulation
    (1997-06-16) Badler, Norman I
    The last few years have seen great maturation in the computation speed and control methods needed to portray 3D virtual humans suitable for real interactive applications. We first describe the state of the art, then focus on the particular approach taken at the University of Pennsylvania with the Jack system. Various aspects of real-time virtual humans are considered, such as appearance and motion, interactive control, autonomous action, gesture, attention, locomotion, and multiple individuals. The underlying architecture consists of a sense-control-act structure that permits reactive behaviors to be locally adaptive to the environment, and a "PaT-Net" parallel finite-state machine controller that can be used to drive virtual humans through complex tasks. Finally, we argue for a deep connection between language and animation and describe current efforts in linking them through the JackMOO extension to lambdaMOO.
  • Publication
    Pedestrians: Creating Agent Behaviors through Statistical Analysis of Observation Data
    (2001-11-07) Ashida, Koji; Allbeck, Jan; Lee, Seung-Joo; Badler, Norman I; Sun, Harold; Metaxas, Dimitris
    Creating a complex virtual environment with human inhabitants that behave as we would expect real humans to behave is a difficult and time consuming task. Time must be spent to construct the environment, to create human figures, to create animations for the agents' actions, and to create controls for the agents' behaviors, such as scripts, plans, and decision-makers. Often work done for one virtual environment must be completely replicated for another. The creation of robust, procedural actions that can be ported from one simulation to another would ease the creation of new virtual environments. As walking is useful in many different virtual environments, the creation of natural looking walking is important. In this paper we present a system for producing more natural looking walking by incorporating actions for the upper body. We aim to provide a tool that authors of virtual environments can use to add realism to their characters without effort.
  • Publication
    Ultrasound Guided Regional Anesthesia Training Simulator Using Microsoft Kinect
    (2014-01-01) Badler, Norman I; Liu, Jiabin; Sairam, Aparajith; Richman, Kenneth; Feng, Jian; Elkassabany, Nabil
    We present a system for the interactive simulation of ultrasound guided peripheral nerve blocks using a Microsoft Kinect®. The system performs motion tracking of both the ultrasound probe and the nerve block needle. Software generates synthetic ultrasound images from previously captured ultrasound images. Details of the software elements in the system are described. Some of the current challenges and future work in this research are discussed.
  • Publication
    Eyes Alive
    (2002-01-01) Badler, Norman I; Badler, Jeremy B; Lee, Sooha Park
    For an animated human face model to appear natural it should produce eye movements consistent with human ocular behavior. During face-to-face conversational interactions, eyes exhibit conversational turn-taking and agent thought processes through gaze direction, saccades, and scan patterns. We have implemented an eye movement model based on empirical models of saccades and statistical models of eye-tracking data. Face animations using stationary eyes, eyes with random saccades only, and eyes with statistically derived saccades are compared, to evaluate whether they appear natural and effective while communicating.
  • Publication
    Integrating Anatomy and Physiology for Behavior Modeling
    (1995) DeCarlo, Douglas; Kaye, Jonathan; Metaxas, Dimitris; Webber, Bonnie L.; Clarke, John R.; Badler, Norman I
    In producing realistic, animatable models of the human body, we see much to be gained from developing a functional anatomy that links the anatomical and physiological behavior of the body through fundamental causal principles. This paper describes our current Finite Element Method implementation of a simplified lung and chest cavity during normal quiet breathing and then disturbed by a simple pneumothorax. The lung model interacts with the model of the chest cavity through applied forces. The models are modular, and a second lung and more complex chest wall model can be added without disturbing the model of the other lung. During inhalation, a breathing force (corresponding to exertion of the diaphragm and chest wall muscles) is applied, causing the chest cavity to expand. When this force is removed (at the start of exhalation), the stretched lung recoils, applying pressure forces to the chest wall which cause the chest cavity to contract. To simulate a simple pneumothorax, the intrapleural pressure is set to atmospheric pressure, which removes pressure forces holding the lung close to the chest cavity and results in the lung returning to its unstretched shape.
  • Publication
    Automated Analysis of Human Factors Requirements
    (2007-02-01) Allbeck, Jan; Badler, Norman I
    Computational ergonomic analyses are often laboriously tested one task at a time. As digital human models improve, we can partially automate the entire analysis process of checking human factors requirements or regulations against a given design. We are extending our Parameterized Action Representation (PAR) to store requirements and its execution system to drive human models through required tasks. Databases of actions, objects, regulations, and digital humans are instantiated into PARs and executed by analyzers that simulate the actions on digital humans and monitor the actions to report successes and failures. These extensions will allow quantitative but localized design assessment relative to specific human factors requirements.
  • Publication
    The Computer Graphics Scene in the United States
    (1984) Badler, Norman I; Carlbom, Ingrid
    We briefly survey the major thrusts of computer graphics activities, examining trends and topics rather than offering a comprehensive survey of all that is happening. The directions of professional activities, hardware, software, and algorithms are outlined. Within hardware we examine workstations, personal graphics systems, high performance systems, and low level VLSI chips; within software, standards and interactive system design; within algorithms, visible surface rendering and shading, three-dimensional modeling techniques, and animation.