Fitter, Naomi T.

Email Address
ORCID
Disciplines
Research Projects
Organizational Units
Position
Introduction
Research Interests

Search Results

Now showing 1 - 2 of 2
  • Publication
    The Penn Baxter Face Database
    (2017-03-23) Fitter, Naomi T.; Kuchenbecker, Katherine J.
    The Penn Baxter Face Database is composed of Baxter robot face images designed in a variety of expressions and colors. Each of these images was photographed on the physical Baxter robot and assessed by internet raters (N = 568) in an Amazon Mechanical Turk survey. Raters assessed the pleasantness and energeticness of each robot face and also shared how safe and pleased each face made them feel. This project was published in our ICSR 2016 paper entitled “Designing and Assessing Expressive Open-Source Faces for the Baxter Robot.” After hearing of interest from other researchers, we previously released our Baxter face database on GitHub at https://github.com/nfitter/BaxterFaces. This dataset, now additionally available on Scholarly Commons, includes the developed Baxter faces, photographs used in the Mechanical Turk survey, editable source files for the studied faces, and bonus faces developed in our subsequent design work with Baxter. These contents may benefit any Baxter users who publish images to the robot's face. The organization of the database is explained in the included ReadMe file.
  • Publication
    The Penn Hand-Clapping Motion Dataset
    (2016-11-27) Fitter, Naomi T; Kuchenbecker, Katherine J
    The Penn Hand-Clapping Motion Dataset is composed of inertial measurement unit (IMU) recordings from the hand motions of 15 naïve people. Each of these individuals participated in an experiment during which they were asked to pantomime various sequences of 10 different motions: back five, clap, double, down five, front five, lap pat, left five, right five, right snap, and up five. The examined motions comprise most typical actions from hand-clapping games like “Pat-a-cake” and “Slide.” This project was published in our IROS 2016 paper entitled “Using IMU Data to Demonstrate Hand-Clapping Games to a Robot.” After hearing of interest from other researchers, we are releasing the corresponding motion dataset, which was originally collected to help us investigate whether we could train highly accurate and rapid classifiers to label hand-clapping game motions performed by everyday people. This dataset, explained further in the included ReadMe file, may interest researchers who investigate human motion.