Date of Award
Doctor of Philosophy (PhD)
Mechanical Engineering & Applied Mechanics
Katherine J. Kuchenbecker
Human friends commonly connect through handshakes and high fives, and children around the world rejoice at hand-clapping games. As robots enter everyday human spaces, they will have the opportunity to join in such physical interactions, but few current robots are intended to touch humans. How should robots move and react in playful hand-to-hand interactions with people?
We conducted research in four main areas to address this design challenge. First, we implemented and tested an initial hand-clapping robotic system. This effort began by recording sensor data from people performing a variety of hand-clapping activities; the resulting accelerometer and position data taught us how to design appropriate hand-clapping robot motion and logic. Implementation on a Rethink Robotics Baxter Research Robot demonstrated that a robot could move like our human participants and reliably detect hand impacts through its wrist-mounted accelerometers. N = 20 study participants clapped hands with differently configured versions of this robot in random order: the robot’s facial animation, physical reactivity, arm stiffness, and clapping tempo all significantly affected how users perceived the robot.
We next sought to create and evaluate more sophisticated robot hand-clapping behaviors. Data from people performing interactive clapping tasks at increasing and decreasing tempos helped us propose prospective timing models and implement adaptive-tempo Baxter play. In a subsequent experiment that involved N = 20 users, a mischievous Baxter was equipped with the top-performing tempo adaptation model and chose to play cooperatively or asynchronously with its human partner. Although a few participants reacted positively to Baxter’s mischief, users overwhelmingly preferred a synchronous, cooperative robot.
Third, we set up and conducted a human-robot interaction experiment more similar to everyday human-human hand-clapping interactions. A machine learning pipeline trained on inertial data from human motions demonstrated that linear support vector machines (SVMs) can classify a new person’s hand-clapping actions with an accuracy of about 95%. This technique succeeded for both hand- and wrist-mounted inertial sensors, enabling people to teach the Baxter robot new hand- clapping games. Evaluation of various two-handed clapping play activities by N = 24 users showed that learning games from Baxter was significantly easier than teaching Baxter games, but that the teaching role caused people to consider more teamwork aspects of the gameplay.
Finally, to broaden the scope of these interactions, we began exploring applications of Baxter in socially assistive robotics. Using many of the same sensing and actuation strategies, we developed a set of six playful hand-to-hand contact-based exercise interactions to be jointly executed between a person and Baxter, along with two similar non-contact games. A proof-of-concept experiment using these exercise
games enrolled N = 20 young adults and N = 14 healthy adults over age 53. The results demonstrated that people are willing and motivated to interact with the robot in this way and that different games promote unique physical and cognitive exercise effects.
Overall, this research aims to help shape design processes for socially relevant physical human-robot interaction and reveal new opportunities for socially assistive robotics.
Fitter, Naomi T., "Design And Evaluation Of Interactive Hand-Clapping Robots" (2017). Publicly Accessible Penn Dissertations. 2282.
Available for download on Saturday, August 15, 2020