About
Research Questions
Physical touch is one of the most important channels of interpersonal communication, but we barely notice the haptic clues in our daily lives. The study of robot-human affective interaction has primarily focused on facial expressions and vocal interactions, but not touch.
What if the robots being developed today and in the future had the ability to express their emotions to humans through physical contact?
Could it be possible for humans to understand robots’ emotions through haptic interactions?
EmotiTactor
EmotiTactor is a research experiment that aims to improve the emotional expressiveness of the robot. A tangible interface is constructed to perform haptic stimulations for primary emotions through a series of machine tactors (tactile organs).
Research
Human-Human Touch
Hertenstein et al.’s research [1] indicated that humans can decode distinct emotions by communication via touch. They held a study that divided the participants into dyads and randomly assigned them to the role of encoder and decoder. The encoders needed to convey the emotion assigned to them by touching the decoders’ forearms without any visual or vocal communication. The decoders needed to choose the emotion that they felt through the cutaneous stimuli on the “response sheet.” The result showed that anger, fear, disgust, love, gratitude, and sympathy could be decoded at above-chance levels. They also recorded the most commonly used tactile behaviors for each emotion. These research findings inspired me to explore the primary emotions from human-human touches utilized in the context of HCI to create human-robot touches.
design
To study the emotional expression of robotic contact, I replaced the role of the encoder with the touches of a robotic tactor (a tactile organ). I hypothesized that humans could decode the emotions from the robotic tactile stimulation, similar to human-human touches.
Prototyping
The prototype is a structure that fits the length of the forearm (18 inches). At the bottom of the structure, there is a groove in the shape of a hand. Inside the structure, there are four machine tactors driven by servo motors. Each tactor aims at one or two of the most “frequent types of touch” by the motion of the servo motor. The tactile behaviors recorded in Hertenstein et al.’s study included: squeezing, trembling, patting, hitting, shaking, and stroking.
Design of Tactors
STUDY
Next, I ran two rounds of studies with 10 participants for each round (study1: 2 males, 8 females with an average age of 24.6; study2: 4 male, 5 female, 1 other with an average age of 24.1) in order to test the haptic simulations for primary emotions. Each participant was given the background and learned about the protocols mentioned below before participating.
PROTOCOLS
On arrival, the participant sat at a table.
Participants needed to wear noise reduction earphones.
Participants put their forearm into the machine through a hole at the bottom of the opaque foam board which separated him or her with the EmotiTactor.
Calibration is carried out before running the test to guarantee that the tactors indeed touch their arms.
A priming process was added in the second-round study. It is called overview, in which the machines cycles through all of the haptic functions, but the participants are not needed to make responses during this routine.
The machines then cycled through the functions according to a random order of the six emotions, which was generated by p5.js.
After each function, participants were asked to make a choice on a response sheet.
Result
The results in the right table indicate that humans can decode at least five emotions through robotic tactile behaviors (fear, disgust, happiness, anger, and sympathy). The decoding accuracy rates ranged from 40% to 100% (from study 2) which are significantly higher than above-chance levels (25% [3]). Participants were easily confused about sadness with sympathy, which was consistent with the previous study [1]. During the participants’ interviews, however, we also observed that it was hard for individuals to tell the differences between these two emotions, at the subjective level.
iteration
Thesis Show
MFA Design and Technology, class of 2020
Parsons School of Design
Advisors: Harpreet Sareen, Loretta Wolozin, John Sharp, Barbara Morris
2020. 5
vr application
We propose a forearm-mounted robot that performs complementary touches in relation to the behaviors of a companion agent in virtual reality (VR). The robot consists of a series of tactors driven by servo motors that render specific tactile patterns to communicate primary emotions (fear, happiness, disgust, anger, and sympathy) and other notification cues. We showcase this through a VR game with physical-virtual agent interactions that facilitate the player-companion relationship and increase user immersion in specific scenarios. The player collaborates with the agent to complete a mission while receiving affective haptic cues with the potential to enhance sociality in the virtual world.
Future Applications
PUBLICATION
Ran Zhou and Harpreet Sareen. 2020. EmotiTactor: Emotional Expression of Robotic Physical Contact. In Companion Publication of the 2020 ACM Designing Interactive Systems Conference (DIS’ 20)
Ran Zhou, Yanzhe Wu and Harpreet Sareen. 2020. HexTouch: A Wearable Haptic Robot for ComplementaryInteractions to Companion Agents in Virtual Reality. In SIGGRAPH Asia 2020 Emerging Technologies (SA ’20)
Ran Zhou, Yanzhe Wu and Harpreet Sareen. 2020. HexTouch: Affective Robot Touch for Complementary Interactions to Companion Agents in Virtual Reality. In 26th ACM Symposium on Virtual Reality Software and Technology (VRST '20) Best Demo Award
Selected References
[1] Matthew J. Hertenstein, Dacher Keltner, Betsy App, Brittany A. Bulleit, and Ariane R. Jaskolka. 2006. Touch communicates distinct emotions. Emotion 6, 3 (2006), 528–533. DOI:http://dx.doi.org/10.1037/1528-3542.6.3.528
[2] Amol Deshmukh, Bart Craenen, Alessandro Vinciarelli, and Mary Ellen Foster. 2018. Shaping Robot Gestures to Shape Users’ Perception: The Effect of Amplitude and Speed on Godspeed Ratings. In Proceedings of the 6th International Conference on Human-Agent Interaction (HAI ’18). Association for Computing Machinery, New York, NY, USA, 293–300. DOI:https://doi.org/10.1145/3284432.3284445
[3] Mark G. Frank and Janine Stennett. 2001. The forced-choice paradigm and the perception of facial expressions of emotion. Journal of Personality and Social Psychology 80, 1 (2001), 75–85. DOI:http://dxdoi.org/10.1037/0022-3514.80.1.75