Michael Kraus from Yale University’s School of Management talks with us about his research examining the role of the voice in our capacity to accurately estimate the emotions of others. His open-access article, “Voice-Only Communication Enhances Empathic Accuracy“,  was published in the American Psychologist in October of 2017.

Empathic Accuracy - Michael Kraus
Empathic Accuracy - Michael Kraus
Empathic Accuracy - Michael Kraus Empathic Accuracy - Michael Kraus
@rwatkins says:
To wrap things up, Doug and I were interested in knowing where Michael might place his research within the bigger context of emotions research. Here, he discussed whether emotion is a real and naturally evolved trait in humans, rather than one that we invented.
@rwatkins says:
Small effects - especially when meta-analyzed, like Michael did across his studies - can point toward big benefits, so we wondered if Michael believes that his study might potentially have value for researchers in other disciplines.
@rwatkins says:
Reading the emotions of others is a major hurdle for the kinds of Artificial Intelligence systems that we interact with in everyday life, such as Amazon Alexa and Google Home. Here, Michael explains what predications he predicts that future empathic accuracy research might have on the design of AI systems.
@rwatkins says:
Michael's fourth study examined empathic accuracy in A real-world contexts where voice-only modes of communication are common: voice chat in the workplace. He discovered that it too led participants to focus more attention on speech content and vocal cues than did facial expressions. The study was preregistered with the Center for Open Science at a time when preregistration was less common than it is today. Michael talked with us next about what led him to decide to preregister that study.
@rwatkins says:
Michael's studies involved a variety of unique interventions. So Doug and I asked about these variations in interactions and why he believes that they were important.
@rwatkins says:
Studies previous to Michael's had explored the role of voice along with visual cues on empathic accuracy. That research suggested that voice played little role in participants' abilities to correctly identify the emotions others were feeling. Michael, however, questioned some of these findings, and decided to explore questions similar to those of previous studies. Here he explains why and how he did so.
@rwatkins says:
Touch can also be a primary sensor for understanding the emotions of others. A gentle touch to one’s cheek evokes emotions quite different than a slap to the face. We asked Michael where touch fits in his research into empathic accuracy.
@rwatkins says:
Similarly, Ryan and I were curious in learning what led Michael to decide to have participants in the second study interact with each other in the dark, as well as what kind of situations this might mimic in the real world outside of the lab. Michael explains.
@rwatkins says:
Doug and I wondered about those recordings of two friends teasing each other in michael’s first study, so we circled back to ask what led him to choose to study that kind of social interaction as a stimulus.
@rwatkins says:
Michael's first experiment revolved around previously-recorded conversations that took place between two friends who were teasing each other. For his first study, he recruited 300 people to either watch these videos with the sound on … watch the videos, but with the audio muted … or listen only to the audio without the video … then assess how each of the friends' felt during the conversation. Finding some support for the idea that people perceive emotions more accurately through voice-only than visual-only or multi-sense communication, Michael carried out a second experiment in which 266 people were paired together and videotaped having conversations either in a lighted room, or in a darkened room, but with the camera's night vision feature activated. Again, people in the experiment were somewhat better at judging their partner's self-assessed emotions when they could only hear their voice. Then, in his third experiment, Michael recruited 600 people to watch and listen to the lighted-room video recording, a video of an interaction recorded using night vision … or an audio-only recording of the interaction. Here, he talks with us about why he chose to do this and what he found.
@rwatkins says:
Michael's article covers a series of unusual experiments regarding empathic accuracy, the details of which we'll discuss in a moment. Next, Michael explains what inspired this line of research, and summarizes how the various studies fit together.
@rwatkins says:
A footnote on the first page of Michael's article dedicates it to Zoe, so we began our conversation by asking Michael who Zoe is, and why his paper is dedicated to her.
{{svg_share_icon}}
Click bottom of waveform to add your comments


 

Websites

▲ Light and dark conditions from Study #2

 

▲ Audio sample of digitized speech from Study #5 (NSFW)

Bonus clips

 

Patrons of Parsing Science gain exclusive access to bonus clips from all our episodes and can also download mp3s of every individual episode.

Support us for as little as $1 per month at Patreon. Cancel anytime.

Patrons can access bonus content here.


We’re not a registered tax-exempt organization, so unfortunately gifts aren’t tax deductible.

 

Hosts / Producers

Doug Leigh & Ryan Watkins

How to Cite

Leigh, D., Watkins, R., & Kraus, M.. (2018, March 6). Parsing Science – Empathic Accuracy. figshare. https://doi.org/10.6084/m9.figshare.5956207.v2

Music

What’s The Angle? by Shane Ivers