Human sounds convey emotions clearer and faster than words
// NEW SAVANNA
It takes just one-tenth of a second for our brains to begin to recognize emotions conveyed by vocalizations, according to researchers from McGill. It doesn't matter whether the non-verbal sounds are growls of anger, the laughter of happiness or cries of sadness. More importantly, the researchers have also discovered that we pay more attention when an emotion (such as happiness, sadness or anger) is expressed through vocalizations than we do when the same emotion is expressed in speech.The researchers believe that the speed with which the brain 'tags' these vocalizations and the preference given to them compared to language, is due to the potentially crucial role that decoding vocal sounds has played in human survival."The identification of emotional vocalizations depends on systems in the brain that are older in evolutionary terms," says Marc Pell, Director of McGill's School of Communication Sciences and Disorders and the lead author on the study that was recently published in Biological Psychology. "Understanding emotions expressed in spoken language, on the other hand, involves more recent brain systems that have evolved as human language developed."
The primary research report:
M.D. Pella, b, K. Rothermich, P. Liu, S. Paulmann, S. Sethi, S. Rigoulot, Preferential decoding of emotion from human non-linguistic vocalizations versus speech prosody, Biological Psychology, Volume 111, October 2015, Pages 14–25Abstract: This study used event-related brain potentials (ERPs) to compare the time course of emotion processing from non-linguistic vocalizations versus speech prosody, to test whether vocalizations are treated preferentially by the neurocognitive system. Participants passively listened to vocalizations or pseudo-utterances conveying anger, sadness, or happiness as the EEG was recorded. Simultaneous effects of vocal expression type and emotion were analyzed for three ERP components (N100, P200, late positive component). Emotional vocalizations and speech were differentiated very early (N100) and vocalizations elicited stronger, earlier, and more differentiated P200 responses than speech. At later stages (450–700 ms), anger vocalizations evoked a stronger late positivity (LPC) than other vocal expressions, which was similar but delayed for angry speech. Individuals with high trait anxiety exhibited early, heightened sensitivity to vocal emotions (particularly vocalizations). These data provide new neurophysiological evidence that vocalizations, as evolutionarily primitive signals, are accorded precedence over speech-embedded emotions in the human voice.
Shared via my feedly reader