News Release

Understanding emotions without language

Does understanding emotions depend on the language we speak, or is our perception the same regardless of language and culture?

Peer-Reviewed Publication

Max-Planck-Gesellschaft

Emotions and Language

image: The researchers findings suggest that understanding emotional signals is not based on the words we have in your language to describe emotions. Instead, emotions appear to have evolved as a set of basic human mechanisms. view more 

Credit: Oliver Le Guen/MPI for Psycholinguistics

According to a new study by researchers from the MPI for Psycholinguistics and the MPI for Evolutionary Anthropology, you don't need to have words for emotions to understand them. The results of the study were published online on October 17 in Emotion, a journal of the American Psychological Association. The study provides new evidence that the perception of emotional signals is not driven by language, supporting the view that emotions constitute a set of biologically evolved mechanisms.

The study compared German speakers to speakers of Yucatec Maya, a Mayan language spoken in the Yucatan peninsula of Mexico. In Yucatec Maya, there is no word for the emotion 'disgust', explains co-author Oliver Le Guen, an anthropologist based in Mexico. When Yucatec Mayan subjects were shown photographs of emotional faces and asked what they thought the person in each photo was feeling, they used the same words to describe angry and disgusted faces. German-speaking subjects, however, used different words for angry and disgusted faces. This showed that the two languages differ in which words speakers have access to for these emotions.

Mix of emotions

The participants from the two language groups were also asked to perform a task using photographs of people showing mixed emotions. The photographs were digitally manipulated to control the mix of emotions in the faces so that the two photos were always equally different across all pairs. Subjects were shown a photo of a mixed-emotion face, which was then replaced by a pair of photos. One of the members of the pair was the original photo, the other featured the same person, but with a slightly different mix of emotions. In some pairs, the dominant emotion in the two photos was different, while in other pairs, the dominant emotion was the same. The participants were asked, for many pairs of photos, which of the two pictures they had just seen.

"Earlier research has found that people who have different words for two emotions do better on this task when the dominant emotion in the two photographs is different, like when one is mainly angry and the other one is mainly disgusted," explains Disa Sauter. "But is this because they internally label the faces angry and disgusted, or is it because emotions are processed by basic human mechanisms that have categories like anger and disgust regardless of whether we have words for those feelings?"

Basic human categories

The crucial test provided by Sauter and colleagues' study was how the Yucatec Maya speakers would do, since they only have one word for both disgust and anger. Their results showed that they did the same as the German speakers, performing better on the task when the two faces they had to choose between were dominated by different emotions.

"Our results show that understanding emotional signals is not based on the words you have in your language to describe emotions," Sauter says. "Instead, our findings support the view that emotions have evolved as a set of basic human mechanisms, with emotion categories like anger and disgust existing regardless of whether we have words for those feelings."

###

Original publication

Sauter, D. A., LeGuen, O., & Haun, D. B. M. (2011, October 17). Categorical Perception of Emotional Facial Expressions Does Not Require Lexical Categories. Emotion. Advance online publication. doi: 10.1037/a0025336.


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.