The Role of Language in the Perception of Emotions

Leda Blaires Ciotti
10 min readMay 11, 2021

--

A lot of people would agree that emotions exist as a set number of basic states that are expressed through certain signals, such as sweating or blushing. What is more, most of us would think of facial expressions as a sort of key that we can use to “read” what other individuals are feeling and believe that certain people are especially skilled at doing this. Such approach to emotions is referred to as the “basic emotion theory”, one of the most influential theories of emotion.

To illustrate how this theory works, imagine the following scenario: You are walking down the street to get some toasted bagels and start your day with energy. It’s beautifully sunny and warm outside, and you decided to wear a brand-new suit because why not? After you get your bagels, you sit on a bench and are about to start eating. But suddenly, you feel something warm and liquidy on your shoulder. Agh! A dove decided to -um- poop when she was flying right above you. You automatically lower your eyebrows, bulge your eyes, press your lips together, and emit a grunt. If someone had asked you what you felt at that moment, you would probably have answered that you felt angry. Moreover, if someone witnessing such an unfortunate event was paying attention to your reactions, she would have been able to recognize that you were feeling angry based on the expressions shown on your face.

In this situation, an external act (a dove pooping on your new suit) triggered the “emotion” anger or perhaps frustration, which produced some typical behavioral and physiological outcomes (lowered eyebrows, a grunt, heart rate going up). By noticing the affective state you were in and the behavioral and physiological responses your body emitted, you recognize you felt angry at the situation.

The basic emotion theory broadly suggests that traditional emotional categories such as anger, fear, sadness, are universal biological states that are triggered by an external act and produce a typical set of physiological and behavioral responses, especially through facial expressions. Our faces encode certain emotions such as anger unambiguously through facial muscle movements, and perceivers can automatically and effortlessly “decode” this facial expression so that information about one’s emotional state can be communicated. This model is supported for example by work on the universality of emotions by Paul Ekman, which proved in some way that people can accurately perceive emotions on others’ faces across cultural contexts (Lindquist & Gendron 2013).

Dr. Paul Ekman identified the six basic emotions as anger, fear, disgust, surprise, enjoyment, and sadness. (Though, no one makes these exaggerated expressions in real life, right?)

However, this basic emotion approach is confronted by a growing body of research that shows that people don’t consistently make the same faces that this model predicts for certain emotions in everyday situations. In some instances, people report feeling happiness but they don’t smile, which is one of the facial movements said to be specific for the emotion of happiness. In other instances, too, people make the facial muscle movements hypothesized to be specific for a single emotion like disgust, but report feeling multiple emotions like disgust, pain, and fear. Another example is that the facial muscle movements produced by congenitally blind individuals do not conform to those predicted by the model. Lastly, it has been shown that when the combinations of facial muscle movements predicted by the model do occur, they often tend to do so in social contexts (Lindquist & Gendron 2013).

In a study examining facial behaviors of gold medalists at award ceremonies, it was found smiling behavior was only frequent in social contexts, at an interactive stage. This suggests that happiness alone is not a sufficient reason for smiling (Fernández-Dols & Ruiz-Belda, 1995).

All of these examples show that emotions are not accurately reflected on the face, and that facial muscle movements viewed without context or conceptual knowledge about an individual’s affective state can be ambiguous to the perceiver. In other words, facial expressions don’t provide all of the information we need to guess an emotion, as the basic emotion approach predicts. Research suggests, however, that when no evidence is found for combinations of facial muscle movements corresponding to specific and discrete emotions, these muscle movements consistently correspond to general pleasant versus unpleasant feelings (Lindquist & Gendron 2013).

So what do we make of this? We still think of ourselves as very skilled at categorizing our own and other people’s behaviors into specific, discrete emotions, but research suggests that the facial muscle movements we see in other people’s faces are nothing more than expressions of simple pleasant or unpleasant feelings. Hmmm… so how do we then effortlessly categorize what we see in other people’s faces into a discrete emotion such as anger or fear if the facial muscle movements are not enough to determine this? An answer to these questions is provided by the framework of the nature of emotions known as constructionist theory of emotions.

Drawing from recent findings in psychology, cognitive science, and neuroscience, this constructionist approach argues that language plays an constitutive role in emotional perception because words serve to “acquire, organize, and use the concept knowledge that is an essential element in emotional perception” (Lindquist et al 2015). According to this view, experiencing emotions results from the act of conceptualizing basic affective responses during a categorization process, which is driven by knowledge about emotions that we acquired from previous experiences. Under this view, language serves to support this conceptual knowledge about emotions and helps construct emotional perception (Lindquist et al 2015). In other words, emotion words help us reduce the uncertainty that is present in natural facial expressions by making meaning out of their ambiguity and enable us to quickly categorize them into perceptions of discrete emotions.

This constructionist theory of emotion then suggests that emotion words serve as a form of context for categorizing behaviors associated with the expression of a certain affective states and instances of ambiguous facial muscle movements into perceptions of sensations like anger or fear. This hypothesis, often referred to as the language-as-context hypothesis, suggests that language plays an important and constitutive role in emotional perception.

Clear evidence for the way in which language helps construct emotion perception is found in the way typical emotion perception studies are run. In such studies, participants are presented with pictures of individuals exhibiting specific facial muscle movements, such as a wrinkled nose or a smile, and are asked to match them to words like “anger”, “sad”, “disgust”, etc. Giving the participants emotion words as response options seems inconsequential, but it has been shown that offering such options inflates the participants “accuracy” at identifying emotions (accuracy in this sense refers to the participants’ ability to match the facial expression (e.g. smile) to what the experimenter wants them to perceive (e.g. smile = happy)). Concretely, participants perform at above chance levels at identifying emotions in an “accurate” way when emotion words are given as options. However, when no such options are available and participants are asked to freely label a facial expression, their accuracy at identifying emotions is quite low (Lindquist et al 2015).

Moreover, if participants are asked to judge whether two facial expressions are portraying the same emotion but are not given emotion words as response options, their accuracy is even lower (Gendron et al 2012). Lastly, providing certain emotion words as response options can even make participants perceive a facial expression as an “incorrect” emotion: for instance, participants perceive a scowling facial expression as “disgust” instead of “anger” if the word “disgust” is provided as an option but “anger” is not (Lindquist et al 2015). All of these studies support the language-as-context hypothesis that emotion words help construct perception by providing the context needed to reduce the ambiguity associated with facial expressions and make meaning out of them through our conceptual knowledge of emotions.

In a typical emotion perception study, participants are shown pictures of individuals exhibiting specific facial expressions and are asked to match those pictures to given words like “anger”, “sad”, “disgust”, “surprise”, etc. Research has shown that providing such options influences participants’ perception of emotion. It’s the range for me…

In addition to the support lent by the methodology of typical emotion perception studies, several experimental findings provide evidence to language’s constitutive role in the perception of emotions. A concrete example of such studies is the one undertaken by Maria Gendron and her colleagues (2012), which provided solid evidence for how conceptual knowledge associated with the meaning of emotion words impacts the initial encoding of emotion percepts. In this study, the researchers tested the idea that emotion word meanings influence how an individual constructs perceptual representations from facial expressions in such a way that if emotion words are not accessible, the perceptual encoding of a discrete emotion in a face is modified.

The way they did this was by employing the so-called repetition priming phenomenon, which occurs when participants respond to a stimulus in a faster way after having been presented with that stimulus. Thus, if the representation of a stimulus differs across instances of presentation, repetition priming will not occur, i.e. there will be no decrease in reaction times. Furthermore, the researchers decreased the accessibility of emotion word meanings through a procedure called semantic satiation. This procedure, which consists of repeating a word 30 times out loud, leads to a temporary decrease in the word’s meaning by interfering with judgements of category membership, semantic relatedness, among other assessments (Gendron et al 2012).

Gendron et al.’s experimental procedure was broadly the following: participants first studied a set of faces depicting anger, sadness, fear, or disgust in either a weak or intense way. Following this, each participant underwent semantic satiation (e.g. repeated the word anger 30 times) followed by a repetition priming procedure in which a face depicting an emotion (e.g. a scowling face depicting anger) was presented once as a prime and once as a target. After this, the participants were presented with both the weak and intense versions of the same face they had seen and had to make a judgement of which of the two faces they had been presented with earlier.

Gendron and her colleagues predicted that repetition priming would occur in control trials (in which semantic satiation was not used) because the participant would have unconstrained access to the relevant emotion word during the encoding of the initial (prime) face and the target face. On the contrary, they predicted that repetition priming would not occur (i.e. the participants would take longer times in making the judgement) in the trials with semantic satiation, given that the relevant emotion word was inaccessible during the initial encoding. In line with their predictions, the researchers found that participants took significantly longer times to make perceptual judgements after the emotion word had been semantically satiated versus when it was not satiated. These results suggest that making an emotion word inaccessible through satiation appears to change the basic perceptual encoding of emotional facial expressions, implying that conceptual knowledge about emotions (anchored by emotion words) was active during the creation of an emotional percept (Gendron et al 2012).

These results are fascinating, since they suggest that language does more than simply labeling emotional expressions in faces (given that the task did not require the actual categorization of emotional faces) and it actually affects the way emotions are perceived.

In the study phase of Gendron et al’s experiment, participants were first familiarized with a set of pictures depicting facial expressions. During the test phase, participants repeated a relevant emotion word 30 times, were presented with a picture depicting a facial expression as a prime, and then asked to determine which of two faces appeared earlier.

Taken together, evidence from typical studies of emotional perception and from studies manipulating the accessibility of emotion words support the constructionist approach to emotion and the language-as-context hypothesis, which suggest language plays a constitutive role in emotion perception. More concretely, language not only serves as a context to disambiguate the uncertainty associated with facial expressions, but also influences the way in which an emotional percept is initially encoded. Put in another way, emotion words are linked to the conceptual knowledge that is essential for making meaning out of ambiguous facial expressions and affective states, and thus play an important role in emotional perception.

Nevertheless, several questions remain about the nature of emotional perception and the role language potentially plays in it. For example, it would be interesting to consider in what other ways emotion perception studies can be conducted. As we have seen, the typical inclusion of response options inflates the accuracy with which subjects perceive emotions. Similarly, it is possible that the use of artificial pictures depicting emotional expressions in a very intense way affects the participants’ emotion judgements in such a way that the results might not hold outside lab settings, where such exaggerated emotional expressions (see images above) are rare. Furthermore, according to the researchers involved in this area of study, future research efforts should be devoted to investigating the exact extent to which language influences the perception of emotions, and through what neural mechanisms language does this. Results from meta analyses of neuroimaging studies have revealed, for example, that the amygdala seems to be constantly activated in emotion studies or tasks that did not make use emotion words, suggesting that, since the amygdala is related to signaling uncertainty, when emotion concept knowledge is not accessible (through emotion words) there is uncertainty about the meaning of certain affective states (Brooks et al 2016).

This is your amygdala being confused about having to figure out which emotion you’re perceiving when no emotion words are present. Poor thing.

A last exciting question pertains to whether emotion perception can be improved through having an increased emotion word vocabulary and the effects this might have on our wellbeing. Wouldn’t it be amazing to see tangible benefits on our wellbeing by working on improving our emotional vocabulary? This is precisely what the blooming area of “emotional granularity” is all about, and recent findings suggest that a higher ability in differentiating between one’s emotional experiences with precision is related to better self regulation and action planning. If you are interested in learning more about this (you should!!), you can read this New York Times op-ed written by the psychologist who developed this concept.

References

Lindquist, K. A., Satpute, A. B., & Gendron, M. (2015). Does Language Do More Than Communicate Emotion? Current Directions in Psychological Science, 24(2), 99–108. https://doi.org/10.1177/0963721414553440

Lindquist, K. A., & Gendron, M. (2013). What’s in a Word? Language Constructs Emotion Perception. Emotion Review, 5(1), 66–71. https://doi.org/10.1177/1754073912451351

Gendron, M., Lindquist, K. A., Barsalou, L., & Barrett, L. F. (2012). Emotion words shape emotion percepts. Emotion, 12(2), 314–325. https://doi.org/10.1037/a0026007

Crivelli, C., Carrera, P., & Fernández-Dols, J. M. (2015). Are smiles a sign of happiness? Spontaneous expressions of judo winners. Evolution and Human Behavior, 36(1), 52–58. https://doi.org/10.1016/j.evolhumbehav.2014.08.009

Brooks, J. A., Shablack, H., Gendron, M., Satpute, A. B., Parrish, M. H., & Lindquist, K. A. (2016). The role of language in the experience and perception of emotion: a neuroimaging meta-analysis. Social Cognitive and Affective Neuroscience, nsw121. https://doi.org/10.1093/scan/nsw121

--

--

Leda Blaires Ciotti
Leda Blaires Ciotti

Written by Leda Blaires Ciotti

Leda is a sophomore at Yale University studying psychology.

Responses (1)