Can't Keep Your Face and Voice Out of My Head: Neural Correlates of an Attentional Bias Toward Nonverbal Emotional Cues
Cereb Cortex (2013) 24 (6): 1460-1473.
Published:
04 February 2013
Abstract
Emotional information can be conveyed by verbal and nonverbal cues with the latter often suggested to exert a greater influence in shaping our perceptions of others. The present functional magnetic resonance imaging study sought to explore attentional biases toward nonverbal signals by investigating the interaction of verbal and nonverbal cues. Results obtained in this study underline the previous suggestions of a “nonverbal dominance” in emotion communication by evidencing implicit effects of nonverbal cues on emotion judgements even when attention is directed away from nonverbal signals and focused on verbal cues. Attentional biases toward nonverbal signals appeared to be reflected in increasing activation of the dorsolateral prefrontal cortex (DLPFC) assumed to reflect increasing difficulties to suppress nonverbal cues during task conditions that asked to shift attention away from nonverbal signals. Aside the DLPFC, results suggest the right amygdala to play a role in attention control mechanisms related to the processing of emotional cues. Analyses conducted to determine the cerebral correlates of the individual ability to shift attention between verbal and nonverbal sources of information indicated that higher task-switching abilities seem to be associated with the up-regulation of right amygdala activation during explicit judgments of nonverbal cues, whereas difficulties in task-switching seem to be related to a down-regulation.
Introduction
Emotions play a central role in nearly every facet of human life: Emotions guide human behavior, shape social interaction, and define interpersonal relationships (Parkinson 1996; Van Kleef 2009) and, in a sense, understanding emotions—particularly those of our fellow human beings—may constitute the most challenging and rewarding task we are faced with on a day-to-day basis.
In deciphering the emotional states of communication partners, human beings rely on a variety of different cues exchanged during social interactions. Spoken words and different so-called nonverbal signals such as facial expressions, changes in the tone of a voice or nonverbal vocalizations like laughter and crying, provide us with the information needed to understand what others might feel, desire, or intent to do (Adolphs 2002; Meyer et al. 2005; Dietrich et al. 2006, 2007, 2008; Szameitat et al. 2010; Brück, Kreifelts, Wildgruber 2011; Jacob et al. 2012a). While a single word or sentence, or a single facial expression, may already “say it all” and suffice in getting the “message across,” everyday social interaction frequently requires considering and weighting several different affective cues observed at the same time. By integrating the various verbal and nonverbal signals received simultaneously, observers may not only exploit redundancies to increase the accuracy of their inferences, rather slight inconsistencies between verbal and nonverbal messages detected in the process may add further information and even alter our impression of the signals' sender's state of the mind. Signs of anger conveyed both through a facial expression (e.g. glaring eyes, flushed red face, tightened lips, and clenching brows) and a harsh tone of voice while speaking may enhance the effect of a threatening verbal message such as “Get out of here, or else!” and may convince the recipient of the sender's anger; whereas emotional discrepancies spotted between nonverbal expressions of envy and verbal testimonies of shared joy (e.g. “I'm so happy for you”), for instance, may reveal the speaker to be lying.
Considering the relative impact of the different sources of information, research findings obtained in different behavioral studies often identify nonverbal cues to “trump” verbal signals in shaping our perceptions of others' emotional states (Mehrabian and Ferris 1967; Mehrabian and Wiener 1967; Argyle et al. 1970, 1971; for review see Noller 1985). Presumably, this stronger impact of nonverbal cues on emotional communication is phylogenetically founded: Current theories of language development, in fact, assume that, in human evolution, nonverbal may have preceded verbal communication (McBride 1975; Dew and Jensen 1977), which enabled social exchange even before human beings were able to speak and thereby might have contributed to the survival of our species. Aside from phylogenetic aspects of language development, the suggested primacy of nonverbal communication also reflects itself in the ontogenetic development of human communication abilities where, again, nonverbal signals precede the use of verbal cues (i.e. language): As long as infants have not learned to speak, communication necessarily has to take place at nonverbal levels rendering nonverbal cues such as facial expressions, pointing gestures, or nonverbal vocalizations like crying the primary means of expressing and deciphering various needs (e.g. food, sleep, safety, and nurturing) that require supply or relief (McNeill 1970).
Based on the latter examples suggesting a superior importance of nonverbal signals, one may assume that nonverbal cues take precedence in information processing and bias information detection in the sense that they capture attention when competing with other sources of information. In corroboration of the latter assumption, behavioral observations obtained across a considerable number of studies evidence affective significance or emotional salience to guide attention (for review see Vuilleumier 2005; Pourtois et al. 2012) with recent discussions on the matter, suggesting that “[e]motional biases are probably [even] stronger with ‘biologically prepared’ stimuli” (Vuilleumier 2005, p. 586) such as faces etc. However, questions remain as to how this supposed nonverbal dominance in emotion communication (and the attention bias associated therewith) may be reflected in brain mechanisms underlying the processing of affective signals.
Considering modulating effects of emotional salience on information processing, recent reviews detailing brain mechanisms associated with the rapid selection of affectively significant stimuli (for review see Compton 2003; Vuilleumier 2005; Pourtois et al. 2012) outline the idea that “emotional attention” may rely on a distinct attention system centered around the amygdala that operates in addition to the voluntary attention system mediated by frontoparietal brain structures (Vuilleumier and Huang 2009; Pourtois et al. 2012): Severing as a central hub in a circuit of brain structures, output signals generated by the amygdala are assumed to boost the representation of emotionally salient information by modulating the activation of a broad network of sensory cortical parietal and frontal brain regions. Such “boosting effects”, in turn, may be amplified or attenuated by top-down modulations of several frontal brain regions such as the dorsolateral prefrontal cortex (DLPFC), ventromedial prefrontal cortex, or orbitofrontal cortex (OFC)—possibly reflecting cerebral mechanisms related to voluntary control and the allocation of attention resources to task-relevant aspects of the environment (for review see Compton 2003; Vuilleumier and Huang 2009; Pourtois et al. 2012).
Concluding from these models of emotional attention, one may assume attention biases related to a nonverbal dominance in emotion communication to be mirrored in 2 distinct cerebral mechanisms: 1) An increasing activation of several sensory areas along the processing path associated with an enhancement of sensory processing that is mediated through bottom-up inputs from subcortical pathways involving the amygdala, and 2) an increasing recruitment of dorsolateral and medial frontal “voluntary control areas” linked to increasing efforts to suppress the processing of nonverbal cues when nonverbal signals are competing with “weaker”—yet task-relevant—cues that need to be subjected to a more elaborate processing.
Proceeding from these outlined hypotheses, the present study aimed to address propositions of a nonverbal dominance by studying the interplay between nonverbal cues and attention exemplified in attentional biases toward nonverbal affective signals. Assuming that such biases in attention might be reflected in an inability to ignore or suppress the processing of nonverbal cues, our study approached the issue by examining and comparing brain responses associated with the explicit evaluation of nonverbal cues and the involuntary processing of nonverbal affective information when asked to suppress the analysis of nonverbal cues and, rather, to focus on verbal cues presented alongside these signals.
As far as questions regarding the cerebral substrates of such an involuntary processing of nonverbal cues are concerned, research findings delineating different cerebral networks to contribute to the “explicit” or “implicit” processing of nonverbal affective signals (i.e. facial and vocal-prosodic cues) may, in fact, provide first tentative clues: While the involuntary processing of nonverbal affective cues presented outside of attention focus has been suggested to be linked to increased limbic and medial frontal activation (Wildgruber et al. 2006, 2009; Brück, Kreifelts, Wildgruber 2011), the effortful, cognitive controlled evaluation of nonverbal cues, in contrast, has been associated with decreases in subcortical limbic activation paralleled by increasing the activation of a broad network of cortical brain structures (Wildgruber et al. 2006, 2009; Brück, Kreifelts, Wildgruber 2011), including the DLPFC and OFC, the posterior superior temporal cortex (pSTC), and cue-specific brain regions such as face-sensitive aspects of the fusiform gyrus (i.e. fusiform face areas, FFA; Kanwisher et al. 1997) and voice-sensitive areas of the mid-superior temporal cortex (i.e. temporal voice areas, TVA; Belin et al. 2000).
To investigate the cerebral processing of nonverbal affective cues under different processing conditions, we used functional magnetic resonance imaging (fMRI). Healthy volunteers were scanned while performing 2 different emotion judgment tasks based on a set of video stimuli capturing different speakers expressing different emotional states: One task required participants to base their respective judgments on nonverbal indicators displayed by the respective speakers (and ignore verbal content), while the other task asked to focus on spoken words (and ignore nonverbal messages) in reaching a decision. Investigations into the targeted brain responses were guided by contrasts evaluating stimulus-driven effects (i.e. brain activation elicited by emotional nonverbal cues irrespective of attention focus) as well as modulations of these effects by the 2 different task instructions employed in the study.
Based on previous research, we assumed that the perception of emotional nonverbal signals would rely on a widespread network of brain regions including limbic brain structures, in particular, the amygdala, as well as cortical brain areas, particularly, the anterior rostral medial frontal cortex (arMFC), FFA, TVA, pSTC, OFC, and DLPFC. Research findings published on the functional characteristics of several of these brain regions, moreover, suggest that the amygdala, FFA, and TVA may play a role in the stimulus-driven processing of affective nonverbal information occurring regardless of task instructions, while the right pSTC may, in contrast, contribute to a task-related analysis of nonverbal information when nonverbal affective cues are in the focus of attention (Wildgruber et al. 2006, 2009; Brück, Kreifelts, Wildgruber 2011). Finally, building on current research conducted on both the implicit processing of affective cues (Wildgruber et al. 2006, 2009; Brück, Kreifelts, Wildgruber 2011) and recent models of emotional attention (for review see Compton 2003; Vuilleumier 2005; Pourtois et al. 2012), we assumed that “control regions” located in the DLPFC as well as regions within the arMFC or the limbic system (particularly the amygdala), may play a role during the involuntary processing of affective nonverbal cues when nonverbal signals are not in the focus of attention.
Another concern of the study, however, was to explore how individual differences in the ability to shift attention between verbal and nonverbal sources of emotional information—or in other words the ability to modulate attentional biases toward nonverbal information—are related to brain activation associated with the processing of affective cues. To this end, we conducted correlation analyses aimed at unraveling relationships between brain responses and behavioral measures reflecting an individual's ability to shift her/his attention focus between verbal and nonverbal cues. Based on evidence provided in current reviews suggesting a predominate role of the amygdala in emotional attention (for review see Vuilleumier and Huang 2009; Pourtois et al. 2012), our analysis focused on modulations of amygdala activation as a potential cerebral correlate of the ability to shift attention among competing sources of emotional information.
No comments:
Post a Comment