Readings

The readings listed below are the foundation of this course. Where available, journal article abstracts from PubMed (an online database providing access to citations from biomedical literature) are included.

Reading List

Adolphs, R. “Neural systems for recognizing emotion.” Curr Opin Neurobiol 12 (2002): 169-177.

PubMed abstract: Recognition of emotion draws on a distributed set of structures that include the occipitotemporal neocortex, amygdala, orbitofrontal cortex and right frontoparietal cortices. Recognition of fear may draw especially on the amygdala and the detection of disgust may rely on the insula and basal ganglia. Two important mechanisms for recognition of emotions are the construction of a simulation of the observed emotion in the perceiver, and the modulation of sensory cortices via top-down influences.

Adolphs, R., D. H., D. Tranel, G. Cooper, and A. R. Damasio. “A role for somatosensory cortices in the visual recognition of emotion as revealed by three-dimensional lesion mapping.” J Neurosci 20 (2000): 2683-2690.

PubMed abstract: Although lesion and functional imaging studies have broadly implicated the right hemisphere in the recognition of emotion, neither the underlying processes nor the precise anatomical correlates are well understood. We addressed these two issues in a quantitative study of 108 subjects with focal brain lesions, using three different tasks that assessed the recognition and naming of six basic emotions from facial expressions. Lesions were analyzed as a function of task performance by coregistration in a common brain space, and statistical analyses of their joint volumetric density revealed specific regions in which damage was significantly associated with impairment. We show that recognizing emotions from visually presented facial expressions requires right somatosensory-related cortices. The findings are consistent with the idea that we recognize another individual’s emotional state by internally generating somatosensory representations that simulate how the other individual would feel when displaying a certain facial expression. Follow-up experiments revealed that conceptual knowledge and knowledge of the name of the emotion draw on neuroanatomically separable systems. Right somatosensory-related cortices thus constitute an additional critical component that functions together with structures such as the amygdala and right visual cortices in retrieving socially relevant information from faces.

Adolphs, R., T. D., S. Hamann, A. W. Young, A. J. Calder, E. A. Phelps, Anderson , G. P. Lee, and A. R. Damasio. “Recognition of facial emotion in nine individuals with bilateral amygdala damage.” Neuropsychologia 37 (1999): 1111-1117.

PubMed abstract: Findings from several case studies have shown that bilateral amygdala damage impairs recognition of emotions in facial expressions, especially fear. However, one study did not find such an impairment, and, in general, comparison across studies has been made difficult because of the different stimuli and tasks employed. In a collaborative study to facilitate such comparisons, we report here the recognition of emotional facial expressions in nine subjects with bilateral amygdala damage, using a sensitive and quantitative assessment. Compared to controls, the subjects as a group were significantly impaired in recognizing fear, although individual performances ranged from severely impaired to essentially normal. Most subjects were impaired on several negative emotions in addition to fear, but no subject was impaired in recognizing happy expressions. An analysis of response consistency showed that impaired recognition of fear could not be attributed simply to mistaking fear for another emotion. While it remains unclear why some subjects with amygdala damage included here are not impaired on our task, the results overall are consistent with the idea that the amygdala plays an important role in triggering knowledge related to threat and danger signaled by facial expressions

Banse, R. “Affective priming with liked and disliked persons: Prime visibility determines congruency and incongruency effects.” Cognition and Emotion 15 (2001): 501-520.

Canli, T., Z. Z., J. Brewer, J. D. E. Gabrieli, and L. Cahill. “Event-related activation in the human amygdala associates with later memory for individual emotional experience.” J Neurosci 20, RC99 (2000).

PubMed abstract: The role of the amygdala in enhancing declarative memory for emotional experiences has been investigated in a number of animal, patient, and brain imaging studies. Brain imaging studies, in particular, have found a correlation between amygdala activation during encoding and subsequent memory. Because of the design of these studies, it is unknown whether this correlation is based on individual differences between participants or within-subject variations in moment-to-moment amygdala activation related to individual stimuli. In this study, participants saw neutral and negative scenes and indicated how emotionally intense they found each scene. Separate functional magnetic resonance imaging responses in the amygdala for each scene were related to the participants’ report of their experience at study and to performance in an unexpected memory test 3 weeks after scanning. The amygdala had the greatest response to scenes rated as most emotionally intense. The degree of activity in the left amygdala during encoding was predictive of subsequent memory only for scenes rated as most emotionally intense. These findings support the view that amygdala activation reflects moment-to-moment subjective emotional experience and that this activation enhances memory in relation to the emotional intensity of an experience.

Critchley, H., D. E., M. Phillips, M. Brammer, E. Bullmore, S. Williams, T. Van Amelsvoort, D. Robertson, A. David, and D. Murphy. “Explicit and implicit neural mechanisms for processing of social information from facial expressions: A functional magnetic resonance imaging study.” Human Brain Mapping 9 (2000): 93-105.

PubMed abstract: The processing of changing nonverbal social signals such as facial expressions is poorly understood, and it is unknown if different pathways are activated during effortful (explicit), compared to implicit, processing of facial expressions. Thus we used fMRI to determine which brain areas subserve processing of high-valence expressions and if distinct brain areas are activated when facial expressions are processed explicitly or implicitly. Nine healthy volunteers were scanned (1.5T GE Signa with ANMR, TE/TR 40/3,000 ms) during two similar experiments in which blocks of mixed happy and angry facial expressions (“on” condition) were alternated with blocks of neutral faces (control “off” condition). Experiment 1 examined explicit processing of expressions by requiring subjects to attend to, and judge, facial expression. Experiment 2 examined implicit processing of expressions by requiring subjects to attend to, and judge, facial gender, which was counterbalanced in both experimental conditions. Processing of facial expressions significantly increased regional blood oxygenation level-dependent (BOLD) activity in fusiform and middle temporal gyri, hippocampus, amygdalohippocampal junction, and pulvinar nucleus. Explicit processing evoked significantly more activity in temporal lobe cortex than implicit processing, whereas implicit processing evoked significantly greater activity in amygdala region. Mixed high-valence facial expressions are processed within temporal lobe visual cortex, thalamus, and amygdalohippocampal complex. Also, neural substrates for explicit and implicit processing of facial expressions are dissociable: explicit processing activates temporal lobe cortex, whereas implicit processing activates amygdala region. Our findings confirm a neuroanatomical dissociation between conscious and unconscious processing of emotional information.

Globisch, J., H. A., F. Esteves, and A. Oehman. “Fear appears fast: Temporal course of startle reflex potentiation in animal fearful subjects.” Psychophysiology 36 (1999): 66-75.

PubMed abstract: The temporal course of startle reflex modulation and autonomic response patterns to fear-relevant and fear-irrelevant pictures in subjects with high and low levels of animal fear was investigated. Thirty-eight high-fear and 48 low-fear volunteers viewed photos of snakes and spiders and pictures of neutral and pleasant content. The slides were presented for 6 s or for only 150 ms, depending on the group. Acoustic startle probes were presented at five different times after slide onset. Relative potentiation of the startle responses started 300 ms after onset of snake/spider pictures in fearful subjects. This fear-potentiated startle effect was maintained for the later probe times and was identical in the 150-ms condition. Fear-relevant pictures also prompted a sympathetically dominated autonomic response profile in fearful persons. These data support the idea that fear can be activated very rapidly, requiring only minimal stimulus input.

LeDoux, J. E. “Sensory systems and emotion: A model of affective processing.” Integrative Psychiatry 4 (1986): 237-243.

Liu, L., I. A., and M. Streit. “Single trial analysis of neurophysiological correlatese of the recognition of complex objects and facial expressions of emotion.” Brain Topography 11 (1999): 291-303.

PubMed abstract: In an earlier experiment, we have used the BTi twin MAGNES system (2 x 37 channels) to record the evoked magnetic field from five healthy right-handed male volunteers using two tasks: visual recognition of complex objects including faces and facial expressions of emotion. We have repeated the experiment with one of the five subjects using the BTi whole head system (148 channels). Magnetic field tomography (MFT) was used to extract 3D estimates of brain activity millisecond by millisecond from the recorded magnetoencephalographic (MEG) signals. Results from the MFT analysis of the average signals of the five subjects have been reported elsewhere (Streit et al. 1997; Streit et al. 1999). In this paper, we present results of the detailed single trial analysis for the subject recorded from the whole head system. We found activations in areas extending from the occipital pole to anterior areas. Regions of interest (ROIs) were defined entirely on functional criteria and confirmed independently by the location of the maximum activity on the MRI. Activation curves for each ROI were computed and objective statistical measures (Kolmogorov-Smirnov test) were then used to identify time segments for which the ROI activity showed significant differences both within the same and across different object/emotion categories. Emphasis is placed on the quantification of the activity from two ROIs, fusiform gyrus (FG) and amygdala (AM), which have been best studied in the context of processing of faces and facial expressions of emotion, respectively. We found no face-specific area as such, but instead areas like the FG was activated by all complex objects at roughly similar latencies and varying strengths. The amygdala activity was significantly different between 150 and 180 ms for fearful expression, and even earlier for happy expression

Morris, J. S., O. A., and R. J. Dolan. “Conscious and unconscious emotional learning in the human amygdala.” Nature 393 (1998): 467-470.

PubMed abstract: If subjects are shown an angry face as a target visual stimulus for less than forty milliseconds and are then immediately shown an expressionless mask, these subjects report seeing the mask but not the target. However, an aversively conditioned masked target can elicit an emotional response from subjects without being consciously perceived. Here we study the mechanism of this unconsciously mediated emotional learning. We measured neural activity in volunteer subjects who were presented with two angry faces, one of which, through previous classical conditioning, was associated with a burst of white noise. In half of the trials, the subjects’ awareness of the angry faces was prevented by backward masking with a neutral face. A significant neural response was elicited in the right, but not left, amygdala to masked presentations of the conditioned angry face. Unmasked presentations of the same face produced enhanced neural activity in the left, but not right, amygdala. Our results indicate that, first, the human amygdala can discriminate between stimuli solely on the basis of their acquired behavioural significance, and second, this response is lateralized according to the subjects’ level of awareness of the stimuli.

Murphy, S. T., and Z. R. “Affect, cognition, and awareness: Affective priming with optimal and suboptimal stimulus exposures.” Journal of Personality and Social Psychology 64 (1993): 723-739.

PubMed abstract: The affective primacy hypothesis (R. B. Zajonc, 1980) asserts that positive and negative affective reactions can be evoked with minimal stimulus input and virtually no cognitive processing. The present work tested this hypothesis by comparing the effects of affective and cognitive priming under extremely brief (suboptimal) and longer (optimal) exposure durations. At suboptimal exposures only affective primes produced significant shifts in Ss’ judgments of novel stimuli. These results suggest that when affect is elicited outside of conscious awareness, it is diffuse and nonspecific, and its origin and address are not accessible. Having minimal cognitive participation, such gross and nonspecific affective reactions can therefore be diffused or displaced onto unrelated stimuli. At optimal exposures this pattern of results was reversed such that only cognitive primes produced significant shifts in judgments. Together, these results support the affective primacy hypothesis

Murphy, S. T., M. J., and R. B. Zajonc. “Additivity of nonconscious affect: Combined effects of priming and exposure.” Journal of Personality and Social Psychology 69 (1995): 589-602.

PubMed abstract: Affect deriving from 2 independent sources–repeated exposure and affective priming–was induced, and the combined effects were examined. In each of 4 studies, participants were first shown 72 Chinese ideographs in which the frequency of exposure was varied (0, 1, or 3). In the second phase participants rated ideographs that were primed either positively, negatively, or not at all. The 4 studies were identical except that the exposure duration–suboptimal (4 ms) or optimal (1 s)–of both the initial exposure phase and the subsequent priming phase was orthogonally varied. Additivity of affect was obtained only when affective priming was suboptimal, suggesting that nonconscious affect is diffuse. Affect whose source was apparent was more constrained. Interestingly, increases in liking generated through repeated exposures did not differ as a function of exposure duration

Ogawa, T., and S. N. “Response differentiation to facial expression of emotion as increasing exposure duration.” In Perceptual and Motor Skills 89, 1999, pp. 557-563.

PubMed abstract: This study investigated whether the underlying structure of responses to facial expressions of emotion would emerge when the exposure time was increased. 25 participants judged facial photographs presented for varying durations of exposure, ranging from 4 msec. to 64 msec. in 4-msec. steps. A dual scaling method was carried out to analyze possible response differentiation as a function of exposure time. Two major components were extracted. Based on the configuration of variables they were interpreted as valence (hedonic tone) and activation. Results indicated that a positive emotion and a highly activated emotion such as surprise and fear were easily recognized under a relatively brief exposure to the stimuli.

Phillips, M. L., M. N., A. W. Young, L. Williams, S. C. R. Williams, E. T. Bullmore, J. A. Gray, and M. J. Brammer. “Time courses of left and right amygdalar responses to fearful facial expresssions.” Human Brain Mapping 12 (2001): 193-202.

PubMed abstract: Despite the many studies highlighting the role of the amygdala in fear perception, few have examined differences between right and left amygdalar responses. Using functional magnetic resonance imaging (fMRI), we examined neural responses in three groups of healthy volunteers (n = 18) to alternating blocks of fearful and neutral faces. Initial observation of extracted time series of both amygdalae to these stimuli indicated more rapid decreases of right than left amygdalar responses to fearful faces, and increasing magnitudes of right amygdalar responses to neutral faces with time. We compared right and left responses statistically by modeling each time series with (1) a stationary fit model (assuming a constant magnitude of amygdalar response to consecutive blocks of fearful faces) and (2) an adaptive model (no assumptions). Areas of significant sustained nonstationarity (time series points with significantly greater adaptive than stationary model fits) were demonstrated for both amygdalae. There was more significant nonstationarity of right than left amygdalar responses to neutral, and left than right amygdalar responses to fearful faces. These findings indicate significant variability over time of both right and left amygdalar responses to fearful and neutral facial expressions and are the first demonstration of specific differences in time courses of right and left amygdalar responses to these stimuli. Copyright 2001 Wiley-Liss, Inc.

Pizzagalli, D., R. M., and D. Lehmann. “Rapid emotional face processing in the human right and left brain hemispheres: An ERP study.” Neuroreport 10 (1999): 2691-2698.

PubMed abstract: Imaging work has begun to elucidate the spatial organization of emotions; the temporal organization, however, remains unclear. Adaptive behavior relies on rapid monitoring of potentially salient cues (typically with high emotional value) in the environment. To clarify the timing and speed of emotional processing in the two human brain hemispheres, event-related potentials (ERPs) were recorded during hemifield presentation of face images. ERPs were separately computed for disliked and liked faces, as individually assessed by postrecording affective ratings. After stimulation of either hemisphere, personal affective judgements of face images significantly modulated ERP responses at early stages, 80-116 ms after right hemisphere and 104-160 ms after left hemisphere stimulation. This is the first electrophysiological evidence for valence-dependent, automatic, i.e. pre-attentive emotional processing in humans.

Raccuglia, R. A., and P. R. “Asymmetric affective evaluation of words and faces.” British Journal of Psychology 88 (1997): 93-116.

PubMed abstract: In two experiments the relationship between direct and indirect forms of affective evaluation was investigated within the framework of a dual-pathway model (LeDoux, 1986, 1989). Emotionally valenced faces were hypothesized to be more directly evaluated affectively than valenced words. A Stroop-like asymmetry was expected, with faces interfering more with word evaluation than vice versa. Similar to experiments investigating affective influences of words on words (Greenwald, Klinger & Liu, 1989), a backward dichoptic pattern-masking technique was used in both experiments, with lateralized presentations of targets and masked primes in a short and a long presentation condition. In Expt 1, priming of emotionally negative, neutral and positive faces on the affective evaluation of emotionally negative, neutral and positive words was investigated in a two-alternative forced-choice task. In Expt 2, primes and targets were reversed. A clear asymmetry occurred in both subliminal and supraliminal conditions, but completely opposite to the one expected. Implications for a dual-pathway model are discussed.

Stapel, D. A., K. W., and K. I. Ruys. “The effects of diffuse and distinct affect.” Journal of Personality and Social Psychology 83 (2002): 60-74.

PubMed abstract: In a series of suboptimal priming studies, it was shown that both affective and nonaffective reactions to a stimulus may occur without awareness. Moreover, it was demonstrated that affective information is detected earlier than nonaffective information. Therefore, early reactions to an affect-laden stimulus (e.g., a smiling man) are cognitively unappraised and thus diffuse (e.g., “positive”), whereas later affective reactions can be more specific and distinct (e.g., “a smiling man”). Through variations of prime exposure (extremely short, moderately short) the impact of early diffuse and late distinct affect on judgment was investigated. Findings show that distinctness (and prime-target similarity) is an essential determinant of whether the effect of affect is null, assimilation, or contrast. Furthermore, whether affect priming activates diffuse or distinct reactions is a matter of a fraction of seconds.

Vuillleumier, P., and S. S. “Beware and be aware: Capture of spatial attention by fear-related stimuli in neglect.” Neuroreport 12 (2001): 1119-1122.

PubMed abstract: Stimuli with threat significance may be privileged in summoning attention, allowing fast detection even outside the field of attention. We studied patients with unilateral neglect and visual extinction, who usually remain unaware of contralesional stimuli presented together with concurrent ipsilesional stimuli, to learn whether emotional stimuli might differentially be affected by contralesional extinction. Pictures of spiders or flowers with similar features were presented in right, left, or both fields. On bilateral trials, the patients detected emotional stimuli (spiders) on the left side much more often than neutral pictures (flowers). While mechanisms of spatial attention are impaired after parietal damage in neglect patients, intact visual pathways to the ventral temporal lobe and amygdala might still mediate distinct mechanisms of emotional attention.

Vuillleumier, P., A. J., J. Driver, and R. J. Dolan. “Effects of attention and emotion on face processing in the human brain: An event-related fMRI study.” Neuron 30 (2001): 8229-841.

PubMed abstract: We used event-related fMRI to assess whether brain responses to fearful versus neutral faces are modulated by spatial attention. Subjects performed a demanding matching task for pairs of stimuli at prespecified locations, in the presence of task-irrelevant stimuli at other locations. Faces or houses unpredictably appeared at the relevant or irrelevant locations, while the faces had either fearful or neutral expressions. Activation of fusiform gyri by faces was strongly affected by attentional condition, but the left amygdala response to fearful faces was not. Right fusiform activity was greater for fearful than neutral faces, independently of the attention effect on this region. These results reveal differential influences on face processing from attention and emotion, with the amygdala response to threat-related expressions unaffected by a manipulation of attention that strongly modulates the fusiform response to faces.

Whalen, P. J., R. S., N. L. Etcoff, S. C. McInerney, M. B. Lee, and M. A. Jenike. “Masked presentations of emotional facial expressions modulate amygdala activity without explicit knowledge.” J Neurosci 18 (1998): 411-418.

PubMed abstract: Functional magnetic resonance imaging (fMRI) of the human brain was used to study whether the amygdala is activated in response to emotional stimuli, even in the absence of explicit knowledge that such stimuli were presented. Pictures of human faces bearing fearful or happy expressions were presented to 10 normal, healthy subjects by using a backward masking procedure that resulted in 8 of 10 subjects reporting that they had not seen these facial expressions. The backward masking procedure consisted of 33 msec presentations of fearful or happy facial expressions, their offset coincident with the onset of 167 msec presentations of neutral facial expressions. Although subjects reported seeing only neutral faces, blood oxygen level-dependent (BOLD) fMRI signal in the amygdala was significantly higher during viewing of masked fearful faces than during the viewing of masked happy faces. This difference was composed of significant signal increases in the amygdala to masked fearful faces as well as significant signal decreases to masked happy faces, consistent with the notion that the level of amygdala activation is affected differentially by the emotional valence of external stimuli. In addition, these facial expressions activated the sublenticular substantia innominata (SI), where signal increases were observed to both fearful and happy faces–suggesting a spatial dissociation of territories that respond to emotional valence versus salience or arousal value. This study, using fMRI in conjunction with masked stimulus presentations, represents an initial step toward determining the role of the amygdala in nonconscious processing.

Zajonc, R. B. In Feeling and thinking: the role of affect in social cognition. Edited by J. P., F. New York, NY: Cambridge University Press, 2000, pp. 31-58.

Course Info

As Taught In
Spring 2003
Learning Resource Types
Activity Assignments
Media Assignments
Written Assignments