International Journal of Brain and Cognitive Sciences

p-ISSN: 2163-1840    e-ISSN: 2163-1867

2013;  2(2): 23-37

doi:10.5923/j.ijbcs.20130202.03

Distinct Attentional Mechanisms for Social Information Processing as Revealed by Event-related Potential Studies

Anna Katharina Mueller, Christine Michaela Falter, Oliver Tucha

Department of Clinical and Developmental Neuropsychology, University of Groningen, Groningen, NL, Netherlands

Correspondence to: Anna Katharina Mueller, Department of Clinical and Developmental Neuropsychology, University of Groningen, Groningen, NL, Netherlands.

Email:

Copyright © 2012 Scientific & Academic Publishing. All Rights Reserved.

Abstract

Phenomena such as the vanishing ball illusion, in which a ball seems to disappear in the atmosphere, whereas the magician is just using cues that capture social attention in order to deceive the audience, shows how human visual perception can be susceptible to social information. In this review the most recent event-related potentials (ERPs) studies of visual social information processing will be discussed. The available literature will be analysed concerning the question whether the processing of social information uniquely modulates attentional mechanisms (indexed in ERP amplitudes and latencies) compared to non-social information. The review demonstrates that ERPs show indeed functional consistency in amplitude and latency modulation on some components (e.g., P1, N170) across different types of social stimuli tested. Importantly, these unique attentional modulations in response to social stimuli as opposed to non-social stimuli were found beyond modulation by other stimulus characteristics. The attentional mechanisms reflected by these components are believed to facilitate social information processing in neurotypicals and point towards a special role of the social domain in human information processing.

Keywords: Event-Related Potentials, Social Vision, Face, Eye, Gaze, Emotion, Body Perception

Cite this paper: Anna Katharina Mueller, Christine Michaela Falter, Oliver Tucha, Distinct Attentional Mechanisms for Social Information Processing as Revealed by Event-related Potential Studies, International Journal of Brain and Cognitive Sciences, Vol. 2 No. 2, 2013, pp. 23-37. doi: 10.5923/j.ijbcs.20130202.03.

1. Introduction

Over the last decade, cognitive studies focusing on the processing of social information and underlying neuronal markers have been on the increase[1]. The question whether social information plays a special role for human attention and information processing distinct from non-social information processing is still unclear though. Visually presented social information is associated with greater activation of specific brain areas, for example the medial prefrontal cortex (mPFC), the superior temporal sulcus (STS), the medial parietal cortex (mPC), and the temporoparietal junction (TPJ)[2].The outlined brain areas have a unique metabolic resting activity that is believed to facilitate social-perceptual processes[3].The comparison of social and non-social information processing with respect to visual selection, organisation, and interpretation therefore appears of interest. To approach this comparison the current review will be focussed onstudiesreporting event-related potentials (ERPs) associated with social stimuli. ERPs are electroencephalographic recordings (EEG) in response to experimental stimuli that are believed to express the sum of postsynaptic potentials from simultaneously firing neurons[4]. The amplitude of the ERP wave represents the amount of associated psychological processes involved in the operation, with the component’s peak latency indicating the point in time at which the psychological operation has been terminated[1].
ERP studies are especially interesting for the purpose of this review as they provide temporal information concerning processes such as attention. Earlier ERP components, such as the P100, N100, P200, N170, and N200 usually relate to attentional selection mechanisms (e.g.,[5]), whereas later components are more often associated with organisation and interpretation of the stimulus, such as the N400 modulation by incongruency of person identity[6]. Given that this review aims to discuss whether attention to social information is associated with ERP modulations that are distinct from the processing of non-social information the focus will be mainly on early ERP modulations.
Social information can be any information related to the outer appearance of humans (i.e. posture, gesture, mimic) or associated with the understanding (i.e. emotional cues, social gestures) and representation of other people (i.e. social group membership) up to verbal or symbolic information associated with human culture (i.e. stop signs). The current review will be confined to ERPs in response to social information in terms of faces, emotion displayed by faces, eyes and eye gaze information as well as human posture, and biological motion. The concluding discussion at the end of the review provides a comparison of ERP findings across stimulus types in order to answer the question whether social information processing is indeed associated with unique ERP modulations.

2. Faces

2.1. Review of ERPs Sensitive to Human Faces

Studies captured by recent reviews on social information processing (e.g.,[7]) focused on the uniqueness of human face perception, facial identity recognition, and discrimination of facial emotions. Overall it was suggested that faces are generally detected more accurately and earlier than objects (e.g.,[8]), words (e.g.,[9],[10]), and inverted or scrambled faces (e.g.,[11],[12]). Moreover, performance on emotion categorisation tasks was enhanced if facial expressions represented unambiguous emotional information as for example compared to scheming[13] or blurred faces[14].
Given these findings, it is likely that the behavioural enhancement in response to social information, when compared to non-social information processing, is paralleled by unique ERP modulations.
In a recent review[7]it was reported that the P80 was affected by participants’ emotional disposition towards faces (liked vs. disliked faces,[15]) or the stimulus’ emotional valence[16, 17]. The P110 was modulated by happy compared to sad faces[18] and a unique response of the frontal P135 component related to successful face detection[10, 12, 19]. Subsequent ERP components between 140-270 ms PSO were related to emotion registration and emotion discrimination (e.g.,[20],[21]). Furthermore, recognition of individuals (familiar vs. unfamiliar faces, learned vs. new faces) affected N170 amplitudes and latencies[22, 23]. The N170 modulation by familiarity of the stimulus is replicated in studies focusing on eye and eye gaze processing (for a review see[24]).
Unfortunately, previous studies on facial perceptiononly sparsely used non-facial stimuli as a control condition, making the identification of unique ERP responses to faces more difficult. Nevertheless, there are a few exceptions to this methodological critique such as studies comparing faces to clock stimuli[25], a woman caricature with Necker cubes[26], faces with cars[27], and schematic faces with objects[28]. For instance, Ishizu et al.[25] compared ERPs in response to intact, split, and inverted neutral faces to similarly manipulated clock images (intact, split, inverted) and identified a selective impact of face configuration on the N170 amplitude. In contrast to N170 latency, which was delayed in both stimulus types (split faces and split clocks), the N170 amplitude was uniquely enhanced by split faces when compared to split clocks and by inverted faces when compared to neutral faces.
Figure 1. Illustration of stimulus diversity; Physically contrasting social stimuli that modulate P1 amplitudes[28, 74, 86] or P1 latencies[102]
In general, face recognition has been shown to be foremost driven by holistic perceptual processes that draw on information from the complete face (e.g.,[29, 30]). A unique N170 enhancement as seen by face inversion or face splitting[25] is therefore suggested to represent the neural response to violations of the holistic configuration of a face[25, 27, 31].
ERP signals towards neutral faces and cars in two different age groups (a child/young adolescence group ranging from 4 to 17 years and an adult group) showed that, independent of age (when taking children’s N250 as an equivalent for the adult N170), N170 amplitudes were enhanced and peak latencies earlier when exposed to faces compared to cars[27]. Furthermore, phase-scrambling of cars and faces, causing the holistic information of an image (overall shape) being lost, diminished the typical N170 response[27]. By taking away the face’s holistic information, the face loses its “perceptual advantage” in being distinguished faster than other non-facial stimuli (e.g., cars,[27]). Bentin et al.[28] associated increased N170 amplitudes with automatic, early bottom-up activation that facilitates initial face detection. Subsequent competition between local and global facial features, thereupon promote more advanced perceptual decisions of whether attention has to be allocated to local features (eye region) or to the whole face (see Figure 1). These decisions are proposed to be based on the presence or absence of a configural conflict (eye regions containing eyes vs. eye regions containing small faces). The authors further assume that, if local and global features of a face fit a typical face, face identification is the subsequent step in facial perception[28]. A decrease in N170 amplitude was further associated with a perceptual facilitation of extracting faces from a stream of irrelevant stimuli, reflecting therefore more accurate face recognition performance under greater attentional load[32].
Positive peaking ERPs, such as the P1, showed significantly faster latencies and increased amplitudes for facial than for non-facial stimuli (intact vs. phase-scrambled cars), independent of the participant’s age or stimulus composition (intact vs. phase-scrambled facial stimuli,[27]). Integrity of stimulus type (intact vs. scrambled) yielded a significant increase in P1 amplitudes, irrespective of social or non-social content of the stimulus[27], which is in contrast to findings by Kornmeier and Bach[26] who identified prioritised perceptual processes in favour of faces when compared to geometric cubes. Kornmeier and Bach[26] compared ERPs in response to the famous old/young woman caricature to those in response to 3x3 Necker cube modulations. Facial stimuli provoked enhanced positive ERP peaks around 130 ms PSO, whereas responses to geometric cubes peaked 50 ms later (180 ms PSO). Moreover, face stimuli elicited earlier mean peaks and greater mean amplitudes in positive direction; whereas responses to cubes elicited a negative wave that was lower in amplitude[26]. Given that early potentials such as the P1 are often interpreted as reflecting automatic attention to motivationally salient stimuli[33], an enhanced response to faces underlines the idea that human perception is particularly adapted to social information. Moreover, Kornmeier and Bach’s[26] findings emphasise that perceptual mechanisms to social stimuli are not only temporally distinct (occurring earlier) but are also distinct with regard to the direction of their amplitude (positive wave) when compared to objects (negative wave).
Studies that did not make use of inanimate control stimuli such as objects or geometric figures focused on how implicit changes in face configuration[34], participants’ demographical characteristics, and other stimulus characteristics impact on ERPs[32, 35]. In addition, pain perception on the basis of one’s own or someone else’s face has been assessed[36]. The following results have been obtained. Implicit face processing in a masked priming paradigm with upright and inverted faces showed that face inversion effects were accompanied by ERP modulation as early as 80 msPSO[34]. The face inversion-effect is one of the most often replicated phenomena in social perception studies, with a decrease in behavioural performance (e.g.,[37]) and accompanied modulation of the N170 (e.g.,[38]) and P1. According to[34], early ERP enhancement, starting 80 ms PSO and remaining up to the P300 response, represent an orienting effect of attention towards facial stimuli. Face priming effects however, were more prominent in N2 and P3 increases in amplitude[34]. As mentioned before though, no non-social control stimuli were used in this study therefore not allowing the conclusion that the priming effect is unique to faces[34].
Perception of facial familiarity showed a frontally occurring 350 ms PSO peaking P3 response and reflected automatic, speeded neural reactions towards emotional information immanent to faces of acquaintances[34]. A somewhat later peaking centro-parietal P3b response (500 ms PSO) was observed for unknown but previously learned faces[39]. Bobes et al.[39] interpreted the later neural response to be in line with more elaborated attentional processes[40, 41] required for actual recognition of an individual. Tacikowski and Nowicka[42] agreed with Bobes et al.’s[39] interpretation of a P300 enhancing effect due to personal significance of a stimulus and show increased P300 amplitudes with shorter latencies for self-name and self-face stimuli in contrast to less personally significant names and faces of famous people from the media.

2.2. Discussion

Taken together, the P1 has been interpreted to be responsive to exogenous low-level visual features, irrespective of stimulus type (social vs. non-social/facial vs. non-facial;[27]) but responded also to facial inversion[34], facial misalignment[43], or level of stimulus ambiguity[26]. Similarly, P2 and P3 amplitudes were modulated by face inversion[25, 34]. With regard to the question whether attention to social information is special, clear ERP modulation by social versus non-social stimuli was found for N1. The N1 showed increased amplitudes in response to faces compared to letters even when attention was actively turned away from faces[32] and when faces were inverted[34], emphasising the assumption of a perceptual facilitating process in favour of faces. Similar to N1, N170 was modulated by social versus non-social information[25, 28, 32] and by rearrangement of facial features such as inversion (e.g.,[25, 34]) and spatial misalignment of faces (bottom part of face misaligned with regard to the upper part of the face,[43]). The N2 was modulated by face inversion and incongruency of prime face and target face[34]. The authors concluded that whereas earlier components (P1, N1, N170) most likely represent configural changes of facial stimuli at low-visual perceptual encoding stages associated with attentional mechanisms, N2 modulations already represent facial identification[34] most likely driven by an implicit “attention-orienting response”[44].
Overall, the findings from priming and cueing paradigms with facial stimuli support the idea of distinct attentional processes in facial perception. Facial processing was marked by rapid attentional processes, as characterised by distinct responsiveness of early components to faces. Moreover, face inversion effects and studies demonstrating enhanced ERPs in response to faces compared to objects while controlling for overall stimulus complexity (e.g.,[27, 28]) showed that facial and therefore social perception is different from non-social perception.Future studies are encouraged to control for confounding variables such as luminance and contrast as well as personal significance of the stimuli. Control for luminance and contrast seems warranted as early ERPs have been shown to be sensitive to low-level visual features of a stimulus (e.g.,[5, 34]). Finally, it would be interesting to find out whether experts in a particular field who are often exposed to a particular stimulus category show similar ERP responses to pictures of their expertise as to human faces.

3. Emotional Expressions

3.1. Review of ERPs Sensitive to Emotional Expressions

Previous research (see review[7]) showed that fearful, threat, and disgusted facial expressions speeded up visual processing and were associated with distinct ERP modulations. Automatic visual attention and distinct ERP responses towards neutral, happy, surprised, and sad faces, however, revealed less univocal results[7]. Studies included in this review report an overall increase in ERP positivity around 100-180 ms PSO in response to emotional facial expressions when contrasted with ERPs to neutral faces (e.g.,[45, 46, 47]). Moreover, inversion of emotional faces delayed the onset latency of these early positive potentials[45]. However, when participants did not pay explicit attention to faces of different emotional valence, no evidence was found for emotion-specific ERP modulations[48] suggesting that facial emotion processing does not operate preattentively and that explicit spatial attention is crucial in emotion identification, in contrast to pure face identification[45]).
Besides studies on face perception, the N170 was also investigated for its role in emotion processing from faces. Both fearful and happy, in contrast to neutral faces increased the N170 response[49], leading to the suggestion that enhanced N170 amplitudes indicate perceptual processing of facial emotional information in general, with no further differentiation of the actual valence of the emotion[49]. In the current review’s section on face perception, however, it was outlined that changes in the N170 amplitude reflect manipulations of face configurality. Given that every emotional expression different from neutral (e.g., happy, fearful) is associated with at least some physical changes in facial features, it remains open whether the reported N170 increase in response to fearful and happy faces are actually due to emotional valence of the stimulus.
Enhanced P80, Vertex Positive Potentials (VPP), which is the positive cognate of the N170, and P300 amplitudes in response to fearful faces were interpreted to reflect more detailed processing of threat-related information[49]. The perception of fear enhanced earlier potentials in general[49] and led some authors to conclude that fearful information is given priority in perceptual processing when compared to happy and neutral faces (e.g.,[50, 51]).
Increased Nc amplitudes of 7 month-old children indicate biased overt attentional processes to fearful faces when compared to neutral faces[52]and prioritised processing of angry faces when compared to fearful faces[53]. This ERP response to the emotional load of the face was yet absent in 5-month old children[52]. The authors conclude that infants’ increased sensitivity to angry facial expressions might be associated with anger addressing the viewer more directly than fearful faces do, since fear does not have to relate to the observer per se, but is more likely to be linked to environmental threat[53]. Likewise, 4 to 6 year old children’s Nc amplitudes were maximal in response to the own mother’s face when the mother’s face expressed anger[54]. The authors interpreted the increase in Nc amplitude as enhanced attention to the stimulus level of personal significance and recognition memory based on the stimulus level of familiarity[54]. These findings suggest that early accelerated potentials (e.g., N170, Nc) are related to person identification on the basis of familiarity and that emotional content (in this case anger) is more likely to be reflected in later components (>500 ms PSO; LPC response). Later effects of emotion as marked by an LPC modulation[54] supported the idea that late positive potentials are sensitive to negatively loaded stimuli in general (e.g.,[55, 56]).
However, this interpretation is in contrast with[46], showing speeded early ERP responses (P100) to emotional significance of facial stimuli in adults, with emotional intensity (N170) and face identification being reflected in potentials after this initial emotion related increase in attention. Notably, a recent study on adolescents showed that N170 effects of emotion might not be detectable until the “adult N170 morphology” is completely developed[47], which is believed to be the case in later teenage years[57]. As Todd et al.’ s[54] sample consisted of 4 to 6 year olds and Utama et al.[46] base their argumentation on adults’ ERPs, contrasting results are likely to be associated with neuronal differences in diverging age groups.
An emotional negativity bias in human perception was detected and indexed by increased N170 amplitudes and P200 as well as P300 amplitude and latency modulation[59]. Moreover, N170, P200 and P300 modulation by intensity of facial emotion[46, 59] adds to the observation that emotion-related fine-tuned processes of perception are reflected in distinct ERP responses. Extreme negative facial expressions were associated with increased N170 amplitudes[59] and participants’ ratings of the emotional intensity of the face[46, 47]. This effect was not present in ERP responses when positive faces were varied in intensity[59]. Moreover, P2 amplitudes were significantly lower and P2 latencies shorter for extreme negative facial expressions compared to moderate negative or neutral facial expressions. Finally, P3 amplitudes were decreased in response to extreme negative facial expressions, which was not the case in response to moderate negative or neutral facial expressions[59].
It has to be mentioned that most of the studies on emotion processing from facial cues did not control for other task characteristics that could modulate ERPs as well, such as low-level visual (colour, luminance) spatial characteristics of the stimuli. The identified ERP modulations therefore, might index attentional mechanisms unrelated to stimulus category (social vs. non-social). In order to address this issue, more advanced studies evolved that controlled for low-level visual features (luminance) and spatial alignment of the stimuli (e.g.,[60, 61]) and yet could replicate previous findings on P1 and N170 modulation by fearful faces[46, 49]. The comparison of low-spatial frequency (LSF) to high-spatial frequency (HSF) filtered fearful faces showed that the typical P100 and N170 amplitude modulation by facial emotion (e.g.,[15, 62, 63]) is mainly mediated by LSF information of the stimuli[61]. The study by Vlamings et al.[61] further controlled for luminance and contrast levels of the HSF and LSF facial cues showing that the reliance on LSF features in facial expression decoding is not associated with contrast or luminance alterations across spatial frequencies (HSF vs. LSF) of the stimuli. Based on previous work (e.g.,[17, 44, 64,65, 66]) enhanced P1 responses were interpreted as a perceptual mechanism providing attentional resources to enhance the processing of emotionally significant information[61]. In contrast to early potentials, later ERPs were shown to be insensitive to spatial frequency filtering[60] suggesting that human perception is marked by a “preattentive neural mechanism for the processing of motivationally relevant stimuli” ([60], p. 3223) that is activated by LSF features.
In contrast,[45] reported no effect on ERP modulations by LS- or HS-filtering of fearful or neutral faces. Diverging findings might stem from differences in stimuli given that Alorda et al.[60] used rather complex scenes whereas[45] used facial stimuli.
Facial expressions and eye gaze direction have been shown to be of high significance in shared attention and social referencing. Fichtenholtz et al.[67] therefore assessed the distinct and combined effects of facial emotion (fearful/happy) and eye gaze direction (left/right) on attentional orienting towards positively (a baby) or negatively loaded (a snake) targets. In addition, the impact on target processing by validly/invalidly cued eye gaze orientation was tested. Increased N180 amplitudes were found for a facial emotion-gaze direction interaction (happy leftward gazing faces,[67]). Moreover, increased P135 and N180 amplitudes were found for both target types (baby, snake) if preceded by a happy face. The authors interpreted these effects in line with an early priming effect of facial emotion[67]. Later ERP modulations, such as a P300 amplitude increase was associated with invalidly cued targets that were preceded by a happy face. Whereas P300 amplitudes in the early time window (250-550 ms PSO) were increased in response to positively loaded targets (the baby), later P300 amplitudes (550-650 ms PSO) were increased in response to negatively loaded targets (the snake). Fichtenholtz et al.[67] concluded that facial emotion affects ERP responses first (early after stimulus onset), followed by an interaction of facial emotion and gaze location (spatial attention), and a subsequent interaction of facial emotion and gaze validity (whether the stimulus actually occurred at the cued location). These results provide insight into the complexity of attentional precedence mechanisms in the processing of emotion-, gaze-, and cueing information.
Other effects of negative facial expressions on the P300 were interpreted to reflect sustained attention to the task of recognising and categorising the emotional valence of aface[68]. Another attentional cueing paradigm showed that contextual non-facial information that preceded the actual facial cue influenced ERPs associated with facial emotion[69]. P1 amplitudes increased in response to fearful faces when preceded by intact objects as compared to scrambled objects[69]. Whereas, P1 latencies did not respond to facial emotion, N170 latencies were longer in response to fearful faces when preceded by an intact object. Later effects on the P2 amplitude and latency were most pronounced when fearful faces were cued by intact fear eliciting objects (a bee). As the authors did not directly compare cueing effects on social (e.g., faces) and non-social (e.g., cars) stimuli it is difficult to say whether the observed attentional gain is due to stimulus category (social vs. non-social) or the preceding cue’s emotional loading (bee vs. baby).
In contrast to previous studies showing a speeded attentional advantage towards fearful facial expressions[20, 70], Fichtenholtz et al.[71] reported modulations of early ERPs by happy emotions with fearful facial expressions leading to rather late attention-related effects on the P300 (550-650 ms PSO). This time dependent dissociation in ERP modulation of facial stimuli in response to diverging emotional valence (happy vs. fear) was interpreted to reflect a reduction in selective attention towards approach-oriented stimuli (happy faces) and a special facilitation of spatial attentional processes towards fearful expressions, when cued by eye gaze[67].
Manipulation of participants’ attentional load by contrasting ERPs under single and dual-task performance showed that P100, VPP, N170, N300 and P300 amplitudes in response to emotional faces increased under single-task but not under dual-task conditions[72]. Additionally, P1 latencies were longer when attentional demands were increased (dual-task condition,[72]). Participants’ ERP responses in a face-in-the-crowd task revealed that fearful faces were detected faster than happy or angry faces[73]. This behavioural advantage for the detection of threat information was accompanied by an increased N2pc amplitude and shorter latency when compared to perception of angry or happy faces[73]. The N2pc component was preceded by an increase in amplitude at 160 ms PSO, signalling an increase in attention allocation in accordance with type of emotion (again favouring threatening faces,[73]). The authors concluded that fast and highly automatic neuronal processes bias attention towards threat loaded cues, even above mechanisms sensitive to anger and even when many social distractors were present, hence when attentional demands were high (crowd of faces,[73]). Findings of the studies by Luo et al.[72] and Feldmann-Wuestefeld et al.[73] emphasise that ERPs in response to social stimuli can be affected by attentional mechanisms associated with the task. However, both studies did not compare interactional effects of stimulus category (social vs. non-social) and task condition (single s. dual). The findings therefore give no information about whether social perception is less affected by attentional demands than non-social perception. Nevertheless, the idea that social stimuli access unique attentional mechanisms is supported by a study by Santos et al. ([74]; see Figure 1) in which the authors focused on the effects of object-based attention towards either facial expressions or spatial alignment of superimposed houses. They showed that speeded identification of fearful faces as indicated by an enhanced N170 was mediated by selective attention to faces. A sustained positivity starting 160 ms PSO was only present in the processing of fearful faces, even if attention was not directed explicitly to the stimulus[74]. This result underlines that human perception is characterised by distinct attentional mechanisms for socially relevant information (fearful faces) on a subconscious level. Future studies should follow Santos et al.’s[74] design with respect to the control of diverse types of task dependent attentional mechanisms (e.g., object-based vs. spatial attention) in order to broaden the understanding of how attention to social stimuli is modulated by task dependent constrains.
Implicit and explicit attention manipulation towards affective words and affective faces further showed that negative facial expressions modulated the N170 response independent of the focus of attention, whereas emotional words were only reflected in somewhat later occurring ERPs (EPN) if explicitly attended[75]. Late ERP responses (LPP modulation), were equally affected by both stimulus categories[75]. These results emphasise the distinctiveness of attentional processes in response to social stimuli. Whereas earlier ERPs seem to be sensitive to the stimulus category (social vs. non-social), late ERPs have been shown to be less selective.

3.2. Discussion

In summary, greater P100 amplitudes were reported for facial stimuli expressing fear compared to neutral faces (e.g.,[71, 74]), in response to low-spatially filtered unpleasant stimuli[60], and in response to happy faces when compared to fearful faces[67, 76]). Moreover, P100 amplitudes were increased for intact unfiltered emotional faces[60, 69] and when stimuli were previously cued[74].
Intact emotional faces elicited greater P2 amplitudes when compared to scrambled stimuli, with an increase in P200 amplitude being further associated with emotion congruency of previously presented cues and facial stimuli[69]. Moreover, angry as compared to neutral faces increased the P200 amplitude, corroborating Eimer and Holmes’[45] proposal that emotional faces are associated with greater P200 amplitudes.
Increased N100 amplitudes were reported for explicitly attended fearful faces[74] and happy expressions[67] when compared to neutral faces. Infants´ Nc amplitudes were sensitive to face familiarity[54] and to unknown faces expressing anger[53] or fear[52]. The Nc amplitude enhancement was interpreted to reflect enhanced attention towards emotionally relevant stimuli[52]. This attentional enhancement is not detectable in ERP recordings until infants reach the age of 7-months[52]. N170 amplitudes correlated with the level of negativity expressed by a face[76] or general intensity level of emotion[46]. The majority of studies reported increased N170 responses to fearful faces (e.g.,[72, 77]) with only one study reporting increased N170 amplitudes in response to happy faces[72]. Increased N170 amplitudes were also found for expressions of neutral compared to pleasant emotions[60] and for strangers’ compared to mothers’ faces in children[54]. The diversity of emotion that impacts on N170 modulation indicates that the N170 apparently responds to changes in facial features, while being rather unspecific to actual emotional valence.
Later ERP responses were shown to reflect more elaborate processes such as empathy (P300 enhancement,[36]) or face familiarity (shorter P300 latencies,[39]). Nevertheless, P300 amplitudes were also associated with sustained emotion processing from faces[68, 72]. In contrast to Luo et al.[72], Fichtenholtz et al.[67] identified increased P300 amplitudes in response to happy as compared to fearful faces, an effect that was reversed for the P300 response in later time windows (550-650 ms PSO). Contrasting results, however, are in this case due to a different use in terminology across research groups. It has to be stressed, that a consensus about which ERP peak and latency is associated with which component is highly aspired but not always realised.
To conclude, unique effects of attention and perception of social stimuli in the case of emotionally loaded facial stimuli are reflected best in the study by Santos et al.[74]. The recording of ERPs in response to either explicitly or implicitly attended emotional faces that were superimposed by a non-social stimulus (house) showed that fearful faces are associated with a speeded neuronal response immediately after stimulus-onset up to 160 ms PSO, even when participants were instructed to pay attention to the superimposed house. A special role of emotional social stimuli was also supported by a unique N170 response to emotional faces as compared to affective words[75]. This N170 enhancement was even seen, when participants did not directly attend to the facial expression per se[75].
Apparently, human information processing is marked by attentional mechanisms that favour and facilitate the processing of social stimuli. Again, early potentials (e.g., N70, N170) seem more sensitive to the social category of the stimulus than later potentials. ERP evidence supports the presence of a unique facilitation in form of speeded and implicit attention to social stimuli (e.g.,[74, 75]). Future studies should aim to directly compare early ERPs associated with emotionally loaded social stimuli (e.g., faces) to emotionally significant objects (e.g., cultural signs, such as stop or first aid signs), while trying to keep luminance and contrast levels constant across conditions.

4. Eyes and Eye Gaze

4.1. Review of ERPs Sensitive to Eyes and Eye Gaze

Similar to facial stimuli, information from eyes and eye gaze cues is a crucial source to extract information from social situations. The eye region has been shown to be the most attended to face area for social perceptual processes (e.g.,[78, 79]). Moreover, information from eyes seems to be a prerequisite in emotion recognition (e.g.,[80, 81]), especially in processing signals of fear[82, 83].
However, some behavioural findings question the distinctiveness of eye stimuli in capturing participants’ attention and show similar attentional enhancement using simple arrow stimuli (e.g.,[84, 85]). ERP responses to an ambiguous cue, presented twofold, either introduced as an eye in profile or as an arrowhead showed that even though cued targets always produced a behavioural advantage in reaction-time, irrespective of social (eye gaze) or non-social (arrowhead) interpretation of the stimulus, P1 was only enhanced in response to the cue if introduced as an eye ([86], see Figure 1). Given that the stimuli in the two conditions (eye vs. arrowhead) did not differ physically, low-level visual properties cannot account for the observed difference in sensory/attentional gain. The authors concluded that eyes are indeed associated with a special attentional gain mechanism. This gain mechanism is likely to derive from the higher amount of social information associated with eyes in comparison to arrows[87].
Feng et al.[88] found enhanced and earlier P120 amplitudes when high-intensity eye whites where compared to low-intensity eye whites, resembling different levels of fear. This effect was absent for pixel-matched control squares suggesting that this unique attentional effect of eye stimuli on the P120 is related to the social relevance of the stimuli[88].
ERP effects of inversion seen in face perception (see section 2.1) were also found for eye stimuli[89]. Whereas the pure perception of eye gaze affected early attentional mechanisms (around 100 ms PSO), inversion of eyes was associated with N170 modulation comparable to that in studies on face-inversion (e.g.,[38]). In line with an attention enhancing effect due to detection of change[24] detection of variation of eye movements was also reflected in N170 modulations[90]. In addition to N170 modulation by eye inversion, P100 latencies were delayed for inverted eyes in upright faces but unaffected when inverted eyes were presented in inverted faces[89].
Investigating interactional effects of emotion and gaze perception revealed a speeding-up of reaction-times towards fearful faces and targets that were anteceded by a leftward gazing character[71]. This behavioural effect corresponded to an increased P135 amplitude in response to fearful expressions and enhanced N190 amplitudes in response to rightward gazing faces. An enhanced P325 amplitude was interpreted to represent an attentional bias towards the left visual field in facial emotion processing[71]. Given that emotional expressions were more likely than eye gaze direction to enhance target processing, the authors concluded that the emotional state of a person is of primary importance during social attention, with actual gaze direction being subordinately processed[71]. In contrast, Holmes et al.[91] did not find any interactional effects on ERPs due to the combined presentation of gaze direction and emotional facial expression. Overall, enhanced ERPs to fearful compared to neutral facial expressions were interpreted to represent regular perceptual alerting effects[91]. A marked effect on the EDAN in reaction to the spatial direction of the gaze cues, however, was in line with the gaze cues’ capacity to capture participants’ attention automatically and to keep attention held at the gazed-at position[91]. Whereas Fichtenholtz et al.[71] used the social stimulus (face and gaze variation) as a cueing stimulus and two rectangles as target stimuli, Holmes et al.[91] presented neutral faces that moved their eyes either to the right or left changing their facial expression either to happy or fearful or not at all (neutral) upon the eye movement. The contrast in findings might therefore be due to differences in study design.
Modulating effects on the P350 and P500 were found to depend on the social context in which eye gaze stimuli were presented[92]. In this study the social context was manipulated by presenting three face stimuli that gazed at each other, gazed at the participant, or avoided gaze exchange. Shorter P350 latencies were found for gaze cues that made eye contact with the participant compared to gaze cues that avoided gaze exchange. P350 latencies were also shorter for faces that avoided the gaze of the participant but made eye contact with each other, compared to faces that avoided eye gaze with the participant and each other[92]. P500 amplitudes were smaller for faces that made eye contact with the participant and for faces that looked at each other but not to the participant compared to the gaze avoiding faces[92]. Apparently information from social context is not reflected in early attention related ERPs but in later ERPs only (P350, P500). Later ERP modulations (P250, P450, P600) were assumed to reflect higher-order cognitive processes associated with social interactions such as the participants’ level of interest or arousal associated with avoidance/non-avoidance of gaze exchange[90]. Finally, a gaze adaptation paradigm showed that a decreased negativity of the ERP response in the time-window 250-350 ms PSO was associated with participants´ overt judgements of gaze direction representing an attentional after-effect of spatial adaptation to social cues (eye gaze) with a subsequent response of the LPC (400-600 ms PSO,[93]). The initial ERP signal of the after-effect of adaptation to eye gaze (250-350 ms PSO) was interpreted to reflect participants’ sustained attention to eye gaze information associated with social exchange. According to the authors, amplitudes and latencies of the subsequent LPC in response to eye gaze cues were similar to typical P300 responses. P300 modulation was formerly shown to reflect an attentional enhancement associated with a comparison of previous and current perceptual events in working memory (for a recent review, see[94]). Consequently, LPC modulation by eye gaze cues was interpreted to index novelty detection under perceptual spatial adaptation[93]. This observation again, is in line with the idea that late ERPs are more sensitive to higher-order processes such as social information integration and organisation and less to alerting effects.

4.2. Discussion

In line with an enhancing effect of social cues on attention, increased P1 amplitudes were found for cued eyes[86] and variations in eye-white intensities[88]. Changes in eye gaze were associated with shorter P1 latencies for upright and intact eye-face contexts, when compared to eye gazes in inverted faces[89].
Directly gazing eyes elicited increased P3 amplitudes when preceded by right-ward gazing eyes compared to eye gaze cues that did not change their gaze direction[93]. Invalidly cued eye gazes elicited greater P300 amplitudes in general indicating possible task-dependent effects as described in previous sections[34, 71].
P400 amplitudes were increased in response to leftward gazing eyes in a social gaze exchange[90]. Moreover, smaller P500 amplitudes were observed when comparing processing of two persons gazing at each other to the perception of three persons that avoid gaze with each other and the participant[92]. P300, P400 and P500 modulations are therefore associated with the process of deducing social meaning from eye gaze[92], with EDAN modulations facilitating attention keeping mechanisms towards socially relevant locations[91].
N170 latencies were delayed for inverted eyes in upright faces[89] and eye blinks elicited decreased amplitudes as compared to closed, leftward or upward gazing eyes[90]. Finally, being exposed to a real person gazing directly at participants compared to inanimate dummies increased N170 amplitudes[95] showing sensitivity of the N170 for actual social context. However, the N170 was not modulated by the level of familiarity of eye gaze stimuli[39], adaptation to eye stimuli[93], and whether two or more facial stimuli shared attention as indicated by eye gaze direction[92].
Strong support for the idea of unique attentional processes for social compared to non-social perception is provided by the study by Tipper et al.[86] demonstrating how social loading of physically identical stimuli elicits distinct ERP modulations in favour of unique social perception. P1 amplitudes were only enhanced when participants were instructed that the presented cue represents an eye. Given that luminance, contrast and even shape of the stimuli were the same and only the social/non-social dimension was varied, it must have been the social information eliciting the unique attentional response.
Similarly, comparing ERPs associated with eye-white intensities to pixel-matched squares[88] enabled more straightforward comparisons between social and non-social stimuli. Again early attention related P120 modulations were found for the social stimulus (the eye-white stimuli) only, with the non-social matched control squares eliciting no unique ERP modulations. These two studies are rather exemplary for the usefulness of designs in which social stimulus processing is directly compared to non-social stimulus processing. In contrast to earlier potentials that seem to reflect enhanced perceptual mechanisms for social information, late ERP modulations (>300 ms PSO) were shown to rather reflect higher-order processes associated with social cognition (organisation, interpretation of social cues (e.g.,[90, 92]).

5. Human Posture and Biological Motion

5.1. Review of ERPs Sensitive to Human Posture and Biological Motion

As shown above, studies on social perception as assessed by face-, eye-, and eye gaze stimuli has grown extensively during the last decade. Less attention has been devoted to the perception of social information from other cues (e.g., hands, gestures, or body movements). A recent review by Peelen and Downing[96] indicated that visual body perception, similar to face perception, is marked by distinct attentional responses. N170 responses towards human bodies were similar to those known from studies on face perception[97, 98]. Inverted bodies[77] and changes in body configurality affected the N170 amplitude even across age groups (3-month olds vs. adults:[99];[100]). Moreover, earlier N170 latencies were found for human bodies when compared to faces, with latencies towards objects being overall delayed[77]. Earlier N170 latencies therefore might be interpreted to reflect the attentional modulation in response to the social dimension inherent to both human bodies and faces compared to objects, with further unique attentional responses to emotional content within each class of social stimuli (human bodies vs. faces).
Whereas attention to emotion from facial expressions was reflected in N170, P2 and N2 modulation, emotional content of body cues affected the VPP (the positive equivalent of the N170) and was reflected in a sustained negativity 300-500 ms PSO. Increased N170 amplitudes were found in response to point-light-displays depicting standard body movements compared to random scrambled point-lights[101]. An early body movement detection mechanism was associated with a positive ERP shift between 100-200 msPSO[101]. P400 amplitudes were decreased when a typical sequence of human movement was processed compared to random movements[101]. These findings underline that it is the social dimension of the stimulus (human movement) that modulated the attentional response (P100-P200, P400).
In contrast to earlier studies reporting no effect of emotion on body perception[97, 98], Van Heijnsbergen et al.[102] observed earlier P1 latencies for intact fearful bodies, whereas scrambled versions of the stimuli did not modulate P1 responses[102]. The VPP showed faster latencies for bodies expressing fear as compared to neutral bodies. Again, the scrambled stimulus versions did not impact on N170 or VPP temporal and spatial topography[102]. The authors concluded that fear information derived from body cues is associated with similar speeded neuronal responses as seen in fearful face processing[102]. Recently, Wang et al.[103] suggested a hierarchy in LPC amplitude enhancement for differences according to the level of social content of complex scenes. Accordingly, the LPC was enhanced for scenes of persons interacting with one another compared to scenes depicting unanimated objects. Moreover, LPC amplitudes towards Theory of Mind (ToM) scenes were even higher than for both aforementioned stimulus categories. These results show that attentional mechanisms indexed in an increase in LPC amplitudes reflect the complexity of the amount of social information.
Finally, larger N400 peaks in response to abnormal sequences of an actual goal-directed body movement (eating) were interpreted by Reid and Striano[104] as reflecting a neuronal correlate sensitive to the semantic information of a typical body movement. This gain in N400 amplitude is likely to reflect an attentional enhancement of social perceptual mechanisms in order to extract and process untypical from typical social behaviour.

5.2. Discussion

The perception of human posture and body movements has been shown to be associated with diverse ERP modulations. Shorter P1 latencies were found for body postures expressing fear compared to neutral body postures[102]. Person perception and ToM eliciting scenes increased P2 amplitudes[103]. If participants attended to human movement, N2 amplitudes increased[101]. Finally, enhanced N400 amplitudes were shown to be not only sensitive to violations in verbal semantics (e.g.,[105]) but also to violations of semantics of biological motion[104].
Particularly informative for the question of unique attentional mechanisms for social versus non-social stimuli is the finding of earlier N170 amplitudes in response to bodies and faces as compared to objects[77], which can be taken as experimental evidence for a predisposition of human perception for social information. Future studies should aim to directly compare ERPs in response to body postures, human movement and complex social scenes to other social stimuli such as faces, eyes etc. and to non-social objects in order to outline similarities in attentional processes across social stimulus types corroborating the idea of generally prioritised attentional processes in social perception.

6. General Discussion

The aim of the current literature review was to investigate whether the processing of social information uniquely modulates attentional mechanisms (indexed in ERP amplitudes and latencies) differently from non-social information. Previous literature on social perception on the basis of ERPs focused foremost on face or eye gaze processing, in particular on processing of emotion from faces or eyes. These previous studies nearly univocally report selective attention related effects of face processing on the N170 component (see review[7]).
More recently, direct comparison of faces with objects was associated with unique attention-related enhancement of early ERPs (P1, N1; e.g.,[32]). This attentional enhancement is likely to be related to facilitating processes in response to social information, supporting the idea that human perception is predisposed for social cues. Late ERPs such as the N400 (e.g.,[104]) or P350 and P500 (e.g.,[92]) in contrast, reflect more elaborate socio-cognitive processes such as evaluation or interpretation of social information[106, 107].
Recent studies on social perception evolved methodologically as manipulations of implicit or explicit attention to the stimulus (e.g.,[88]), spatial- or object-based attention to the stimulus category (e.g.,[74]), and stimulus complexity (e.g.,[46, 88, 103]) were introduced instead of measuring ERPs to passively presented social stimuli. This refinements of measurement made it possible to address methodological confounds unrelated to the processing of social information (e.g., influence of luminance or contrast;[61, 86, 60].
Given that unique attentional mechanisms were still found for social information after these methodological refinements (e.g.,[86, 60) the evidence that human perceptual processes are distinct for social perception is strengthened. In addition, the focus on body perception increased (for a recent review see[96]) acknowledging the fact that social information is not only related to faces or eye cues. This development of increasing stimulus diversity of social perceptual studies led to an enhancement of the conceptual understanding of ERP functionality as attentional mechanisms can be compared across types of social stimuli. As an example, the comparison of attentional mechanisms in response to emotional cues across different stimuli (face stimuli vs. body posture vs. body movement/gestures) showed that the VPP, which appears to be the positive equivalent of the N170, seems to be the neural correlate of emotion processing from body cues[77, 102], whereas the N170 is the functional emotion-sensitive equivalent for facial cues (e.g.,[49, 72]).
Itier and Batty[24] addressed the methodological limitations of the discussed literature and questioned the suitability of cueing paradigms for attentional studies. The author’s criticism appears to be important as they speculated that the reflexive nature of participants’ orientation to social stimuli such as eyes might be a consequence of the task and not the social dimension of the stimulus[24]. Future designs should therefore aim to address actual task-dependent influences on ERP modulation in order to clarify whether there are interactional effects of stimulus category (social vs. non-social) and task-dependent factors.
The majority of reviewed studies made use of small sample sizes and only a minority controlled for the influence of gender. Because of females’ biased sensitivity for emotion arousing information[108], it would be desirable that studies on emotion processing from social cues would assess whether neuronal correlates might differ between male and female participants. Yet, irrespective of emotional load, a recent ERP study revealed that females in general show increased cueing effects[109]. Hence, not only ERPs in response to social stimuli might be influenced by sample selection, but also non-social ERPs.
Finally, a consensus in terminology with regard to which amplitude and latency can be attributed to which ERP component across research groups is lacking. For example, whereas some (e.g.,[72]) refer to the P3 response as occurring at around 300 ms PSO, others (e.g.,[67]) refer to early (250-550ms PSO) and late (550-650 ms PSO) P3 complex time windows. A comparison of ERP functionality across studies solely based on ERP components is therefore difficult.
Nevertheless, the current review showed striking similarities of ERP modulations across a wide range of social stimuli, supporting a special role of social information processing. For instance, the amplitude of the early peaking P1 was not only increased in response to inverted[34], misaligned[43], fearful (e.g.,[71, 74]), happy[67, 76], and cued faces[74] but also to abstract stimuli introduced as representing an eye[86] or different levels of fear in eyes[88]. Similarly, the N170 amplitude was increased in response to attended[25], ignored[32], inverted (e.g.,[25, 34]), misaligned[43] and schematic faces[28], but also to bodies[77] and interactions of participants with an actor making eye contact[95]. Consequently, we propose that it is the social dimension of stimuli leading to the described ERP modulations, rather than other stimulus characteristics, given that the discussed stimuli varied in their outer appearance but were accompanied by similar attentional responses. For illustration Figure 1 depicts a visual comparison of physically different stimuli that nevertheless had comparable effects on the P1[28, 74, 86,102].
In conclusion, current studies on social perception indicate that the processing of various types of social information shows functional consistency in ERP modulations on some components (e.g., P1, N170). These functional modulations of early ERPs can be found across different paradigms when using social information as target cues. As shown by paradigms that directly compared ERPs in response to social information with ERPs to non-social cues that were matched in luminance and contrast, these early modulations were uniquely associated with the perception of social information. Modulations of late potentials (e.g., P350, P500) are more likely to reflect social-cognitive processes such as integration of social context (e.g., social referencing effects as seen in social attention studies). Overall, the current review’s integrative approach of identifying similarities in ERP modulation in response to a diversity of social stimuli emphasised the existence of distinct attentional mechanisms for social information processing.
Figure 2. ERP components, their associated functions, and articles per ERP component citing modulation effects associated with social stimuli

References

[1]  Todorov, A., Fiske, S. T., & Prentice, D. A. (2011). Social neuroscience: Toward understanding the underpinnings of the social mind. Social neuroscience: Toward understanding the underpinnings of the social mind, eds. Alexander Todorov, Susan T. Fiske and Deborah A. Prentice. New York, NY US: Oxford University Press.
[2]  Gusnard, D.A.&Raichle, M.E., (2001). Searching for a baseline: functionalimaging and the resting human brain. Nature Reviews Neuroscience, 2 (10), 685-694.
[3]  Jenkins, A. C., & Mitchell, J. P. (2011). How has cognitive neuroscience contributed to social psychological theory? In Social neuroscience: Toward understanding the underpinnings of the social mind., eds. Alexander Todorov, Susan T. Fiske, Deborah A. Prentice, 3-13. New York, NY US: Oxford University Press.
[4]  Fabiani, M., Gratton, G., &Federmeier, K. D. (2007).Event-related brain potentials: methods, theory, and applications. Handbook of Psychophysiology(3), 85-119.
[5]  Wijers, A., Mulder, G., Okita, T., Mulder, L. J. M., &Scheffers, M. (1989). Attention to color: An analysis of selection, controlled search, and motoractivation, using event-related potentials. Psychophysiology, 26, 89-109.
[6]  Foecker, J., Hoelig, C., Best, A.,& Roeder, B. (2011). Crossmodal interaction of facial and vocal person identity information: An event-related potential study. Brain Research, 1385 (APR 18), 229-245.
[7]  Palermo, R., & Rhodes, G. (2007). Are you always on my mind? A review of how face perception and attention interact. Neuropsychologia,45(1), 75-92.
[8]  Ro, T., Russell, C., &Lavie, N. (2001). Changing faces: A detection advantage in the flicker paradigm. Psychological Science, 12, 94-99.
[9]  Halgren, E., Baudena, P., Heit, G., Clarke, J. M., &Marinkovic, K. (1994). Spatio-temporal stages in face and word processing. 1: Depth-recordedpotentials in the human occipital, temporal and parietal lobes. Journal of Physiology, 88, 1-50.
[10]  Pegna, A. J., Khateb, A., Michel, C. M., &Landis, T. (2004). Visual recognition of faces, objects, and words using degraded stimuli: Where and when it occurs. Human Brain Mapping, 22,300-311.
[11]  Purcell, D. G.,& Stewart, A. L. (1988). The face-detection effect: Configurationenhances detection. Perception & Psychophysics, 43, 355-366.
[12]  Yamamoto, S., &Kashikura, K. (1999). Speed of face recognition in humans: An event-related potentials study. NeuroReport, 10(17), 3531-3534.
[13]  Calvo, M. G., &Esteves, F. (2005). Detection of emotional faces: Low perceptual threshold and wide attentional span. Visual Cognition, 12 (1), 13-27.
[14]  Streit, M., Ioannides, A., Liu, L., Wolwer, W., Dammers, J., Gross, J., et al. (1999). Neurophysiological correlates of the recognition of facial expressions of emotion as revealed by magnetoencephalography. Cognitive Brain Research, 7, 481-491.
[15]  Pizzagalli, D., Regard, M., & Lehmann, D. (1999). Rapidemotional face processing in the human right and leftbrain hemispheres: An ERP study. Neuroreport, 10, 2691-2698.
[16]  Eger, E., Jednyak, A., Iwaki, T., &Skrandies, W. (2003). Rapid extraction of emotional expression: Evidence from evoked potential fields during brief presentation of face stimuli. Neuropsychologia, 41, 808-817.
[17]  Pourtois, G., Grandjean, D., Sander, D., &Vuilleumier, P. (2004). Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cerebral Cortex, 14(6), 619-633.
[18]  Halgren, E., Raij,T.,Marinkovic, K., Jousmaeki,V.,& Hari, R. (2000). Cognitiveresponse profile of the human fusiform face area as determined by MEG.Cerebral Cortex, 10(1), 69-81.
[19]  Rousselet, G. A., Mac´e, M. J.-M., & Fabre-Thorpe, M. (2003). Is it an animal? Is it a human face? Fast processing in upright and inverted natural scenes. Journal of Vision, 3, 440-455.
[20]  Batty, M., & Taylor, M. J. (2003). Early processing of the six basic facial emotional expressions. Cognitive Brain Research, 17(3), 613-620.
[21]  Esslen, M., Pascual-Marqui, R. D., Hell, D., Kochi, D., & Lehmann, D. (2004). Brain areas and the time course of emotional processing. NeuroImage, 21(4), 1189-1203.
[22]  Bentin, S., & Carmel, D. (2002). Accounts for the N170 face-effect: A reply to Rossion, Curran, & Gauthier. Cognition, 85, 197-202.
[23]  Itier, R. J., & Taylor, M. J. (2004). N170 or N1? Spatiotemporal differences between object and face processing using ERPs. Cerebral Cortex, 14(2), 132-142.
[24]  Itier, R. J., & Batty, M. (2009). Neural bases of eye and gaze processing: The core of social cognition. Neuroscience and Biobehavioral Reviews, 33(6), 843-863.
[25]  Ishizu, T., Ayabe, T., &Kojima, S. (2008). Configurational factors in the perception of faces and non-facial objects: An ERP study. International Journal of Neuroscience, 118 (7), 955-966.
[26]  Kornmeier, J., & Bach M. (2009). Object perception: When our brain is impressed but we do not notice it. Journal of Vision, 9 (1), 7.
[27]  Kuefner, D., De Heering, A., Jacques, C., Palmero-Soler, E., &Rossion, B. (2010). Early visually evoked electrophysiological responses over the human brain (P1, N170) show stable patterns of face-sensitivity from 4 years to adulthood 3. Frontiers in Human Neuroscience, 3, 67.
[28]  Bentin, S., Golland, Y., Flevaris, A., Robertson, L. C., &Moscovitch, M. (2006). Processing the trees and the forest during initial stages of face perception: Electrophysiological evidence. Journal of Cognitive Neuroscience, 18 (8), 1406-1421.
[29]  Caharel, S., Fiori, N., Bemard, C., Lalonde, R., &Rebai, M. (2006). The effects of inversion and eye displacements of familiar and unknown faces on early and late-stage ERPs. International Journal of Psychophysiology, 62(1), 141-151.
[30]  Haxby, J. V., Ungerleider, L. G., Clark, V. P., Schouten, J. L., Hoffman, E. A., &Martin, A. (1999). The effect of face inversion on activity in human neural systems for face and object perception. Neuron, 22 (1), 189-199.
[31]  Rossion, B., & Jacques, C. (2008). Does physical interstimulus variance account for early electrophysiological face sensitive responses in the human brain? Ten lessons on the N170. Neuroimage, 39, 1959-1979.
[32]  Deiber, M. P., Rodriguez, C., Jaques, D., Missonnier, P., Emch. J., Millet, P., Gold, G., Giannakopoulos, P., & Ibanez, V. (2010). Aging effects on selective attention-related electroencephalographic patterns during face encoding. Neuroscience, 171 (1), 173-186.
[33]  Taylor, M. J. (2002). Non-spatial attentional effects on P1. Clinical Neurophysiology, 113 (12), 1903-1908.
[34]  Pesciarelli, F., Sarlo, M., & Leo, I. (2011). The time course of implicit processing of facial features: An event-related potential study. Neuropsychologia, 49 (5), 1154-1161.
[35]  Ebner, N. C., He Y., Fichtenholtz, H. M., McCarthy, G., &Johnson, M. K. (2011). Electrophysiological correlates of processing faces of younger and older individuals. Social Cognitive and Affective Neuroscience, 6 (4), 526-535.
[36]  Ibanez, A., Hurtado, E., Lobos, A., Escobar, J., Trujillo, N., Baez, S., Huepe, D., Manes, F., &Decety, J. (2011). Subliminal presentation of other faces (but not own face) primes behavioral and evoked cortical processing of empathy for pain. Brain Research,1398 (JUN 29), 72-85.
[37]  Yin, R. K. (1969). Looking at upside-down faces. Journal of Experimental Psychology, 81 (1), 141.
[38]  Bentin, S., Allison, T., Puce, A., Perez, E., & McCarthy, G. (1996). Electrophysiological studies of face perception in humans. Journal of Cognitive Neuroscience, 8, 551-565.
[39]  Bobes, M. A., Quiñonez, I., Perez, J., Leon, I., & Valdés-Sosa, M. (2007). Brain potentials reflect access to visual and emotional memories for faces. Biological Psychology, 75 (2), 146-153.
[40]  Picton, T.W. (1992). The P300 wave of the human event-related potentials. Journal Clinical Neurophysiology, 9, 456-479.
[41]  Ruchkin, D.S., Johnson, R., Canoune, H.L., Ritter, W., &Hammer, M. (1990). Multiple source of P3b associated with different types of information. Psychophysiology, 27, 157-176.
[42]  Tacikowski, P., &Nowicka, A. (2010). Allocation of attention to self-name and self-face: An ERP study. Biological Psychology, 84 (2), 318-324.
[43]  Jacques, C., &Rossion, B. (2009). The initial representation of individual faces in the right occipito-temporal cortex is holistic: Electrophysiological evidence from the composite face illusion. Journal of Vision, 9 (6), 11.
[44]  Liddell, B. J., Williams, L. M., Rathjen, J., Shevrin, H., & Gordon, E. (2004). A temporal dissociation of subliminal versus supraliminal fear perception: An event-related potential study. Journal of Cognitive Neuroscience, 16,479-486.
[45]  Eimer, M., & Holmes, A. (2007). Event-related brain potential correlates of emotional face processing RID C-4749-2008. Neuropsychologia, 45 (1), 15-31.
[46]  Utama, N. P., Takemoto, A., Koike, Y., & Nakamura, K. (2009). Phased processing of facial emotion: An ERP study. NeuroscienceResearch, 64 (1), 30-40.
[47]  Wong, T. K. W., Fung, P. C. W., McAlonan, G. M., & Chua, S. E. (2009). Spatiotemporal dipole source localization of face processing ERPs in adolescents: A preliminary study. Behavioral and Brain Functions, 5.
[48]  Holmes, A., Vuilleumier, P., & Eimer, M. (2003). The processing of emotional facial expression is gated by spatial attention: Evidence from event-related brain potentials. Cognitive Brain Research, 16, 174-184.
[49]  Williams, L. M., Palmer, D., Liddell, B. J., Song, L., & Gordon, E. (2006). The ‘when’ and ‘where’ of perceiving signals of threat versus non-threat. NeuroImage, 31 (1), 458-467.
[50]  Ito, T. A., Larsen, J. T., Smith, N. K., &Cacioppo, J. T. (1998). Negative information weighs more heavily on the brain: The negativity bias in evaluative categorizations. Journal of Personality and Social Psychology, 75(4), 887-900. doi: 10.1037/0022-3514.75.4.887
[51]  Smith, N. K., Cacioppo, J. T., Larsen, J. T., &Chartrand, T. L. (2003). May I have your attention, please: Electrocortical responses to positive and negative stimuli.Neuropsychologia, 41(2), 171-183. doi: 10.1016/S0028-3932(02)00147-1
[52]  Peltola, M. J., Leppänen, J. M., Mäki, S., &Hietanen, J. K. (2009). "Emergence of Enhanced Attention to Fearful Faces between 5 and 7 Months of Age." Social Cognitive and Affective Neuroscience, 4 (2), 134-142.
[53]  Kobiella, A., Grossmann T., Reid V. M., &Striano, T. (2008). The discrimination of angry and fearful facial expressions in 7-month-old infants: An event-related potential study. Cognition & Emotion, 22 (1), 134-146.
[54]  Todd, R. M., Lewis, M. D., Meusel, L., &Zelazo, P. D. (2008). The time course of social-emotional processing in early childhood: ERP responses to facial affect and personal familiarity in a go-nogo task. Neuropsychologia, 46 (2), 595-613.
[55]  Cunningham, W. A., Espinet, S. D., DeYoung, C. G., &Zelazo, P. D. (2005). Attitudes to the right – and left: Frontal ERP asymmetries associated with stimulus valence and processing goals. NeuroImage, 28, 827-834.
[56]  Lewis, M. D., Lamm, C., Segalowitz, S. J., Stieben, J., &Zelazo, P. D. (2006). Neurophysiological correlates of emotion regulation in children and adolescents. Journal of Cognitive Neuroscience, 18(3), 430-443. doi: 10.1162/jocn.2006.18.3.430
[57]  Taylor, M. J., Batty, M., &Itier, R. J. (2004). The faces of development: A review of early face processing over childhood. Journal of Cognitive Neuroscience, 16, 1426-1442.
[58]  Yuan J., Hong, L., Antao, C., &Yuejia, L. (2007). Neural correlates underlying humans' differential sensitivity to emotionally negative stimuli of varying valences: An ERP study. Progress in Natural Science, 17, 115-121.
[59]  Yuan, J., Zhang, Q., Chen, A., Li, H., Wang, Q., Zhuang, Z., &Jia, S. (2007). Are we sensitive to valence differences in emotionally negative stimuli? electrophysiological evidence from an ERP study.Neuropsychologia, 45(12), 2764-2771. doi: 10.1016/j.neuropsychologia.2007.04.018
[60]  Alorda, C., Serrano-Pedraza I., Campos-Bueno J. J., Sierra-Vazquez, V., & Montoya P. (2007). Low spatial frequency filtering modulates early brain processing of affective complex pictures RID B-1809-2008. Neuropsychologia, 45 (14), 3223-3233.
[61]  Vlamings, P. H. J. M., Goffaux, V., &Kemner, C. (2009). Is the early modulation of brain activity by fearful facial expressions primarily mediated by coarse low spatial frequency information? Journal of Vision, 9 (5), 12.
[62]  Kawasaki, H., Kaufman, O., Damasio, H., Damasio, A. R., Granner, M., Bakken, H., et al. (2001). Single-neuron responses to emotional visual stimuli recorded in human ventral prefrontal cortex. Nature Neuroscience, 4, 15-16
[63]  Pizzagalli, D. A., Lehmann, D., Hendrick, A.M., Regard,M., Pascual-Marqui, R. D., & Davidson, R. J. (2002). Affective judgments of faces modulate early activity(approximately 160 ms) within the fusiform gyri. Neuroimage, 16, 663-677.
[64]  Eimer, M., & Holmes, A. (2002). An ERP study on the time course of emotional face processing. NeuroReport, 13 (4), 427-431.
[65]  Pourtois, G., Dan, E. S., Grandjean, D., Sander, D., &Vuilleumier, P. (2005). Enhanced extrastriate visual response to bandpass spatial frequency filtered fearful faces: Time course and topographic evoked-potentials mapping. Human Brain Mapping, 26(1), 65-79. doi: 10.1002/hbm.20130
[66]  Williams, L. M., Liddell, B. J., Rathjen, J., Brown, K. J., Gray, J., Phillips, M., . . . Gordon, E. (2004). Mapping the time course of nonconscious and conscious perception of fear: An integration of central and peripheral measures. Human Brain Mapping, 21(2), 64-74. doi: 10.1002/hbm.10154
[67]  Fichtenholtz, H. M., Hopfinger, J. B., Graham, R.,Detwiler, J.M., &LaBar, K. S. (2007). Happy and fearful emotion in cues and targets modulate event-related potential indices of gaze-directed attentional orienting. Social Cognitive and Affective Neuroscience, 2 (4), 323-333.
[68]  Kubota, J. T., & Ito, T. A. (2007). Multiple cues in social perception: The time course of processing race and facial expression. Journal of Experimental Social Psychology, 43 (5), 738-752.
[69]  Hirai, M., Watanabe, S., Honda, Y., Miki, K., &Kakigi, R. (2008). Emotional object and scene stimuli modulate subsequent face processing: An event-related potential study. Brain Research Bulletin, 77 (5), 264-273.
[70]  Krolak-Salmon, P., Fischer, H., Vighetto, A., &Mauguiere, F. (2001). Processing of facial emotional expression: Spatio-temporal data as assessed by scalp event-related potentials. European Journal of Neuroscienc, 13, 987-994.
[71]  Fichtenholtz, H. M., Hopfinger J. B., Graham, R., Detwiler, J. M., &LaBar, K. S. (2009). Event-related potentials reveal temporal staging of dynamic facial expression and gaze shift effects on attentional orienting. Social Neuroscience, 4 (4), 317-331.
[72]  Luo, W., Feng, W., He, W., Wang, N., &Luo, Y. (2010). Three stages of facial expression processing: ERP study with rapid serial visual presentation. NeuroImage, 49 (2), 1857-1867.
[73]  Feldmann-Wuestefeld, T., Schmidt-Daffy, M., &Schuboe, A. (2011). Neural evidence for the threat detection advantage: Differential attention allocation to angry and happy faces. Psychophysiology, 48 (5), 697-707.
[74]  Santos, I. M., Iglesias, J., Olivares, E. I.,& Young, A. W. (2008). Differential effects of object-based attention on evoked potentials to fearful and disgusted faces RID C-1036-2011 RID B-7135-2011. Neuropsychologia, 46 (5), 1468-1479.
[75]  Fruehholz, S., Jellinghaus, A., & Herrmann, M. (2011). Time course of implicit processing and explicit processing of emotional faces and emotional words. Biological Psychology, 87 (2), 265-274.
[76]  Fruehholz, S., Fehr, T., & Herrmann, M. (2009). Early and late temporo-spatial effects of contextual interference during perception of facial affect. International Journal of Psychophysiology, 74 (1), 10.
[77]  Stekelenburg, J. J., & Gelder, B. (2004). The neural correlates of perceiving human bodies: An ERP study on the body-inversion effect. Neuro Report, 15, 777-780.
[78]  Itier, R.J., Alain, C., Sedore, K., & McIntosh, A. R.(2007). Early face processing specificity: it’s in the eyes! Journal of Cognitive Neuroscience, 19 (11), 1815-1826.
[79]  Janik, S. W., Wellens, A. R., Goldberg, M. L., &Dellosso, L. F. (1978). Eyes as the center of focus in the visual examination of human faces. Perceptual and Motor Skills, 47 (3), 857-858.
[80]  Calder, A.J., Young, A. W., Keane, J., & Dean, M.(2000). Configural information in facial expression perception. Journal of Experimental Psychology - Human Perception and Performance, 26 (2), 527-551.
[81]  Ekman, P., & Friesen, W. V. (1978). Facial Action Coding System: Investigator’s Guide. Consulting Psychologists Press. Palo Alto, CA
[82]  Adolphs, R., Gosselin, F., Buchanan, T.W., Tranel, D., Schyns, P., &Damasio, A.R. (2005). A mechanism for impaired fear recognition after amygdala damage. Nature, 433 (7021), 68-72.
[83]  Leppänen, J. M., Hietanen, J. K., &Koskinen, K. (2008). Differential early ERPs to fearful versus neutral facial expressions: A response to the salience of the eyes? Biological Psychology, 78 (2), 150-158.
[84]  Ristic, J., &Kingstone, A. (2005). Taking control of reflexive social attention. Cognition, 94 (3), B55-65.
[85]  Tipples, J. (2002). Eye gaze is not unique: automatic orienting in response to uninformative arrows. Psychonomic Bulletin andReview, 9 (2), 314-318.
[86]  Tipper, C. M., Todd C. H., Giesbrecht, B., &Kingstone, A. (2008). Brain responses to biological relevance. Journal of Cognitive Neuroscience, 20 (5), 879-891.
[87]  Birmingham, E., Bischof, W. F., &Kingstone, A. (2008). Gaze selection in complex social scenes. Visual Cognition, 16 (2-3), 341-355.
[88]  Feng, W., Luo, W., Liao, Y., Wang, N., Gan, T., &Luo, Y. (2009). Human brain responsivity to different intensities of masked fearful eye whites: An ERP study. Brain Research, 1286 (25), 147-154.
[89]  Doi, H., Sawada, R., &Masataka, N. (2007). The effects of eye and face inversion on the early stages of gaze direction perception - an ERP study. Brain Research, 1183, 83-90.
[90]  Brefczynski-Lewis, J. A., Berrebi, M. E., McNeely, M. E., Prostko, A. L., &Aina Puce. (2011). In the blink of an eye: Neural responses elicited to viewing the eye blinks of another individual. Frontiers in Human Neuroscience, 5, 68.
[91]  Holmes, A., Mogg, K.,Monje G. L., & Bradley, B. P. (2010). Neural activity associated with attention orienting triggered by gaze cues: A study of lateralized ERPs. Social Neuroscience, 5 (3), 285-95.
[92]  Carrick, O. K., Thompson, J. C., Epling, J. A., &Puce, A. (2007). It's all in the eyes: Neural responses to socially significant gaze shifts. NeuroReport: For Rapid Communication of Neuroscience Research, 18 (8), 763-766.
[93]  Kloth, N.,& Schweinberger, S. R. (2010). Electrophysiological correlates of eye gaze adaptation RID A-1860-2009. Journal of Vision, 10 (12), 17.
[94]  Polich, J. (2007). Updating P300: an integrative theory of P3a and P3b. Clinical Neurophysiology, 118, 2128-2148.
[95]  Ponkanen, L.M., Alhoniemi, A., Leppanen, J. M., &Hietanen, J. K. (2011). Does it make a difference if I have an eye contact with you or with your picture? An ERP study. Social Cognitive and Affective Neuroscience, 6 (4), 486-494.
[96]  Peelen, M. V., & Downing, P. E. (2007). The neural basis of visual body perception. Nature Reviews Neuroscience, 8 (8), 636-648.
[97]  Kovács, G. et al. (2005). Electrophysiological correlates of visual adaptation to faces and body parts in humans. Cerebral Cortex, 16, 742-753.
[98]  Mouchetant-Rostaing, Y., Giard, M. H., Delpuech, C., Echallier, J. F., &Pernier, J. (2000). Early signs of visual categorization for biological and non-biological stimuli in humans. Neuroreport, 11, 2521-2525.
[99]  Gliga, T. &Dehaene-Lambertz, G. (2005). Structural encoding of body and face in human infants and adults. Journal of Cognitive Neuroscience, 17, 1328-1340.
[100]  Hirai, M. &Hiraki, K. (2005). An event-related potentials study of biological motion perception in human infants. CognitiveBrain Research, 22, 301-304.
[101]  Krakowski, A. I., Ross L. A., Snyder, A. C., Sehatpour, P., Kelly, S. P., &Foxe, J. F. (2011). The neurophysiology of human biological motion processing: A high-density electrical mapping study. NeuroImage, 56 (1), 373-383.
[102]  van Heijnsbergen, C. C. R. J., Meeren, H. K. M., Grezes, J., & de Gelder, B. (2007). Rapid detection of fear in body expressions, an ERP study. Brain Research, 1186, 233-241.
[103]  Wang, Y. W., Lin, C. D., Yuan, B., Huang, L., Zhang, W. X., &Shen D. L. (2010). Person perception precedes theory of mind: An event related potential analysis. Neuroscience, 170 (1), 238-246.
[104]  Reid, V.M., &Striano T. (2008). N400 involvement in the processing of action sequences. Neuroscience Letters, 433 (2), 93-97.
[105]  Luck, S.J. (2005). An Introduction to the Event-Related Potential Technique. The MIT Press, Cambridge.
[106]  Liu, D., Sabbagh, M. A., Gehring, W. J., &Wellman, H. M. (2004). Decoupling beliefs from reality in the brain: an ERP study of theory of mind. Neuroreport, 15, 991-995.
[107]  Puce, A., Epling, J. A., Thompson, J. C., & Carrick, O. K. (2007). Neural responses elicited to face motion and vocalization pairings. Neuropsychologia, 45, 93-106.
[108]  Proverbio, A. M., Zani, A., &Adorni, R. (2008). Neural markers of a greater female responsiveness to social stimuli. BMC Neuroscience, 9.
[109]  Feng, Q., Zheng, Y., Zhang, X., Song, Y., Luo, Y., Li, Y., &Talhelm, T. (2011). Gender differences in visual reflexive attention shifting: Evidence from an ERP study. Brain Research, 1401, 59-65. doi: 10.1016/j.brainres.2011.05.041
[110]  Botvinick, M. M., Braver, T. S., Barch, D. M., Carter, C. S., & Cohen, J. D. 2001. Conflict monitoring and cognitive control. Psychological Review 108: 624-52.
[111]  Cassia, V. M., Kuefner, D., Westerlund, A., & Nelson, C. A. 2006. A behavioural and ERP investigation of 3-month-olds' face preferences. Neuropsychologia 44 (11): 2113-25.
[112]  Ito, T. A., &Urland, G. R. 2005. The influence of processing objectives on the perception of faces: An ERP study of race and gender perception. Cognitive Affective & Behavioral Neuroscience 5 (1): 21-36.
[113]  Puce, A., Syngeniotis A., Thompson J.C., Abbott, D. F., Wheaton, K. J., &Castiello, U. 2003. The human temporal lobe integrates facial form and motion: Evidence from fMRI and ERP studies RID E-7569-2010. NeuroImage 19 (3) (JUL): 861-9.
[114]  Harter, S.L., Miller, N.J., Price, M.E.,&LaLonde, A.L. 1989. Keyes Neural processes involved in directing attention. J. Cogn. Neurosci., 1: 223–237.
[115]  Praamstra, P., Boutsen, L., & Humphreys, G. W. 2005. Frontoparietal control of spatial attention and motor intention in human EEG. Journal of Neurophysiology94(1): 764–774.
[116]  deHaan, M., Johnson, M. H., &Halit, H. 2003. Development of face-sensitive event related potentials during infancy: A review. International Journal of Psychophysiology 51(1): 45–58.
[117]  Key, A.P. F., Stone, W., & Williams, S. M. 2009. What do infants see in faces? ERP evidence of different roles of eyes and mouth for face perception in 9-month-old infants. Infant and Child Development 18 (2) (03): 149-62.
[118]  Hietanen, J. K., Leppänen, J.M.,Nummenmaa, L., &Astikainen, P. 2008. Visuospatial attention shifts by gaze and arrow cues: An ERP study. Brain Research 1215 (06): 123-36.
[119]  Righart, R., Burra, N., &Vuilleumier, P. 2011. Face perception in the mind’s eye. Brain Topography 24 (1) (03): 9-18.
[120]  Kok, A. 2001. On the utility of P3 amplitude as a measure of processing capacity. Psychophysiology38: 557–577.
[121]  Sabbagh, M. A., Moulson, M. C., &Harkness, K. L. 2004. Neural correlates of mental state decoding in human adults: An event-related potential study. Journal of Cognitive Neuroscience 16 (3): 415-26.
[122]  Hopf, J. M., &Mangun, G. R. 2000. Shifting visual attention in space: An electrophysiological analysis using high spatial resolution mapping. Clinical Neurophysiology 11(7): 1241–1257.
[123]  Pollak, S. D., Klorman, R., Thatcher, J. E., &Cicchetti, D. 2001. P3b reflects maltreated children’s reactions to facial displays of emotion. Psychophysiology38: 267–274.