Abstract
AbstractFace perception plays an important role in our daily social interactions, as it is essential to recognize emotions. The N170 Event Related Potential (ERP) component has been widely identified as a major face-sensitive neuronal marker. However, despite extensive investigations conducted to examine this electroencephalographic pattern, there is yet no agreement regarding its sensitivity to the content of facial expressions.Here, we aim to clarify the EEG signatures of the recognition of facial expressions by investigating ERP components that we hypothesize to be associated with this cognitive process. We asked the question whether the recognition of facial expressions is encoded by the N170 as weel as at the level of P100 and P250. In order to test this hypothesis, we analysed differences in amplitudes and latencies for the three ERPs, in a sample of 20 participants. A visual paradigm requiring explicit recognition of happy, sad and neutral faces was used. The facial cues were explicitly controlled to vary only regarding mouth and eye components. We found that non neutral emotion expressions elicit a response difference in the amplitude of N170 and P250. In contrast with the P100, there by excluding a role for low level factors.Our study brings new light to the controversy whether emotional face expressions modulate early visual response components, which have been often analysed apart. The results support the tenet that neutral and emotional faces evoke distinct N170 patterns, but go further by revealing that this is also true for P250, unlike the P100.
Publisher
Cold Spring Harbor Laboratory