Repository logo
 

Event-related potentials for the implicit and explicit processing of emotional facial expressions as basic level- and subordinate level-stimulus categories

Date

2014

Authors

Nomi, Jason S., author
Troup, Lucy J., advisor
Davalos, Deana, advisor
Kraiger, Kurt, committee member
Draper, Bruce A., committee member

Journal Title

Journal ISSN

Volume Title

Abstract

The two dominant models in face perception propose independent mechanisms are responsible for initial face perception (discriminating a face from an object), identity recognition (recognizing a specific face) and emotional expression perception (processing of an expression). However, Bruce and Young (1986) propose a linear model where identity recognition and expression perception operate in a parallel manner after initial face perception while Haxby, Hoffman and Gobbini (2000) propose an interactive model where all three mechanisms interact with each other within a non-linear core system. Event related potentials (ERPs) demonstrate that initial face perception is reflected by the temporal occipital P1 and N170 while identity recognition is reflected by the anterior N250. Some studies have found an expression influence on the P1 and N170 while other studies have not, providing mixed support for either model. The current study examined how facilitation of basic level and subordinate level category processing of emotional expressions may have influenced the results of previous studies. Research in stimulus category processing demonstrates that faces are typically processed at the subordinate level (e.g. my friend "Joe" as opposed to the basic level of "face") while objects are processed at the basic level (e.g. car but not the subordinate level of "Nissan Sentra"). However, there has been little research exploring how the processing of expressions may be influenced by category processing. Happy, neutral and sad expressions were presented in isolation for Experiment 1 to facilitate processing of expressions on the basic level (faces are all unfamiliar with the most basic changes being only in expression) while the same expressions were presented alongside cars, houses and butterflies in Experiment 2 to facilitate subordinate processing (basic level: faces vs. objects; subordinate level: happy, neutral and sad expressions and cars, houses and butterflies). Experiment 1 found P1 and N170 modulations by happy, neutral and sad expressions that were not influenced by implicit or explicit processing condition with no such modulations in Experiment 2. Additionally, there were early modulations of ERPs related to expression in both experiments in the 30-80ms range with explicit processing mediating face and object differences found in the 30-80ms range for Experiment 2. The results of the current study support the Haxby, Hoffman, and Gobbini model where expression perception mechanisms can modulate early ERP components reflecting initial face perception and also show that this modulation depends on the presence or absence of comparison object stimuli. When comparison stimuli were not present (Experiment 1), expressions processed as a basic level stimulus category mainly influenced ERPs in the 140-400ms time range reflecting enhanced processing of the specific expression. When comparison object stimuli were present (Experiment 2), expressions processed as a subordinate stimulus category mainly influenced ERPs in the 30-140ms time range reflecting quicker categorization due to the presence of object stimuli rather than processing of the specific emotional expression.

Description

Rights Access

Subject

category processing
face perception
ERPS
emotional expressions

Citation

Associated Publications