Specific EEG/ERP responses to animated facial expressions in virtual reality environments M Simões1, C Amaral1, P Carvalho2, M Castelo-Branco1 |
---|
1IBILI, Faculty of Medicine of University of Coimbra, Portugal |
Visual event-related potentials of facial expressions (FEs) have been studied using usually static stimuli after a nonspecific black screen as a baseline. However, when studying social events, the ecology of the environment and stimuli can be a bias. Virtual reality provides a possible approach to improve ecology while keeping stimulus control . We propose a new approach to study responses to FEs. A human avatar in a virtual environment (a plaza) performs the six universal FEs along the time. The setup consisted of a 3D projection system coupled with a precision-position tracker. Subjects (N=7, mean age=25.6y) beared a 32-channel EEG/ERP cap together with 3D glasses and two infrared emitters for position tracking. The environment adapted in real time to subjects' position, giving the feeling of immersion. Each animation was composed by the instantaneous morphing of the FE, which is maintained for one second before the 'unmorphing' to the neutral expression. ISI was set to three seconds. For the occipito-temporal region, we found a asymmetrical negativity [200-300]ms after stimulus onset, followed by a positivity on the centro-parietal region at latency [450-600]ms. Given the neutral face baseline, these observations suggest the identification of two specific neural processors of facial expressions. |
Up Home |
---|