Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.598105
Title: Multimodal emotion perception from facial and vocal signals
Author: Cox, A. G.
Awarding Body: University of Cambridge
Current Institution: University of Cambridge
Date of Award: 2006
Availability of Full Text:
Full text unavailable from EThOS.
Please contact the current institution’s library for further details.
Abstract:
The perception of emotion in other people is a fundamental part of social communication. Emotional expressions are often multimodal in nature and like human speech both auditory and visual components are used for comprehension. Up to this date however, the majority of emotion research has focused on the perception of emotion from facial or vocal expressions in isolation. This thesis investigated the behavioural and neural consequences of perceiving emotion from facial and vocal emotional signals simultaneously. Initial experiments demonstrated that a congruent, but unattended, vocal expression produced faster emotion-categorisation decisions to facial expressions, relative to incongruent or neutral voices. Similarly, simultaneously presented facial expressions had the same effect on the categorisation of vocal expressions. Subsequent experiments showed that other pairings of emotional stimuli (vocal expressions and emotion pictures; facial expressions and emotion pictures) did not have bi-directional effects on each other, but rather asymmetric effects that were consistent with interactions between these stimuli at post-perceptual stages of processing. Facial and vocal signals are naturalistic pairings, and evidence that these signals are integrated at a ‘perceptual’ level was provided by a final experiment using functional magnetic resonance imaging. Congruent facial-vocal pairings produced enhanced activity in the superior temporal sulcus; a region implicated in cross-modal integration of sensory inputs. The data from this thesis suggest that facial and vocal signals of emotion are automatically integrated at a perceptual processing stage to create a single unified percept to facilitate social communication.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.598105  DOI: Not available
Share: