Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.703013
Title: How do people with autism process multisensory information?
Author: Poole, Daniel
ISNI:       0000 0004 6059 9684
Awarding Body: University of Manchester
Current Institution: University of Manchester
Date of Award: 2016
Availability of Full Text:
Access through EThOS:
Access through Institution:
Abstract:
Our experience of the world is based on information received from multiple sensory sources. To create a coherent representation of our environment, information from the different senses is combined to maximise the reliability of the combined percept. Furthermore, only stimuli occurring together in time and space will typically interact. Ineffective multisensory processing has been proposed as a possible explanation for the sensory differences experienced by many people with Autism Spectrum Condition (ASC). However, studies to date have produced mixed findings and have generally focused on the interaction between the visual and auditory modalities. The aim of the work presented in this thesis was to improve the characterisation of multisensory processing in adults with ASC, exploring the interaction between vision and touch for the first time. Specifically, we compared the temporal and spatial limits of multisensory processing, and the optimal combination of multisensory cues between participants with ASC and matched controls. In Experiment 1, performance on a visual- haptic size judgement task was compared to predictions from a statistically optimal model in which unisensory cues are combined additively with the weight of each cue determined by its reliability. For both participants with ASC and controls, multisensory performance differed from the predictions of this optimal model, but was similar to a non-optimal model in which participants switch stochastically between cues from trial-to-trial. The commonly used crossmodal congruency task was adapted for exploring individual differences in visual-tactile interactions. This task was used to explore the temporal modulation of visual distractors on tactile judgements (Experiment 4). Similar to controls, participants with ASC exhibited interactions only for simultaneous stimuli. Experiment 6 further explored uni and multisensory temporal sensitivity across vision, touch and hearing. No between group differences were observed, suggesting that the temporal processing of crossmodal stimuli is typical in adults with ASC. The spatial limits of visual- tactile interactions were also investigated. A visual distractor positioned far from the stimulated hand influenced tactile judgements in the group with ASC, but not in controls (Experiments 5 and 8). However, this reduced spatial modulation was not observed for purely visual judgements (Experiment 9), and both groups of participants benefited from spatial separation in an alternate visual-tactile task (Experiment 7). These findings suggest that visual- tactile selective attention is affected in ASC. This work has improved the characterisation of multisensory processing in ASC including a number of previously unexplored processes. It is hoped that work of this nature might ultimately lead to the development of effective sensory interventions for people with ASC.
Supervisor: Not available Sponsor: Medical Research Council
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.703013  DOI: Not available
Share: