Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.626630
Title: Contextual modulations of visual perception and visual cortex activity in humans
Author: de Haas, B.
ISNI:       0000 0004 5362 6824
Awarding Body: University College London (University of London)
Current Institution: University College London (University of London)
Date of Award: 2014
Availability of Full Text:
Access from EThOS:
Full text unavailable from EThOS. Please try the link below.
Access from Institution:
Abstract:
Visual perception and neural processing depend on more than retinal stimulation alone. They are modulated by contextual factors like cross-modal input, the current focus of attention or previous experience. In this thesis I investigate ways in which these factors affect vision. A first series of experiments investigates how co-occurring sounds modulate vision, with an emphasis on temporal aspects of visual processing. In three behavioral experiments I find that participants are unable to ignore the duration of co-occurring sounds when giving visual duration judgments. Furthermore, prolonged sound duration goes along with improved detection sensitivity for visual stimuli and thus extends beyond duration judgments per se. I go on to test a cross-modal illusion in which the perceived number of flashes in a rapid series is affected by the number of co-occurring beeps (the sound-Induced flash illusion). Combining data from structural magnetic resonance imaging (MRI) and a behavioral experiment I find that individual proneness to this illusion is linked with less grey matter volume in early visual cortex. Finally, I test how co-occurring sounds affect the cortical representation of more natural visual stimuli. A functional MRI (fMRI) experiment investigates patterns of activation evoked by short video clips in visual areas V1-3. The trial-by-trial reliability of such patterns is reduced for videos accompanied by mismatching sounds. Turning from cross-modal effects to more intrinsic sources of contextual modulation I test how attention affects visual representations in V1-3. Using fMRI and population receptive field (pRF) mapping I find that high perceptual load at fixation renders spatial tuning for the surrounding visual field coarser and goes along with pRFs being radially repelled. In a final behavioral and fMRI experiment I find that the perception of face features is modulated by retinal stimulus location. Eye and mouth stimuli are recognized better, and evoke more discriminable patterns of activation in face sensitive patches of cortex, when they are presented at canonical locations. Taken together, these experiments underscore the importance of contextual modulation for vision, reveal some previously unknown such factors and point to possible neural mechanisms underlying them. Finally, they argue for an understanding of vision as a process using all available cues to arrive at optimal estimates for the causes of sensory events.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.626630  DOI: Not available
Share: