Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.574546
Title: Visual-haptic integration during tool use
Author: Takahashi, Chie
Awarding Body: Prifysgol Bangor University
Current Institution: Bangor University
Date of Award: 2012
Availability of Full Text:
Access through EThOS:
Access through Institution:
Abstract:
To integrate visual and haptic information effectively, the brain should only combine information that refers to the same object. Thus, it must solve a 'correspondence problem', to determine if signals relate to the same object or not. This could be achieved by considering the similarity of the two sensory signals in time and space. For example, if two size estimates are spatially separated or conflicting, it is unlikely that they originate from the same object; so sensory integration should not occur. Humans are adept at using tools such as pliers, however, which can systematically change the spatial relationships between (visual) object size and the opening of the hand. Here we investigate whether and how the brain solves this visual-haptic correspondence problem during tool use. In a series of psychophysical experiments we measured object-size discrimination performance, and compared this to statistically optimal predictions, derived from a computational model of sensory integration. We manipulated the spatial offset between seen and felt object positions, and also the relative gain between object size and hand opening. When using a tool, we changed these spatial properties by manipulating tool length and the pivot position (for a pliers-like tool). We found that the brain integrates visual and haptic information near-optimally when using tools (independent of spatial offset and size- conflict between raw sensory signals), but only when the hand opening was appropriately remapped onto the object coordinates by the tool geometry. This suggests that visual-haptic integration is not based on the similarity between raw sensory signals, but instead on the similarity between the distal causes of the visual and haptic estimates. We also showed that perceived size from haptics and the haptic reliability were changed with tool gain. Moreover, cue weights of the same object size were altered by the tool geometry, suggesting that the brain does dynamically take spatial changes into account when using a tool. These findings can be explained within a Bayesian framework of multisensory integration. We conclude that the brain takes into account the dynamics and geometry of tools allowing the visual-haptic correspondence problem to be solved correctly under a range of circumstances. We explore the theoretical implications of this for understanding sensory integration, as well as practical implications for the design of visual-haptic interfaces.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.574546  DOI: Not available
Share: