Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.622079
Title: Audiovisual scene synthesis
Author: Mital, Parag Kumar
ISNI:       0000 0004 5360 975X
Awarding Body: Goldsmiths College (University of London)
Current Institution: Goldsmiths College (University of London)
Date of Award: 2014
Availability of Full Text:
Access through EThOS:
Full text unavailable from EThOS. Please try the link below.
Access through Institution:
Abstract:
This thesis attempts to open a dialogue around fundamental questions of perception such as: how do we represent our ongoing auditory or visual perception of the world using our brain; what could these representations explain and not explain; and how can these representations eventually be modeled by computers? Rather than answer these questions scientifically, we will attempt to develop a computational arts practice presenting these questions to participants. The approach this thesis takes is computational scene synthesis: a computationally generative collage process where the units of the collage are built using perceptually-inspired representations. We explain how scene synthesis is built in detail and relate it to an existing lineage of collage-based practitioners. Then, working in auditory and visual domains separately, in order to bring questions of perception to the experience of the artwork, this thesis makes significant interdisciplinary strides from reviewing fundamental issues in perception in terms of experimental psychology and cognitive neuroscience, to formulating and developing perceptually-inspired computational models of large databases of audiovisual material, to finally developing these models with a computationally generative collage-based arts practice. Two final practical outputs using audiovisual scene synthesis will be explored: (1) a short film series which attempts to recreate the number 1 video of the week on YouTube using only the audiovisual content from the remaining top 10 videos; and (2) a real-time augmented reality experience presented through a virtual reality headset and headphones presenting a scene synthesis of a participant's surroundings using only previously learned audiovisual fragments. Results from both outputs demonstrate the ability for scene synthesis to provoke meaningful engagements with one's own process of perception. The results further demonstrate that scene synthesis is capable of highlighting both theoretical and practical gaps in our current understanding of human perception and their computational implementations.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.622079  DOI: Not available
Share: