Use this URL to cite or link to this record in EThOS: http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.572561
Title: Computation in balanced networks
Author: Barrett, D.
Awarding Body: University College London (University of London)
Current Institution: University College London (University of London)
Date of Award: 2012
Availability of Full Text:
Access through EThOS:
Access through Institution:
Abstract:
In the cortex, neural activity is noisy, irregular and asynchronous – a consequence of dynamically balancing excitatory and inhibitory input to neurons. Despite this noisy balancing, the brain is capable of performing a vast array of incredibly difficult computations. This is mysterious, because noise and irregularity are usually associated with poor performance. We ask, how can the cortex compute in a noisy background? The observation of orientation tuning in the visual cortex suggests that structured connectivity is important. We propose a unifying model of cortical connectivity in which weak structured connectivity is embedded in strong random background connectivity. This connectivity can simultaneously produce orientation tuning and irregular, asynchronous dynamics. We find that structure can boost computational performance, by amplifying orientation tuning. We then ask; why is cortical activity noisy? Surprisingly, we find that balanced network noise can also improve computational performance, by increasing the computational operating range of the cortex. The mechanism is simple; noise allows very large signals to become available for computation, despite the small operating range of individual neurons. However, this improvement comes at a price; for small signals, balanced network noise degrades performance. This exemplifies a performance-stability trade-off. As a corollary, we find that the contrast invariance of orientation tuned cells in the visual cortex is a consequence of this computational stability. Finally, we ask; does noise co-variability impair computation? It is known that correlated variability can degrade the computational performance of a network, especially if many neurons are strongly co-variant. We find that correlations in balanced networks are weak, but not weak enough to be ignored in computation because they affect decoding. Together, these results constitute an important link between neural computation and dynamics, opening the door to a reconciliation between conflicting theories of randomness and structure.
Supervisor: Not available Sponsor: Not available
Qualification Name: Thesis (Ph.D.) Qualification Level: Doctoral
EThOS ID: uk.bl.ethos.572561  DOI: Not available
Share: