Event Abstract

Representation of environmental statistics by neural populations

  • 1 NYU, Center for Neural Science, United States
  • 2 NYU, United States

The efficient coding hypothesis proposes that sensory systems have evolved so as to efficiently represent signals experienced in natural environments (Barlow, 1961). Quantitatively, this hypothesis can be expressed as the desire to maximize mutual information between environmental signals (as characterized by a prior probability density), and the response of a neural population, subject to resource constraints. Here we consider both the perceptual and physiological implications of efficient coding constrained by both the size of the population, and the average firing rate. Information is difficult to optimize for a multi-dimensional neural system, so we rely on two approximations. First, we use the Fisher bound on mutual information (Brunel et al., 1998), which can be rewritten in terms of the KL divergence between the square root of Fisher information and the prior (McDonnell, 2008). The bound becomes tighter as the signal to noise ratio of the population code increases (e.g. with a larger number of neurons and/or higher firing rates). Second, we solve for the optimum of this objective function, ignoring the normalization that would be necessary to make the root Fisher information a proper density. Under these approximations, an efficient population code should achieve a Fisher information proportional to the square of the prior, while satisfying resource constraints. Since the inverse of root Fisher information also provides a lower bound on the discriminability of stimuli (Series et al., 2009), this solution provides a direct perceptual prediction. Specifically, efficient coding implies that the inverse of perceptual discriminability should look like the prior. We tested this prediction for two different stimulus attributes: orientation, and speed. In the first case, we show that the inverse of orientation discriminability (data from Girshick et al., VSS2009) closely resembles the prior distribution of orientations we measured from natural images. In the speed case, we examined both discrimination data (Stocker et al., 2006) and the responses of a population of Macaque MT cells to stimuli of different speeds (data from Majaj et al., Cosyne2007). The latter data set was collected explicitly for the purpose of assessing the embedding of prior information in MT populations (Stocker, VSS2009). Thus, we can use the efficient coding result to make two independent predictions for the prior on speed. The first prediction is obtained by inverting the discrimination thresholds, as in the orientation case. A second prediction is obtained by computing the normalized root Fisher information of the MT population, which we accomplish by assuming independent Poisson noise and parameterized tuning curves fit to the data. Remarkably, the two predicted priors are consistent with each other and with the finding that human visual speed perception is consistent with a Bayesian observer using a prior that favors slower speeds (eg., Stocker et al. 2006). The efficient coding hypothesis allows us to establish a quantitative link between environmental statistics, neural processing, and perception. Although we have resorted to several approximations to achieve these results, we are currently exploring reformulations that can eliminate the need for such approximations while reaching similar conclusions.

Conference: Computational and Systems Neuroscience 2010, Salt Lake City, UT, United States, 25 Feb - 2 Mar, 2010.

Presentation Type: Poster Presentation

Topic: Poster session III

Citation: Ganguli D and Simoncelli EP (2010). Representation of environmental statistics by neural populations. Front. Neurosci. Conference Abstract: Computational and Systems Neuroscience 2010. doi: 10.3389/conf.fnins.2010.03.00240

Received: 04 Mar 2010; Published Online: 04 Mar 2010.

* Correspondence: Deep Ganguli, NYU, Center for Neural Science, New York, United States, dganguli@cns.nyu.edu

© 2007 - 2017 Frontiers Media S.A. All Rights Reserved