Synergy, redundancy, and independence in population codes. Author Elad Schneidman, William Bialek, Michael Berry Publication Year 2003 Type Journal Article Abstract A key issue in understanding the neural code for an ensemble of neurons is the nature and strength of correlations between neurons and how these correlations are related to the stimulus. The issue is complicated by the fact that there is not a single notion of independence or lack of correlation. We distinguish three kinds: (1) activity independence; (2) conditional independence; and (3) information independence. Each notion is related to an information measure: the information between cells, the information between cells given the stimulus, and the synergy of cells about the stimulus, respectively. We show that these measures form an interrelated framework for evaluating contributions of signal and noise correlations to the joint information conveyed about the stimulus and that at least two of the three measures must be calculated to characterize a population code. This framework is compared with others recently proposed in the literature. In addition, we distinguish questions about how information is encoded by a population of neurons from how that information can be decoded. Although information theory is natural and powerful for questions of encoding, it is not sufficient for characterizing the process of decoding. Decoding fundamentally requires an error measure that quantifies the importance of the deviations of estimated stimuli from actual stimuli. Because there is no a priori choice of error measure, questions about decoding cannot be put on the same level of generality as for encoding. Keywords Animals, Action Potentials, Models, Neurological, Neurons, Synaptic Transmission, Nerve Net Journal J Neurosci Volume 23 Issue 37 Pages 11539-53 Date Published 12/2003 Alternate Journal J. Neurosci. Google ScholarBibTeXEndNote X3 XML