%A Pehlevan,Cengiz
%A Zhao,Xinyuan
%A Sengupta,Anirvan M.
%A Chklovskii,Dmitri
%D 2020
%J Frontiers in Computational Neuroscience
%C
%F
%G English
%K neural networks,Canonical correlation analysis (CCA),Hebbian plasticity,pyramidal neuron,biologically plausible learning
%Q
%R 10.3389/fncom.2020.00055
%W
%L
%N 55
%M
%P
%7
%8 2020-June-30
%9 Original Research
%#
%! Neurons as canonical correlation analyzers
%*
%<
%T Neurons as Canonical Correlation Analyzers
%U https://www.frontiersin.org/article/10.3389/fncom.2020.00055
%V 14
%0 JOURNAL ARTICLE
%@ 1662-5188
%X Normative models of neural computation offer simplified yet lucid mathematical descriptions of murky biological phenomena. Previously, online Principal Component Analysis (PCA) was used to model a network of single-compartment neurons accounting for weighted summation of upstream neural activity in the soma and Hebbian/anti-Hebbian synaptic learning rules. However, synaptic plasticity in biological neurons often depends on the integration of synaptic currents over a dendritic compartment rather than total current in the soma. Motivated by this observation, we model a pyramidal neuronal network using online Canonical Correlation Analysis (CCA). Given two related datasets represented by distal and proximal dendritic inputs, CCA projects them onto the subspace which maximizes the correlation between their projections. First, adopting a normative approach and starting from a single-channel CCA objective function, we derive an online gradient-based optimization algorithm whose steps can be interpreted as the operation of a pyramidal neuron. To model networks of pyramidal neurons, we introduce a novel multi-channel CCA objective function, and derive from it an online gradient-based optimization algorithm whose steps can be interpreted as the operation of a pyramidal neuron network including its architecture, dynamics, and synaptic learning rules. Next, we model a neuron with more than two dendritic compartments by deriving its operation from a known objective function for multi-view CCA. Finally, we confirm the functionality of our networks via numerical simulations. Overall, our work presents a simplified but informative abstraction of learning in a pyramidal neuron network, and demonstrates how such networks can integrate multiple sources of inputs.