%A Berthiaume,David
%A Paffenroth,Randy
%A Guo,Lei
%D 2020
%J Frontiers in Applied Mathematics and Statistics
%C
%F
%G English
%K deep learning,Neural Network,generalization,Sard's Theorem,multilayer perceptron
%Q
%R 10.3389/fams.2020.572539
%W
%L
%N 52
%M
%P
%7
%8 2020-October-27
%9 Original Research
%#
%! Understanding Deep Learning
%*
%<
%T Understanding Deep Learning: Expected Spanning Dimension and Controlling the Flexibility of Neural Networks
%U https://www.frontiersin.org/article/10.3389/fams.2020.572539
%V 6
%0 JOURNAL ARTICLE
%@ 2297-4687
%X Neural networks (NN) provide state-of-the-art performance in many problem domains. They can accommodate a vast number of parameters and still perform well when classic machine learning techniques provided with the same number of parameters would tend to overfit. To further the understanding of such incongruities, we develop a metric called the expected spanning dimension (ESD) which allows one to measure the intrinsic flexibility of an NN. We analyze NNs from the small, in which the ESD can be exactly computed, to large real-world networks with millions of parameters, in which we demonstrate how the ESD can be numerically approximated efficiently. The small NNs we study can be understood in detail, their ESD can be computed analytically, and they provide opportunities for understanding their performance from a theoretical perspective. On the other hand, applying the ESD to large-scale NNs sheds light on their relative generalization performances and provides suggestions as to how such NNs may be improved.