Exploring the Properties and Evolution of Neural Network Eigenspaces during Training

Published in Machine Vision and Image Processing (MVIP), 2021

Recommended citation: Your Name, You. (2009). "Paper Title Number 1." Machine Vision and Image Processing (MVIP). 1(1). https://ieeexplore.ieee.org/document/9738741

We investigate properties and the evolution of the emergent inference process inside neural networks using layer saturation. and logistic regression probes . We demonstrate that the difficulty of a problem, defined by the number of classes and complexity of the visual domain, as well as the number of parameters in neural network layers affect the predictive performance in an antagonistic manner. We further show that this relationship can be measured using saturation. This opens the possibility of detecting over- and under-parameterization of neural networks. We further show that the observed effects are independent of previously reported pathological patterns like the “tail pattern” described in [1]. Finally, we study the emergence of saturation patterns during training, showing that saturation patterns emerge early during training. This allows for early analysis and potentially increased cycle-time during experiments.

Download paper here

Recommended citation: Richter M.L, Malihi. L, Windler, A.K.P., Krumnack. U. (2022). “Exploring the Properties and Evolution of Neural Network Eigenspaces during Training” https://ieeexplore.ieee.org/document/9738741. 1(1).