APSIPA Transactions on Signal and Information Processing > Vol 9 > Issue 1

Theoretical analysis of skip connections and batch normalization from generalization and optimization perspectives

Yasutaka Furusho, Nara Institute of Science and Technology, Japan, Kazushi Ikeda, Nara Institute of Science and Technology, Japan, kazushi@is.naist.jp
 
Suggested Citation
Yasutaka Furusho and Kazushi Ikeda (2020), "Theoretical analysis of skip connections and batch normalization from generalization and optimization perspectives", APSIPA Transactions on Signal and Information Processing: Vol. 9: No. 1, e9. http://dx.doi.org/10.1017/ATSIP.2020.7

Publication Date: 27 Feb 2020
© 2020 Yasutaka Furusho and Kazushi Ikeda
 
Subjects
 
Keywords
Deep neural networksResNetSkip connectionsBatch normalization
 

Share

Open Access

This is published under the terms of the Creative Commons Attribution licence.

Downloaded: 2736 times

In this article:
I. INTRODUCTION 
II. PROBLEM FORMULATION 
III. GENERALIZATION GAP 
IV. EMPIRICAL RISK 
V. CONCLUSION 

Abstract

Deep neural networks (DNNs) have the same structure as the neocognitron proposed in 1979 but have much better performance, which is because DNNs include many heuristic techniques such as pre-training, dropout, skip connections, batch normalization (BN), and stochastic depth. However, the reason why these techniques improve the performance is not fully understood. Recently, two tools for theoretical analyses have been proposed. One is to evaluate the generalization gap, defined as the difference between the expected loss and empirical loss, by calculating the algorithmic stability, and the other is to evaluate the convergence rate by calculating the eigenvalues of the Fisher information matrix of DNNs. This overview paper briefly introduces the tools and shows their usefulness by showing why the skip connections and BN improve the performance.

DOI:10.1017/ATSIP.2020.7