APSIPA Transactions on Signal and Information Processing > Vol 7 > Issue 1

Understanding convolutional neural networks via discriminant feature analysis

Hao Xu, University of Southern California, USA, iamxuhao@gmail.com , Yueru Chen, University of Southern California, USA, Ruiyuan Lin, University of Southern California, USA, C.-C. Jay Kuo, University of Southern California, USA
 
Suggested Citation
Hao Xu, Yueru Chen, Ruiyuan Lin and C.-C. Jay Kuo (2018), "Understanding convolutional neural networks via discriminant feature analysis", APSIPA Transactions on Signal and Information Processing: Vol. 7: No. 1, e20. http://dx.doi.org/10.1017/ATSIP.2018.24

Publication Date: 11 Dec 2018
© 2018 Hao Xu, Yueru Chen, Ruiyuan Lin and C.-C. Jay Kuo
 
Subjects
 
Keywords
Convolutional neural networksObject recognitionEmpirical analysis
 

Share

Open Access

This is published under the terms of the Creative Commons Attribution licence.

Downloaded: 1735 times

In this article:
I. INTRODUCTION 
II. RELATED WORK 
III. EVALUATION METRICS 
IV. EXPERIMENTAL RESULTS 
V. CONCLUSION 

Abstract

Trained features of a convolution neural network (CNN) at different convolution layers is analyzed using two quantitative metrics in this work. We first show mathematically that the Gaussian confusion measure (GCM) can be used to identify the discriminative ability of an individual feature. Next, we generalize this idea, introduce another measure called the cluster purity measure (CPM), and use it to analyze the discriminative ability of multiple features jointly. The discriminative ability of trained CNN features is validated by experimental results. Research on CNNs utilizing GCM and CPM tools offers important insights into its operational mechanism, including the behavior of trained CNN features and good detection performance of some object classes that were considered difficult in the past. Finally, the trained feature representation is compared between different CNN structures to explain the superiority of deeper networks.

DOI:10.1017/ATSIP.2018.24