APSIPA Transactions on Signal and Information Processing > Vol 8 > Issue 1

Towards Visible and Thermal Drone Monitoring with Convolutional Neural Networks

Ye Wang, University of Southern California, USA, Yueru Chen, University of Southern California, USA, Jongmoo Choi, University of Southern California, USA, C.-C. Jay Kuo, University of Southern California, USA, cckuo@sipi.usc.edu
 
Suggested Citation
Ye Wang, Yueru Chen, Jongmoo Choi and C.-C. Jay Kuo (2019), "Towards Visible and Thermal Drone Monitoring with Convolutional Neural Networks", APSIPA Transactions on Signal and Information Processing: Vol. 8: No. 1, e5. http://dx.doi.org/10.1017/ATSIP.2018.30

Publication Date: 15 Jan 2019
© 2019 Ye Wang, Yueru Chen, Jongmoo Choi and C.-C. Jay Kuo
 
Subjects
 
Keywords
Deep learningDetectionTrackingDroneIntegrated system
 

Share

Open Access

This is published under the terms of the Creative Commons Attribution licence.

Downloaded: 2341 times

In this article:
I. INTRODUCTION 
II. RELATED WORK 
III. DATA COLLECTION AND AUGMENTATION 
IV. DRONE MONITORING SYSTEM 
V. Experimental Results 
VI. CONCLUSION 

Abstract

This paper reports a visible and thermal drone monitoring system that integrates deep-learning-based detection and tracking modules. The biggest challenge in adopting deep learning methods for drone detection is the paucity of training drone images especially thermal drone images. To address this issue, we develop two data augmentation techniques. One is a model-based drone augmentation technique that automatically generates visible drone images with a bounding box label on the drone's location. The other is exploiting an adversarial data augmentation methodology to create thermal drone images. To track a small flying drone, we utilize the residual information between consecutive image frames. Finally, we present an integrated detection and tracking system that outperforms the performance of each individual module containing detection or tracking only. The experiments show that, even being trained on synthetic data, the proposed system performs well on real-world drone images with complex background. The USC drone detection and tracking dataset with user labeled bounding boxes is available to the public.

DOI:10.1017/ATSIP.2018.30