In Knowledge distillation: A good teacher is patient and consistent, Beyer et al. investigate various existing setups for performing knowledge distillation and show that all of them lead to ...
Article Views are the COUNTER-compliant sum of full text article downloads since November 2008 (both PDF and HTML) across all institutions and individuals. These metrics are regularly updated to ...
This is an implementation of our work "Feature Distillation DNN-Oriented JPEG Compression Against Adversarial Examples" https://arxiv.org/pdf/1803.05787.pdf We add ...
Shaanxi Provincial Coal Geology Group Co. Ltd., Key Laboratory of Coal Resources Exploration and Comprehensive Utilization, Ministry of Natural and Resources, Xi’an 710026, China ...
Abstract: Deep neural networks (DNNs) have always been a popular base model in many image classification tasks. However, some recent works suggest that there are some man made images will easily lead ...