Comprehensive Attention Self-Distillation for Weakly-Supervised Object Detection - Robotics Institute Carnegie Mellon University

Comprehensive Attention Self-Distillation for Weakly-Supervised Object Detection

Zeyi Huang, Yang Zou, Vijayakumar Bhagavatula, and Dong Huang
Conference Paper, Proceedings of (NeurIPS) Neural Information Processing Systems, December, 2020

Abstract

Weakly Supervised Object Detection (WSOD) has emerged as an effective tool to train object detectors using only the image-level category labels. However, without object-level labels, WSOD detectors are prone to detect bounding boxes on salient objects, clustered objects and discriminative object parts. Moreover, the image-level category labels do not enforce consistent object detection across different transformations of the same images. To address the above issues, we propose a Comprehensive Attention Self-Distillation (CASD) training approach for WSOD. To balance feature learning among all object instances, CASD computes the comprehensive attention aggregated from multiple transformations and feature layers of the same images. To enforce consistent spatial supervision on objects, CASD conducts self-distillation on the WSOD networks, such that the comprehensive attention is approximated simultaneously by multiple transformations and feature layers of the same images. CASD produces new state-of-the-art WSOD results on standard benchmarks such as PASCAL VOC 2007/2012 and MS-COCO.

BibTeX

@conference{Huang and Huang-2020-125332,
author = {Zeyi Huang and Yang Zou and Vijayakumar Bhagavatula and Dong Huang},
title = {Comprehensive Attention Self-Distillation for Weakly-Supervised Object Detection},
booktitle = {Proceedings of (NeurIPS) Neural Information Processing Systems},
year = {2020},
month = {December},
keywords = {Weakly Supervised Learning, Object Detection, Attention-based Training},
}