Home/Learning from Small Sample Sets by Combining Unsupervised Meta-Training with CNNs

Learning from Small Sample Sets by Combining Unsupervised Meta-Training with CNNs

Yuxiong Wang and Martial Hebert
30th Conference on Neural Information Processing Systems (NIPS), December, 2016

Download Publication (PDF)

Copyright notice: This material is presented to ensure timely dissemination of scholarly and technical work. Copyright and all rights therein are retained by authors or by other copyright holders. All persons copying this information are expected to adhere to the terms and constraints invoked by each author’s copyright. These works may not be reposted without the explicit permission of the copyright holder.

Abstract

This work explores CNNs for the recognition of novel categories from few examples. Inspired by the transferability properties of CNNs, we introduce an additional unsupervised meta-training stage that exposes multiple top layer units to a large amount of unlabeled real-world images. By encouraging these units to learn diverse sets of low-density separators across the unlabeled data, we capture a more generic, richer description of the visual world, which decouples these units from ties to a specific set of categories. We propose an unsupervised margin maximization that jointly estimates compact high-density regions and infers low-density separators. The low-density separator (LDS) modules can be plugged into any or all of the top layers of a standard CNN architecture. The resulting CNNs significantly improve the performance in scene classification, fine-grained recognition, and action recognition with small training samples.

BibTeX Reference
@conference{Wang-2016-26050,
title = {Learning from Small Sample Sets by Combining Unsupervised Meta-Training with CNNs},
author = {Yuxiong Wang and Martial Hebert},
booktitle = {30th Conference on Neural Information Processing Systems (NIPS)},
month = {December},
year = {2016},
}
2017-09-13T10:38:10+00:00