Technical Report MSC-2006-13

TR#:MSC-2006-13
Class:MSC
Title: Automatic Generation Of Near Misses For Active Learning Of Visual Concepts
Authors: Nela Gurevich
Supervisors: Shaul Markovitch, Ehud Rivlin
PDFMSC-2006-13.pdf
Abstract: Assume that we are trying to build a visual recognizer for a particular class of objects - chairs, for example - using existing induction methods. Assume the assistance of a human teacher who can label an image of an object as a positive or a negative example.

As positive examples, we can obviously use images of real chairs. It is not clear, however, what types of objects we should use as negative examples. Are images of a monkey, a galaxy and a pen good candidates for this purpose? This is an example of a common problem where the concept we are trying to learn represents a small fraction of a large universe of instances. In this work we suggest learning with the help of near misses - negative examples that differ from the learned concept in only a small number of significant points, and we propose a framework for automatic generation of such examples. We observe that a learning process involves two spaces - the instance space and the feature space. We show that generating near misses in the feature space is problematic in some domains, and propose a methodology for generating examples directly in the instance space using modification operators - functions over the instance space that produce new instances by slightly modifying existing ones. The generated instances are evaluated by mapping them into the feature space and measuring their utility using known active learning techniques. We apply the proposed framework to the task of learning visual concepts from range images.

We examine the problem of defining modification operators over the instance space of range images and solve it by using an intermediate instance space - the functional representation space. The efficiency of the proposed framework for object recognition is demonstrated by testing it on real-world recognition tasks.

CopyrightThe above paper is copyright by the Technion, Author(s), or others. Please contact the author(s) for more information

Remark: Any link to this technical report should be to this page (http://www.cs.technion.ac.il/users/wwwb/cgi-bin/tr-info.cgi/2006/MSC/MSC-2006-13), rather than to the URL of the PDF files directly. The latter URLs may change without notice.

To the list of the MSC technical reports of 2006
To the main CS technical reports page

Computer science department, Technion
admin