English

Energy Minimization with Label costs and Applications in Multi-Model Fitting

By  





The a-expansion algorithm has had a significant impact in computer vision due to its generality, effectiveness, and speed. Until recently, it could only minimize energies that involve unary, pairwise, and specialized higher-order terms. We propose an extension of a-expansion that can simultaneously optimize ‘‘label costs’’ with certain optimality guarantees. An energy with label costs can penalize a solution based on the set of labels that appear in it. The simplest special case is to penalize the number of labels in the solution, but the proposed energy is significantly more general than this. Usefulness of label costs is demonstrated by a number of specific applications in vision (e.g. in object recognition) that appeared in the last year. Our work (see CVPR 2010, IJCV submission) studies label costs from a general perspective, including discussion of multiple algorithms, optimality bounds, extensions, and fast special cases (e.g. UFL heuristics). In this talk we focus on natural generic applications of label costs is multi-model fitting and demonstrate several examples: homography detection, motion segmentation, unsupervised image segmentation, compression, and FMM. We also discuss a method (PEARL) for effective exploration of the continuum of labels -an important practical obstacle for a-expansion in model fitting. We discuss why our optimizationbased approach to multi-model fitting is significantly more robust than standard extensions of RANSAC (e.g. sequential RANSAC) currently dominant in vision.
Find OpenCourseWare Online Exams!
Attribution: The Open Education Consortium
http://www.ocwconsortium.org/courses/view/a77b37630806b6b3c9121c659fc37693/
Course Home http://videolectures.net/nipsworkshops2010_boykov_eml/