English

Using Interior Point Methods for Optimization in Training Very Large Scale Support






In this talk we shall discuss the issues of Interior Point Methods (IPMs) applied to solve optimization problems arising in the context of very large-scale Support Vector Machine (SVM) training. First, we will briefly introduce IPMs for linear and quadratic programming and comment on their advantages: (a) polynomial complexity, (b) ability to solve very large problems, (c) excellent practical behaviour (much better than that predicted by the worst-case complexity analysis), and (d) eligibility to parallelisation [1]. We will then address specific features of optimization problems arising in SVM training, in particular, the presence of large datasets which are stored as dense matrices and make problems very well-suited to the use of IPMs. We will survey numerical techniques applicable in this context such as suitable factorization methods and we will comment on several interesting developments made over the last decade which aimed at using IPMs for SVM training. We will demonstrate that the key to success in applying IPMs is the ability to re-formulate SVM problems as separable quadratic optimization ones which then can be successfully tackled by appropriate linear algebra tools of IPMs [2,3]. Finally, we will comment on the existing challenges for IPMs in SVM training context and touch a number of research issues which still remain open. In particular, we will discuss a need of developing new algorithms which may take advantage of new multi-core architectures, challenges of problems getting larger and larger, and the use of non-linear and indefinite kernels. This is joint work with Kristian Woodsend.
Find OpenCourseWare Online Exams!
Attribution: The Open Education Consortium
http://www.ocwconsortium.org/courses/view/164df4c0b3a1c5d4e1f9c411eecc5ea8/
Course Home http://videolectures.net/icml09_gondzio_ituipm/