SVM-optimization and Steepest-descent Line Search.
Abstract. We consider (a subclass of) convex quadratic optimization problems and analyze decomposition algorithms that perform, at least approximately, steepest-descent exact line search. We show that these algorithms, when implemented properly, are within $\epsilon$ of optimality after $O(\log 1/\epsilon)$ iterations for strictly convex cost functions, and after $O(1/\epsilon)$ iterations in the general case. Our analysis is general enough to cover the algorithms that are used in software packages like SVMTorch and (first or second order) LibSVM. To the best of our knowledge, this is the first paper coming up with a convergence rate for these algorithms without introducing unnecessarily restrictive assumptions.