Many modern machine learning algorithms reduce to solving large-scale linear, quadratic or semi-definite mathematical programming problems. Optimization has thus become a crucial tool for learning, and learning a major application of optimization. Furthermore, a systematic recasting of learning and estimation problems in the framework of mathematical programming has encouraged the use of advanced techniques from optimization such as convex analysis, Lagrangian duality and large scale linear algebra. This has allowed much sharper theoretical analyses, and greatly increased the size and range of problems that can be handled. Several key application domains have developed explosively, notably text and web analysis, machine vision, and speech all fuelled by ever expanding data resources easily accessible via the web.
This special topic is intended to bring closer optimization and machine learning communities for further algorithmic progress, particularly for developing large-scale learning methods capable of handling massive document and image datasets.
Topics of interest include:
- Mathematical programming approaches to machine learning problems, like semi-definite programming, interior point methods, sequential convex programming, gradient-based methods, etc.
- Optimisation on graphical models for machine learning, belief propagation.
- Efficient training of Support Vector Machines, incremental SVMs, optimization over kernels.
- Convex relaxations of machine learning problems.
- Applications involving large scale databases, such as data mining, bioinformatics, multimedia.