Jacek Gondzio is Professor of Optimization at the School of Mathematics at the University of Edinburgh. Prof Gondzio is interested in the theory and implementation of optimization methods for linear, quadratic and nonlinear programming.
He is best known for his contributions to interior point methods for very large scale optimization. He also works on a development of new algorithms for combinatorial optimization and on the use of linear algebra techniques and sparse matrix factorisation methods applied in optimization. His interests include the use of parallel and distributed computing for solving real-life very large optimization problems arising in different applications. |
Abstract
Continuation in Optimization: From interior point methods to Big Data
In this talk we will discuss similarities between two homotopy-based approaches:
- (inexact) primal-dual interior point method for LP/QP, and
- preconditioned Newton conjugate gradient method for big data optimization.
Both approaches rely on clever exploitation of the curvature of optimized functions and deliver efficient techniques for solving optimization problems of unprecedented sizes. We will address both theoretical and practical aspects of these methods.
References:
[1] J. Gondzio, Interior point methods 25 years later, European Journal of Operational Research 218 (2012) pp. 587--601. DOI: 10.1016/j.ejor.2011.09.017
[2] J. Gondzio, Convergence analysis of an inexact feasible interior point method for convex quadratic programming, SIAM Journal on Optimization 23 (2013) No 3, pp. 1510--1527. DOI: 10.1137/120886017
[3] K. Fountoulakis and J. Gondzio, A second-order method for strongly convex L1-regularization problems, Mathematical Programming 156 (2016), pp. 189--219. DOI: 10.1007/s10107-015-0875-4
[4] K. Fountoulakis and J. Gondzio, Performance of first- and second-order methods for L1-regularized least squares problems, Computational Optimization and Applications 65 (2016) 605--635. DOI: 10.1007/s10589-016-9853-x
In this talk we will discuss similarities between two homotopy-based approaches:
- (inexact) primal-dual interior point method for LP/QP, and
- preconditioned Newton conjugate gradient method for big data optimization.
Both approaches rely on clever exploitation of the curvature of optimized functions and deliver efficient techniques for solving optimization problems of unprecedented sizes. We will address both theoretical and practical aspects of these methods.
References:
[1] J. Gondzio, Interior point methods 25 years later, European Journal of Operational Research 218 (2012) pp. 587--601. DOI: 10.1016/j.ejor.2011.09.017
[2] J. Gondzio, Convergence analysis of an inexact feasible interior point method for convex quadratic programming, SIAM Journal on Optimization 23 (2013) No 3, pp. 1510--1527. DOI: 10.1137/120886017
[3] K. Fountoulakis and J. Gondzio, A second-order method for strongly convex L1-regularization problems, Mathematical Programming 156 (2016), pp. 189--219. DOI: 10.1007/s10107-015-0875-4
[4] K. Fountoulakis and J. Gondzio, Performance of first- and second-order methods for L1-regularized least squares problems, Computational Optimization and Applications 65 (2016) 605--635. DOI: 10.1007/s10589-016-9853-x