First order methods for the large convex optimization problem arising in sparse inverse covariance selection

Stay connected



Share on facebook
Share on twitter
Share on linkedin

CIS Colloquium, Jun 13, 2012, 11:00AM – 12:00PM, Wachman 447

First order methods for the large convex optimization problem arising in sparse inverse covariance selection

Katya Scheinberg, Lehigh University

Many problems arising in signal processing, statistics and machine learning are solved by casting them as very large scale convex optimization problems. These problems usually have dense large scale data and the traditional second order optimization methods cannot be used. Often, however, these problems exhibit special structure (i.e., sparsity) of the solution. First-order methods with favorable convergence rates have recently become a focal point of much research in addressing these problems. These methods have low per-iteration complexity and can exploit the problem structure. We will discuss several convex optimization problems arising in the context of machine learning. In particular we will focus of sparse inverse covariance selection problem and we will discuss several first order methods for this problem.

Bio: Katya Scheinberg is an associate professor in the Industrial and Systems Engineering Department at Lehigh University. A native from Moscow, she earned her undergraduate degree in operations research from the Lomonosov Moscow State University in 1992 and then received her Ph.D. in operations research from Columbia in 1997. Scheinberg was a Research Staff Member at the IBM T.J. Watson Research center for over a decade, where she worked on various applied and theoretical problems in optimization, until moving back to Columbia as a visiting professor in 2009 and later on to Lehigh. Her main research areas are related to developing practical algorithms (and their theoretical analysis) for various problems in continuous optimization, such as convex optimization, derivative free optimization, machine learning, quadratic programming, etc. Scheinberg has also published a book in 2008 titled, Introduction to Derivative Free Optimization, which is co-authored with Andrew R. Conn and Luis N. Vicente. She is currently the editor of Optima, the MOS newsletter, and an associate editor of SIOPT.