[Home]   [  News]   [  Events]   [  People]   [  Research]   [  Education]   [Visitor Info]   [UCSD Only]   [Admin]
Home > Events > CCoM > Abstract
Search this site:

Nonsmooth, Nonconvex Optimization

Michael Overton
Courant Institute of Mathematical Sciences

Abstract:

There are many algorithms for minimization when the objective function is differentiable, convex, or has some other known structure, but few options when none of the above hold, particularly when the objective function is nonsmooth at minimizers, as is often the case in applications. We describe two simple algorithms for minimization of nonsmooth, nonconvex functions. Gradient Sampling is a relatively new method that, although computationally intensive, has a nice convergence theory. The method is robust and the convergence theory has recently been extended to constrained problems. BFGS is an old method, developed for smooth problems, for which we have very limited theoretical results, but some remarkable empirical observations, extensive success in applications, and a rather bold conjecture. Limited Memory BFGS is a popular extension for large problems, and it too is applicable to the nonsmooth case, although our experience with it is more mixed.

Thursday, February 4, 2010
11:00AM AP&M 6402