Applied Math And Analysis Seminar
Monday, February 5, 2018, 3:15pm, 119 Physics
Ju Sun (Stanford University)
When Are Nonconvex Optimization Problems Not Scary?
Abstract:
Many problems arising from scientific and engineering applications can be naturally formulated as optimization problems, most of which are nonconvex. For nonconvex problems, obtaining a local minimizer is computationally hard in theory, never mind the global minimizer. In practice, however, simple numerical methods often work surprisingly well in finding high-quality solutions for specific problems at hand.

In this talk, I will describe our recent effort in bridging the mysterious theory-practice gap for nonconvex optimization. I will highlight a family of nonconvex problems that can be solved to global optimality using simple numerical methods, independent of initialization. This family has the characteristic global structure that (1) all local minimizers are global, and (2) all saddle points have directional negative curvatures. Problems lying in this family cover various applications across machine learning, signal processing, scientific imaging, and more. I will focus on two examples we worked out: learning sparsifying bases for massive data and recovery of complex signals from phaseless measurements. In both examples, the benign global structure allows us to derive geometric insights and computational results that are inaccessible from previous methods. In contrast, alternative approaches to solving nonconvex problems often entail either expensive convex relaxation (e.g., solving large-scale semidefinite programs) or delicate problem-specific initializations.

Completing and enriching this framework is an active research endeavor that is being undertaken by several research communities. At the end of the talk, I will discuss open problems to be tackled to move forward. [video]

Generated at 8:52am Thursday, March 28, 2024 by Mcal.   Top * Reload * Login