Skip to yearly menu bar Skip to main content

( events)   Timezone:  
Thu Dec 08 11:00 PM -- 09:30 AM (PST) @ Area 5 + 6
Nonconvex Optimization for Machine Learning: Theory and Practice
Hossein Mobahi · Anima Anandkumar · Percy Liang · Stefanie Jegelka · Anna Choromanska
[ Video

Workshop Home Page

A large body of machine learning problems require solving nonconvex optimization. This includes deep learning, Bayesian inference, clustering, and so on. The objective functions in all these instances are highly non-convex, and it is an open question if there are provable, polynomial time algorithms for these problems under realistic assumptions.

A diverse set of approaches have been devised to solve nonconvex problems in a variety of approaches. They range from simple local search approaches such as gradient descent and alternating minimization to more involved frameworks such as simulated annealing, continuation method, convex hierarchies, Bayesian optimization, branch and bound, and so on. Moreover, for solving special class of nonconvex problems there are efficient methods such as quasi convex optimization, star convex optimization, submodular optimization, and matrix/tensor decomposition.

There has been a burst of recent research activity in all these areas. This workshop brings researchers from these vastly different domains and hopes to create a dialogue among them. In addition to the theoretical frameworks, the workshop will also feature practitioners, especially in the area of deep learning who are developing new methodologies for training large scale neural networks. The result will be a cross fertilization of ideas from diverse areas and schools of thought.

Opening Remarks (Talk)
Learning To Optimize (Talk)
Morning Poster Spotlight (Spotlight)
Morning Poster Session (Posters)
Coffee Break (Break)
The moment-LP and moment-SOS approaches in optimization and some related applications (Talk)
Non-convexity in the error landscape and the expressive capacity of deep neural networks (Talk)
Leveraging Structure in Bayesian Optimization (Talk)
Lunch Break (Break)
Submodular Optimization and Nonconvexity (Talk)
Submodular Functions: from Discrete to Continuous Domains (Talk)
Taming non-convexity via geometry (Talk)
Break (Coffee Break)
Discussion Panel
Afternoon Poster Spotlight (Spotlight)
Afternoon Poster Session (Posters)