Plenary Talk
 in 
Workshop: Order up! The Benefits of Higher-Order Optimization in Machine Learning
                        
                    
                    Tensor Methods for Nonconvex Optimization.
Coralia Cartis
                        Abstract:
                        
                            
                    
                We consider the advantages of having and incorporating higher- (than second-) order derivative information inside regularization frameworks, generating higher-order regularization algorithms that have better complexity, universal properties and can certify higher-order criticality of candidate solutions. Time permitting, we also discuss inexact settings where problem information and smoothness assumptions are weakened, without affecting the algorithms’ complexity. Efficient solution of some higher-order polynomial subproblems will also be discussed.
Chat is not available.