Skip to yearly menu bar Skip to main content


Tutorial

Stochastic Search and Optimization

James Spall

Emerald Bay A, Harveys Convention Center Floor (CC)

Abstract:

Stochastic search and optimization (SS&O) methods are widely used in many areas of computational science. Online algorithms, such as stochastic gradient descent, are a prominent example of SS&O. The speaker will discuss some general issues related to how SS&O contributes to the analysis and control of modern systems as a way of: (i) coping with inherent system noise, (ii) providing algorithms that are relatively insensitive to modeling uncertainty, and (iii) providing algorithms that are able to find a global solution from among multiple local solutions. As a specific example of SS&O, the speaker will discuss the simultaneous perturbation stochastic approximation (SPSA) algorithm for difficult multivariate optimization problems arising in stochastic systems. The essential feature of SPSA, which accounts for its power and relative ease of use in difficult multivariate optimization problems, is the underlying gradient approximation that requires only two objective function measurements regardless of the dimension of the optimization problem. This talk will focus on the basic ideas and motivation behind SPSA without dwelling on the mathematical details. As time permits, the speaker will also include some discussion on contrasts with other algorithms (genetic algorithms, simulated annealing, etc.) and will briefly discuss some recent advances in areas such as discrete optimization and adaptive (second-order) search with or without stochastic gradients.

Chat is not available.