Skip to yearly menu bar Skip to main content


Tutorial

The Smoothed Analysis of Algorithms

Daniel A Spielman

Regency D

Abstract:

Theorists have long been challenged by the existence of remarkable algorithms that are known by scientists and engineers to work well in practice, but whose theoretical analyses have been negative or unconvincing. The root of the problem is that algorithms are usually analyzed in one of two ways: by worst-case or average-case analysis. The former can improperly suggest that an algorithm will perform poorly, while the latter can be unconvincing because the random inputs it considers may fail to resemble those encountered in practice.

Smoothed analysis can help explain the success of many algorithms that both worst-case and average-case analyses cannot. In smoothed analysis, we measure the performance of an algorithm under slight random perturbations of arbitrary inputs, and bound the performance as a function of the size of the input and the magnitude of the perturbation. If an algorithm has low smoothed complexity, then it should perform well if its inputs are subject to random noise.

We will explain how smoothed analysis has been used to analyze the behavior of many algorithms and heuristics, including the simplex algorithm, the Perceptron algorithm Gaussian Elimination with partial pivoting, and k-means clustering.

Chat is not available.