Smoothed Analysis: An Attempt to Explain the Behavior of Algorithms in Practice
Uploaded by: Maximimi
Upload date: 2019-02-25 18:04:14
Edited at: 2020-12-30 23:07:53
Edited by: Maximimi

Comments:

Great paper by the founders of Smoothed Analysis! Some nice quotes: - "if one can prove that an algorithm performs well in the worst case, then one can be confident that it will work well in every domain. However, there are many algorithms that work well in practice that do not work well in the worst case. Smoothed analysis provides a theoretical framework for explaining why some of these algorithms do work well in practice." - "The performance profiles of algorithms across the landscape of input instances can differ greatly and can be quite irregular." - "If a single input instance triggers an exponential run time, the algorithm is called an exponential-time algorithm." - "While polynomial time algorithms are usually viewed as being efficient, we clearly prefer those whose run time is a polynomial of low degree, especially those that run in nearly linear time." - "Developing means for predicting the performance of algorithms and heuristics on real data and on real computers is a grand challenge in algorithms." - "We hope that theoretical explanations will be found for the success in practice of many of these algorithms, and that these theories will catalyze better algorithm design." - "there are many problems that need to be solved in practice for which we do not know algorithms with good worst-case performance. Instead, scientists and engineers typically use heuristic algorithms to solve these problems. Many of these algorithms work well in practice, in spite of having a poor, sometimes exponential, worst-case running time. Practitioners justify the use of these heuristics by observing that worst-case instances are usually not “typical” and rarely occur in practice. The worst-case analysis can be too pessimistic." - "heuristics are often used to speed up the practical performance of implementations that are based on algorithms with polynomial worst-case complexity. These heuristics might in fact worsen the worst-case performance, or make the worst-case complexity difficult to analyze." - "While one would ideally choose the distribution of inputs that occur in practice, this is difficult as it is rare that one can determine or cleanly express these distributions, and the distributions can vary greatly between one application and another. Instead, average-case analyses have employed distributions with concise mathematical descriptions, such as Gaussian random vectors, uniform {0, 1} vectors, and Erdos-Renyi random graphs. The drawback of using such distributions is that the inputs actually encountered in practice may bear very little resemblance to the inputs that are likely to be generated by such distributions." - "Because of the intrinsic difficulty in defining practical distributions, we consider an alternative approach to modeling real data. The basic idea is to identify typical properties of practical data, define an input model that captures these properties, and then rigorously analyze the performance of algorithms assuming their inputs have these properties. Smoothed analysis is a step in this direction. It is motivated by the observation that practical data are often subject to some small degree of random noise." - "At a high level, each input is generated from a two-stage model: In the first stage, an instance is generated and in the second stage, the instance from the first stage is slightly perturbed. The perturbed instance is the input to the algorithm." - "we hope insights gained from smoothed analysis will lead to new ideas in algorithm design [...] we suggest that it might be possible to solve some problems more efficiently by perturbing their inputs."
Maximimi at 2019-02-28 11:32:27
Edited by Maximimi at 2019-02-28 11:33:20

Please consider to register or login to comment on the paper.