Probabilistic analysis of algorithms
In analysis of algorithms, probabilistic analysis of algorithms is an approach to estimate
the computational complexity of an algorithm or a computational problem. It starts from an
assumption about a probabilistic distribution of the set of all possible inputs. This assumption is
then used to design an efficient algorithm or to derive the complexity of a known algorithm.
This approach is not the same as that of probabilistic algorithms, but the two may be combined.
For non-probabilistic, more specifically, for deterministic algorithms, the most common types of
complexity estimates are the average (expected time complexity)[dubious – discuss] and the almost
always complexity. To obtain the average-case complexity, given an input distribution,
the expected time of an algorithm is evaluated, whereas for the almost always complexity
estimate, it is evaluated that the algorithm admits a given complexity estimate that almost
In probabilistic analysis of probabilistic (randomized) algorithms, the distributions or averaging
for all possible choices in randomized steps are also taken into an account, in addition to the
In computer science, amortized analysis is a method for analyzing a given algorithm's time
complexity, or how much of a resource, especially time or memory in the context of computer
programs, it takes to execute. The motivation for amortized analysis is that looking at the worstcase run time per operation can be too pessimistic.
While certain operations for a given algorithm may have a significant cost in resources, other
operations may not be as costly. Amortized analysis considers both the costly and less costly
operations together over the whole series of operations of the algorithm. This may include
accounting for different types of input, length of the input, and other factors that affect its
The method requires knowledge of which series of operations are possible. This is most
commonly the case with data structures, which have state that persists between operations. The
basic idea is that a worst case operation can alter the state in such a way that the worst case
cannot occur again for a long time, thus "amortizing" its cost.