We say that an algorithm requires *average time
proportional to f(n)* (or that it has average-case complexity
O(f(N))) if there are constants c and n_{0} such that the average time the algorithm requires to
process an input set of size n is no more than c*f(n) time units whenever n ≥ n_{0}.

This definition is very similar to the one for worst case complexity.

The biggest difference is that we deal in the average time to process input sets of size n, instead of the maximum (worst case) time.

But note that we are still looking for an *upper
bound* on the algorithm's behavior.

The average case complexity describes how quickly the average time increases when n increases, just as the worst case complexity describes how quickly the worst case time increases when n increases.

**Question:** Suppose we have an
algorithm with worst case complexity O(n).

True or false: It is possible for that algorithm to have average
case complexity O(n^{2}).

## In the Forum: