1. Definition

1.1.

We say that an algorithm requires average time proportional to f(n) (or that it has average-case complexity O(f(N))) if there are constants c and n0 such that the average time the algorithm requires to process an input set of size n is no more than c*f(n) time units whenever n ≥ n0.

This definition is very similar to the one for worst case complexity.

An algorithm requires average time proportional to f(n) (or that it has average-case complexity O(f(N))) if there are constants c and n0 such that the average time the algorithm requires to process an input set of size n is no more than c*f(n) time units whenever n ≥ n0.

The biggest difference is that we deal in the average time to process input sets of size n, instead of the maximum (worst case) time.

An algorithm requires average time proportional to f(n) (or that it has average-case complexity O(f(N))) if there are constants c and n0 such that the average time the algorithm requires to process an input set of size n is no more than c*f(n) time units whenever n ≥ n0.

But note that we are still looking for an upper bound on the algorithm's behavior.

The average case complexity describes how quickly the average time increases when n increases, just as the worst case complexity describes how quickly the worst case time increases when n increases.

Question: Suppose we have an algorithm with worst case complexity O(n).

True or false: It is possible for that algorithm to have average case complexity O(n2).


In the Forum:

(no threads at this time)