We know that the worst-case assumption protects us from being overly optimistic. But at times, it will be overly pessimistic.
We might consider an "average-case" approach. The "average-case" analysis is pretty rare for a couple reasons:
1. Input Data Assumptions Limit the Algorithm
It makes assumptions about the input data. This means that we would limit our algorithm to a certain input pattern (e.g., random input). Most algorithms do not limit input patterns.
2. Assumptions About Input are Difficult
Think about a simple linear search (scan):
Worst-case says we scan n elements
We might think average-case would say we scan (1/2)n elements
However we have ignored failed searches (which would scan all elements). How many searches result in no match. This may not be easy to estimate.
We'll next look a more common technique called "amortized analysis".
An amortized analysis is especially useful when an algorithm has a worst-case cost only on rare occasions.
Consider the algorithm for "append/add" operation for a dynamic array (resizable array) structure:
Add element to the the next available slot
If the structure is "full", then grow the array by constructing a new fixed array double the size of the current and copy the existing elements to the new fixed array
Performance of steps:
The first algo step above is fast -- o(1)
The second step (grow) is slower -- o(n)
A worst-case analysis would give this algorithm a big-O level of O(n).
However, we know that the slower "grow" operation is rare. An amortized analysis effectively ignores the rare step thus giving this algorithm a big-O level of O(1)