1. Why Do We Care About Best and Worst Cases?
When analyzing algorithms, we need to know how long an algorithm can take to run.
However, depending on the input, the runtime may vary. For example:
- Sometimes, the algorithm finishes very quickly (best case).
- Other times, it might take a lot longer (worst case).
We study both cases, but in most professional and academic contexts, we focus on the worst case — because that gives us a guaranteed upper bound on execution time.
2. Example: Best Case vs. Worst Case
Imagine an algorithm with the following logic (in pseudocode):
1. Do some operation. 2. If N == 1: return result 3. Else: Run a loop N times return result
Best Case
- If
N = 1
:- Line 1: takes 1 operation
- Return statement: takes 1 operation
Total:
T(n) = 1 + 1 = 2
Worst Case
- If
N > 1
:- Line 1: 1 operation
- Line 2 check: 1 operation
- Loop:
N
operations - Return statement: 1 operation
Total:
T(n) = 1 + 1 + N + 1 = 3 + N
3. Why Focus on Worst Case?
If you’re ever asked to “give the time complexity” without any additional detail, the assumption is:
“Give me the worst case time complexity.”
That’s because:
- Best case might only happen in rare inputs.
- Worst case ensures your algorithm never takes longer than this.
4. Comparing Algorithm Complexities
Now that we can write complexity as a mathematical function (like T(n) = 3 + n
), we can compare different algorithms that solve the same problem.
Example:
- Algorithm 1:
T₁(n) = 2n² + 2n + 55
- Algorithm 2:
T₂(n) = 999n + 3
The Question
Which algorithm is faster?
We could try:
n = 1000
n = 1,000,000
n = big number
But here’s the key:
Small n
values don’t tell the whole story.
When n
grows very large (tends to infinity), small constants and low-order terms stop mattering — the shape of the growth dominates.
5. The Problem with Plugging in Infinity
If we try to just “plug in” infinity:
T₁(n) = 2 × ∞² + 2 × ∞ + 55 → ∞ T₂(n) = 999 × ∞ + 3 → ∞
We just get “infinity” for both, which doesn’t tell us which grows faster.
Why?
Because infinity is not a number, so direct substitution is meaningless.
6. The Real Way to Compare Functions as n → ∞
Instead of plugging in infinity, we:
- Look at the dominant term (the one that grows fastest as
n
increases). - Ignore constants and lower-order terms.
For example:
T₁(n) = 2n² + 2n + 55 → n² dominates for large n T₂(n) = 999n + 3 → n dominates for large n
Since n²
grows faster than n
, Algorithm 2 will eventually outperform Algorithm 1 for large enough n
.
7. Why Graphs Help (and Why They Can Mislead)
If we graph:
T₁(n) = 22n²
Parabola that gets steep very quickly.
T₂(n) = n³
Cubic curve that starts slower but eventually overtakes the quadratic.
The Catch:
You can find a point where one overtakes the other, but graphs only show a limited range.
When analyzing complexity, we want proof for all large n, not just what we see on a graph.
8. The Takeaway
- Best case: Algorithm runs fastest (rarely the one we use for complexity evaluation).
- Worst case: Algorithm runs slowest (used for guarantees).
- Comparing algorithms: Always compare growth rates as
n → ∞
. - Ignore constants and lower-order terms: They matter less and less as
n
grows. - Dominant term wins: The fastest-growing term determines the algorithm’s Big O class.