Learnitweb

Understanding Big-O Notation

1. What is Notation?

Notation is a symbolic way of writing mathematical expressions.

Instead of writing:

“The sum of one and two”

We write:

1 + 2

This symbolic representation (such as +, -, \times, \div, \log, etc.) is called mathematical notation.

2. What is Big-O Notation?

Big-O notation is a way to describe how fast a function grows as its input size increases. It’s written as:

O(g(n))

Where g(n) is an expression like n, n^2, 2^n, etc.

3. Formal Definition

A function T(n) is in O(g(n)) if there are constants C > 0 and n_0 such that:

T(n) \leq C \cdot g(n) for all n \geq n_0

This means: T(n) grows no faster than g(n) times some constant.

4. Big-O is a Set

Think of O(g(n)) as a set of functions. For example:

  • If g(n) = n^2, then O(n^2) contains:
    • n^2
    • 0.5n^2 + 10n
    • -100n^2 + 999n + 1

Any function that doesn’t grow faster than n^2 (up to constant factors) is part of this set.

We write:

T(n) \in O(g(n))
or

T(n) = O(g(n))
or
T(n) = O(g(n))

5. Examples

FunctionBig-O Notation
T(n) = n + \sqrt{n}O(n)
T(n) = n! + n^2O(n!)
T(n) = 100n + 5O(n)
T(n) = n^3 + 7n^2 + 10O(n^3)
T(n) = 42O(1)

6. Comparing Two Functions

Given:

  • T_1(n) = 5 \cdot 2^n
  • T_2(n) = 4 \cdot \sqrt{n}

We know:

  • O(2^n) grows much faster than O(\sqrt{n})

Hence:

  • T_1(n) = O(2^n)
  • T_2(n) = O(\sqrt{n})

7. Constants Don’t Matter

Example:

  • T_1(n) = 999nO(n)
  • T_2(n) = 5nO(n)

Although T_2 grows slower, Big-O does not show this since both are linear.

8. Why Ignore Constants?

We assume each operation takes 1 unit time (abstract model), ignoring machine-level details.

That’s why 10n and 0.0001n are both O(n).

We’re interested in how the function behaves as n \to \infty.

9. Multiple Input Parameters

An algorithm may depend on more than one input:

If n is the size of list A, and k is the size of list B, then:

T(n, k) = O(nk)

10. Big-O Arithmetic Rules

a. Multiply by constant:

C \cdot O(f(n)) = O(f(n))

b. Same order sum:

O(f(n)) + O(f(n)) = O(f(n))

c. Different order sum:

If f(n) grows faster than g(n), then:

f(n) + O(g(n)) = O(f(n))

d. Multiplication:

O(f(n)) \cdot O(g(n)) = O(f(n) \cdot g(n))

11. Using Big-O in Real Code

Example:

def foo(arr):
    for x in arr:    # O(n)
        print(x)
    arr.sort()       # O(n log n)

Overall complexity:

O(n) + O(n \log n) = O(n \log n)

12. Growth Orders and Names

Big-O NotationNamePerformance
O(1)ConstantExcellent
O(\log n)LogarithmicExcellent
O(n)LinearGood
O(n \log n)LinearithmicAcceptable
O(n^2)QuadraticPoor
O(2^n)ExponentialBad
O(n!)FactorialVery Bad

If your algorithm is worse than O(n \log n), you should try optimizing it.