Big Omega

Big Omega

Whereas Big O measures the upper bound (the maximum amount of time something can take) Big Omega (using the symbol Ω) measures the best running time.

We say that $f(n)$ is $Ω(g(n))$ if there are real constants c and $n_0$ such that:

$$f(n) \geq cg(n) for all n \geq n_0$$

We say that f(n) is $\theta (g(n))$ (Theta) if f(n) is $\omega (g(n))$ and f(n) is also $O(g(n))$. Equivalently g(n) is O(f(n)) and f(n) is O(g(n)). Let’s see loads of examples.

$$3 log n + log log n$$ is $\omega (log n)$ because we choose constant = 3. $3 log n + log log n \geq 3 log n$ and from the argument 1. So $n_0 = 3, c = 1$

$$ \frac{2}{3}n^2 - n ∈ \omega (n^2)$$ if we take anything below $\frac{2}{3}$. If we take c slightly smaller than 2 thirds at some point this factor will take over and grow faster than the function on the right hand side.

For example, with c = 1/3. Let’s calculate it. We claim that:

$$ \frac{2}{3}n^2 - n \geq c * m^2$$ for any $m \geq 3$. Let us multiply both sides by 3.

$$2m^2 - 3m \geq m^2$$

$$m^2 \geq 3m$$

Divide both sides by m:

$$m \geq 3 = m_0$$

C = $\frac{1}{3}$ and $n_0 = 3$.

Summary

Bet you were expecting some hard to understand guide to Big O huh? Well, this is all it is. You need to memorise (or learn) the hierarchy, take some algorithms and find out what their Big O notation is.

Big O represents how long an algorithm takes but sometimes we care about how much memory (space complexity) an algorithm takes too.

There are other forms of measuring algorithm time complexity such as Big Theta which is the least amount of time an algorithm takes.