## Big O (O(n))

An algorithm is said to run in time T(N) = O(f(N)) if there are constants c and n_{o} such that T(N) ≤ cf(N) when N ≥ n_{o}.

When we say that T(N) = O(f(N)), we are guaranteeing that the function T(N) grows at a rate no faster than f(N); thus f(N) is an upper bound on T(N)

**Example:**

If *g(x) = x*^{2} + x^{1.5} + 10.

Then *g(x) = O(x*^{2})

## Omega (Ω(n))

An algorithm is said to run in time T(N) = Ω(f(N)) if there are constants c and n_{o} such that T(N) ≥ cf(N) when N ≥ n_{o}. Thus f(N) is a lower bound on T(N)

**Example:**

If *g(x) = x*^{2} + x^{1.5} + 10.

Then *g(x) = Ω(x*^{1.5})

## Theta ( Θ(n) )

An algorithm is said to run in time T(N) = Θ(h(N)) if and only if T(N) = O(h(N)) and T(N) = Ω(h(N)).

**Example:**

If *g(x) = log(n*^{100}) = 100log(n).

Then *g(x) = Θ(log(n))*

Note that the 100 is a constant and is therefore no included in the O