# Complexity Analysis

It is very common for beginning computer science students to compare their programs with one another. You may also have noticed that it is common for computer programs to look very similar, especially the simple ones. An interesting question often arises. When two programs solve the same problem but look different, is one program better than the other?

## Big O (O(n))

An algorithm is said to run in time T(N) = O(f(N)) if there are constants c and no such that T(N) ≤ cf(N) when N ≥ no.

When we say that T(N) = O(f(N)), we are guaranteeing that the function T(N) grows at a rate no faster than f(N); thus f(N) is an upper bound on T(N)

Example:

If g(x) = x2 + x1.5 + 10.

Then g(x) = O(x2)

## Omega (Ω(n))

An algorithm is said to run in time T(N) = Ω(f(N)) if there are constants c and no such that T(N) ≥ cf(N) when N ≥ no. Thus f(N) is a lower bound on T(N)

Example:

If g(x) = x2 + x1.5 + 10.

Then g(x) = Ω(x1.5)

## Theta ( Θ(n) )

An algorithm is said to run in time T(N) = Θ(h(N)) if and only if T(N) = O(h(N)) and T(N) = Ω(h(N)).

Example:

If g(x) = log(n100) = 100log(n).

Then g(x) = Θ(log(n))

Note that the 100 is a constant and is therefore no included in the O

## Typical Growth Rates

Function Name
c Constant
log N Logarithmic
log2 N Log-squared
N Linear