In mathematics and science, it’s often too difficult to get precise formulas for things, so often researchers just estimate the growth. For example, in a previous post on the Fibonacci sequence, we found that the Fibonacci numbers grow roughly exponentially; that is, they are close to an exponential function, but there is a tiny error term. Often, we don’t actually care about the exact formula but we just want an idea of how quickly something grows. In order to capture this vague notion precisely, we use asymptotic notation, which captures all the long-run information contained in the original function.

Since all we care about is what happens in the long run, the definition of asymptotic notation involves limits. There are five primary notations: big-O, little-o, big-Omega, little-omega, and big-Theta. Don’t be intimidated by the large number of definitions. These are all defined very similarly, so once you understand one of them, you can understand them all. For that reason, we will focus on understanding big-O, and merely describe the differences from the others.

Big-O notation is used to describe upper bounds. Intuitively, the function f(n) is O(g(n)), pronounced “big-O of g(n),” if in the long run f(n) grows no faster than g(n). We’ll briefly take a look at the technical definition, and then explain it in English:

A function f(n) = O(g(n)) if there is a constant C > 0 such that f(n) \le Cg(n) whenever n \ge n_0, for some n_0.

The phrase “whenever n \ge n_0, for some n_0” can be interpreted as “when n gets large enough.” The n_0 can be seen as the “cutoff point,” where any larger n will cause f(n) \le Cg(n). Why do we want the constant C to be there? Intuitively, constant multipliers don’t affect asymptotic growth. That is, we want to say that an^2 grows faster than bn, no matter how small a is or how large b is. These details only matter if you want to know the precise definition of big-O. If you just want to understand intuitively what big-O means, it means “grows no faster than.” Some properties and examples:

  • f(n) = O(f(n)); obviously, no function grows faster than itself
  • af(n) = O(f(n)); this is a consequence of how we designed big-O so that constant multipliers don’t matter
  • n^i = O(n^j)
  • If f(n) = O(g(n)) and g(n) = O(h(n)), then f(n) = O(h(n)); this can be checked by the definitions, but it also makes sense intuitively—if f(n) grows no faster than g(n), which grows no faster than h(n), then of course f(n) grows no faster than h(n)!
  • If $f(n) = O(g(n))$, then $O(f(n) + g(n)) = O(f(n))$

The last property is extremely important for polynomials. It basically means that when we look at polynomials, in big-O only the highest degree term matters, and we don’t care about the constants. This makes finding big-O of polynomials super easy. For example:

  • n^2 + n + 1 = O(n^2)
  • 3n^{10} + 5n^4 + 9n^{11} + 100000n = O(n^{11})

This comes in especially handy when estimating running times for algorithms. Let’s say we have an algorithm which, given n points in the plane, finds the closest pair of points, and it does so by brute-force, comparing every pair of points. There are n(n-1)/2 possible pairs, and each comparison takes some constant amount of time (this is why we don’t care about constants, because they change with advances in hardware), so in total the algorithm’s running time is O(n^2).

Now, the other asymptotic notations are defined similarly. Intuitively, all you need to know is the following:

  • A function f(n) is little-o of g(n), written f(n) = o(g(n)), if f(n) grows strictly slower than g(n). In other words, little-o is to big-O as < is to \le.
  • A function f(n) is big-Omega of g(n), written f(n) = \Omega(g(n)), if f(n) grows at least as fast as g(n). In other words, big-Omega is to big-O as \ge is to \le.
  • A function f(n) is little-omega of g(n), written f(n) = \omega(g(n)), if f(n) grows strictly faster than g(n). In other words, little-omega is to big-O as > is to \le.
  • Finally, a function f(n) is big-Theta of g(n), written f(n) = \Theta(g(n)), if f(n) = O(g(n)) and f(n) = \Omega(g(n)). This means that f(n) grows asymptotically at the same rate as g(n).

To recap, you should have learned the following from this post:

  • What asymptotic notation is
  • How to compute big-O of polynomials
  • The use of asymptotic notation
Advertisements