Big O Notation

In this section, we'll breifly discuss the topic of Big O Notation. In Computer Science, Big O is useful in the field of Asymptotic Analysis. Big O allows us to describe the time-complexity of a piece of code.

Why Big O

What is the time-complexity of a program? This is essentially the bound on how fast a program can run. Why is this important? In Computer Programming, the efficiency of a program is of utmost importance: we always want our code to run as fast as possible. If our code runs slowly, it may take too long for an action to be executed! That's why we develop better algorithms.


To describe the time complexity of a program, we can say that the code is in O(f(n)) time, where f(n) is some function of n, and O(f(n)) means the program grows at most as fast as f(n). 

When we write a piece of code, we always want this code to do some task. Often times, we'll want our program to do multiple tasks. We can think of Time-Complexity as the maximum number of tasks we can do. This can be modeled using a mathematical expression. For example, if we have a for loop that loops n times, then the number of tasks we can do is n, thus our time complexity will be O(n). 

Example 1:

What does O(1) mean?

This means that the time is constant and does not grow depending on the value of n.

Now, we'll formally explain a few common time complexities, so that by the end, you'll have a basic understanding of how these work.

O(1) - Constant Time:

Now we talk formally about O(1). This is constant time, in other words, no matter what loops or tasks are at hand, our program will always run at a constant rate.

Example: accessing an element in an array.


                             int[] arr = {1,2,3,4,5,6}


O(n) - Linear Time

O(n) is linear time. This means that the time it takes to complete the tasks is directly proportional to n, the input size. As you increase n, the time will increase at the same rate.

A common example is iterating through an array with size n. As you increase the size, the time complexity is still O(n), since the time always grows proprtionally. 


                                 int[] arr = new int[n]

                               for(int i = 0; i < n; i++){



O(log n) - Logarithmic Time

Logarithmic time is more difficult to visualize. This is when the time to run the code is directly proportional to the logarithm of the input size. 

A common example of an algorithm with O(log n) time is a Binary Search on an array with n elements.

O(n^2) - Quadratic Time​

Quadratic time is when the time it takes to run the code grows proportional to the square of the input size. You will encounter these types of complexities with nested loops

A common example of this is a nester for loop:


                             for(int i = 0; i < n; i++) {

                                   for(int j = 0; j < n; j++) {




Using the same logic, find the time complexity of the following:

                             for(int i = 0; i < n; i++) {

                                   for(int j = 0; j < n; j++) {

                                        for(int k = 0; k < n; k++) {




Answer: O(n^3)