Big-O Notation

Platini Epic November 16, 2018 0 Comments

One of the most important goals when working on very large applications, is to make sure that it’s performant and scalable. In Computer Science, we leverage the Big-O notation to calculate the Time & Space complexity of our code.

Thus far, this is the best leading and proven logic you can leverage to see how fast your algorithm is and the amount of memory space it’s using. Knowing the correct algorithm to use in your application, depending on the use case, will ensure your application is performant and scalable given the number of input data.

Simply put, Big-O notation gives us a rough estimate and indication of the worst-case scenario an algorithm will perform when the input data grows over time.

For example: the following notation:

  • Time Complexity = O(n^2) 
  • Space Complexity = O(n) 

Indicates that the algorithm will perform slower, and it will require more memory as n (the number of data items) gets larger. If we were parsing a csv file that had 1,000 lines, then n = 1000.

There’s a mathematical analysis used to calculate the Big-O of a given algorithm, which I will go into detail later. In most cases, we don’t need the math to figure out the Big-O of a given algorithm. The simplest case to remember is, if your code is using a single loop that iterates through all your input data n only once, then the Big-O of your algorithm is O(n). Whenever you have written Nested Loops: 2 Loops = O(n^2) (read as Big-O of n square), 3 Loops = O(n^3), 4 Loops = O(n^4), etc…

It is important to remember that the Big-O calculation is crucially important when working with large input data. The following is a good example that will test this theory.

When it comes to implementing a Sorting Algorithm: Insertion Sort = O(n^2), Merge Sort = O(n log n), in theory we can see that Insertion Sort will perform slower when the input data is larger compare to the Merge Sort. But if your input data is smaller, and your array was somewhat sorted, the Insertion Sort = O(n^2) will perform faster.

For now, please refer to the following Table below to quickly grasp the Big-O Notation:

Big-O Notation

Time Complexity vs Space Complexity

O(1) = Constant = Best Performance

Example = a single loop of an array of n. The algorithm always takes the same amount of time, regardless of how much data there is.

O(log n) = Logarithmic = Great Performance

Example = Binary Search. This algorithm reduces the amount of data with each iteration. If you have 100 items, it takes about 7 steps to find the answer. With 1,000 items, it takes 10 steps. And 1,000,000 items only take 20 steps. This is super fast even for large amounts of data.

O(n) = Linear = Good Performance

Example = Sequential Search. The Time Complexity grows linearly. If you have 100 items, this does 100 units of work. Doubling the number of items makes the algorithm take exactly twice as long (200 units of work).

O(n log n) = Linearithmic = Decent Performance

Example = the fastest general-purpose sorting algorithms.. This is slightly worse than linear but not too bad.

O(n^2) = Quadratic = A Bit Slow Performance

Example = algorithms using nested loops, such as insertion sort. If you have 100 items, this does 100^2 = 10,000 units of work. Doubling the number of items makes it four times slower (because 2 squared equals 4).

O(n^3) = Cubic = Poor Performance

Example = matrix multiplication. If you have 100 items, this does 100^3 = 1,000,000 units of work. Doubling the input size makes it eight times slower.

O(2^n) = Exponential = Very Poor Performance

Example: traveling salesperson problem. Please avoid these kinds of algorithms, but sometimes you have no choice. Adding just one bit to the input doubles the running time.

O(n^2) = Quadratic = A Bit Slow Performance

Example = algorithms using nested loops, such as insertion sort. If you have 100 items, this does 100^2 = 10,000 units of work. Doubling the number of items makes it four times slower (because 2 squared equals 4).

O(n!) = Factorial = Awful Performance

Never Ending! It can literally take a million years to do anything

Leave a Reply

Your email address will not be published.