Time Complexity of Algorithms

While writing out algorithms in code, a programmer must be aware of their drain on the performance that may be passed onto the user experience or aware of the potential drain of CPU usage. A friend recently told me of an infinite (but harmless) loop that the code for their database had and the “damage” was an additional $100k in CPU usage for that machine for the month (because the server was hosted and paid by CPU usage).

Constant Time – O(1)

Increasing the problem size does not change the number of operations.

Examples:

  • Pulling from the top of a stack or from the front of an array. No matter how large the stack or array is, the steps to complete the operations is one.
  • Looking up a value on an object when the key is known.

Linear Time – O(n)

The number of operations is proportional to problem size. The larger the list the longer the operation will take to complete by that n number.

Examples:

  • Finding a single item in an unsorted array
  • Searching for a single value in a linked list
  • ForEach
  • Slice and indexOf is linear in Javascript

Logarithmic Time – O(log c n)

Multiplying the problem size by constant adds the same number operations. How many times do I have to divide n by c to get to 1?

Examples:

  • Dividing is almost always log.
  • Finding an item in a sorted array using a binary search tree.
  • Finding a name in the phone book, open up half the book then determine if you have to check the first half or the second, then go to halfway between that half of the book, repeat.

Quadratic Time – O(n*n or n2)

The number of operations is proportional to the square of the problem size.

Examples:

  • Two nested for loops, usually
  • Any operation that multiplies two numbers by each other
  • Comparing two items as in a bubble sort or quick sort

Exponential Time – (Ocn)

The number of operations is proportional to some constant raised to the power of problem size. Any operation that would double the number of inputs. If the number of input grows, the time to complete the operation grows exponentially.

 

Determining Complexity

  1. Determine what variable(s) represent problem size (this is n)
  2. Write the number of operations in terms of n
    1. lines of code in series are added
    2. lines of code nested in other function calls or loops are multiplied
  3. Find leading term and drop coefficients (instead of 5n*n, then it’s n*n)
  4. If adding operations, choose the worst case (linear trumps constant, linear trumps logn, etc) and this is the overall time complexity.
  5. If multiplying, the time complexity is the product of both (n*logn = nlogn)
Advertisements