While writing out algorithms in code, a programmer must be aware of their drain on the performance that may be passed onto the user experience or aware of the potential drain of CPU usage. A friend recently told me of an infinite (but harmless) loop that the code for their database had and the “damage” was an additional $100k in CPU usage for that machine for the month (because the server was hosted and paid by CPU usage).

**Constant Time – O(1)**

Increasing the problem size does not change the number of operations.

Examples:

- Pulling from the top of a stack or from the front of an array. No matter how large the stack or array is, the steps to complete the operations is one.
- Looking up a value on an object when the key is known.

**Linear Time – O( n)**

The number of operations is proportional to problem size. The larger the list the longer the operation will take to complete by that `n`

number.

Examples:

- Finding a single item in an unsorted array
- Searching for a single value in a linked list
- ForEach
- Slice and indexOf is linear in Javascript

**Logarithmic Time – O(log _{c} n)**

Multiplying the problem size by constant adds the same number operations. How many times do I have to divide *n* by _{c} to get to 1?

Examples:

- Dividing is almost always log.
- Finding an item in a sorted array using a
**binary search tree**. - Finding a name in the phone book, open up half the book then determine if you have to check the first half or the second, then go to halfway between that half of the book, repeat.

**Quadratic Time – O( n*n or n^{2})**

The number of operations is proportional to the square of the problem size.

Examples:

- Two nested for loops, usually
- Any operation that multiplies two numbers by each other
- Comparing two items as in a bubble sort or quick sort

**Exponential Time – (O _{c}^{n})**

The number of operations is proportional to some constant raised to the power of problem size. Any operation that would double the number of inputs. If the number of input grows, the time to complete the operation grows exponentially.

**Determining Complexity**

- Determine what variable(s) represent problem size (this is
*n*) - Write the number of operations in terms of
*n*- lines of code in series are
**added** - lines of code nested in other function calls or loops are
**multiplied**

- lines of code in series are
- Find leading term and drop coefficients (instead of 5
*n***n*, then it’s*n***n*) - If adding operations, choose the worst case (linear trumps constant, linear trumps log
^{n}, etc) and this is the overall time complexity. - If multiplying, the time complexity is the product of both (
*n**log*n =**n*log*n*)