We can write programming language agnostic pseudo code, and theoretically calculate time complexity for various cases such as best case, worst case and average case.

**Why Analyse Algorithms? **

By analyzing algorithms we can:

- Predict Performance
- Compare Algorithms
- Provide Guarantees
- Prevent Performance Bugs

**What is Asymptotic Analysis?**

Term **asymptote** means a straight line approached by a given curve as one of the variables in the equation of the curve approaches a limit, mostly infinity. **Asymptotic analysis** of algorithms is a mathematical method of describing limiting behavior.

With asymptotic analysis, we are interested in the properties of a function f(n) as n becomes very large. For instance, consider a function f(n) = n^2 + 5n. As n becomes very large, the term 5n becomes insignificant compared to n^2, and f(n) is said to be “asymptotically equivalent to n2, as n → ∞”.

**Why Asymptotic Analysis?**

It is not wise to run a program and find time complexity because of various reasons including time consumed as well as need to first complete writing the program.

**How To Do Asymptotic Analysis?**

Out of the notations available to denote the limiting behavior, the most popular ones are Big Oh, Big Omega and Big Theta.

Big Oh denotes an upper bound and is used to express a worst case scenario. Big Omega is used to denote a lower bound and is used to express best case scenario. Similarly is used to express the average case.

Big Oh (or Big O) is the most common notation that we use because we are mostly worried about the worst case. Asymptotic notations (like Big O) characterizes functions according to their growth rates: different functions with the same growth rate may be represented using the same O notation.

Order of growth rate classifications show how an algorithm scales with the input size. Order of growth rate classifications used are: 1, log N, N, N log N, N2, N3 and 2N, where N is the input size. While O(1) called as “order of 1” is the best, performance degrades for larger inputs according to the growth rate classification order (*given above*).

Growth rate classifications are usually referred to as follows:

- 1 – Constant
- log N – Logarithmic
- N – Linear
- N log N – Linearithmic
- N2 – Quadratic
- N3 – Cubic
- 2N – Exponential

In general, you can calculate the time complexity of your program as follows:

- Calculate complexity of individual statements
- Sum them up and leave constants and smaller denominations. E.g. 5n2 + 6N + log N will become O(n2).

A simple way to calculate the time complexity of different statements are as follows:

- Any single statement = O(1)
- Any loop that iterate over N elements = O(N)
- Any nested loop with two loops (both iterating N elements) = N * N = O(N2)
- Any nested loop with two loops (both iterating N elements) = N * N * N = O(N3)
- If the input data set gets divided into 2 in each iterations, we consider the complexity as log N.
- E.g. Consider binary search. We search for an element in a sorted list. We compare our search element with the middle element and decide which side to go. We continue to do this until we find our element or there are not more elements to compare.

- A nested for loop with two loops, one iterating over N elements and second with input data set size reduces by 2, overall complexity = N * log N = O (N log N).
- E.g. for(int i = 0; i < n; i ++) for(int j=0; j<n; j=j*2).

**Note: **For a recursive function, we may also use other approaches such as substitution method or the Masters theorem. This is part of most university courses, but used less frequently for normal scenarios.

**REFERENCES:**

http://www.dictionary.com/browse/asymptotic

http://www.dictionary.com/browse/asymptote

https://en.wikipedia.org/wiki/Asymptotic_analysis

https://en.wikipedia.org/wiki/Big_O_notation