##### ASYMPTOTIC NOTATIONS

Asymptotic Notation is a shorthand way to write down about the fastest possible and the slowest possible running time of an algorithm. We are concerned with how the running time of an algorithm increases with the size of the input in the limit, as the size of the input increases without bound. These notations are important because

- Gives simple characteristics of an algorithm efficiency.
- Allows comparisons of the various algorithm.

Usually, an algorithm that is asymptotically more efficient will be the best choice.

**BIG- OH NOTATION (****WORST CASE)**

For a given function g(n), we denote Big-oh notation by order of g(n) where, if and only if there exists the constants c and n_{0} such that for n≥n0; c≥0, n0≥0; g(n) is an asymptotically** tight bound **for f (n). It expresses the upper bound of an algorithm, i.e, a measure of the longest amount of time, that an algorithm can possibly take to complete.

In the above graph after a particular input value n_{0 }, always Cg(n) is greater than f(n) which indicates the algorithm upper bound. For example,Given **f(n)= 3n+5, g(n)=n then f(n)= O(g(n)) .**

**BIG-OMEGA NOTATION**

Ω-notation, pronounced "big-omega notation", is used to describe the asymptotic lower bound of an algorithm, as it bounds the growth of the running time for large enough input sizes. For non-negative functions f(n) and g(n), if there exists an integer n_{0 }and a constant, c>0 such that for all I, n>n_{0, }f(n)≥ cg(n), then f(n)=Ω(g(n)) where n≥1. This notation provides the lower bound and describes the best case for a given data.

** f(n)=Ω(g(n))**

For example,** f(n)= 3n+2, g(n)=n**

then **f(n)≥cg(n) **assuming c=1

3n+2≥n; therefore, **f(n)=Ωg(n).**

__Theta()__**Notations**

The lower and the upper bound for a function is provided by Theta notations. For non negative functions f(n) and g(n), if their exists an I, n_{0 }and positive constants c_{1} and c_{2 },i.e, c_{1} >0 and c_{2} >0,such that for all the integer, n>n_{0} , c_{1} g(n)≤f(n)≤ c_{2} g(n), where c_{1}c_{2}, n≥n_{0}.

For example, f(n)= 3n+2, g(n)=n

When f(n)≥cg(n) assuming c=1; 3n+2≥n; n_{0}≥0;

When f(n)≤cg(n) assuming c=4; n_{0}≥1

Therefore we can say, f(n) is bounded by g(n) both in the upper bound as well as lower bound.

**SIGNIFICANCE:**

Practical Significance of these notations are,

Given any algorithm, Big O(0) says the worst case time or the upper bound of the algorithm where** the worst-case complexity** of an algorithm is a function is defined by a maximum number of steps taken on any instances of size n. Practically, worst case complexities are used more often as compared to other complexities.

**Best case complexity** is a function defined by a minimum number of steps taken on any significant amount of time on size n. **Omega Notation(Ω)** gives us the best case where we are giving lower bound on the best case running time of an algorithm.

**Average Case Complexity **is a function defined by an average number of steps taken on any instances of size n. We use average case when the best case and worst case both are the same, which means for any input, it will give same time.