When it comes to analysing the complexity of any algorithm in terms of time and space, we
can never provide an exact number to define the time required and the space required by the
algorithm, instead we express it using some standard notations, also known as Asymptotic
When we analyse any algorithm, we generally get a formula to represent the amount of time
required for execution or the time required by the computer to run the lines of code of the
algorithm, number of memory accesses, number of comparisons, temporary variables
occupying memory space etc. This formula often contains unimportant details that don't
really tell us anything about the running time.
Let us take an example, if some algorithm has a time complexity of T(n) = (n2 + 3n + 4),
which is a quadratic equation. For large values of n, the 3n + 4 part will become
insignificant compared to the n2 part.
For n = 1000, n2 will be 1000000 while 3n + 4 will be 3004.
Also, When we compare the execution times of two algorithms the constant coefficients of
higher order terms are also neglected.
An algorithm that takes a time of 200n2 will be faster than some other algorithm that takes n3
time, for any value of n larger than 200. Since we're only interested in the asymptotic
behavior of the growth of the function, the constant factor can be ignored too.
What is Asymptotic Behaviour
The word Asymptotic means approaching a value or curve arbitrarily closely (i.e., as some
sort of limit is taken).
Remember studying about Limits in High School, this is the same.
The only difference being, here we do not have to find the value of any expression where n is
approaching any finite number or infinity, but in case of Asymptotic notations, we use the
same model to ignore the constant factors and insignificant parts of an expression, to device a