In real life, there is no such thing as second place. Either you are a winner, or you’re not.
--Your friends at LectureNotes

Note for Data Structure using C - DS By JNTU Heroes

  • Data Structure using C - DS
  • Note
  • Jawaharlal Nehru Technological University Anantapur (JNTU) College of Engineering (CEP), Pulivendula, Pulivendula, Andhra Pradesh, India - JNTUACEP
  • 5 Topics
  • 261 Offline Downloads
  • Uploaded 2 years ago
Jntu Heroes
Jntu Heroes
0 User(s)
Download PDFOrder Printed Copy

Share it with your friends

Leave your Comments

Text from page-1


Text from page-2

Data Structures Through C UNIT – I Basic concepts of Algorithm Preliminaries of Algorithm: An algorithm may be defined as a finite sequence of instructions each of which has a clear meaning and can be performed with a finite amount of effort in a finite length of time. The algorithm word originated from the Arabic word “Algorism” which is linked to the name of the Arabic mathematician AI Khwarizmi. He is considered to be the first algorithm designer for adding numbers. Structure and Properties of Algorithm: An algorithm has the following structure 1. Input Step 2. Assignment Step 3. Decision Step 4. Repetitive Step 5. Output Step An algorithm is endowed with the following properties: 1. Finiteness: An algorithm must terminate after a finite number of steps. 2. Definiteness: The steps of the algorithm must be precisely defined or unambiguously specified. 3. Generality: An algorithm must be generic enough to solve all problems of a particular class. 4. Effectiveness: the operations of the algorithm must be basic enough to be put down on pencil and paper. They should not be too complex to warrant writing another algorithm for the operation. 5. Input-Output: The algorithm must have certain initial and precise inputs, and outputs that may be generated both at its intermediate and final steps. An algorithm does not enforce a language or mode for its expression but only demands adherence to its properties. Practical Algorithm Design Issues: 1. To save time (Time Complexity): A program that runs faster is a better program. 2. To save space (Space Complexity): A program that saves space over a competing program is considerable desirable. Efficiency of Algorithms: The performances of algorithms can be measured on the scales of time and space. The performance of a program is the amount of computer memory and time needed to run a program. We use two approaches to determine the performance of a program. One is analytical and the other is experimental. In 2

Text from page-3

performance analysis we use analytical methods, while in performance measurement we conduct experiments. Time Complexity: The time complexity of an algorithm or a program is a function of the running time of the algorithm or a program. In other words, it is the amount of computer time it needs to run to completion. Space Complexity: The space complexity of an algorithm or program is a function of the space needed by the algorithm or program to run to completion. The time complexity of an algorithm can be computed either by an empirical or theoretical approach. The empirical or posteriori testing approach calls for implementing the complete algorithms and executing them on a computer for various instances of the problem. The time taken by the execution of the programs for various instances of the problem are noted and compared. The algorithm whose implementation yields the least time is considered as the best among the candidate algorithmic solutions. Analyzing Algorithms: Suppose M is an algorithm, and suppose n is the size of the input data. Clearly the complexity f(n) of M increases as n increases. It is usually the rate of increase of f(n) with some standard functions. The most common computing times are O(1), O(log2 n), O(n), O(n log2 n), O(n2), O(n3), O(2n) Example: Program Segment A Program Segment B Program Segment C -------------------------- -------------------------- -------------------------- for k =1 to n do for j =1 to n do x =x + 2; -------------------------- x =x + 2; for x = 1 to n do end; x =x + 2; -------------------------- end end; -------------------------- Total Frequency Count of Program Segment A Program Statements Frequency Count -------------------------x =x + 2; 1 -------------------------Total Frequency Count 1 Total Frequency Count of Program Segment B Program Statements Frequency Count -------------------------for k =1 to n do (n+1) 3

Text from page-4

x =x + 2; n end; n -------------------------Total Frequency Count 3n+1 Total Frequency Count of Program Segment C Program Statements Frequency Count -------------------------for j =1 to n do (n+1) for x = 1 to n do n(n+1) x =x + 2; n2 n2 end end; n -------------------------3n2+3n+1 Total Frequency Count The total frequency counts of the program segments A, B and C given by 1, (3n+1) and (3n 2+3n+1) respectively are expressed as O(1), O(n) and O(n2). These are referred to as the time complexities of the program segments since they are indicative of the running times of the program segments. In a similar manner space complexities of a program can also be expressed in terms of mathematical notations, which is nothing but the amount of memory they require for their execution. Asymptotic Notations: It is often used to describe how the size of the input data affects an algorithm’s usage of computational resources. Running time of an algorithm is described as a function of input size n for large n. Big oh(O): Definition: f(n) = O(g(n)) (read as f of n is big oh of g of n) if there exist a positive integer n0 and a positive number c such that |f(n)| ≤ c|g(n)| for all n ≥ n0 . Here g(n) is the upper bound of the function f(n). f(n) 3 g(n) 2 3 16n + 45n + n 3 f(n) = O(n ) 12n 34n – 40 n f(n) = O(n) 50 1 f(n) = O(1) Omega(Ω): Definition: f(n) = Ω(g(n)) ( read as f of n is omega of g of n), if there exists a positive integer n0 and a positive number c such that |f(n)| ≥ c |g(n)| for all n ≥ n0. Here g(n) is the lower bound of the 4

Lecture Notes