×

Close

- Theory Of Computation - TC
- Note
- 2 Topics
**23 Views**- Uploaded 1 year ago

Touch here to read

Page-1

Topic:

234 Chapter 4: Algorithms 26. (b) Given: 20, 47, 15, 8, 9, 4, 40, 30, 12, 17. Here, two-way merge sort is used, so a group of two is taken at once. The second pass is shown as follows: 8, 15, 20, 47 12, 17 4, 9, 30, 40 20, 47 8, 45 4, 9 30, 40 12, 17 20, 47 15, 8 9, 4 40, 30 12, 17 After second pass of the algorithm element, the order is 8, 15, 20, 47, 4, 9, 30, 40, 12, 17. 27. (c) Randomized quick sort worst-case complexity is O(n2 ); when all the elements are same, it took worst time. 28. (d) 29. (b) Look at the following figure: 15 3 30. (a) Bellman-Ford algorithm is used to find all-pair shortest distances in a graph, and it is a dynamic programming technique. 31. (b) Merge sort took O(n log n) to sort a list of element. Binary search time complexity is O(log n). Insertion sort worst-case running time complexity is O(n2 ) . 32. (c) d(r, u) and d(r, v) will be equal when u and v are at same level; otherwise, d(r, u) will be less than d(r, v). 33. (d) In an undirected graph, if it is a complete graph, then it has n(n −1)/2 edges. We can choose to have (or not have) any of the n(n −1)/2 edges. So, the total number of undirected graphs with n vertices is 2(n(n−1)/2). 34. (c) Depth-first search uses stack for storing values; queue is used by breadth-first search. Heap is used for sorting the set of elements. 50 62 58 20 5 The number of nodes in the left subtree and the right subtree of the root, respectively, is (7, 4). Therefore, option (b) is correct. 37 8 91 60 24 Chapter 4.indd 234 4/9/2015 9:55:53 AM

UNIT V: THEORY OF COMPUTATION LAST SIX YEARS' GATE ANALYSIS 7 Number of questions 6 5 Marks 1 Marks 2 Total number of questions 4 3 2 1 0 2015 2014 2013 2012 2011 2010 Concepts on which questions were asked in the previous six years Chapter 5.indd 235 Year Concepts 2015 Turing machine, Regular expression and languages, DFA, Context free grammar, Recursive language, NDPDA 2014 Regular languages, DFA, RE and REC 2013 Regular languages, DFA, CFL, CFG 2012 Regular languages, DFA and NFA, CFL 2011 Power of DFA, NFA and TM, Regular language, PDA 2010 Regular expression, Recursive language, LR and LL, CFL 4/9/2015 9:57:44 AM

Chapter 5.indd 236 4/9/2015 9:57:44 AM

CHAPTER 5 THEORY OF COMPUTATION Syllabus: Regular languages and finite automata, context-free languages and pushdown automata, recursively enumerable sets and Turing machines, undecidability. 5.1 INTRODUCTION Theory of computation is a branch of computer science and mathematics that studies whether a problem can be solved using an algorithm on a model of computation. This branch also studies that how efficiently that problem can be solved. This deals with two types of theories. First is computability theory, deals with up to which extent problem can be solved. Second is complexity theory, which deals with efficiency of solution. In this finite automata, push down automata, Turing machines, regular expressions and various grammars will be discussed. Automata theory is the study of abstract machines (or more appropriately, abstract “mathematical” machines or systems) and the computational problems that can be solved using these machines. These abstract machines are called automata. Computability theory deals with Chapter 5.indd 237 the question of the extent to which a problem is solvable on a computer, whereas complexity theory considers not only whether a problem can be solved at all on a computer, but also how efficiently it can be solved. Two major aspects are considered in this—time complexity and space complexity. Basic introduction and various technical terms for understanding of concept are described in the following section. 5.2 FINITE AUTOMATA There are several things to be done in the designing and implementation of an automatic machine, but the most important question is that, what will be the behaviour of the machine? “Theory of Computation” is the subject which solves this purpose. 4/9/2015 9:57:44 AM

## Leave your Comments