Time complexity: The amount of computer program needs to run it to completion. time the Space complexity: The amount of memory it needs to run to completion. BASIC DIFFERENCES BETWEEN SPACE COMPLEXITY SPACE COMPLEXITY: COMPLEXITY AND TIME The space complexity of an algorithm is the amount of memory it requires to run to completion. The space needed by a program contains the following components: 1) Instruction space: -stores the executable version of programs and is generally fixed. 2) Data space: It contains: a) Space required by constants and simple variables. Its space is fixed. b) Space needed by fixed size structure variables such as array and structures. c) Dynamically allocated space. This space is usually variable. 3) Enviorntal stack: -Needed to stores information required to reinvoke suspended processes or functions. The following data is saved on the stack - return address. -Value of all local variables -value of all formal parameters in the function. TIME COMPLEXITY: The time complexity of an algorithm is the amount of time it needs to run to completion. Namely space to measure the time complexity we can count all operations performed in an algorithm and if we know the time taken for each operation then we can easily compute the total time taken by the algorithm. This time varies from system to system. Our intention is to estimate execution time of an algorithm irrespective of the computer on which it will be used. Hence identify the key operation and count such operation performed till the program completes its execution. The time complexity can be expressed as a function of a key operation performed. The space and time complexity is usually expressed in the form of function f (n), where n is the input size for a given instance of a problem being solved. F (n) helps us to predict the rate of growth of complexity that will increase as size of input to the problem increases. F (1) also helps us to predict complexity of two or more algorithms in order or find which is more efficient. The O-notation In other words: c is not really important for the description of the running time! To take this circumstance into account, running time complexities are always specified in the socalled O-notation in computer science. One says: The sorting method has running time O(n2). The expression O is also called Landau's symbol. Mathematically speaking, O(n2) stands for a set of functions, exactly for all those functions which, “in the long run”, do not grow faster than the function n2, that is for those functions for which the function n2 is an upper bound (apart from a constant factor.) To be precise, the following holds true: A function f is an element of the set O(n2) if there are a factor c and an integer number n0 such that for all n equal to or greater than this n0 the following holds: f(n) ≤ c·n2. The function n2 is then called an asymptotically upper bound for f. Generally, the notation f(n)=O(g(n)) says that the function f is asymptotically bounded from above by the function g.[12] A function f from O(n2) may grow considerably more slowly than n2 so that, mathematically speaking, the quotient f / 2 n converges to 0 with growing n. An example of this is the function f(n)=n. However, this does not hold for the function f which describes the running time of our sorting method. This method always requires n2 comparisons (apart from a constant factor of 1/2). n2 is therefore also an asymptotically lower bound for f. This f behaves in the long run exactly like n2. Expressed mathematically: There are factors c1 and c2 and an integer number n0 such that for all n equal to or larger than n0 the following holds: c1·n2 ≤ f(n) ≤ c2·n2. So f is bounded by n2 from above and from below. There also is a notation of its own for the set of these functions: Θ(n2). Figure 2.9 contrasts a function f which is bounded from above by O(g(n)) to a function whose asymptotic behavior is described by Θ(g(n)): The latter one lies in a tube around g(n), which results from the two factors c1 and c2. The time complexity of an algorithm is commonly expressed using big O notation, which excludes coefficients and lower order terms. When expressed this way, the time complexity is said to be described asymptotically, i.e., as the input size goes to infinity. For example, if the time required by an algorithm on all inputs of size n is at most 5n3 + 3n, the asymptotic time complexity is O(n3). Complexity classes The concept of polynomial time leads to several complexity classes in computational complexity theory. Some important classes defined using polynomial time are the following. P: The complexity class of decision problems that can be solved on a deterministic Turing machine in polynomial time. NP: The complexity class of decision problems that can be solved on a non-deterministic Turing machine in polynomial time. ZPP: The complexity class of decision problems that can be solved with zero error on a probabilistic Turing machine in polynomial time. RP: The complexity class of decision problems that can be solved with 1-sided error on a probabilistic Turing machine in polynomial time. BPP: The complexity class of decision problems that can be solved with 2-sided error on a probabilistic Turing machine in polynomial time. BQP: The complexity class of decision problems that can be solved with 2-sided error on a quantum Turing machine in polynomial time. P is the smallest time-complexity class on a deterministic machine which is robust in terms of machine model changes. (For example, a change from a single-tape Turing machine to a multitape machine can lead to a quadratic speedup, but any algorithm that runs in polynomial time under one model also does so on the other.) Any given abstract machine will have a complexity class corresponding to the problems which can be solved in polynomial time on that machine. CS Advanced Theory of Computation Department of Computer Science, IIU Spring 2011 Course Structure Lectures = 1 [HEC Recommended 3] Labs = 0 Credit hours = 3 Prerequisites Graduate Standing and Consent of instructor. Previous coursework involving proofs and some programming experience are needed. Course Objective To gain an understanding of the mathematics this underlies the theory of computation. At the end of the course, the student should be able to formalize mathematical models of computations; use these formalisms to explore the inherent limitations of computations; and describe some major current approaches to investigating feasible computation. Rationale The theory of computation is concerned with the theoretical limits of computability. Several mathematical models of computation have been formulated independently and under any such computational model, the existence of well-defined but unsolvable problems can be formally shown. These topics form part of the core of the mathematical foundations of computer science that will provide students and researchers with a sound theoretical view of the most fundamental concepts of computation. Specifically, this course provides a rigorous introduction to the theoretical foundations of computer science. It deals with a number of interconnected topics and tries to answer the basic questions, "What is a computer?", "What is an algorithm?", and "What is computable?". This course examines important theorems and proofs, establishes a number of interesting assertions in order to expose the techniques used in the area of theory of computation. Note that although this is not a "mathematics" course, it does make significant use of mathematical structures, abstractions, definitions, theorems, proofs, lemmas, corollaries, logical reasoning, inductive proofs, and the like. If such concepts are difficult for you, you will find this course very difficult but rewarding. I invite you to accept the challenge. Course Outline Automata theory, formal languages, Turing machines, computability theory and reducibility, computational complexity, determinism, nondeterminism, time hierarchy, space hierarchy, NP completeness, selected advanced topics. Class Time Mondays, 4:30 PM - 7:30 PM Texts Our Primary Source Michael Sipser, Introduction to the Theory of Computation, PWS Publishing, Boston, 1997. Our Secondary Source John E. Hopcroft and Jeffrey D. Ullman, Introduction to Automata Theory, Languages, and Computation, Addison-Wesley, 1979. John E. Hopcroft, Rajeev Motwani, and Jeffrey D. Ullman, Introduction to Automata Theory, Languages, and Computation, Addison-Wesley, 2006. Mikhail J. Atallah and Mariana Blanton (Eds.), Algorithms and Theory of Computation Handbook: General Concepts and Techniques, CRC Press, New York, 2009 (2nd Edition). Thomas Cormen, Charles Leiserson, Ronald Rivest, and Cliff Stein, Introduction to Algorithms, McGraw Hill Publishing Company and MIT Press, 2009 (3rd Edition). HEC Recommended Text Books/Reference Books Michael Sipser, Introduction to the Theory of Computation, PWS Publishing, Boston, 1997. Christos Papadimitriou, Computational Complexity, 1994, Addison-Wesley. John Hopcroft and Jeffery Ullman, Introduction to Automata Theory, Languages, and Computation, 1979, Addison-Wesley. Tao Jiang, Ming Li, and Bala Ravikumar, Formal models and Computability, in Handbook of Computer Science, CRC Press, 1996. T. H. Cormen, et al., Introduction to Algorithms, MIT Press and McGraw-Hill Book Co., 1990. Peter Linz, An Introduction to Formal Languages and Automata, Jones and Bartlett Publishers, 2006. Ethics The Honor Code will be strictly enforced in the classroom. It is a violation to represent joint work as your own or to let others use your work; always acknowledge any assistance you received in preparing work that bears your name. You are expected to work independently unless explicitly permitted to collaborate on a particular assignment. It is not a violation to discuss approaches to problems with others; however, it is a violation to use wording or expressions in your assignments that have been written by others without acknowledging the source. Exams Midterm April 4, 2011 4:30 PM Final June 6, 2001 Evening Session Points Distribution Quizzes 15% Midterm 25% Final 60% Grades A+ 3 A 2 B+ 1 B 1 C+ 3 C 2 D+ 1 D 0 F 4 Always Learning HIGHER EDUCATION / EDUCATORS USA (change) Browse by discipline Search by aut Sign in or sign up | Find your rep | Exam copy bookbag Home / Educators / Catalog & Instructor Resources / Request Access / Request Pending Thanks for your interest in Pearson's Instructor Resource Center! Dear Dr. Brohi, We received your initial request for access on February 9, 2011 and it is being reviewed by our faculty services team. We appreciate your patience as we ensure the security of our "for educator eyes-only" resources. In the meantime, if you have any questions, please contact your Pearson representative (you can find his or her contact information at http://www.pearsonhighered.com/replocator) or visit our Customer Technical Support site athttp://247pearsoned.custhelp.com. Request ID : 1813944