Uploaded by ab46aaeeae

Lecture 1 (3)

advertisement
Analysis of Algorithms
By: Raja Wasim Ahmad
Raja.khan@ajman.ac.ae
Data Structure and Algorithm
• Data: Quantities, symbols, or characters on which operations are performed by a computer.
• Deals with data storage (organization) in computer memory (systematic way)
• Access and manipulate data in an efficient way
• Efficient?
• Algorithm: Finites set of steps (instructions) to solve a problem
• Computer language independent (Pseudocode)
• {S1: Get A, S2: Read B, S3: Sum=A
+B, S4: Print Sum}
• Finiteness: Finite time to execute each instruction
while (True)
{a=1;}
• Unambiguity /Definiteness
• Input and output
Algorithm Analysis
• Determination of the amount of time and space required to execute
an algorithm.
• Why algorithm analysis?
•
•
•
•
Predict behavior of algorithm
Mission critical task
Compare algorithms
Optimize existing algorithms
• Priory vs. Posterior (experiments)
Example (Posteriors Method)
Challenges of experimental/posterior analysis
• Experimental running times of two algorithms are difficult to directly
compare unless the experiments are performed in the same hardware
and software environments.
• Input samples may not cover all the possible inputs and scenarios
• An algorithm must be fully implemented in order to execute it to
study its running time experimentally.
• Resource requirements
Asymptotic Notation (Priory Method)
• To estimate how long the program will run
• Evaluate the relative efficiency of any two algorithms in an
environment which is hardware and software independent.
• No need to implement algorithms.
• Takes into account all possible inputs.
• Growth rate of the function with respect to size of input
• Using asymptotic analysis we can estimate about the best, worst, and
average case behavior of an algorithm.
Time complexity examples
A[]={12,4,5} -
Example 1
Example 2 I=0: 1 I<N: N+1 N
Frequency count:
Sum (A, N)
{
{N=2}
1. Sum=0; (1)
2. For (i=0; i<N; i++)
{
3.
Sum=sum+A [i];
}
}
S 1: 1 time
S 2: 1+ (N+1)X1+(N)X1 times
S 3: NX1 times
Total= S1+ S2+S3
T (n)= 1+(1+{N+1}+N)+ N
F (N)=3+3N (N)
4N+3N^2+2  3N^2 N^2
More Examples
Tricks
• Drop the non dominant terms
• Drop the constant terms
• Break the code into fragments
T (N)= N^3+ 500N+3456
Fundamental functions in algorithm analysis
Types of time function
• Constant
O (1)
• Logarithmic O (LogN)
• Linear
O(n)
• Quadratic
O (NXN)
• Cubic
O (NXNXN)
• Exponential O (2^N)
1 < Logn < n < N X logn < N^2 < N^3….. < N^n
Asymptotic Notations
• Big O notation
• Upper bound estimation
• Big Omega Notation
• Lower bound estimation
• Big Theta Notation
• Average bound estimation
Big O Notation
Examples
• 8n+5 is O(n)
F (n)=8n+5
G(n)=n : N? C?
8n<= C. N
n=5 C=9
• 5n4 +3n3 +2n2 +4n+1 is O(n4). (c = 15, when n ≥ n0 = 1)
• 3log n+2 is O(logn). (c=2, n>=2)
Big-Omega
• 2n³ - 7n + 1 = Ω(n³) (n₀= 3, c = 1)
• 3nlog n−2n is Ω(nlog n). (c = 1 and n0 = 2)
Big-Theta
2n³ - 7n + 1 = Θ(n³)
3nlog n+4n+5logn is Θ(nlog n) (n>=2)
Summary
• We use functions to describe algorithm runtime; Number of steps as
a function of input size
• Big-Oh/Omega/Theta are used for describing function growth rate
• A proof for big-Oh and big-Omega is basically a chain of inequality.
• Choose the n₀ and c that makes the chain work.
Space Complexity
• The amount of memory space required to solve an instance of the
computational problem as a function of the size of the input.
• Memory required by an algorithm to execute a program and produce
output.
• It is expressed asymptotically in Big-O notation e.g., O (nlogn), O(N),
and O (N^2)
Fixed Space Function
P, Q, R = 1
S (N)= O (3)
S(N)=O (1)
Linear Space Complexity
I, temp, n, array
i=temp=n= 1 space
Array= N X 1
O (N+3)
O (N)
Quadratic Space Complexity
• Hint: Use 2D arrays
References:
• https://www.youtube.com/watch?v=Nd0XDYjVHs&list=PLDN4rrl48XKpZkf03iYFl-O29szjTrs_O&index=12
• Text Book
Download