Uploaded by syedaliasad6666

Assignmnet NO1 BCS-5A-5C,5D Spring-23

advertisement
Assignment NO1-BCS-5(A,C,D).
More on Asymptotic Running Time Analysis
CMPSC 122
Due: 30/9/22
(CLO-2)
I. Asymptotic Running Times and Dominance
In your homework, you looked at some graphs. What did you observe?
When analyzing the running time of algorithms, we really care about what happens in the long run, or what happens when n
is large. Thus…
•
We are not terribly concerned with what happens when __________________ (input size is small)
•
We are concerned with how quickly _______________________________________ (running time functions grow)
•
Details like _______________________________________________ don't matter when n is large (coeffs, lower
order terms)
We call this kind of analysis asymptotic analysis or, sometimes, aymptotics.
We say that a function f (x) dominates a function g (x) when…
Examples:
1. Does 23n2 dominate n + 7?
n3
vs. 2n3 + 7?
5
2.
How about
3.
How about 450 ⋅ 3n vs. 3n?
4.
How about 150n vs. 20n?
(Please note that we are being a little informal with some of this analysis in this course.)
Page 1 of 6
Prepared by D. Hogan for PSU CMPSC 122
II. Key Families of Functions
It turns out that there are a few families of functions that tend to be important. Ranked in order of dominance, they are:
1.
2.
3.
4.
5.
6.
7.
8.
nn
n!
cn
nc
n lg n
n
lg n
c
for all constants c s.t. c > 1, where if a > b, an dominates bn
for all constants c s.t. c ≥ 2, where if a > b, ca dominates cb
Two notes:
• Since all constants grow at the same rate, anything that you might think to call Θ(c) gets simplified to Θ(1).
• In Θ-notation, each function in each of families 1-2 and 5-8 would all reduce to the same thing. Different constant
choices in families 3 and 4 yield different Θ-notation expressions.
• This list leaves some matters vague, but covers most of the common functions that arise.
• You should convince yourself of the correctness of this ranking by plotting several of these kinds of functions on the
same axis.
Problem: Simplify each function of n to Θ-notation and then rank the functions by dominance.
3n!
n!
+ n3
12
17 lg n + n 3 + n lg n
100 n + n 200000 + 50 n
n lg n +
nn
+ 7n
2
Problem: Simplify each function of n to Θ-notation and then rank the functions by dominance.
n 35 + lg n
lg 3n + 3
n 37
n(n + 1)
2n(1+ 3lg n)
III. O-Notation
Another kind of asymptotic notation is O-notation. While O-notation was invented before Θ-notation, it is less precise. So,
let's first give a more formal definition for Θ-notation:
We say a function f (x) is Θ(g (x)) when… (for all sufficiently large n…)
Now, let's give a more definition for O-notation:
We say a function f (x) is O(g (x)) when…
Page 2 of 6
Prepared by D. Hogan for PSU CMPSC 122
This asymptotic notation is said to put bounds on the functions. In this sense, Θ gives an asymptotic tight bound, while O
gives an asymptotic upper bound. Or, analogously, … (= vs. <=)
Example: 2n2 is ______________ and also 2n2 is ______________. But we could also say 2n2 is _____________.
Example: n! is O(_________)
Example: 5n lg n is O(_____________), but not O(__________________).
Example: n2 + 7n + 6 is O(_______________), but not O(___________________)
In some sense, we could think of a Θ-notation bound as being the best O-notation bound possible, and sometimes you'll see
the expression "best big-O bound" as a result.
IV. Example Code Fragments
Having gone a bit further, let's look at some code examples. Once again, we will, in each case, count the exact number of
elementary operations that are done. But we will then also simplify to Θ-notation too. (Intentionally, the first few are review,
and the numbering picks up where we left off before.
Example 1: for i = 1 to n
{
a=b+c-i
}
Example 8: for i = 1 to n
{
a=b+c–i
}
for i = 1 to n
{
a=b+c–i
}
Example 9: for i = 1 to n
{
for j = 1 to n
{
a = b + c + max(i, j)
}
}
Page 3 of 6
Prepared by D. Hogan for PSU CMPSC 122
Example 10: for i = 1 to 2n
{
for j = 1 to n
{
a = b + c + max(i, j)
}
}
Example 11: for i = 0 to n/2
{
for j = n/2 downto 1
{
for k = 1 to n
{
a = ib + jc + kd
}
}
}
Example 12: for i = 0 to n/2
{
for j = n/2 downto 1
{
for k = 1 to n
{
a = ib + jc + kd
}
}
z = z + 2^a
}
Example 13: for i = 0 to n/2
{
for j = 1 to n
{
a = j + 3a
}
for j = n downto 1
{
b = 50 + j - a
}
}
Example 14: for i = 1 to n
{
for j = 1 to n/3
// 3 elementary operations
for j = n/3 to 2n/3
// n elementary operations
for j = 2n/3 to n
// 1 elementary operation
}
Page 4 of 6
Prepared by D. Hogan for PSU CMPSC 122
Problem: Rank the running time functions for all of the above examples from fastest growing to slowest growing.
V. Back to Binary Search
When we starting discussing analysis, we looked at binary search under the assumption n was a power of 2. What if it's not?
Backing up, what did we conclude was the worst-case running time for a binary search when n is a power of 2?
Let's look at a few not-as-nice input sizes, again, in the worst case:
•
33
•
34
•
37
•
63
•
150
•
700
How does this fit with asymptotic notation?
Page 5 of 6
Prepared by D. Hogan for PSU CMPSC 122
VI. How This Fits in the Course and Curriculum
Problem: Presumably, you've written code to traverse or walk an array, i.e., print out all of the elements of that array.
Suppose an array has size n. What is the worst-case running time to walk an array and why?
In the remainder of the course, we'll look at more advanced data structures, algorithms on those structures, and sorting
algorithms. In all cases, we'll make a point of analyzing their asymptotic running times. Your final project will carry the
analysis of running times of sorts further and make connections between theory and practice.
Examples of code fragments we have handled here are ones where the math is "nice." In 360, we'll look at sequences and
counting tools that will help you to be able to analyze the running time of "harder" problems.
Finally, as 465 is the main analysis course in the curriculum, we'll look at more formal definitions of O and Θ-notation, as
well as a few other kinds of asymptotic notation there. We'll also encounter many more advanced analysis techniques, and a
running theme will be to analyze, formally, the running time of many new algorithms.
Page 6 of 6
Prepared by D. Hogan for PSU CMPSC 122
Download