Uploaded by Shifa Yashmin

Time and Space Complexity: Algorithm Analysis Report

advertisement
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
INSTITUTE OF ADVANCE EDUCATION &RESEARCH
(FORMERLY KNOWN AS DARJEELING UNIVERSAL CAMPUS)
Assignment Report on
Topics Name: Time and Space Complexity
Name of Subject: Design and Analysis Algorithm
Subject Code: FYCYS401
Branch: B.sc in Cyber Security
Semester:4th semester
Presented By
Name of Student: Arkapriya Das
University Roll Number: 28840423006
University Registration No.: 232882410046
DUC-288 1
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
Time and Space Complexity
In the Design and Analysis of Algorithms, understanding the time
complexity and space complexity is crucial for evaluating how
efficient an algorithm is in terms of execution time and memory
usage. These complexities help in comparing different algorithms and
understanding how they will perform as the size of the input grows.
1. Time Complexity
Time complexity refers to the amount of time an algorithm takes to complete
as a function of the size of the input. It provides an estimate of the number of
basic operations (e.g., additions, comparisons, assignments) that will be
performed.
Time complexity is usually expressed using Big O notation (O()), which
describes the upper bound on the time required by the algorithm in the worst
case, in terms of input size nnn. The time complexity tells us how the runtime
grows with the size of the input.
Common Time Complexities:
1. O(1): Constant time.
o
o
The algorithm’s execution time is independent of the input size.
Example: Accessing an element from an array by index.
2. O(log n): Logarithmic time.
o
o
The execution time grows logarithmically as the input size increases.
Example: Binary search algorithm on a sorted array.
3. O(n): Linear time.
o
o
The execution time grows linearly with the size of the input.
Example: A linear search in an unsorted array.
4. O(n log n): Linearithmic time.
o
o
The execution time grows faster than linear but slower than quadratic.
Example: Merge Sort and Quick Sort (average case).
5. O(n^2): Quadratic time.
o
o
The execution time grows quadratically with the size of the input.
Example: Bubble Sort, Insertion Sort, or any algorithm with nested loops.
6. O(n^3): Cubic time.
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
o The execution time grows cubically with the size of the input.
o Example: Algorithms involving three nested loops, such as matrix
multiplication using the naïve approach.
7. O(2^n): Exponential time.
o
o
The execution time grows exponentially with the size of the input.
Example: Recursive solutions to problems like the Traveling Salesman
Problem (TSP) or Fibonacci sequence (naive recursive approach).
8. O(n!): Factorial time.
o
o
The execution time grows factorially with the input size.
Example: Brute-force solutions to the Traveling Salesman Problem
(TSP), checking all permutations.
Big O Notation Examples:
•
O(1): No loops or recursive calls, constant operations.
o
•
O(n): One loop that runs nnn times.
o
•
Example: Accessing an element in an array by index.
Example: Finding the maximum element in an unsorted list.
O(n^2): Nested loops, each running nnn times.
o
Example: Bubble Sort.
Worst, Best, and Average Case:
•
•
•
Worst-case time complexity: The maximum time required for an algorithm,
assuming the worst possible input.
Best-case time complexity: The minimum time required for an algorithm,
assuming the best possible input.
Average-case time complexity: The expected time required for an algorithm,
assuming random input.
2. Space Complexity
Space complexity refers to the amount of memory or space an algorithm
needs to run, as a function of the size of the input. It describes how the
memory requirements grow with the input size.
Just like time complexity, space complexity is often expressed using Big O
notation, and it accounts for both:
•
•
The auxiliary space (extra space needed by the algorithm).
The space for input (which is typically considered fixed or given).
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
Common Space Complexities:
1. O(1): Constant space.
o
o
The algorithm uses a fixed amount of space, regardless of the input size.
Example: An algorithm that sorts data in-place without needing extra
storage.
2. O(n): Linear space.
o
o
The algorithm’s space requirements grow linearly with the size of the input.
Example: Storing a list of nnn elements, or creating array.
3. O(n): Linear Space
•
•
•
Definition: An algorithm is said to have linear space complexity (O(n)) if the
amount of memory it requires grows linearly with the size of the input. This means
that as the input size increases, the memory required by the algorithm increases
proportionally.
Key Points:
o The algorithm stores data structures (like arrays, lists, or hashmaps) whose
size is proportional to the size of the input.
o This can include storing the entire input itself or generating a new structure
to hold intermediate results.
Example:
o Storing an Array/List: If we store n elements in an array or list, the space
complexity will be O(n) because the memory required grows linearly with
the number of elements.
Other Space Complexities:
While O(1) and O(n) are common, there are other space complexities too:
•
•
O(n^2): This occurs when the algorithm uses space proportional to the
square of the input size. Example: 2D arrays (matrices).
O(log n): This occurs when space complexity grows logarithmically
with input size. Example: Recursion in algorithms like Binary Search (in
some implementations).
THE END
DUC-288
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
2
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
4
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
REPORT WRITING ON ASSIGNMENT (CA2)MAKAUT
Download