Big O Notation
Big-O notation is a way to describe how an algorithm’s running time (or space) grows as the input size
grows, focusing on the trend, not exact seconds
When we look at big O notation we look at two possible ways to represent it:
1. Worst Case:
It describes the maximum time or space an algorithm could take for an input of size N
Why it matters:
First it guarantees performance won’t exceed a certain limit. Secondly, Important for reliability
and system design.
2. Best Case:
It describes the minimum time an algorithm could take for an input of size N
Why it matters:
First it shows optimal behavior. Secondly, Can be misleading if relied on alone.
Below are some exercises for you to practice. Find the time complexity for
both best and worst case for the following 8 codes