Uploaded by Thomas Joerge

Complexity Summary

advertisement
Array Stack
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
init
O(max_capacity)
O(max_capacity)
It is limited by the size of the ArrayR
is_full
O(1)
O(1)
Simple arithmetic comparison is
assumed to be constant
push
O(1)
O(1)
Using ArrayR's setter is constant time
and the precondition uses is_full
pop
O(1)
O(1)
This just adjusts the length, pushing
the item into waste memory so it is
constant
peek
O(1)
O(1)
Same as pop but length is not
modified
Array Set
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
init
O(max_capacity)
O(max_capacity)
It is limited by the size of the ArrayR
clear
O(1)
O(1)
Simple assignments are assumed to
be constant
len
O(1)
O(1)
Return statements are constant
is_empty
O(1)
O(1)
Simple arithmetic comparisons are
assumed to be constant
is_full
O(1)
O(1)
Same as above
O(size*comp)
If linear search is used, then we
have to use a for loop. If the desired
item is first, then the complexity is
just O(comp) but if it is at the end,
we have to iterate through the whole
set, thus the complexity is O(size
*comp)
O(contains())
contains() is the most expensive
operation and everything else is
constant
contains
add
O(comp)
O(contains())
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
remove
O(comp)
O(size*comp)
Uses the same linear search as
contains, therefore its complexity is
similar. Everything else is constant
(assignment and setting)
union
O(size of
self*size of
other*comp)
O(size of
self*size of
other*comp)
Two for loops with a comparison
from add(), which comes from
contains()
intersection
O(size of
self*size of
other*comp)
O(size of
self*size of
other*comp)
One for loop with a contains(), which
runs size of other times
O(size of
self*size of
O(size of
self*size of
other*comp)
other*comp)
difference
Same as above but with not in
instead of in
Bit Vector Set
Operation
Best Case
Complexity
Worst
Case
Complexity
Reasoning
init
O(1)
O(1)
Integers have an arbitrary size in Python
clear
O(1)
O(1)
Basic assignment is constant
O(number
O(number
Elems is not elements in the set! It's the size
of the integer in bitwise form (e.g. a single item
of elems)
of elems)
set with a value of 1000 would need 999
iterations in the for loop)
is_empty
O(1)
O(1)
Simple arithmetic comparison is constant
is_full
O(1)
O(1)
add
O(1)
O(1)
It is a single bit shift to the right
remove
O(1)
O(1)
It is a single bit shift to the left
union
O(1)
O(1)
It uses the or bitwise operation
intersection
O(1)
O(1)
It uses the and bitwise operation
difference
O(1)
O(1)
It uses the and bitwise operation and negation
len
Linear Queue
It always returns False because arbitrary-sized
integers can't be full, hence it's constant
Best Case
Complexity
Worst Case
Complexity
O(max
O(max
capacity)
capacity)
clear
O(1)
O(1)
Simple reassignment is assumed to be
constant
is_full
O(1)
O(1)
Comparison is assumed to be constant
append
O(1)
O(1)
serve
O(1)
O(1)
Everything is a simple assignment and
is_empty() is constant so it is constant
Best Case
Complexity
Worst Case
Complexity
Reasoning
O(max
O(max
capacity)
capacity)
clear
O(1)
O(1)
Simple reassignment is assumed to be
constant
is_full
O(1)
O(1)
Comparison is assumed to be constant
append
O(1)
O(1)
serve
O(1)
O(1)
Operation
init
Reasoning
It is limited by the size of ArrayR
Everything is a simple assignment and
is_full() is constant
Circular Queue
Operation
init
It is limited by the size of ArrayR
Everything is a simple assignment and
is_full() is constant
Everything is a simple assignment and
is_empty() is constant so it is constant
Array List
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
init
O(max
capacity)
O(max
capacity)
It is limited by the size of ArrayR
getitem
O(1)
O(1)
Getting is a constant time operation
setitem
O(1)
O(1)
Setting is a constant time operation
Best Case
Complexity
Operation
index
O(comp)
Worst Case
Complexity
O(size*comp)
Reasoning
Uses linear search so if the item is
in the first position, then the for loop
doesn't have to run (best case).
Worst case, it has to run through the
entire array and compare everything
delete_at_index
O(1)
O(size)
Uses a similar linear search idea as
above but there is no comparison
cost
better
O(size -
O(size -
If we recycle index, we can tighten
delete_at_index
indexToDelete)
indexToDelete)
the upper bound
append
O(1)
O(size)
If the list is not full, then it is a simple
case of assignment. If the list is full,
the array has to be resized and the
data copied over, which means the
for loop will run size times
O(comp +
size)
remove
O(comp *size)
If the element is at the head of the
list, index cost is comp and
delete_at_index cost is size (best).
Worst case it's at the end so the
cost is multiplied through the for
loop
O(size indexToDelete)
insert
O(size)
If we have to resize it, that will incur
a greater cost. If not we can just
shuffle everything from the index to
delete and the rest is constant
reassignment
Sorted List
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
init
O(1)
O(1)
Simple assignment is constant
len
O(1)
O(1)
Return statements are constant
is_empty
O(1)
O(1)
Simple comparison is constant
clear
O(1)
O(1)
Simple reassignment is constant
remove
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
Best case complexity stops in
the first iteration of the loop so
only the cost of comparison is
standard
index()
O(equalComp)
O(size *(equalComp +
lessThanComp))
considered. In the worst case, it
goes through the entire loop and
has to do both comparisons
(correct value and use the sorted
invariant with <)
binary
search
index()
add with
linear
O(equalComp)
O(equalComp*log(size))
Best case is the item's in the
middle so it stops immediately,
only cost is comparison. Worst
case is it's at the end/not there
so we have to do log(size)
'halvings'
The best and worse cases here
don't line up! But what we're
considering is the shuffle and the
O(log(size))
index so if we split their best and
worse case, we get this. Best
case occurs when the item's in
the last position and worst when
O(size)
search
the item is first
add with
binary
O(size)
Again, index and shuffle have
different best/worse cases but in
this case they complement each
other as O(1)/O(size), so it turns
O(size)
search
out that add() will have just
O(size) as the complexity in
either case!
Linked List
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
init
O(1)
O(1)
Basic assignment is constant
O(targetIndex)
Everything is constant except for
the for loop, which runs
get_node_at_index
O(targetIndex)
targetIndex times
Operation
Best Case
Complexity
Worst Case
Complexity
Reasoning
setitem
O(targetIndex)
O(targetIndex)
Because we use
get_node_at_index
getitem
O(targetIndex)
O(targetIndex)
Same as above
delete_at_index
O(targetIndex)
O(targetIndex)
All assignments and simple
comparisons are constant, except
for the get_node_at_index
insert
O(targetIndex)
O(targetIndex)
Same reasoning as above
Everything is constant except for
the loop and the comparison. In
index
O(comp)
O(size*comp)
the best case, the loop doesn't
even run so the complexity is just
O(comp). In the worst case, it has
to iterate through the whole thing,
hence O(size*comp)
Bubble Sort
Compare adjacent elements and swap if the one on the left is bigger
Best case (assuming optimised): O(size)
Worst case: O(size^2)
Unoptimised form is not incremental, optimised form is
Always stable if ≥ used
Maximum number of traversals needed to guarantee sort is size - 1
Selection Sort
Traverse the whole list, find the smallest one and swap it with the leftmost unsorted
element
Best case: O(size^2)
Worst case: (size^2)
Not very incremental
Not stable because non-consecutive elements can be swapped
Smallest elements are sorted first on the left
Insertion Sort
Make a sorted and unsorted section and traverse the unsorted side and insert elements
into the correct position in the sorted size
Best case: O(size)
Worst case: O(size^2)
Incremental if appended to the end (i.e. unsorted side); not incremental if appended to
the front
Always stable if ≥ used
Left half will always be sorted
Download