Time complexity
Problem 1
Q1.
T(N)
S(1)
Reason: The second for loop is unreachable so it only runs one loop
Q2.
T(1)
S(1)
Reason: It does not rely on input x (Independent of input) so the run time is always the same no matter what.
Q3.
T(1)
S(1)
The function look like it depends as n but we are considering large n.
For Large n, the run time is 1.
Q4.
T(NlogN)
S(logN)
Each recursive call divide by 3 so is Logn times. Because of the for loop, its nlogn
For space, we look at how deep it goes.
The final level would have n/3^k
Q5,
T(n)
S(1)
Time is n because even thou the time is divided by half, we round off to n with a multiple of 1/2
Since n is use as a condition, we do not divide by 2 each time,
run time is n.
Space is one as we just push and pop at the same time
Q6/
T(n)
S(n)
Property of logarithms:
a^(log[a]B) = B
We run recursion twice each time, we divide n by 2.
We realised as the level increase, we will reach
2^log[2]N * c
which is rounded to Nc
The space is n because queue is n time only
Problem 2
a)
F (n^4)
Reason: Its obvious that the largest n is to the power of 4
b)
C(n)
Reason:
T(n/4) is a recursion that will evaluate to
3n + 3n/4 + 3n/4^2 + ...
which in turn, the largest value sum to 3n
c)
D (nlogn)
Its paranoid quicksort,
Each time it recurse, it does 3n
It doesnt change the work done because both recursion added does the same amount of job
The number of 3n is logn
d)
e (n^2)
Similar to the prev but instead of having equal work, each time the work gets smaller.
The largest value in the fn is still n^2
e)
H (n^log[2]5)
Each time it divides by 5 due to the summation, we will reach the time complexity of
t(N/2)*5 + c
As we decrease the levels, it will see a pattern off
c + 5c + 5^2C + .... 5^log[2]nC , where log[2] n > 2
Therefore the largest value is 5^log[2]n
f)
G (2^n)
It splits into two each time,
even if the work is n-1 each time, the time constant was
c, 2c .... 2^n C
Largest is 2^2