Consider the following recurrence relation.
T(n) = { 2 if n = 0, 1 { T(floor(2n/3)) + lg(n) if n > 1 Find upper and lower bounds on the value of T(n), using the Master Theorem. Then try to find a tight bound. * This is a trick question, because the Master Theorem only applies to recurrences where the extra work performed outside the recursive calls takes time \Theta(n^d) for some constant d. Nevertheless, we can use the Master Theorem to obtain separate upper and lower bounds on the value of T(n). Consider the following recurrence relations. L(n) = { 2 if n = 0, 1 { L(floor(2n/3)) + 1 if n > 1 U(n) = { 2 if n = 0, 1
{ U(floor(2n/3)) + n if n > 1
Since 1 <= lg(n) <= n for all n > 1, it is easy to see that for all n,
L(n) <= T(n) <= U(n).
Applying the Master Theorem to L(n), we get a = 1, b = 3/2 and d = 0 so
a = b^d and L(n) (- \Theta(n^d log n) = \Theta(log n).
Hence, T(n) (- \Omega(log n).
In this case, it would be trivial to prove T(n) >= lg(n) by induction on
n >= 1-- it may seem that the inductive hypothesis is not necessary, but
it is to ensure that T(floor(2n/3)) is not negative.
{ Go over this proof at the end of you have time left-- but don't worry
about it if you don't. }
Applying the Master Theorem to U(n), we get a = 1, b = 3/2 and d = 1 so
a < b^d and U(n) (- \Theta(n^d) = \Theta(n).
Hence, T(n) (- O(n).
Again, it is trivial to prove T(n) <= 2 n by induction on n >= 1--
making use of the fact that lg(n) <= n for all n >= 1.
{ Go over this proof at the end of you have time left -- but don't worry
about it if you don't. }
Unfortunately, we have reached the limit of what can be learned by
applying the Master Theorem-- there is no way to make the bounds any
tighter.
{ Well, that's not quite true: if we remember that lg(n) is o(n^c) for
any realy number c > 0 (from calculus), then we can prove that T(n) is
O(n^c) for any real number c > 0. But that still leaves a gap between
the upper and lower bounds. }
In order to get a tight bound, we need to use something like the
repeated substitution method. We will need to use the following facts
about logarithms-- it is assumed that you are aware of these properties,
and you are allowed to make use of them in your own work. For all real
numbers x, y > 0, and all bases b > 1:
- log_b(x/y) = log_b(x) - log_b(y)
- log_b(xy) = log_b(x) + log_b(y)
- log_b(x^y) = y log_b(x)
Repeated substitution for T(n):
T(n) = T(2n/3) + lg(n)
= T(2(2n/3)/3) + lg(2n/3) + lg(n)
= T((2/3)^2 n) + 2 lg(n) + lg(2/3)
= T((2/3)^3 n) + lg((2/3)^2 n) + 2 lg(n) + lg(2/3)
= T((2/3)^3 n) + 3 lg(n) + 3 lg(2/3)
= T((2/3)^4 n) + lg((2/3)^3 n) + 3 lg(n) + 3 lg(2/3)
= T((2/3)^4 n) + 4 lg(n) + 6 lg(2/3)
after i substitutions,
= T((2/3)^i n) + i lg(n) + (i (i-1) / 2) lg(2/3)
T( ) "disappears" from the right-hand side when (2/3)^i n = 1, i.e.,
i = log_{2/3}(1/n) = log_{3/2} n. For this value of i,
T(n) = T(1) + log_{3/2} n lg(n)
+ (log_{3/2}(n) (log_{3/2}(n) - 1) / 2) lg(2/3)
Based on this, we expect that T(n) (- \Theta(log^2 n)
No comments:
Post a Comment