Recurrence relations
Example: Mergesort
Simple linear recurrence
First, let's just consider merging.
…
Divide-and-Conquer
Now, let's consider mergesort, a typical divide-and-conquer algorithm.
…
What is the (worst-case) running-time of this algorithm, for input size n?
T(n) = ??
It's not clear what a closed-form answer is.
But we can easily find a recursive formulation:
T(n) = 2T(n/2) + n; T(2)=1.
[Draw out tree, assuming n = 16, say.
In blue, write the amount of work done at that node;
the total time for a node is that blue plus all the blue underneath.
Add up all the blue: observe that by rows-of-tree, it's n on each row.
There are log(n) rows, if n a perfect power of 2.
Thus, total n⋅log(n) + #leaves⋅1;
#leaves is 2height = 2(lg(n)) = n.
Total work: O(n log n).]
Exercise: show that this big-oh still holds for n not being a perfect
power of 2. Not too hard, if we assume that more input never
takes less
time. Then just bound T(n) with T(nearest-power-of-2-bigger-than-n)
(which is less than 2n).
Now, consider other divide-and-conquer algorithms (ignoring floor/ceiling):
-
mergesort: Tm(n) = 2Tm(n/2)+Θ(n).
Answer from comp210: Tm(n) = O(n log n).
-
binary search: Tbs(n) = Tbs(n/2)+1.
Answer from comp210: Tbs(n) = O(log n).
(Remember, these are worst-case running times.)
Master theoremm (a simplified version)
for recurrence relations arising from divide-and-conquer:
For f(n)=a⋅f(n/b)+c⋅nd,
-
if a < bd: f(n) is O(nd)
-
if a = bd: f(n) is O(nd log n)
-
if a > bd: f(n) is O(nlogb(a))
Exponential Growth
Does this cover all types of recurrences? No; consider:
breaking an n-digit password.
Brute force:
T(n)= 10 T(n-1)
Clearly the closed-form answer is 10n.
What of Fibonacci?
{ fib(n-1)+fib(n-2) if n ≥ 2
fib(n) ≡ { 1 if n = 1
{ 0 if n = 0
[Consider: different initial conditions; running the process backwards.]
-
computing answer by def'n:
Hmm, lots of terms, maybe even 2n?
-
can memoize / work forward: O(n).
-
Would be nice if we had a closed form relation
for the answer: O(1) (???).
fib(n) = αn/√5 + alphaBarn/√5,
where α = (1+√5)/2, alphaBar = (1-√5)/2.
[Got to here 2004.apr.13]
But why is this the answer / closed form?
What if we change the initial coniditions?
Approach to recurrences of the form f(n)= b⋅f(n-1) + c⋅f(n-2):
-
Guess a sol'n of the form f(n)=αn; what is α?
Well, we get α2 - b⋅α - c = 0,
[The "characteristic polynomial"; would be cubic if we had involved f(n-3).]
We can compute the solutions to this, α = … .
-
Sure enough, can check back that these answers satisfy the recurrence,
but what about the initial conditions?
Hmm, doesn't go.
-
Realize that sol'ns to the recurrence relation (w/o the initial conditions)
are closed under (a) sums, (b) constant-factors.
If α1n and α2n are sol'ns,
then so are: k⋅α1n + j⋅α2n, for any j,k ∈ ℜ.
(That is, ``linear combo of two sol'ns is also a sol'n'',
which math people go so far as to say ``the set of solutions
form a vector space''.)
-
Hmm, these j,k we can solve for. (Do so.)
-
Are we done? Are there more solutions that aren't of the
form αn?
At this point, we ask the mathematicians, and trust us when
they give us the answer we want: yes, this is all solutions.
This technique generalizes:
-
Guess an exponential, get a polynomial of degree n,
-
Solve roots α1, …, αn.
-
A linear combination of those alpha's makes up all the possible answers.
-
Caveat: if αi occurs with multiplicity m, then you also include
include αin, n⋅αin, n2⋅αin, …, nm⋅αin.