An algorithm that contains a call to itself is called a recursive algorithm.
Usually seen in "Divide and Conquer" type algorithms, the running time of recursive
algorithms can often be described by a recurrence. A recurrence is an equation
or inequality that describes a function in terms of its values on smaller inputs.
There are three methods for solving recurrences to obtain asymptotic bounds on the
recursive algorithms they describe. These methods are as follows.  The substitution method  Using the substitution method, one needs to follow the following two steps in order to solve recurrences. 1. Guess the form of the solution. 2. Use mathematical induction to show the guess in step 1 works.  The iteration/recursiontree method  In this method, one repeats substituting t(n) with what t(n) equates to until a pattern is discovered which ends with t(constant), as the input of the function gets smaller with every step. Example:
Alternatively one can draw a recursion tree which is very similar to writing out the iterations
Normally this method is used for producing a good guess for the substitution method, but if you are careful with the details when doing the iteration or drawing the tree, you don't need to prove your answer with substitution.  The master method  Let a ≥ 1 , b > 1 . Let t(n) = at(n/b) + f(n) . t(n) = Θ(1) . then t(n)
can be bounded asymptotically as follows:1. if f(n) = O(n ^{logbaε}) for some constant ε > 0, then t(n) = Θ(n ^{logba}).
2. if f(n) = Θ(n ^{logba}), then t(n) = Θ(n ^{logba}lg n).
3. if f(n) = Ω(n ^{logba+ε}) for some constant ε > 0, and if af(n/b) ≤ cf(n) for
some constant c _{1}, and n sufficiently large, then t(n) = Θ(f(n)). Note that not all recurrences can be solved using the master method. For the ones that don't apply to any of the above cases, you have to use either the iteration/recursiontree method or the substitution method.

Foundations > Algorithms >