Recursion & Recursive Algorithms in Python: Definition & Examples

Instructor: Apostolos Ntokos

Apostolos has a master degree in Computer Science and his academic interests are: Algorithms, Machine Learning, Game Theory.

In this lesson, we will learn about Recursive Algorithms and their benefits and challenges. We will explain all this using the Divide and Conquer Algorithm, which is a typical example of the recursive technique. We will explain how to code recursive functions in Python, how to form recurrence relations, how to use the Master Theorem, as well as write a function that performs a recursive search of data (Binary Search).

Recursive Algorithms - Divide and Conquer

A recursive algorithm is an algorithm which calls itself with a smaller problem. More generally, if a problem can be solved utilizing solutions to smaller versions of the same problem and the smaller versions reduce to easily solvable cases, then one can use a recursive algorithm to solve that problem.

When we design and implement a recursive algorithm, we use the Divide and Conquer algorithmic technique. Like greedy and dynamic programming, Divide and Conquer is one of the most important and widely used algorithmic paradigms. The general method of designing Recursive Algorithms is the following:

  • Divide: Divide the given problem into 'smaller' and 'simpler' sub-problems of the same type.
  • Conquer: Recursively solve these sub-problems, using simple algorithms usually in constant time.
  • Combine: Combine the answers of the sub-problems, to produce the solution of the initial problem.

Each recursive algorithm must have:

  • Base case: A case for which the answer is known (and can be expressed without recursion).
  • General (recursive) case: The case for which the solution is expressed in terms of a smaller version of itself.

Next, we implement, as an example, a simple recursive algorithm for computing the k-th power of 2.

def kPowerOf2(k):
 if k == 0:
  return 1
 else:
  return 2*kPowerOf2(k-1)
print(kPowerOf2(5))

Here, the base case is the case where k is equal to 0, otherwise we are in the general case.

Each time kPowerOf2() is called, Python instantiates a data structure to represent that particular call to the function. This data structure is called a stack frame. When the function returns, the return value is left on top of the stack for the calling function to access.

Recurrence Relations

Recurrence relations give us a powerful tool for calculating the time complexity of Recursive Algorithms. It is very important to compute the time complexity of a recursive algorithm, because as we will see sometimes the time complexity could be exponential, which means that we have to change our implementation. But how do we form a recurrence relation?

Let's use, as an example, the Binary Search algorithm. The idea of binary search is to use the information that the array is sorted and reduce the time complexity to O(log n).

Let's implement the binary search in the Python programming language.

def binary_search(A, l, h, k):
  if h >= l:
    mid = l + (h - l)/2
    if A[mid] == k:
      return 'k is found'
    elif A[mid] > k:
      return binary_search(A, l, mid-1, k)
    else:
      return binary_search(A, mid+1, h, k)
  else:
    return 'k is not found'

As we can see, the binary search function takes 4 parameters (A, l, h, k):

  • A: the array
  • l : the leftmost element of the array
  • h: the rightmost element of the array
  • k: the element we search for

An empty list is a base case for the binary search algorithm. We check this using the condition if h >= l. The binary search algorithm actually has another type of base case: If we find the element we are looking for in the middle of the list, we are done. There is no need for further recursion.

Now, let's form the recurrence relation. In each step, the algorithm:

  • compares the input element k with the value of the middle element in array. If the values match, return the index of middle. O(1)
  • otherwise, if k is less than the middle element, then the algorithm recurs with left side of middle element, else it recurs with the right side of middle element. So, in each step, we discard half of the array. T(n/2)

The algorithm solves the problem recursively solving one sub-problem of size n/2. So, the recurrence relation is: T(n) = T(n/2) + O(1).

Master Theorem

In order to solve a recurrence relation we can apply the Master Theorem.

Figure 1 outlines the definition of the Master Theorem.


Figure 1: Master Theorem
null


Parameter a denotes the number of sub-problems, parameter b is the size of each problem's sub-array, and f(n) is the time taken to create the sub-problems and combine the solutions.

Now let's apply the Master Theorem to the recurrence relation of the Binary Search.

T(n) = T(n/2) + O(1), so:

  • a = 1 , b = 2.
  • O(1) = Theta(n^0) = Theta(1)

So, T(n) = Theta(n^0 log n) = Theta(log n).

The time complexity of the Binary Search is O(log n), which is 'faster' than the time complexity of the linear search O(n). In this case, the Divide and Conquer technique helped us to solve a problem more efficiently in terms of time complexity.

Note: If we cannot use the Master Theorem, we can solve the recurrence relation using the recursive tree, or mathematical induction.

To unlock this lesson you must be a Study.com Member.
Create your account

Register for a free trial

Are you a student or a teacher?

Unlock Your Education

See for yourself why 30 million people use Study.com

Become a Study.com member and start learning now.
Become a Member  Back
What teachers are saying about Study.com
Free 5-day trial

Earning College Credit

Did you know… We have over 160 college courses that prepare you to earn credit by exam that is accepted by over 1,500 colleges and universities. You can test out of the first two years of college and save thousands off your degree. Anyone can earn credit-by-exam regardless of age or education level.

To learn more, visit our Earning Credit Page

Create an account to start this course today
Try it free for 5 days!
Create An Account
Support