recursion – Determining complexity for recursive functions (Big O notation)

recursion – Determining complexity for recursive functions (Big O notation)

The time complexity, in Big O notation, for each function:

``````int recursiveFun1(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun1(n-1);
}
``````

This function is being called recursively n times before reaching the base case so its `O(n)`, often called linear.

``````int recursiveFun2(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun2(n-5);
}
``````

This function is called n-5 for each time, so we deduct five from n before calling the function, but n-5 is also `O(n)`.
(Actually called order of n/5 times. And, O(n/5) = O(n) ).

``````int recursiveFun3(int n)
{
if (n <= 0)
return 1;
else
return 1 + recursiveFun3(n/5);
}
``````

This function is log(n) base 5, for every time we divide by 5
before calling the function so its `O(log(n))`(base 5), often called logarithmic and most often Big O notation and complexity analysis uses base 2.

``````void recursiveFun4(int n, int m, int o)
{
if (n <= 0)
{
printf(%d, %dn,m, o);
}
else
{
recursiveFun4(n-1, m+1, o);
recursiveFun4(n-1, m, o+1);
}
}
``````

Here, its `O(2^n)`, or exponential, since each function call calls itself twice unless it has been recursed n times.

``````
int recursiveFun5(int n)
{
for (i = 0; i < n; i += 2) {
// do something
}

if (n <= 0)
return 1;
else
return 1 + recursiveFun5(n-5);
}
``````

And here the for loop takes n/2 since were increasing by 2, and the recursion takes n/5 and since the for loop is called recursively, therefore, the time complexity is in

(n/5) * (n/2) = n^2/10,

due to Asymptotic behavior and worst-case scenario considerations or the upper bound that big O is striving for, we are only interested in the largest term so `O(n^2)`.

Good luck on your midterms ðŸ˜‰

For the case where `n <= 0`, `T(n) = O(1)`. Therefore, the time complexity will depend on when `n >= 0`.

We will consider the case `n >= 0` in the part below.

1.

``````T(n) = a + T(n - 1)
``````

where a is some constant.

By induction:

``````T(n) = n * a + T(0) = n * a + b = O(n)
``````

where a, b are some constant.

2.

``````T(n) = a + T(n - 5)
``````

where a is some constant

By induction:

``````T(n) = ceil(n / 5) * a + T(k) = ceil(n / 5) * a + b = O(n)
``````

where a, b are some constant and k <= 0

3.

``````T(n) = a + T(n / 5)
``````

where a is some constant

By induction:

``````T(n) = a * log5(n) + T(0) = a * log5(n) + b = O(log n)
``````

where a, b are some constant

4.

``````T(n) = a + 2 * T(n - 1)
``````

where a is some constant

By induction:

``````T(n) = a + 2a + 4a + ... + 2^(n-1) * a + T(0) * 2^n
= a * 2^n - a + b * 2^n
= (a + b) * 2^n - a
= O(2 ^ n)
``````

where a, b are some constant.

5.

``````T(n) = n / 2 + T(n - 5)
``````

where n is some constant

Rewrite `n = 5q + r` where q and r are integer and r = 0, 1, 2, 3, 4

``````T(5q + r) = (5q + r) / 2 + T(5 * (q - 1) + r)
``````

We have `q = (n - r) / 5`, and since r < 5, we can consider it a constant, so `q = O(n)`

By induction:

``````T(n) = T(5q + r)
= (5q + r) / 2 + (5 * (q - 1) + r) / 2 + ... + r / 2 +  T(r)
= 5 / 2 * (q + (q - 1) + ... + 1) +  1 / 2 * (q + 1) * r + T(r)
= 5 / 4 * (q + 1) * q + 1 / 2 * (q + 1) * r + T(r)
= 5 / 4 * q^2 + 5 / 4 * q + 1 / 2 * q * r + 1 / 2 * r + T(r)
``````

Since r < 4, we can find some constant b so that `b >= T(r)`

``````T(n) = T(5q + r)
= 5 / 2 * q^2 + (5 / 4 + 1 / 2 * r) * q + 1 / 2 * r + b
= 5 / 2 * O(n ^ 2) + (5 / 4 + 1 / 2 * r) * O(n) + 1 / 2 * r + b
= O(n ^ 2)
``````

recursion – Determining complexity for recursive functions (Big O notation)

One of the best ways I find for approximating the complexity of the recursive algorithm is drawing the recursion tree. Once you have the recursive tree:

``````Complexity = length of tree from root node to leaf node * number of leaf nodes
``````
1. The first function will have length of `n` and number of leaf node `1` so complexity will be `n*1 = n`
2. The second function will have the length of `n/5` and number of leaf nodes again `1` so complexity will be `n/5 * 1 = n/5`. It should be approximated to `n`

3. For the third function, since `n` is being divided by 5 on every recursive call, length of recursive tree will be `log(n)(base 5)`, and number of leaf nodes again 1 so complexity will be `log(n)(base 5) * 1 = log(n)(base 5)`

4. For the fourth function since every node will have two child nodes, the number of leaf nodes will be equal to `(2^n)` and length of the recursive tree will be `n` so complexity will be `(2^n) * n`. But since `n` is insignificant in front of `(2^n)`, it can be ignored and complexity can be only said to be `(2^n)`.

5. For the fifth function, there are two elements introducing the complexity. Complexity introduced by recursive nature of function and complexity introduced by `for` loop in each function. Doing the above calculation, the complexity introduced by recursive nature of function will be `~ n` and complexity due to for loop `n`. Total complexity will be `n*n`.

Note: This is a quick and dirty way of calculating complexity(nothing official!). Would love to hear feedback on this. Thanks.