# Analysis of Algorithms

The usual treatment of analysis of time/space complexities starts with "how to begin the analysis" of certain categories of algorithmic constructs (loops, recursion, etc.).

We have already mentioned analyzing simple sequencing and looping.

Review this.

• Describe how much time sequential code seqments take.
• Describe how much time looping code seqments take.

Also clarify in your mind what the notation T(n) as the time measure signifies. Note why you might subscript T with "A" for algorithm or "f" for the particular function computed by the algorithm, as in TA(n).

Recursive code requires a more "sophisticated" analysis.

A typical, although unreasonably intricate, example is "fibonacci":

```
int fib(int n)
{
if (n == 1 || n == 2) return 1;
else                  return fib(n-1) + fib(n-2);
}

```

After identifying the "base" case(s) and the inductive step, the following recurrence relation is realized:

```             a  if n == 1 or 2
T(n) =
T(n-1) + T(n-2) + h(n)     n > 2
```
where h(n) is the cost to perform (in this case) the additions and other "clean up" work.

Why is "h" a function of "n"? - because could depend on size of f(n-1), etc.

To simplify the problem, assume h is constant:

```             a  if n == 1 or 2
T(n) =
T(n-1) + T(n-2) + b        n > 2
```

This will be shown to be order O(2n).

As a preliminary to the complicated problem of determining the exact time complexity of the above, assume the recurrence is:

```T(n) = T(n-1) + T(n-1) + b = 2*T(n-1) + b

= 2*( 2*T(n-2) + b ) + b

= 2 2 * T(n-2)  + (1 + 2) * b
= 2 3 * T(n-3)  + (1 + 2 + 22) * b

:

= 2 k * T(n-k)  + (1 + 2 + ... + 2(k-1)) * b
= 2 k * T(n-k)  + 2k * b

```

For k = (n-2) we have

```     T(n) = 2(n-2)*T(2) + 2(n-3)*b
T(n) = 2(n-2)*a    + 2(n-3)*b
2(n-3)*2a   + 2(n-3)*b
= 2(n-3) * (2a + b)

= O(2n)
```

To consider the original fibonacci problem,

```   T(n) = T(n-1) + T(n-2) + b        n > 2
```

we use the concept of "characteristic equation".

Given

```D  A  = D A + D A + D A + ... + D A
0  n    1 n-1 2 n-2 3 n-3       r n-r
```

Rewrite as:

```      C  A  + C A + C A + C A + ... + C A    = 0
0  n    1 n-1 2 n-2 3 n-3       r n-r
```

The characteristic equation of this is:

```          r      r-1   r-2   r-3         0
C  x  + C x + C x + C x + ... + C x    = 0
0       1     2     3           r
```

Then, if x0 is a sol'n to char. eq.,

```    n
C x  is part of
0
```

the sol'n to the recurrence.

Turning back to the original, first ignoring the 'b', we have

```
T(n) = T(n-1) + T(n-2)        n > 2
```

Which has coefficients

```
C0 = 1

C1 =  C2  = -1

```

The characteristic equation is:

```
2
1 x - 1 x  - 1 = 0
```

Which has the solution:

```         x = -(-1) +/- sqrt(1- 4*1*(-1))
---------------------------
2*(1)
```

Or

```x = (1+sqrt(5))/2   x =  (1-sqrt(5))/2
1                   2
```

are the "characteristic roots". Any linear combination of them is a solution to the recurrence relation:

```        n      n
K x  + K x
1      2

n                    n
= K  [(1+sqrt(5))/2] + K  [(1-sqrt(5))/2]
1                    2
```

Which can be seen to be exponential (i.e., the sum of two exponential functions). To solve for the particular solution, restore the constant value 'b' to set boundary conditions:

```T(n) - T(n-1) - T(n-2) = b
```

Assuming that the particular solution will be some form of constant, K:

```T(n) - T(n-1) - T(n-2) = b
K   -   K      -  K   = b
- K = b
K = -b
```

Actually, we don't need to solve for K, since b also represents an arbitrary constant.

So finally, we have:

```
n                    n
T(n) = K  [(1+sqrt(5))/2] + K  [(1-sqrt(5))/2]  + K
1                    2                     3
```

As a more straight forward example, consider the analysis of the time complexity of "Binary Search". A Java-esque coding looks like:

```
static int find(String x, String[] L) { return dof(x,L,0,L.length-1); }

static int dof(String x, String[] L, int left, int right)
{
if (left > right) return -1;
int mid = (left+right)/2;
if (L[mid].equals(x)) return mid;
else if (L[mid].compareTo(x) > 0) return dof(x,L,left,mid-1);
else                              return dof(x,L,mid+1,right);
}
```

The analysis begins with delimiting the base and inductive parts of the definition.

```
T(n) = a                             n = 1
T(n) = b + T(n/2)                    otherwise [worse case]

T(n) = b + T(n/2)
= b + [ b + T(n/4) ]  = 2b + T(n/4)
= 2b + [ b + T(n/8) ] = 3b + T(n/8)

:

= k*b + T(n/[2k])
```

if k = log2 n then

```
= log n2 * b + T(1) = b log2 n + a

```

To clarify, the time complexity of the function dof and, hence, of the function find, behaves, in the worst case, and considering n to be a power of 2, like the function b log2 n + a.

This is of order log2 n.

Discussion of the above algorithm

• Consider impact of carrying along all the parameters on each recursive call.

• Describe changes that could be made to lessen this impact.

Write code to accomplish this.

• Describe considerations which must me made to make these changes ``threadsafe''.

Write code to accomplish this.

• Change the above algorithm into an iterative version.