Analysis Of Algorithms
Outline:
- Analysis of an algorithm
- Time complexity
- Types of time functions/Order of algorithm
- Space complexity
In this lesson, you will learn about the theoretical aspect of the algorithms. In the previous lesson, you have learnt about algorithms.
Whenever you design an algorithm, you should keep in mind these two parameters that are time and space. How much time it takes to execute the algorithm, how much time it is taken by an algorithm. Whereas space complexity means how much space or memory is required for variables and instructions used in this algorithm. Time complexity and space complexity are the two most important parameters to analyse the efficiency of an algorithm and these should be independent of the machine.
As I discussed earlier often several different algorithms are available for a single problem. Sometimes it becomes a complicated task to select the best algorithm in terms of time and space.
Time Complexity:
Time complexity is defined as the amount of time taken by the algorithm. It is written as a function of the length or size of input n. It describes the efficiency of the algorithm in terms of a number of operations. As the number of operations or instructions increases, the computational time increases as well (of course!!). Generally, the lesser number of operations, the faster will be the algorithm. In this section, I am going to explain which algorithm is faster than the other. I am going to derive mathematical expressions for different algorithms.
Note:
Time complexity determines the worst-case run time. This will explain in detail in later lessons.
Example 1:
Take algorithm for adding two numbers.
Start
Declare variables: num1, num2, result
Result = num1 + num2
Display result
Stop
Count the number of operations the computer should perform.
The total number of steps/operations = 3
Time is taken by each step = 1 unit of time
Time is taken by all step = 3 units of time
Time complexity = O(3)
Where O(3) stands for the order of 3. This will explain in details in the upcoming lecture.
I assume each step takes one unit of time to execute. In real applications, this is not happening. In computing, every statement takes a different amount of time to execute. Execution time also depends on many other factors as well.
Evaluate Time Complexity (Examples) | Order Of Algorithm| Types of Time Function and Order of growth:
We write time complexity like this O(n). This is called a big O notation. This is an asymptotic notation. We will discuss this in the upcoming lecture. The common time complexity functions and their examples are given below.
Constant: (any constant number)
Example is a statement that is simply adding two numbers. I have discussed the algorithm for adding two numbers. It is an example of constant time function
Linear: N
An example is a single loop.
Example 1: Let's take an algorithm for 'for loop’.
Start
For numbers 0 to n
Print “hello world”
End
Another way of writing the similar algorithm
Start
For (i = 0, i<n, i++)...... 'n+1’ times
Print “hello world” ……'n’ times
End
Both algorithms are the same, having the same number of instructions or operations. The second algorithm uses for loop syntax. Actually for loop is a high level programming language construct (part of programming language. Syntax may vary from one language to another). So, for loop remains the same in any programming language.
Let's begin. How many steps/ operations or instructions in the above algorithm?
For loop iterates for 'n+1’ times. Print statement repeats 'n’ times.
The total number of steps/operations = n+1
Time taken by every step = n+1 units of time
Time complexity = O(n)
I omitted +1 because it is a constant and order remains the same. It means this algorithm process the input in n units of time.
Example 2: Let's take an algorithm for 'while loop’.
We want to investigate how many times the loop iterates. And hence the time taken by the program.
Start
Declare variables i = 0, n ... executes 1 time
While ( i < n ) ….. executes 'n+1’ times
Print “hello world”. …. executes ‘n’
i++ …. executes ‘n’
End
The total number of steps/operations = 3n+2
Time taken by all steps = 3n+2 units of time
Time complexity = O(n)
Logarithmic: log N
Example is binary search (divide in half). Here for/while loop has logarithmic iterations.
Example 1:
Start
Declare variables i,n ….. executes 1 time
For (i = 1, i<n, i = i*2) ….
Print “hello world”…..executes log2 (n) times
End
I draw a table. With the help of this table you can understand how does the loop repeat and how does value of 'i’ change?
i
|
i < n
|
i = i*2
|
Explanation
|
i = 1
|
i < n
|
i = 1*2 = 2
|
Simple multiplication
|
i = 2
|
i < n
|
i = 2*2 =4
|
In each iteration variable i is multiplied by 2
|
i = 4
|
i < n
|
i = 4*2 = 8
i = 22*2
|
We can generalize
i = 22*2
|
i = 8 = 23
|
i < n
|
i = 8*2 = 16
i = 23*2
| |
i = 16 = 24
|
i < n
|
i = 16*2
i = 24*2
| |
i = k = 2k
|
2k < n
|
i = k*2
i = 2k*2
|
Suppose the loop repeats itself 'k’ times. We got a general term.
|
We can write 2k <= n
Apply log on both sides.
k = log2 (n)
The loop iterates log2 (n) times.
The total number of steps/operations = log2 (n)
Time taken by all steps = log2 (n) units of time
Time complexity = O(log2 (n))
Note: most of the time you get fractional values. But you will take the whole value. Round off the result, you will get the actual value
Example 2:
Start
Declare variables i = 0, n ... executes 1 time
While ( i < n )
Print “hello world”…. executes log2 (n) times
i = i*2 …. executes log2 (n) times
End
All calculations are the same for both loops.
The total number of steps/operations = log2 (n)
Time taken by all stepss = log2 (n) units of time
Time complexity = O(log2 (n))
Quadratic: N2
Example is double (nested) loops.
Now again we use algorithm for ‘for loop’. It means loop within a loop. There is an example for nested loops. Consider 10 boxes, in each box there are 10 balls. How do you count them.
- First you count the total number of boxes
- Open each box and count the number of each box
- Each box contains 10 balls.
- 10 boxes contain 10*10 balls
Same strategy is applied for nested loops. In other words you can say that there are two linear nested functions
Start
For (i = 0, i<n, i++)...... 'n+1’ times
For (j = 0, j<i, j++) ……’n’ times
Print “hello world” ……'n*n’ times
End
The total number of steps/operations = (n+1)*n
Time taken by all steps = (n+1)*n units of time
f(n) = (n+1)*n
O(n) = n2
Space Complexity:
Space complexity is also a measure of efficiency of an algorithm in terms of memory. An algorithm is efficient if it occupies less memory than the other.
S(p) = C + Sp
Where
S(p) = space requirements
C = constant, it is the space taken up by variables and instructions.
Sp = space characteristics, it is a variable part. Space dependent upon instance characteristics. It varies from one algorithm to the other.
1. To analyse an algorithm there are two important parameters
- One
- Two
- Three
2. Time complexity means the amount of time taken by an algorithm or it defines efficiency of algorithm.
- True
- False
3. Space complexity determines the efficiency of algorithm in terms of space
- Space
- Area
- Length
- None
4. The example of quadratic time function is double loops.
- Single loop
- Nested loops
- Double loops
- None
5. Single loop is an example of linear time function
- Cubic
- Quadratic
- Linear
- Non linear
6. The formula for space complexity is S(p) = C + Sp
- True
- False
7. Time complexity for the function below is
Start
For (i = 0, i<n, i+2)
Print “hello world”
End
- Linear
- Quadratec
- Cubic