Tuesday, November 19, 2024
Google search engine
HomeData Modelling & AIAlgorithms Sample Questions | Set 3 | Time Order Analysis

Algorithms Sample Questions | Set 3 | Time Order Analysis


Question 1: What is the asymptotic boundary of T(n)?

 T(n) =  \sum_{i=2}^{n} log_{i}n = log_{2}n + log_{3}n + \ldots + log_{n}n

  1. θ( n*log(n) )
  2. θ( n2 )
  3. θ( n )
  4. θ( n*log2(n) )
  5. θ( n2*log2(n) )

Answer: 3
Explanation: To find appropriate upper and lower boundaries, an approach which first comes to mind is to expand the sigma notation to single terms among which some patterns can be detected. This way it helps to define some acceptable upper and lower boundaries and their combination might lead to a possible solution.

Regarding specifying these boundaries, there are some hints as following:

  • This is obvious that for any k greater than √ n, each logkn should be less than log√nn = 2, while more than lognn = 1. In mathematic language:

    1. A hint on UPPER boundary, for k > √ n:

       \forall k \geq \sqrt{n},  log_{k}n \leq log_{\sqrt{n}}n = 2

       \Rightarrow \sum_{i=[\sqrt{n}]}^{ n } log_{i}n \leq \sum_{i=[\sqrt{n}]}^{ n } 2

    2. A hint on LOWER boundary, for k > √ n:

       \forall k \geq \sqrt{n},  log_{k}n \geq log_{n}n = 1

       \Rightarrow \sum_{i=[\sqrt{n}] + 1}^{ n } log_{i}n \geq \sum_{i=[\sqrt{n}] + 1}^{ n } 1

  • Besides that, as the base of a logarithm increases, its value decreases; so none of the terms resulted from expansion of the first sigma can be more than the first ter, log2n, nor can be less than the last one, which is lognn; in other sentences,

    1. Another hint on UPPER boundary, but this time for k < √ n:

       \forall k \leq \sqrt{n},  log_{k}n \leq log_{2}n

       \Rightarrow  \sum_{i=2}^{  [\sqrt{n}] } log_{i}n \leq  \sum_{i=2}^{  [\sqrt{n}] } log_{2}n

    2. Another hint on LOWER boundary, but this time for k < √ n:

       \forall k \leq \sqrt{n},  log_{k}n \geq log_{\sqrt{n}}n = 2

       \Rightarrow \sum_{i=2}^{  [\sqrt{n}] } log_{i}n \geq  \sum_{i=2}^{  [\sqrt{n}] } 2

Following these hints gives:

  1. Lower boundary:

     \sum_{i=2}^{n} log_{i}n  =  \sum_{i=2}^{  [\sqrt{n}] } log_{i}n +  \sum_{i=[\sqrt{n}]+1}^{n} log_{i}n

     \leq  \sum_{i=2}^{  [\sqrt{n}] } log_{2}n +  \sum_{i=[\sqrt{n}]+1}^{n} 2

     =  ([\sqrt{n}] - 1) * log_{2}n +  (n - [\sqrt{n}]) * 2

     \approx  2 * n + \sqrt{n} * (log_{2}n - 2) - log_{2}n

     \Rightarrow T(n) \in O( 2 * n + \sqrt{n} * (log_{2}n - 2) - log_{2}n ) = O( n )

  2. Upper boundary:

     \sum_{i=2}^{n} log_{i}n  =  \sum_{i=2}^{  [\sqrt{n}] } log_{i}n +  \sum_{i=[\sqrt{n}]+1}^{n} log_{i}n

     \geq  \sum_{i=2}^{  [\sqrt{n}] } log_{[\sqrt{n}]}n +  \sum_{i=[\sqrt{n}]+1}^{n} 1

     =  ([\sqrt{n}] - 1) * 2 +  (n - [\sqrt{n}]) * 1

     \approx  n  + \sqrt{n} - 2

     \Rightarrow T(n) \in \Omega(n + \sqrt{n} - 2) = \Omega( n )

What has been derived till now indicates that the growth of T(n) cannot exceed O(n), nor can be less than Ω(n); Therefore, the asymptotic complexity order of T(n) is:

 T(n) \in \Theta(n)


Question 2: What is running time order of given program?

C PROGRAM: Input n of type integer
  for(i= 2; i<n; i=i+1)
   for(j = 1; j < n; j= j * i)
    // A line of code of Θ(1)
  1. θ( n )
  2. θ( n*log(n) )
  3. θ( n2 )
  4. θ( n*log2log(n) )
  5. θ( n2*log2(n) )

Answer: 1

Explanation: The running time of each line is indicated below separately:

  1. The first code line, t1(n), is:
      for(i= 2; i<n; i=i+1) // it runs (n – 2) times; so the time complexity of this line is of θ(n)
  2. The second code line, t2(n) is:
       for(j = 1; j < n; j= j * i) // log2n + log3n + … + logn-1n = Σlogin ∈ Θ( n ) in according to PREVIOUS QUESTION of this article (Refer to Question 1)
  3. The third code line, t3(n) is:
        //A code line of Θ(1) :: Inside loops, so its order time is as the same as that of previous line, which is of Θ( n )

The total time complexity T(n) of the program is the sum of each line ti(n), i = 1..3, as following:

 T(n) \in \Theta( n ) + \Theta( n ) + \Theta( n ) = \Theta( n )


Question 3: The following recurrence equation T(n) is given. How many number of proposed gi(n), i=1 .. 5, functions is acceptable in order to have T(n) ∈ θ(f(n)) when f(n) = gi(n)?

 T(n) = 8 * T(\frac{n}{2}) + f(n)
 g_{1}(n) = n^{4}, g_{2}(n) = n^{3.01}, g_{3}(n) = n^{3},
 g_{4}(n) =  n^{3} * log(n), g_{5}(n) = n^{3} * log^{-1}(n)

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5

Answer: 2
Explanation: Master theorem and its extension can be of great help to easily tackle this problem. The general form of master theorem can be expressed as:

 T(n) = a* T( \frac{n}{b}) + f(n), \forall a\geq 1, b > 1

In order to use the master theorem, there is a need to see that the given problem with specific “a”, “b”, and “f(n)” satisfies the condition of which case of this theorem. The three cases of master theorem and their conditions are:

  • case 1: This case happens the recursion tree is leaf-heavy (the work to split/recombine a problem is dwarfed by subproblems.)

      \forall \epsilon>0, f(n)  \leq  n^{log_{b}{a-  \epsilon}  } \Rightarrow  T(n)  \in   \theta (n^{lob_{b}{a}})

  • case 2: This case occurs when the work to split/recombine a problem is comparable to subproblems.

     f(n) \in \theta( n^{log_{b}{a}} * (log(n))^{k}) \Rightarrow  T(n)  \in \theta( n^{log_{b}{a}} *(log(n))^{k+1} )
     \forall k \geq 0

  • case 3: This case takes place when the recursion tree is root-heavy (the work to split/recombine a problem dominates subproblems.)

     \forall \epsilon>0, f(n) \geq n^{log_{b}{a + \epsilon}  } \Rightarrow  T(n)  \in   \theta (f(n))

The generalized second case of master theorem, so-called advanced master theorem, handles all values of k. It says:

  •  \forall k > -1,

     f(n) \in \theta( n^{log_{b}{a}} * (log(n))^{k}) \Rightarrow  T(n)  \in \theta( n^{log_{b}{a}} *(log(n))^{k+1} )

  •  k = -1,

     f(n) \in \theta( n^{log_{b}{a}} * (log(n))^{-1}) \Rightarrow  T(n)  \in \theta( n^{log_{b}{a}} *log(log(n)) )

  •  \forall k < -1,

     f(n) \in \theta( n^{log_{b}{a}} * (log(n))^{k}) \Rightarrow  T(n)  \in \theta( n^{log_{b}{a}} )

The answer to this question is the third case of master theorem where T(n) is of Θ( f(n) ); so in order to have T(n) = θ(f(n)), there should be polynomial difference “epsilon” between nlogba and f(n); therefore, the functions g1(n) and g2(n) meet the conditions of third case of Master theorem. The value of “epsilon” found for them are 1 and 0.01 respectively.


Question 4: Which option delineates a true asymptotic analysis for this multiple input variable program, while having an insight (prior knowledge) about the relative growth of inputs, like m ∈ Θ(n)?

 C PROGRAM: inputs m and n of type integer 
  for(i= 1; i<= n; i=i+1) : n
   for(j = 1; j <= m; j= j * 2)
    for(k = 1; k <= j; k= k+1)
     \\ A code line of Θ(1)
  1. θ( n * m*(m+1)/2 )
  2. θ( n*m + n*log2(m) )
  3. θ( m3 )
  4. θ( n2 )
  5. θ( n2*log2(n) )

Answer: 4

Explanation: To compute the time complexity of program based on inputs n and m, T(n, m), the first step is to obtain the running time of each line, ti(n, m), as indicated below:

  1. for(i= 1; i<= n; i=i+1) // It runs n times

     t_{1}(n, m) \in \Theta(n)

  2. for(j = 1; j <= m; j= j * 2) // iterates log2(m) times, and it is inside another loop which multiply it n times

     t_{2}(n, m) \in \Theta ( n * log(m) )

  3. for(k = 1; k <= j; k= k+1) // It runs 2 + 4 + 8 + … + 2log(m) times

     2 + 4 + 8 + \ldots + 2^{log(m)} = 2* (2^{log(m)} - 1) = 2*m - 2

    This is also inside an outer loop, first “for” loop, which itself iterates n times

     t_{3}(n, m) \in \Theta ( m * n )

  4. \\A line of code of Θ(1) The same as previous line, Θ( m*n )

     t_{4}(n, m) \in \Theta ( m * n )

The total running time order of this program is:

 T(n, m) \in  \sum_{i=1}^{4} \Theta ( t_{i}(n, m) ) \Rightarrow

 T(n, m) \in  \Theta(n) + \Theta(n* log(m)) + \Theta(n*m) + \Theta(n*m) = \Theta(n*m)

It can even be more simplified in according to the given prior knowledge which says that m ∈ Θ(n), or n ∈ Θ(m):

 T(n, m) \in \Theta(n^{2}) or \Theta(m^{2})


Question 5: There is a vector of integer numbers, called V[], which is of length “N”.
For a specific problem (program), it is given that Σi=1N |V[i]| = P.
What is the time complexity of following code snippet? [Needless to say, P is also an integer number]

Tmp = -1;
  For r= 1 to N
   For S = 1 to V[r]
    Tmp = Tmp + 20;
  1. O( N + 2*N*P )
  2. O( N * P )
  3. O( N2 )
  4. O( P2 )
  5. O( 2*P + N )

Answer: 5
Explanation: The number of time each line will be executed and their total time complexity is indicated below:

Tmp = -1; // θ(1)
  For r= 1 to N // N times; so it is of θ(N)
   For S = 1 to V[r] // Cannot be more than |V[r]| times; so the total number of times is O(Σr=1N |V[r]| ) = O(P)
    Tmp = Tmp + 20; // The same as previous line, O( P )

In order to find the time complexity of the given program, there are three facts to keep on mind:

  1. Each V[r] can take any integer value (even zero or negative ones), but it doesn’t matter as all negative values will lead to no execution of the second loop in programming languages like C. However, in programming languages, it is allowed to count down-to (or to iterate over) negative numbers, but the algorithms are not being analyzed depends on programming languages, and the analysis is just based on the algorithm itself. What to say for sure is the information that is given in the question; so a shrewd action is to consider the absolute value of |V[r]| and also to use O() notation in order to get rid of being stuck. Otherwise, it has to be said that the program runs at least as much as the time needed for just execution of the first loop, or Ω(N)
  2. Although the running time order of this program does not seem to depend on two variables, but there is no more information for further analysis which is needed to compare P and N; so the asymptotic complexity depends on the values of both P and N; in other words, there is T(N, P) complexity function instead of T(N).
  3. The O() notation defines a looser boundary than the tight boundary specified by &theta() notation; Therefore, the sum of θ() and O() would be of O() type. In this problem, the total time complexity of the program, which is the sum of all code lines complexities, θ(1) + θ(N) + O( P ) + O( P ), belongs to set O( 2 * P + N + 1 ) or O(2*P + N).

Considering all factors mentioned above, one asymptotic complexity can be of T(N, P) ∈ O(2*|P| + N); However, the coefficients of variables are not important at the final step of asymptotic analysis, as they all belong to the same set of complexity functions, and the complexity can also be expressed as O(|P| + N).

Source:

  1. A compilation of Iran university exams (with a bit of summarization, modification, and also translation)

Feeling lost in the world of random DSA topics, wasting time without progress? It’s time for a change! Join our DSA course, where we’ll guide you on an exciting journey to master DSA efficiently and on schedule.
Ready to dive in? Explore our Free Demo Content and join our DSA course, trusted by over 100,000 neveropen!

RELATED ARTICLES

Most Popular

Recent Comments