## Algorithms

##### Sorting

- Bubble Sort
- Insertion Sort
- Quick Sort
- Merge Sort
- Select Sort
- Bucket Sort

**Solved Example: 9972-01**

Let P be a quicksort program to sort numbers in ascending order using the first element as the pivot. Let $t_1$ and $t_2$ be the number of comparison made by P for the inputs [1 2 3 4 5] and [4 1 5 3 2] respectively. Which one of the following holds?

A. $t_1$ = 5

B. $t_1$ < $t_2$

C. $t_1$ > $t_2$

D. $t_1$ = $t_2$

Correct Answer: **C**

**Solved Example: 9972-02**

Assume that a mergesort algorithm in the worst case takes 30 seconds for an input of size 64.

Which of the following most closely approximates the maximum input size of a problem that can be solved in 6 minutes?A. 256

B. 512

C. 1024

D. 2048

Correct Answer: **B**

**Solved Example: 9972-03**

An array of 25 distinct elements is to be sorted using quick sort. Assume that the pivot element is chosen uniformly at random. The probability that the pivot element gets placed in the worst possible location in the first round of partitioning (rounded off to 2 decimal places) is:

A. 0.02

B. 0.04

C. 0.08

D. 0.16

Correct Answer: **C**

**Solved Example: 9972-04**

Consider the following array: \[23,32,45,69,72,73,89,97\] Which algorithm out of the following options uses the least number of comparisons (among the array elements) to sort above array in ascending order?

A. Insertion sort

B. Selection sort

C. Quicksort using the last element as pivot

D. Merge sort

Correct Answer: **A**

##### Searching

Searching algorithm is an algorithm made up of a series of instructions that retrieves information stored within some data structure, or calculated in the search space of a problem domain.

**Search algorithms:**

- Linear search
- Step I: Start with the leftmost element of array and one by one compare x with each element of the array.
- Step II: If x matches with an element, return the index.
- Step III: If x doesn't match with any of elements, return -1.
- Binary search: Can be performed only on a sorted list. Uses divide and conquer technique to search the list.
- Step I:Search element is compared with the midlle of the list.
- Step II: If search item < middle element of the list, search is restricted to the first half of the list.
- Step III: If search item > middle element of the list, search is restricted to the second half of the list.
- Step IV: If search item = middle element of the list, search is complete.
- Jump search
- Interpolation search
- Exponential search
- Ternary search

**Solved Example: 9240-01**

Finding the location of the element with a given value is:

A. Traversal

B. Search

C. Sort

D. None of the options

Correct Answer: **B**

**Solved Example: 9240-02**

Consider a hash table with 100 slots. Collisions are resolved using chaining. Assuming simple uniform hashing, what is the probability that the first 3 slots are unfilled after the first 3 insertions?

A. $\dfrac{97^3}{100^3}$

B. $\dfrac{99 \times 98 \times 97}{100^3}$

C. $\dfrac{97 \times 96 \times 95}{100^3}$

D. $\dfrac{97 \times 96 \times 95}{3! \times 100^3}$

Correct Answer: **A**

**Solved Example: 9240-03**

A _______ is an ordered collection of finite, homogeneous data elements.

A. Linked List

B. Graph

C. Tree

D. Hash Table

Correct Answer: **A**

##### Complexity

The complexity of an algorithm is a way to classify how efficient an algorithm is, compared to alternative ones. The focus is on how execution time increases with the data set to be processed.

We are usually interested in the worst case complexity: What are the most operations that might be performed for a given problem size.

**Classes of Complexity**

- Constant time (C): The time necessary to perform the algorithm does not change in response to the size of the problem.
- Linear time (n): The time grows linearly with the size (n) of the problem.
- Quadratic time (n$^2$): The time grows quadratically with the size (n) of the problem.

**Worst case complexities of common Sorting algorithms:**

- Slection Sort: (n$^2$)
- Insertion Sort: (n$^2$)
- Merge Sort: (n $\log$ n)
- Quick Sort: (n$^2$)

**Asymptotic Notations**

Following are the commonly used asymptotic notations to calculate the running time complexity of an algorithm.
- O notation
- $\Omega$ notation
- $\theta$ notation

**Solved Example: 9239-01**

There are n unsorted arrays: A$_1$, A$_2$ , ..., A$_n$. Assume that n is odd. Each of A$_1$, A$_2$, ..., A$_n$ contains n distinct elements. There are no common elements between any two arrays. The worst-case time complexity of computing the median of the medians of A$_1$, A$_2$, ..., A$_n$ is:

A. O(n)

B. O(n log n)

C. O(n$^2$)

D. $\Omega$(n$^2$ log n)

Correct Answer: **C**

**Solved Example: 9239-02**

Consider the following C function.

int fun(int n) {

int i, j;

for (i = 1; i < = n; i++) {

for (j = 1; j < n; j + = i) {

printf (‘’ %d %d’’, i, j);

}

}

}

Time complexity of fun in terms of Θ notation is

A. $\Theta (n \sqrt{n})$

B. $\Theta (n^2)$

C. $\Theta (nlog n)$

D. $\Theta (n^2 log n)$

Correct Answer: **C**

**Solved Example: 9239-03**

Let $f(n) = n$ and $g(n) = n^{(1 + sin n)}$, where n is a positive integer. Which of the following statements is/are correct?

I. $f(n) = O(g(n))$

II. $f(n) = \Omega(g(n))$

A. Only I

B. Only II

C. Both I and II

D. Neither I nor II

Correct Answer: **D**

**Solved Example: 9239-04**

What is the complexity of the following code?

sum=0;

for (i=1; I <= n; i*=2)

A. O(n$^2$)

B. O(n log n)

C. O(n)

D. O(n log n log n)

Correct Answer: **B**

##### Big-O

**Solved Example: 9152-01**

What is the space complexity of insertion sort? (Consider input size and additional storage)

A. O(n)

B. O(n^2)

C. O(n log n)

D. O(1)

Correct Answer: **A**