Nothing Special   »   [go: up one dir, main page]

小考20181026 (第一次期中考)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Algorithms First Examination

(2018/10/26)

(Note that, if you design an algorithm, you must have pseudo code to represent your
algorithm. You can put comments after your pseudocode to clarify your presentation.)

1. Chapter 2 (10%). We can express insertion sort as a recursive procedure as follows. In order
to sort A[1 .. n], we recursively sort A[1 .. n-1] and then insert A[n] into the sorted array A[1 ..
n-1]. Write a recursive pseudo code to implement the insertion sort. Write a recurrence for
the worst-case running time of this recursive version of insertion sort and give the time
complexity of the recursive algorithm.

Solution:
(5%) Recursive Pseudo code:
INSERTION-SORT (𝐴, 𝑛)
1 if 𝑛 > 1
2 then INSERTION-SORT (𝐴, 𝑛 − 1)
3 𝑘𝑒𝑦  𝐴[𝑛]
4 𝑖  𝑛−1
5 while 𝑖 > 0 and 𝐴[𝑖] > 𝑘𝑒𝑦
6 𝐴[𝑖 + 1]  𝐴[𝑖]
7 𝑖  𝑖 − 1
8 𝐴[𝑖 + 1]  𝑘𝑒𝑦
9 return
10 else
11 return
(3%) Recurrence for the worst-case running time:
The input list is sorted in decreasing order
𝛩(1) 𝑖𝑓 𝑛 = 1
𝑇(𝑛) = {
𝑇(𝑛 − 1) + 𝑂(𝑛) 𝑖𝑓 𝑛 > 1
(2%) Time complexity:
𝑇(𝑛) = 𝛩(𝑛2 )

2. Chapter 3 (5%). Let 𝑓(𝑛) and 𝑔(𝑛) be asymptotically positive functions. Prove or
disprove of the following conjecture. 𝑓(𝑛) = 𝑂(𝑔(𝑛)) ⟹ 𝑔(𝑛) = Θ(𝑓(𝑛)).

Answer:
This conjecture is incorrect. We can give a counter example as follows.
Let 𝑓(𝑛) = 𝑛, 𝑔(𝑛) = 𝑛2 . We get 𝑛 = 𝑂(𝑛2 ). However, 𝑛2 ≠ Θ(𝑛).
So, 𝑓(𝑛) = 𝑂(𝑔(𝑛)) ⇏ 𝑔(𝑛) = Θ(𝑓(𝑛)).
評分標準:
舉反例或是用定義證明過程和結論正確,5 分
舉錯反例,0 分
用定義證明,每錯一個地方,扣 2 分
用定義證明,概念錯誤,0 分

3. Chapter 3 (5%). Prove or disprove the following questions. Is 2n+1 = O(2n)? Is 22n = O(2n)?

Answer
Assume 2n+1 = O(2n).
Then there exists a positive constant c such that 2n+1 ≤ c * 2n.
2n+1 = 2 * 2n
If c ≥ 2, 2 * 2n ≤ c * 2n
Therefore, 2n+1 = O(2n).

Assume 22n = O(2n)


Then there exists a positive constant c such that 22n ≤ c * 2n.
Dividing both sides by 2n, we get 2n < c.
2n is not bounded by a constant c.
Therefore, 22n ≠ O(2n)

評分標準:
每錯一個地方,扣 1 分
概念錯誤,2n+1 = O(2n)扣 2 分,22n = O(2n)扣 3 分
2n+1 = O(2n)最多扣 2 分,22n = O(2n) 最多扣 3 分

4. (10%). Write a recursive algorithm to implement the Tower of Hanoi game. The Tower of
Hanoi is a very famous game. In this game, there are three pegs and N number of disks placed
one over the other in decreasing size. The objective of this game is to move the disks one by
one from the first peg to the last peg. And there is only one condition, we cannot place a
bigger disk on top of a smaller disk. What is the time complexity of your algorithm?

Ans:

Tower-of-Hanoi:
MoveTower(disk, source, dest, aux):
IF disk == 1, THEN:
move disk from source to dest
ELSE:
MoveTower(disk - 1, source, aux, dest)
move disk from source to dest
MoveTower(disk - 1, aux, dest, source)
END IF

T(n) = 2T(n-1) + 1
T(n) = 2 * (2T(n-2) + 1) + 1 = 22T(n-2) + 2 + 1
T(n) = 2kT(n-k) + 2k-1 + 2k-2 + … + 1
T(n) = O(2n)
Pseudo Code 5%
Analysis of time complexity 5%

5. Chapter 4 (5%). Find an asymptotic solution of the following recurrence. Express your
answer using Θ-notation, and give a brief justification. 𝑇(𝑛) = 𝑙𝑜𝑔 𝑛 + 𝑇(√𝑛)

Solution: (用其他方法也可以)
(Only the answer 0%) 𝑇(𝑛) = 𝛩(𝑙𝑜𝑔 𝑛)
(5%)
(1) To see this, note that if we expand out 𝑇(𝑛) by continually replacing 𝑇(𝑛) with
its formula, we get:

𝑇(𝑛) = 𝑙𝑜𝑔 𝑛 + 𝑙𝑜𝑔 √𝑛 + 𝑙𝑜𝑔 √√𝑛 + 𝑙𝑜𝑔 √√√𝑛 + . . .

1 1 1
= 𝑙𝑜𝑔 𝑛 + 𝑙𝑜𝑔 𝑛 + 𝑙𝑜𝑔 √𝑛 + 𝑙𝑜𝑔 √√𝑛 + . . .
2 2 2
1 1 1
= 𝑙𝑜𝑔 𝑛 + 𝑙𝑜𝑔 𝑛 + 𝑙𝑜𝑔 𝑛 + 𝑙𝑜𝑔 𝑛 + . . .
2 4 8
= 𝛩(𝑙𝑜𝑔 𝑛)
(2)
T(n) = log n + T(√n)
𝑚
Let 𝑛 = 2
𝑇(2𝑚 ) = 𝑇(𝑛) = 𝑆(𝑚)
𝑇(√𝑛) = 𝑆(𝑚/2)
𝑙𝑜𝑔𝑛 = 𝑙𝑜𝑔 2𝑚 = 𝑚
𝑆(𝑚) = 𝑆(𝑚/2) + 𝑚
By masters Theorem
𝑆(𝑚) = 𝛩(𝑚)
𝑇(2𝑚 ) = 𝛩(𝑚)
𝑇(𝑛) = 𝛩(𝑚) = 𝛩(𝑙𝑜𝑔𝑛)
𝑇(𝑛) = 𝛩(𝑙𝑜𝑔𝑛)
6. Chapter 4 (5%). Use a recurrence tree to give an asymptotically tight solution to the
recurrence T(n) = T(n-a) + T(a) + cn, where a ≥ 1 and c > 0 are constants.

Ans:

𝑛/𝑎
𝑛
𝑇(𝑛) = ∑ 𝑐(𝑛 − 𝑖𝑎) + ( ) 𝑐𝑎
𝑎
𝑖=0

𝑛/𝑎 𝑛/𝑎
𝑛
= ∑(𝑐𝑛) − ∑(𝑐𝑖𝑎) + ( ) 𝑐𝑎 = 𝑐𝑛2 /𝑎 − (𝑐𝑛2 /2𝑎 + 𝑐𝑛/2) + 𝑐𝑛 = 𝜃(𝑛2 )
𝑎
𝑖=0 𝑖=0

Recurrence tree 2%
Analysis of time complexity 3%

7. Chapter 6 (10%). What is the running time of HEAPSORT on an array A of length n that is
already sorted in increasing order? What about decreasing order?

Solution:
(5%) If A is sorted in increasing order, Build-Max-Heap will attain the maximum
running time of 𝛩(𝑛) for each call Max-Heapify (A, i), since it tries to order the array
in a decreasing order. Hence, the Build-Max-Heap takes 𝑂(𝑛 𝑙𝑜𝑔2 (𝑛)). Each of
the 𝑛 − 1 calls Max-Heapify(A, 1) will take at most 𝑂(𝑙𝑜𝑔2 (𝑛)) time, hence the
running time of Heapsort will be 𝑂(𝑛 𝑙𝑜𝑔2 (𝑛)).
(5%) If A is sorted in decreasing order, Max-Heapify (A, i) has running time 𝑂(1) for
any i. Therefore, the running time of Build-Max-Heap, will get 𝛩(𝑛) running time due
to the ⌊𝑛/2⌋ calls to Max-Heapify. However, each of the 𝑛 − 1 calls Max-Heapify(A,
1) will take at most 𝑂(𝑙𝑜𝑔2 (𝑛)) time. Hence the running time of Heapsort will be
𝑂(𝑛 𝑙𝑜𝑔2 (𝑛)).

8. Chapter 7 (5%). Compare the quick-sort and merge-sort algorithms in terms of their time
and space complexity. Which is better in terms of time complexity? Which is better in terms
of space complexity?

Ans:

The complexity of Quick-sort is O(n2) under a worst-case and is O(n lg n) in average


case. The space used by quicksort, however, depends on the version taken. The in-place
version of quick-sort has a space complexity is n.

On the other hand, as the recurrence relation for merge sort is T(n) = 2T(n/2) + n, we
can deduce that the time complexity of merge-sort is O(n lg n) and the space complexity
of merge-sort is 2n as we have to store the elements somewhere.

We can conclude that the space complexity of in-place quick-sort performs better than
merge-sort, while the space complexity of not-in-place version is the same as merge-
sort theoretically. Also, the time complexity of quick-sort is the same as (in average
case) or worse(worst-case) than the time complexity of merge-sort in theory.

Nevertheless, quicksort is significantly faster in practice than other Θ(nlogn) algorithms


typically, because its inner loop can be efficiently implemented on most architectures,
and in most real-world data, it is possible to make design choices which minimize the
probability of requiring quadratic time.

Time complexity: quick-sort (1%), merge-sort (1%)


Space complexity: quick-sort(1%), merge-sort (1%)
Conclusion: 1%

9. Chapter 7 (10%). Please implement an in-place partition algorithm for Quicksort. For an
array A[p .. r], the in-place partition algorithm returns an index q such that each element of
A[p .. q-1] is less than or equal to A[q] and each element of A[q+1 .. r] is greater than A[q].

Answer
𝐟𝐮𝐧𝐜𝐭𝐢𝐨𝐧 𝑝𝑎𝑟𝑡𝑖𝑡𝑖𝑜𝑛(𝐴, 𝑝, 𝑟)
𝑝𝑖𝑣𝑜𝑡𝑉𝑎𝑙𝑢𝑒 : = 𝐴[𝑟] // Use the rightmost element as the pivot
𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥 : = 𝑝
𝐟𝐨𝐫 𝑖 𝐟𝐫𝐨𝐦 𝑝 𝐭𝐨 𝑟 − 1
𝐢𝐟 𝐴[𝑖] < 𝑝𝑖𝑣𝑜𝑡𝑉𝑎𝑙𝑢𝑒
𝑠𝑤𝑎𝑝(𝐴[𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥], 𝐴[𝑖]) // Move the values smaller than the pivot to
// the front part of the subarray
𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥 : = 𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥 + 1
𝑠𝑤𝑎𝑝(𝐴[𝑟], 𝐴[𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥]) // Move the pivot to the boundary of the values that’s
// smaller/bigger than pivot.
// Let 𝐴[𝑝. . 𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥 − 1] < 𝑝𝑖𝑣𝑜𝑡𝑉𝑎𝑙𝑢𝑒
// and 𝐴[𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥 + 1. . 𝑟] ≥ 𝑝𝑖𝑣𝑜𝑡𝑉𝑎𝑙𝑢𝑒
𝑞 = 𝑠𝑡𝑜𝑟𝑒𝐼𝑛𝑑𝑒𝑥
𝐫𝐞𝐭𝐮𝐫𝐧 𝑞

評分標準:
概念正確且 pseudo code 完全沒有錯誤,10 分
沒有 in-place 的概念,0 分
index 每錯一個,扣 2 分
多或少程式每行,扣 2 分

10. Chapter 8 (10%). It is known that (n log n) is a lower bound for sorting problem. However,
we have seen algorithms like counting sort or radix sort which can sort n items in O(n) time.
Is there a contradiction? If not, why? Explain?

Ans :

There is no contradiction.
Comparison sort like insertion, merge, heapsort, quicksort determines sorted order by
comparison. So we need Ω(nlogn) to solve it.

Other sorting algorithms like counting sort, radix sort, bucket sort determine sorted
order by counting. Moreover, the input of such sorting algorithms have some limits.

For Counting Sort:


Input: Array A[1..n] of n integers, each has value from [0, k].
Running time = O(n + k)
if k = O(n), time complexity is linear time. k = O(n) is our input limit.

For Radix Sort:


Input: Array A[1..n] of n integers, each has d digits, and each digit has value from
[0,k]
Running time = O(d (n + k) )
if d = O(1) and k = O(n), time complexity is linear time. D = O(1) and k= O(n) is
our input limit.
For Bucket Sort:
Input: Array A[1..n] of n elements, each is drawn uniformly at random from the
interval [0,1). -> input limit here
Running time = θ(n + X)
worst-case running time = θ(n2)
Average running time = θ(n)

評分標準 :
沒有寫出哪裡矛盾,不給分
只寫出其中一項 : counting、radix、bucket sort 是利用 counting 來做排序 OR
這些 Sorting 的 input 有限制,給 6 分
有寫出是利用 counting 做排序 而且 它們的 input 是有限制的,給 10 分

11. Chapter 8 (10%). Show how to sort n integers in the range 0 to 𝑛3 – 1 in O(n) time.

Solution:
Treat the numbers as 3-digit numbers in radix n. Each digit ranges from 0 to n-1.
Sort these 3-digit numbers with radix sort. There are 3 calls to counting sort, each
taking 𝑂(𝑛 + 𝑛) time, so that the total time is 𝑂(𝑛).

12. Chapter 9 (10%). Your uncle, Peter, is an owner of 25 horses, and he wants to find out the
three fastest horses among these 25 horses. Unfortunately, Peter doesn’t have any timing
device. All he has is a racecourse that allows a competition of five horses each time. In
particular, if any five horses are placed in the racecourse for a trial, then we will be able to
tell the relative speed of these five horses. You want to help him to minimize the number of
trials needed to determine the best three horses. Can you design a racing schedule which
requires at most 8 trials? How about 7 trials?

Ans :

8 trials : (寫出 8 次,給 5 分)


1. We divided 25 horses into five groups, each group has five horses.
2. Each group conducts a trial, and we can get five fastest horses from the five
groups.
3. Let the five fastest horses be a new group and conduct a trial. We can get the
fastest horse of all horses, and we remove the fastest horse from its corresponding
group.
4. Then put the 2nd fastest horse from the group that generated the fastest horse and
conduct 7th trial, then we can get the 2nd fastest horse of all horses. We remove the
2nd fastest horse from its corresponding group.
5. Repeat step 4 again. We can get the 3rd fastest horse of all horses. Done.
7 trials : (寫出 7 次,給 10 分)
1. We divided 25 horses into five groups, each group has five horses.
2. Each group conducts a trial, and we can get five fastest horses from the five
groups..
3. Let the five fastest horses be a new group and conduct a trial. We can get the
fastest horse of all horses, and we remove the fastest horse from its corresponding
group. We also remove the last two horses in the sixth trial. Then put the 2nd and
3rd fastest horses from the group that generated the fastest horse and put the
second fastest horse from the group that generated the second fastest horse in the
sixth trials.
4. Then, let this five horses conduct a trial again and we can get the 2nd fastest horse
and the 3rd fastest horse of all horses. Done.

13. Chapter 9 (10%). In this problem, we want to get the k numbers which are closest to the
median. However, we are not required to sort the k numbers when we output them. For
example, suppose the array A is as follows: 𝐴 = 〈2.3, 0.1, 2.1, 1.6, 0.5, 3.7, 2.2〉Then, the
median of the array is 2.1, and the 3 numbers closest to the median would be 2.1, 2.2, and
2.3. Design an O(n)-time algorithm to accomplish this task. Note that your algorithm needs
to work for any value of k.

Ans :

𝐴
1. Let 𝑚 = select(𝐴,⌈ 2 ⌉) //O(n)

2. For each element of 𝐴, 𝐴[i] = |𝐴[𝑖] − 𝑚| //O(n)


3. Let x = select(𝐴, k+1) //O(n)
4. Using partition of quicksort, we can get the groups of numbers that smaller than x,
the group of numbers is the k nearest numbers what we want. //O(n)

評分標準 :
寫出 step 1 給 3 分
寫出 step 1、step 2 給 6 分
寫對給 10 分
沒分析時間複雜度倒扣 2 分,若全錯則不倒扣

You might also like