# у«ЌТ│ЋУ«ЙУ«АУїЃСЙІ

ТюгуФаУіѓСИ╗УдЂС╗Іу╗ЇС║єУ«Ау«ЌТю║у«ЌТ│ЋуџётЄауДЇСЙІтГљ№╝їт»╣тѕєТЪЦТЅЙ№╝їт»╣тѕєТЪЦТЅЙТў»СИђуДЇТЋѕујЄтЙѕжФўуџёТЪЦТЅЙТќ╣Т│Ћ№╝їСйєУбФТЪЦТЅЙуџёТЋ░ТЇ«т┐ЁжА╗Тў»ТюЅт║Ј№╝ѕСЙІтдѓжЮъжђњтЄЈТюЅт║Ј№╝Ѕуџё№╝Џтљѕт╣Хтѕєу▒╗С╗ЦтЈіт┐ФжђЪТјњт║Ј№╝їт┐ФжђЪТјњт║Ј№╝ѕQuicksort№╝ЅТў»т»╣тєњТ│АТјњт║ЈуџёСИђуДЇТћ╣У┐ЏуГЅсђѓ
т▒Ћт╝ђТЪЦуюІУ»дТЃЁ

1. Divide and Conquer Рђб A general paradigm for algorithm design; inspired by emperors and colonizers. Рђб Three-step process: 1. Divide the problem into smaller problems. 2. Conquer by solving these problems. 3. Combine these results together. Рђб Examples: Binary Search, Merge sort, Quicksort etc. Matrix multiplication, Selection, Convex Hulls. Subhash Suri UC Santa Barbara

2. Binary Search Рђб Search for x in a sorted array A. Binary-Search (A, p, q, x) 1. if p > q return -1; 2. r = (p + q)/2 3. if x = A[r] return r 4. else if x < A[r] Binary-Search(A, p, r, x) 5. else Binary-Search(A, r + 1, q, x) Рђб The initial call is Binary-Search(A, 1, n, x). Subhash Suri UC Santa Barbara

3. Binary Search Рђб Let T (n) denote the worst-case time to binary search in an array of length n. Рђб Recurrence is T (n) = T (n/2) + O(1). Рђб T (n) = O(log n). Subhash Suri UC Santa Barbara

4. Merge Sort Рђб Sort an unordered array of numbers A. Merge-Sort (A, p, q) 1. if p РЅЦ q return A; 2. r = (p + q)/2 3. Merge-Sort (A, p, r) 4. Merge-Sort (A, r + 1, q) 5. MERGE (A, p, q, r) Рђб The initial call is Merge-Sort (A, 1, n). Subhash Suri UC Santa Barbara

5. Merge Sort Рђб Let T (n) denote the worst-case time to merge sort an array of length n. Рђб Recurrence is T (n) = 2T (n/2) + O(n). Рђб T (n) = O(n log n). Subhash Suri UC Santa Barbara

6. Merge Sort: Illustration 5 2 4 6 1 3 2 6 5 2 4 6 1 3 2 6 Divide 5 2 4 6 1 3 2 6 5 2 4 6 1 3 2 6 2 5 4 6 1 3 2 6 Merge 2 4 5 6 1 2 3 6 1 2 2 3 4 5 6 6 Subhash Suri UC Santa Barbara

7. Multiplying Numbers Рђб We want to multiply two n-bit numbers. Cost is number of elementary bit steps. Рђб Grade school method has ╬ў(n2) cost.: xxxxxxxx xxxxxxxx xxxxxxxxx xxxxxxxxx . . . xxxxxxxxx xxxxxxxxxxxxxxxx Рђб n2 multiplies, n2/2 additions, plus some carries. Subhash Suri UC Santa Barbara

8. Why Bother? Рђб DoesnРђЎt hardware provide multiply? It is fast, optimized, and free. So, why bother? Рђб True for numbers that fit in one computer word. But what if numbers are very large. Рђб Cryptography (encryption, digital signatures) uses big number Рђюkeys.РђЮ Typically 256 to 1024 bits long! Рђб n2 multiplication too slow for such large numbers. Рђб KaratsubaРђЎs (1962) divide-and-conquer scheme multiplies two n bit numbers in O(n1.59) steps. Subhash Suri UC Santa Barbara

9. KaratsubaРђЎs Algorithm Рђб Let X and Y be two n-bit numbers. Write X = ab Y = cd Рђб a, b, c, d are n/2 bit numbers. (Assume n = 2k .) XY = (a2n/2 + b)(c2n/2 + d) = ac2n + (ad + bc)2n/2 + bd Subhash Suri UC Santa Barbara

10. An Example Рђб X = 4729 Y = 1326. Рђб a = 47; b = 29 c = 13; d = 26. Рђб ac = 47 РѕЌ 13 = 611 Рђб ad = 47 РѕЌ 26 = 1222 Рђб bc = 29 РѕЌ 13 = 377 Рђб bd = 29 РѕЌ 26 = 754 Рђб XY = 6110000 + 159900 + 754 Рђб XY = 6270654 Subhash Suri UC Santa Barbara

11. KaratsubaРђЎs Algorithm Рђб This is D&C: Solve 4 problems, each of size n/2; then perform O(n) shifts to multiply the terms by 2n and 2n/2. Рђб We can write the recurrence as T (n) = 4T (n/2) + O(n) Рђб But this solves to T (n) = O(n2)! Subhash Suri UC Santa Barbara

12. KaratsubaРђЎs Algorithm Рђб XY = ac2n + (ad + bc)2n/2 + bd. Рђб Note that (a Рѕњ b)(c Рѕњ d) = (ac + bd) Рѕњ (ad + bc). Рђб Solve 3 subproblems: ac, bd, (a Рѕњ b)(c Рѕњ d). Рђб We can get all the terms needed for XY by addition and subtraction! Рђб The recurrence for this algorithm is T (n) = 3T (n/2) + O(n) = O(nlog2 3). Рђб The complexity is O(nlog2 3) РЅѕ O(n1.59). Subhash Suri UC Santa Barbara

13. Recurrence Solving: Review Рђб T (n) = 2T (n/2) + cn, with T (1) = 1. Рђб By term expansion. T (n) = 2T (n/2) + cn = 2 2T (n/22) + cn/2 + cn = 22T (n/22) + 2cn = 22 2T (n/23) + cn/22 + 2cn = 23T (n/23) + 3cn .. = 2iT (n/2i) + icn Рђб Set i = log2 n. Use T (1) = 1. Рђб We get T (n) = n + cn(log n) = O(n log n). Subhash Suri UC Santa Barbara

14. The Tree View Рђб T (n) = 2T (n/2) + cn, with T (1) = 1. Total Cost T(n) cn cn T(n/2) T(n/2) cn/2 2(cn/2) = cn cn/2 T(n/4) T(n/4) T(n/4) T(n/4) cn/4 cn/4 cn/4 cn/4 4(cn/4) = cn T(n/8) 8(cn/8) = cn cn/8 cn/8 cn/8 cn/8 cn/8 cn/8 cn/8 cn/8 Рђб # leaves = n; # levels = log n. Рђб Work per level is O(n), so total is O(n log n). Subhash Suri UC Santa Barbara

15. Solving By Induction Рђб Recurrence: T (n) = 2T (n/2) + cn. Рђб Base case: T (1) = 1. Рђб Claim: T (n) = cn log n + cn. T (n) = 2T (n/2) + cn = 2 (c(n/2) log(n/2) + cn/2) + cn = cn (log n Рѕњ 1 + 1) + cn = cn log n + cn Subhash Suri UC Santa Barbara

16. More Examples Рђб T (n) = 4T (n/2) + cn, T (1) = 1. Level Work n 0 cn n/2 1 4cn/2 = 2cn n/4 2 16cn/4 = 4cn n/8 3 i i 4 cn /2i i = 2 cn Subhash Suri UC Santa Barbara

17. More Examples Level Work n 0 cn n/2 1 4cn/2 = 2cn n/4 2 16cn/4 = 4cn n/8 3 i i 4 cn /2i i = 2 cn Рђб Stops when n/2i = 1, and i = log n. Рђб Recurrence solves to T (n) = O(n2). Subhash Suri UC Santa Barbara

18. By Term Expansion T (n) = 4T (n/2) + cn = 42T (n/22) + 2cn + cn = 43T (n/23) + 22cn + 2cn + cn .. = 4iT (n/2i) + cn 2iРѕњ1 + 2iРѕњ2 + . . . + 2 + 1 = 4iT (n/2i) + 2icn Рђб Terminates when 2i = n, or i = log n. Рђб 4i = 2i ├Ќ 2i = n ├Ќ n = n2. Рђб T (n) = n2 + cn2 = O(n2). Subhash Suri UC Santa Barbara

19. More Examples Рѕџ T (n) = 2T (n/4) + n, T (1) = 1. Рѕџ T (n) = 2T (n/4) + n 2 Рѕџ = 2 2T (n/4 ) + n/4 + n 2 2 Рѕџ = 2 T (n/4 ) + 2 n 2 3 Рѕџ = 2 2T (n/4 ) + n/42 +2 n 3 3 Рѕџ = 2 T (n/4 ) + 3 n .. i i Рѕџ = 2 T (n/4 ) + i n Subhash Suri UC Santa Barbara

20. More Examples Рђб Terminates when 4i = n, or when i = log4 n = log 2n log 4 = 1 2 log n. 2 1 log n Рѕџ T (n) = 2 2 + n log4 n Рѕџ = n(log4 n + 1) Рѕџ = O( n log n) Subhash Suri UC Santa Barbara

21. Master Method n T (n) = aT + f (n) b Total Cost n f(n) n/b a children af(n/b) a a n/b 2 a2 f(n/b2 ) a a a a ai f(n/bi ) Subhash Suri UC Santa Barbara

22. Master Method Total Cost n f(n) n/b a children af(n/b) a a n/b 2 a2 f(n/b2 ) a a a a ai f(n/bi ) Рђб # children multiply by factor a at each level. Рђб Number of leaves is alogb n = nlogb a. Verify by taking logarithm on both sides. Subhash Suri UC Santa Barbara

23. Master Method Рђб By recursion tree, we get logb nРѕњ1 logb a i n T (n) = ╬ў(n ) + af i i=0 b Рђб Let f (n) = ╬ў(np logk n), where p, k РЅЦ 0. Рђб Important: a РЅЦ 1 and b > 1 are constants. Рђб Case I: p < logb a. nlogb a grows faster than f (n). T (n) = ╬ў(nlogb a) Subhash Suri UC Santa Barbara

24. Master Method Рђб By recursion tree, we get logb nРѕњ1 logb a i n T (n) = ╬ў(n ) + af i i=0 b Рђб Let f (n) = ╬ў(np logk n), where p, k РЅЦ 0. Рђб Case II: p = logb a. Both terms have same growth rates. T (n) = ╬ў(nlogb a logk+1 n) Subhash Suri UC Santa Barbara

25. Master Method Рђб By recursion tree, we get logb nРѕњ1 logb a i n T (n) = ╬ў(n ) + af i i=0 b Рђб Let f (n) = ╬ў(np logk n), where p, k РЅЦ 0. Рђб Case III: p > logb a. nlogb a is slower than f (n). T (n) = ╬ў (f (n)) Subhash Suri UC Santa Barbara

26. Applying Master Method Рђб Merge Sort: T (n) = 2T (n/2) + ╬ў(n). a = b = 2, p = 1, and k = 0. So logb a = 1, and p = logb a. Case II applies, giving us T (n) = ╬ў(n log n) Рђб Binary Search: T (n) = T (n/2) + ╬ў(1). a = 1, b = 2, p = 0, and k = 0. So logb a = 0, and p = logb a. Case II applies, giving us T (n) = ╬ў(log n) Subhash Suri UC Santa Barbara

27. Applying Master Method Рђб T (n) = 2T (n/2) + ╬ў(n log n). a = b = 2, p = 1, and k = 1. p = 1 = logb a, and Case II applies. T (n) = ╬ў(n log2 n) Рђб T (n) = 7T (n/2) + ╬ў(n2). a = 7, b = 2, p = 2, and logb 2 = log 7 > 2. Case I applied, and we get T (n) = ╬ў(nlog 7) Subhash Suri UC Santa Barbara

28. Applying Master Method 2 Рѕџ Рђб T (n) = 4T (n/2) + ╬ў(n n). a = 4, b = 2, p = 2.5, and k = 0. So logb a = 2, and p > logb a. Case III applies, giving us 2 Рѕџ T (n) = ╬ў(n n) n Рђб T (n) = 2T (n/2) + ╬ў log n . a = 2, b = 2, p = 1. But k = Рѕњ1, and so the Master Method does not apply! Subhash Suri UC Santa Barbara

29. Matrix Multiplication Рђб Multiply two n ├Ќ n matrices: C = A ├Ќ B. n Рђб Standard method: Cij = k=1 Aik ├Ќ Bkj . Рђб This takes O(n) time per element of C, for the total cost of O(n3) to compute C. Рђб This method, known since GaussРђЎs time, seems hard to improve. Рђб A very surprising discovery by Strassen (1969) broke the n3 asymptotic barrier. Рђб Method is divide and conquer, with a clever choice of submatrices to multiply. Subhash Suri UC Santa Barbara

ТѕЉт░▒Тў»ТѕЉ№╝Ђ
ти▓т░єжЊЙТјЦтцЇтѕХУЄ│тЅфУ┤┤ТЮ┐