Dynamic Programming CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 1 Dynamic Programming Coming up Assembly-line scheduling Matrix-chain multiplication Elements of dynamic programming Longest common subsequence Optimal binary search trees (Chap 15) CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 2 First Dynamic Programming Example: Assembly-line Scheduling CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 3 Assembly-line Scheduling Station S1,1 Station S1,2 Station S1,3 Station S1,4 … Station S1,n-1 Station S1,n Assembly line 1 a1,1 e1 t1,1 chassis enters a1,3 t1,2 a1,n-1 a1,4 t1,3 a1,n t1,n-1 x1 t2,n-1 x2 Completed auto exits … t2,1 e2 Assembly line 2 a1,2 a2,1 t2,2 a2,2 t2,3 a2,3 a2,4 a2,n-1 a2,n Station S2,1 Station S2,2 Station S2,3 Station S2,4 … Station S2,n-1 Station S2,n CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 4 Assembly-line Scheduling Problem: To minimize the total processing time for one auto. Station S1,1 Station S1,2 Station S1,3 Station S1,4 … Station S1,n-1 Station S1,n a1,1 x1 a1,2 a1,3 a1,4 … a1,n-1 a1,n x1 t1,1 chassis enters t2,n-1 t2,3 a2,1 a2,2 a2,3 Completed auto exits a2,4 … a2,n- a2,n 1 Station S2,1 Station S2,2 Station S2,3 Station S2,4 … Station S2,n-1 Station S2,n CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 5 Assembly-line Scheduling To find the fastest ways, we may inspect the speeds of all possible ways. But … there are 2n possible ways !!! It takes (2n) time. Solve it by dynamic-programming. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 6 Assembly-line Scheduling (Step 1) Step 1. Characterize the structure of an optimal solution (the fastest way through the factory) The fastest way through the factory = The faster one of either a1,n The fastest way to reach S1,j-1 x1 or The fastest way to reach S2,j-1 x2 a2,j-1 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 7 Assembly-line Scheduling (Step 1) Step 1. Characterize the structure of an optimal solution (the fastest way through the factory) The fastest way to reach station S1,j = The faster one of either a1,j-1 The fastest way to reach S1,j-1 a1,j a1,j or The fastest way to reach S2,j-1 t2,n-1 a2,j-1 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 8 Assembly-line Scheduling (Step 2) Step 2. Recursively define an optimal solution (fastest way through the factory), in terms of the optimal solutions to subproblems Let f* denote the fastest time through the factory. Let fi[j] denote the fastest time to reach and get through Si,j. Then, f*= min (f1[n] + x1, f2[n] + x2) f1(j) = f2(j) = e1+a1,1 if j=1 min (f1[j-1] + a1,j, f2[j-1] + t2,j-1+ a1,j) if j>1 e2+a2,1 if j=1 min (f2[j-1] + a2,j, f1[j-1] + t1,j-1+ a2,j) if j>1 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena Most of the subproblems are visited more than once. Any clever method to handle them? 4. Dynamic Programming - 9 Assembly-line Scheduling (Step 3) FASTEST-WAY (a,t,e,x,n) 1 2 f1[1] e1 + a1,1 f2[1] e2 + a2,1 Step 3. Compute the value of 3 4 for j 2 to n do if f1[j-1] + a1,j f2[j-1] + t2,j-1+ a1,j then f1[j] f1[j-1] + a1,j l1[j] 1 else f1[j] f2[j-1] + t2,j-1 + a1,j l1[j] 2 an optimal solution in a bottom-up fashion 5 Let li[j] be the line number, 1 or 2, 6 whose station j-1 is used in 7 the solution. 8 eg. l1[5] = 2 means that the fastest 9 do if f2[j-1] + a2,j f1[j-1] + t1,j-1+ way to reach station 5 of a2,j line 1 should pass through 10 then f2[j] f2[j-1] + a2,j station 4 of line 2. 11 l2[j] 2 Let l* be the line whose station n 12 else f2[j] f1[j-1] + t1,j-1 + a2,j is used in the solution. 13 l2[j] 1 eg. l*=2 means that the fastest way 14 if f1[n] + x1 f2[n] + x2 involves station n of line 2. 15 then f* f [n]+x 1 1 16 l* 1 CS3381 Des & Anal of Alg (2001-2002 SemA) http://www.cs.cityu.edu.hk/~helena City Univ of HK / Dept of CS / Helena Wong 17 else f* f2[n]+x4.2Dynamic Programming - 10 Assembly-line Scheduling (Step 3) What we are doing, is: • Starting at the bottom level, continuously filling in the tables (f1[], f2[] and l1[], l2[]), and • Looking up the tables to compute new results for next higher level Exercise: what is the complexity of this algorithm? CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 11 Assembly-line Scheduling (Step 4) Step 4. Construct an optimal solution from computed information. PRINT-STATIONS() 1 i l* 2 print “line ” i “, station ” n 3 for j n downto 2 4 do i li[j] 5 print “line ” i “, station ” j-1 line 1, station 6 line 2, station 5 line 2, station 4 line 1, station 3 line 2, station 2 line 1, station 1 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 12 Assembly-line Scheduling What we have done is indeed Dynamic Programming Now it is time to test your memory: Dynamic Programming Step ? Characterize the structure of an optimal solution Eg. Study the structure of the fastest way through the factory. Step ? Recursively define the value of an optimal solution Step ? Compute the value of an optimal solution in a bottom-up fashion Step ? Construct an optimal solution from computed information. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 13 Second Dynamic Programming Example: Matrix-Chain Multiplication CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 14 Matrix-Chain Multiplication 5 5 5 5 5 5 5 5 5 5 5 5 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 15 Matrix-Chain Multiplication Can we freely multiply matrices of any different shapes? No. They must be compatible: To multiply A and B, number of columns of A must be equal to number of rows of B. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 16 Matrix-Chain Multiplication We can also multiply them in different ways: However, we must consider the different efficiencies of the different ways. (parenthesizations). CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 17 Matrix-Chain Multiplication 5x3x4=60 multiplications 5 5 5 5 5 5 5 5 5 5 5 5 4x5x2=40 multiplications 5 4x3x2= multiplications Total = multiplications CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong 5x3x2= multiplications Total = http://www.cs.cityu.edu.hk/~helena multiplications 4. Dynamic Programming - 18 Matrix-Chain Multiplication A B 5 5 5 5 5 5 5 5 5 5 5 5 5 … 100 … Another Example: … C … 5 … B 50 … … … 100 … A … … 10 (AB): (BC): ((AB)C): (A(BC)): Number of scalar multiplications = No. of rows of A * No. of columns of A * or no. of No. of columns of B rows of B Size of resultant matrix: No. of rows = No. of rows of A No. of columns = No. of columns of B … No. of multiplication = 10 * 100 * 5 = 5000 No. of multiplication = 100 * 5 * 50 = 25000 No. of multiplication = 5000 + 10 * 5 * 50 = No. of multiplication = 25000 + 10 * 100 * 50 = CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 19 Matrix-Chain Multiplication So before multiplying the 3 matrices, we better decide the optimal parenthesization. But how to decide? Shall we check all the parenthesizations and find out the optimal one? If we are multiplying 100 matrices, it seems there are a lot of candidates. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong Mmm, Let’s see how many parenthesizations we’d need to check … http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 20 Matrix-Chain Multiplication n-1 ways Suppose we need to compute A1A2A3…An Our question is “How many parenthesizations we need to check”. We may compute in many ways: 1st way: A1(A2…An) but, there are different methods to compute for (A2…An). 2nd way: (A1A2)(A3…An) but, there are different methods to compute for (A3…An). .. n-1th way: (A1…An-1)An )An but, there are different methods to compute for (A1…An-1). Total no. of methods = Let’s handle in kth way, then sum of the methods for all n-1 ways. no. of methods for (A1..An) = no. of methods for (A1..Ak)(Ak+1An) Let P(n) = no. of alternative parenthesizations of n matrices, then = (no. of methods for k matrices) x (no. of methods for n-k matrices) 1 if n=1 P(n) = CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong k=1 to n-1 P(k) P(n-k) http://www.cs.cityu.edu.hk/~helena if n>1 4. Dynamic Programming - 21 Matrix-Chain Multiplication P(n) = 1 if n=1 k=1 to n-1 P(k) P(n-k) if n>1 Hence, P(1) = 1 P(2) = P(1)*P(1) = 1 P(3) = P(1)*P(2)+P(2)*P(1) = 2 P(4) = P(1)*P(3)+P(2)*P(2)+P(3)*P(1) = 5 P(5) = P(1)*P(4)+P(2)*P(3)+P(3)*P(2)+P(4)*P(1) = 14 P(6) = P(1)*P(5)+P(2)*P(4)+P(3)*P(3)+P(4)*P(2)+P(5)*P(1) = … This function increases very fast [(2n)]. So it is not feasible to check all possible parenthesizations. We’ll solve it using dynamic programming. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 22 Matrix-Chain Multiplication (Step 1) Step 1. The structure of an optimal parenthesization For ..AiA2A3…Aj.. Let ( Ai…Ak )( Ak+1…Aj ) be the optimal parenthesization. Then Ai…Ak must be the optimal parenthesization for Ai..Ak. And Ak+1…Aj must be the optimal parenthesization for Ak+1..Aj. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 23 Matrix-Chain Multiplication (Step 2) Step 2. A recursive solution Define m[i,j] as minimum number of scalar multiplications for Ai…Aj. And let ps-1 be the number of columns of As-1 (= number of rows of As). Then, m[i,j] = 0 if i=j min i k < j {m[i,k]+m[k+1,j]+pi-1pkpj } if i<j Then, each m[i,j] problem can be solved by making use of the solutions of smaller problems. Indeed there are not many problems: one problem for each choice of i and j satisfying 1 i j n. But their solutions are to be visited many times. => Any clever method? CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 24 Matrix-Chain Multiplication (Step 3) Step 3. Compute an optimal solution from bottom-up 0 m[i,j] = m[1,2] = m[1,1]+m[2,2]+30x35x15 = m[2,3] = m[2,2]+m[3,3]+35x15x5 = if i=j min i k < j {m[i,k]+m[k+1,j]+pi-1pkpj } if i<j For chain length = 2: Example: col x row A1 35x30 A2 15x35 A3 5x15 A4 10x5 A5 20x10 A6 25x20 For chain length = 1: m 1 2 j 3 4 5 6 i 3 1 2 4 5 6 0 15750 0 7875 2625 0 9375 4375 750 0 11875 7125 2500 1000 0 15125 10500 5375 3500 5000 0 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong m[1,3] = min m[1,1] + m[2,3] + 30x35x5 = m[1,2] + m[3,3] + 30x15x5 = m[1,4] = min http://www.cs.cityu.edu.hk/~helena m[2,2] + m[3,4] + 35x15x10 = m [2,3] + m[4,4] + 35x5x10 = 4. Dynamic Programming - 25 Matrix-Chain Multiplication (Step 3) Step 3. Compute an optimal solution from bottom-up O() () Let s[i,j] be the value of k where optimal splitting occurs. A1 A2 A3 A4 A5 A6 col x row 35x30 15x35 5x15 10x5 20x10 25x20 m 1 2 j 3 4 5 6 i s 1 2 3 4 5 6 1 2 1 j 3 1 2 4 3 3 3 5 3 3 3 4 6 3 3 3 5 5 i 3 1 2 4 5 6 0 15750 0 7875 2625 0 9375 4375 750 0 11875 7125 2500 1000 0 15125 10500 5375 3500 5000 0 CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong MATRIX-CHAIN-ORDER 1 for 2 3 for 4 5 6 7 8 9 10 11 i 1 to n m[i,i] 0 len 2 to n //len is chain length for i 1 to n - len + 1 j i + len - 1 m[i,j] for k i to j-1 q m[i,k]+m[k+1,j]+pi-1pkpj if q < m[i,j] then m[i,j] q s[i,j] k http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 26 Matrix-Chain Multiplication (Step 4) Step 4. Constructing an optimal solution A recursive printing procedure: Recall that s[i,j] = the value of k where optimal splitting occurs. PRINT-OPTIMAL-PARENS(i,j) 1 if i=j then print “A”I 2 else print “(“ 3 PRINT-OPTIMAL-PARENS(i,s[i,j]) 4 PRINT-OPTIMAL-PARENS(s[i,j]+1,j) 5 print “)“ CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena i s 1 2 3 4 5 6 1 2 1 j 3 1 2 4 3 3 3 5 3 3 3 4 6 3 3 3 5 5 Sample ((A1(A2A3))((A4A5)A6)) 4. Dynamic Programming - 27 Elements of Dynamic Programming: 2 Key Ingredients & Memoization CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 28 Elements of Dynamic Programming Comparison: Divide & Conquer: Break up into smaller problems Dynamic Programming: Solve all smaller problems but only reuse optimal subproblem solutions But when should we apply dynamic programming? Optimizatio n Problems Bottom-up Table-based approach Solution 2 key ingredients: Optimal Substructure Overlapping Subproblems CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 29 Elements of Dynamic Programming 2 key ingredients of Dynamic Programming: Optimal Substructure For any problem, if an optimal solution consists of optimal solutions to its subproblems, then, it exhibits optimal substructure. Overlapping Subproblems If a recursive algorithm revisits the same subproblem over and over again, we say that the subproblems are overlapping. If both ingredients happen, it is a good clue that dynamic programming might apply. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 30 Elements of Dynamic Programming Memoization - a variation of dynamic programming. • Uses a top-down (recursive) strategy: After computing solutions to subproblems, store in the table. Subsequent calls do table lookup. • Memorization? Memoization? MEMOIZED-MATRIX-CHAIN() 1 for i 1 to n 2 for j i to n 3 m[i,j] 4 return LOOKUP-CHAIN(i,j) CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong LOOKUP-CHAIN(i,j) 1 if m[i,j] < 2 return m[i,j] 3 if i=j 4 then m[i,j] 0 5 else for k i to j-1 6 q LOOKUP-CHAIN(i,k) + LOOKUP-CHAIN(k+1,j) + pi-1pkpj 7 if q < m[i,j] 8 then m[i,j] q 9 return m[i,j] http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 31 Elements of Dynamic Programming When Memoization outperforms bottom-up dynamic programming? For the some cases when some subproblems need not be solved at all. Eg. In the assembly line, some stations’ delivery paths are blocked. Then Memoization is faster. Otherwise, a bottom-up dynamic programming is usually more efficient: No overhead for recursion Less overhead for maintaining the table. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 32 Third Dynamic Programming Example: Longest Common Subsequence CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 33 Longest Common Subsequence Suppose we have 2 strings: S1=ACCGGTCGAGTGCGCGGAAGCCGGCCGAA S2=GTCGTTCGGAATGCCGTTGCTCTGTAAT The Longest Common Subsequence is GTCGTCGGAAGCCGGCCGAA The Longest Common Subsequence problem : Given 2 sequences X=<x1,x2,…xm> and Y=<y1,y2,…yn>, find a maximum-length common subsequence (LCS) of X and Y. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 34 Longest Common Subsequence Optimal substructure of LCS: Let Z=<z1,z2,..zk> be any LCS of X=<x1,x2,…xm> and Y=<y1,y2,…yn>. Let’s define the notations: Xr=<x1,x2,..xr>, Yr=<y1,y2,..yr>, Zr=<z1,z2,..zr>. Then, if xm=yn, then Zk-1 is an LCS of the pair Xm-1 and Yn-1. if xmyn, then Z is an LCS of the pair Xm-1 and Yn, or Z is an LCS of the pair Xm and Yn-1, or Z is an LCS of both pairs. We can work out a recursive formula for the problem: let c[i,j] = length of an LCS of Xi and Yj, then C[i,j] = 0 if i=0 or j=0 c[i-1,j-1]+1 if i,j>0 and xi=yj Max(c[i,j-1],c[i-1,j]) if i,j>0 and xi yj CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena Note that in this case we skip the computation of the subproblems c[i,j-1] and c[i1,j] 4. Dynamic Programming - 35 Longest Common Subsequence To finish the solution by dynamic programming? You may like to try out yourselves, Or, read Chap 15.4. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 36 Forth Dynamic Programming Example: Optimal binary search trees CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 37 Optimal binary search trees A binary search tree for dictionary lookup: Let K=<k1,k2,..kn> be n words (distinct keys) in sorted order Let d0,d1,..dn be n “dummy keys” representing values not in K k2 k1 d0 k4 d1 i 0 1 2 3 4 5 pi 0.15 0.10 0.05 0.10 0.20 qi 0.05 0.10 0.05 0.05 0.05 0.10 Define the cost of a search as the number of nodes examined in a search. (depth of the key + 1) CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong k3 d2 Let pi = probability of searching k1 qi = probability representing di http://www.cs.cityu.edu.hk/~helena d5 d5 k4 k3 d2 d4 Expected cost of an arbitrary k5 search = 2.75 k1 d1 k5 d3 k2 d0 Expected cost of an arbitrary search = 2.8 d4 d3 Optimal binary search tree 4. Dynamic Programming - 38 Optimal binary search trees The Optimal Binary Search Tree problem : For a given set of probabilities of the keys and dummy nodes, construct an optimal binary search tree. Optimal substructure: For an optimal binary search tree with root kr as the root, the left sub-tree and the right sub-tree are also optimal binary search trees. kr CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 39 Optimal binary search trees We can work out a recursive formula for the problem: Let e[i,j] = expected cost of searching an optimal binary search tree containing ki,..,kj. When j = i-1, it corresponds to searching di-1, then e[i,i-1]=qi-1. e[i,j] = k2 k1 d0 k5 d1 d5 k4 k3 d2 d4 d3 qi-1 if j=i-1 Minir j { e[i,r-1] + e[r+1,j] + s=i to jps + s=i-1 to jqs } if i j CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 40 Optimal binary search trees To finish the solution by dynamic programming? You may like to try out yourselves, Or, read Chap 15.5. CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 41 Dynamic Programming Summary Assembly-line scheduling Matrix-chain multiplication Elements of dynamic programming Optimal Substructure Overlapping Subproblems Memoization Longest common subsequence Optimal binary search trees CS3381 Des & Anal of Alg (2001-2002 SemA) City Univ of HK / Dept of CS / Helena Wong http://www.cs.cityu.edu.hk/~helena 4. Dynamic Programming - 42