4. Nontraditional Optimization Algorithms Scientists have tried to mimic the nature throughout the history. Nature Manmade Nature Manmade Crane (bird) Crane (Machine) Nature Manmade Crane (bird) Crane (Machine) Fish Submarine Nature Manmade Crane (bird) Crane (Machine) Fish Submarine bird Aircraft Nature Manmade Crane (bird) Crane (Machine) Fish Submarine bird Aircraft Brain processes Microprocessor Nature Manmade Crane (bird) Crane (Machine) Fish Submarine bird Aircraft Brain processes Microprocessor Biological neural networks Artificial Neural networks Nature Manmade Crane (bird) Crane (Machine) Fish Submarine bird Aircraft Brain processes Microprocessor Biological neural networks Artificial Neural networks Reproduction process Genetic Algorithms The nontraditional optimization algorithms are The nontraditional optimization algorithms are Genetic Algorithms The nontraditional optimization algorithms are Genetic Algorithms Neural Networks The nontraditional optimization algorithms are Genetic Algorithms Neural Networks Ant Algorithms The nontraditional optimization algorithms are Genetic Algorithms Neural Networks Ant Algorithms Simulated Annealing 4.1 Genetic Algorithms 4.1 Genetic Algorithms 4.1.(a) Notion of Genetic Algorithms 4.1 Genetic Algorithms 4.1.(a) Notion of Genetic Algorithms Human 4.1 Genetic Algorithms 4.1.(a) Notion of Genetic Algorithms Human 1 0 1 1 1 0 0 1 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 4.1 Genetic Algorithms 4.1.(a) Notion of Genetic Algorithms Human 1 0 1 1 1 0 0 1 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 23 chromosomes 4.1 Genetic Algorithms 4.1.(a) Notion of Genetic Algorithms Human 1 0 1 1 1 0 0 1 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 23 chromosomes A fetus is formed by a Male(sperm) and female(egg). 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 Crossover point 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 0 0 1 1 0 0 1 1 1 1 0 0 1 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 0 0 1 1 0 0 1 1 1 1 0 0 1 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 0 0 1 1 0 0 1 1 1 1 0 0 1 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 0 0 1 1 0 1 1 1 1 0 0 0 0 0 1 1 1 1 0 0 1 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 1 1 0 0 1 + 0 1 1 0 0 1 1 0 1 0 1 0 0 1 1 0 1 1 1 1 0 0 0 0 0 1 1 1 1 0 0 1 0 0 1 0 1 1 1 1 4.1.(b) Some Basic Facts 4.1.(b) Some Basic Facts Powerful 4.1.(b) Some Basic Facts Powerful, wealthy 4.1.(b) Some Basic Facts Powerful, wealthy, smart 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people get more dating opportunities. 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people get more dating opportunities. But, it is random. 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people get more dating opportunities. But, it is random. (Probability of SelectionFitness) 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people get more dating opportunities. But, it is random. A kid may be more mother like or father like. 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people get more dating opportunities. But, it is random. A kid may be more mother like or father like. But, it is random. 4.1.(b) Some Basic Facts Powerful, wealthy, smart, good looking, Educated or caring people get more dating opportunities. But, it is random. A kid may be more mother like or father like. But, it is random. (Crossover Point) Approximately 10% of couples do not have kids since either they opt not to have them or Approximately 10% of couples do not have kids since either they opt not to have them or they cannot have them biologically. Approximately 10% of couples do not have kids since either they opt not to have them or they cannot have them biologically. But, it is random. Approximately 10% of couples do not have kids since either they opt not to have them or they cannot have them biologically. But, it is random. The population can be maintained at a constant level by perfect family planning. Approximately 10% of couples do not have kids since either they opt not to have them or they cannot have them biologically. But, it is random. The population can be maintained at a constant level by perfect family planning. It is done by limiting 2 kids per family. The evolutionary process can be expedited by improving the variety of the gene pool. The evolutionary process can be expedited by improving the variety of the gene pool. It is done via mutation. The evolutionary process can be expedited by improving the variety of the gene pool. It is done via mutation. Mutation Process The evolutionary process can be expedited by improving the variety of the gene pool. It is done via mutation. Mutation Process 1 0 1 1 1 0 0 1 The evolutionary process can be expedited by improving the variety of the gene pool. It is done via mutation. Mutation Process 1 0 1 1 1 0 0 1 1 0 0 1 1 0 0 1 Genetic algorithms are usually applied for maximization problems. Genetic algorithms are usually applied for maximization problems. To minimize f(x) (f(x)>0)using GAs, consider Genetic algorithms are usually applied for maximization problems. To minimize f(x) (f(x)>0)using GAs, consider maximization of Genetic algorithms are usually applied for maximization problems. To minimize f(x) (f(x)>0)using GAs, consider maximization of 1 1+f(x) 4.1.(c) Example 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 0 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 4 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 4 6 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 4 6 8 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 4 6 8 10 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 4 6 8 10 12 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5) 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)= 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7) 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)= 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1) 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)= 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4) 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)= 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101100 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101100 (12,0) 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101100 (12,0)=110000 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101100 (12,0)=110000 (8,6) 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101100 (12,0)=110000 (8,6)=100110 0 2 4 6 8 10 12 14 x 4.1.(c) Example Maximize y1.3+10e^(-xy)+sin (x-y)+3 in R=[0,14]×[0,7]. y (0,5)=000101 (2,7)=001111 7 (6,1)=011001 (10,4)=101100 (12,0)=110000 (8,6)=100110 0 2 4 6 8 10 12 14 x String 000101 001111 011001 101100 110000 100110 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Fitness: 21.02 21.02+15.46+4.11+9.17+13.21+13.31 = 0.276 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 Fitness: 21.02 21.02+15.46+4.11+9.17+13.21+13.31 = 0.276 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 Fitness: 15.46 21.02+15.46+4.11+9.17+13.21+13.31 = 0.203 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 Fitness: 4.11 21.02+15.46+4.11+9.17+13.21+13.31 = 0.054 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob String 0.276 0.479 0.533 0.653 0.826 1.000 0.269 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob String 0.276 0.479 0.533 0.653 0.826 1.000 0.269 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 0.923 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 0.923 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 0.117 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 0.117 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 0.366 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 0.366 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 0.804 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 0.804 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 0.589 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 0.589 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 String 000101 100110 000101 001111 110000 101100 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 0.720-0.899 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 0.540-0.719 Cum.Prob 0.360-0.539 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 0.180-0.359 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 0.000-0.179 String 000101 100110 000101 001111 110000 101100 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 0.707 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 0.508 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 0.240 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000101 100110 000101 001111 110000 101100 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000110 100101 000111 001101 111100 100000 String 000101 100110 000101 001111 110000 101100 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000110 100101 000111 001101 111100 100000 String 000101 100110 000101 001111 110000 101100 Changes if p>0.95 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000110 100101 000111 001101 111100 100000 String 000101 100110 000101 001111 110000 101100 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000110 001101 000111 001101 111100 110000 String 000101 100110 000101 001111 110000 101100 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000110 001101 000111 001101 111100 110000 String 000101 100110 000101 001111 110000 101100 Go to iter. 2 String 000101 100110 000101 001111 110000 101100 String 000110 100101 000111 001101 111100 100000 Prob. 0.276 0.203 0.054 0.120 0.173 0.174 Mutation f(x,y) 21.02 15.46 4.11 9.17 13.21 13.31 Crossover String (x,y) 000101 (0,5) 001111 (2,7) 011001 (6,1) 101100 (10,4) 110000 (12,0) 100110 (8,6) Cum.Prob 0.276 0.479 0.533 0.653 0.826 1.000 String 000110 001101 000111 001101 111100 110000 String 000101 100110 000101 001111 110000 101100 Go to iter. 2 String 000110 001101 000111 001101 111100 110000 String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) 23.17 11.05 25.43 11.05 9.24 13.21 String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) 23.17 11.05 25.43 11.05 9.24 13.21 Continue String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) 23.17 11.05 25.43 11.05 9.24 13.21 Continue String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) 23.17 11.05 25.43 11.05 9.24 13.21 Previous Avg. = 12.71 New Avg. = 15.52 Continue String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) 23.17 11.05 25.43 11.05 9.24 13.21 Continue Previous Avg. = 12.71 Previous Max. = 21.02 New Avg. = 15.52 New Max. = 25.43 String (x,y) 000110 (0,6) 001101 (2,5) 000111 (0,7) 001101 (2,5) 111100 (14,4) 110000 (12,0) f(x,y) 23.17 11.05 25.43 11.05 9.24 13.21 Continue Previous Avg. = 12.71 Previous Max. = 21.02 New Avg. = 15.52 New Max. = 25.43 Previous = New = y 7 0 2 4 6 8 10 12 14 x Previous = New = y 7 0 2 4 6 8 10 12 14 x 4.2 Neural Networks 4.2 Neural Networks (a) Biological Neural Networks 4.2 Neural Networks (a) Biological Neural Networks 4.2 Neural Networks (a) Biological Neural Networks 4.2 Neural Networks (a) Biological Neural Networks Neurons Axon Axon Nucleus Axon Nucleus Axon dendrites Nucleus Axon dendrites Nucleus Axon dendrites Nucleus Axon dendrites Nucleus Axon dendrites Nucleus Axon Synapses dendrites Nucleus Axon Synapses dendrites Nucleus Axon Synapses Neuron dendrites Nucleus Axon Synapses Neuron dendrites Nucleus Axon Synapses Neuron The gap changes while information are being stored. A neuron is multi-input and single-output object.. A neuron is multi-input and single-output object.. A nucleus produces a signal and passes it through the axon when it is excited by the signals received from other neurons.. A neuron is multi-input and single-output object.. A nucleus produces a signal and passes it through the axon when it is excited by the signals received from other neurons.. If the signal is large enough to pass through synapse, the dendrites carry it to the adjacent neurons. (b) Architecture of Artificial Neural Networks (b) Architecture of Artificial Neural Networks The subsections are (b) Architecture of Artificial Neural Networks The subsections are Simple model of a neuron (b) Architecture of Artificial Neural Networks The subsections are Simple model of a neuron Neuron transfer function (characteristics) (b) Architecture of Artificial Neural Networks The subsections are Simple model of a neuron Neuron transfer function (characteristics) Weights between two neurons (b) Architecture of Artificial Neural Networks The subsections are Simple model of a neuron Neuron transfer function (characteristics) Weights between two neurons The complete model (i) Simple Model of a Neuron + axon dendrites Synapses cell body (ii) Transfer Function of a Neuron (ii) Transfer Function of a Neuron linear (ii) Transfer Function of a Neuron linear threshold (ii) Transfer Function of a Neuron linear threshold sigmoid (iii) Weights between Two Neurons (iii) Weights between Two Neurons (iii) Weights between Two Neurons The signal attenuation is modeled by a multiplier m(0≤m≤1). (iii) Weights between Two Neurons The signal attenuation is modeled by a multiplier m(0≤m≤1). m (iii) Weights between Two Neurons The signal attenuation is modeled by a multiplier m(0≤m≤1). m 0≤m≤1 (iii) Weights between Two Neurons The signal attenuation is modeled by a multiplier m(0≤m≤1). m m may be assigned real values. 0≤m≤1 (iv) Complete Model (iv) Complete Model (iv) Complete Model (iv) Complete Model (iv) Complete Model (iv) Complete Model (iv) Complete Model Input layer Hidden layer(s) (iv) Complete Model Input layer Hidden layer(s) (iv) Complete Model Input layer Output layer u1 u2 uk un m1 m2 mk mn v u1 u2 uk un m1 m2 mk mn v u1 u2 uk un m1 m2 v mk mn v=f(Σmiui+θ) u1 u2 uk m1 m2 v mk un mn v=f(Σmiui+θ) θ-threshold level u1 u2 uk un m1 m2 v mk mn v=f(Σmiui+θ) θ-threshold level, f-linear u1 u2 uk un m1 m2 v mk mn v=f(Σmiui+θ) θ-threshold level, f-linear, threshold u1 u2 uk un m1 m2 v mk mn v=f(Σmiui+θ) θ-threshold level, f-linear, threshold or sigmoid u1 u2 uk un m1 m2 v mk mn v=f(Σmiui+θ) θ-threshold level, f-linear, threshold or sigmoid m’s are changed while information are being stored. (c) Special Types of Networks Hidden layer(s) Feed forward networks Input layer Output layer Hidden layer(s) Feedback networks Input layer Output layer (d) Pattern Recognition zero one X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 1 1 1 0 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 1 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 1 0 1 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 1 1 1 0 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 1 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 1 0 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X21 X22 X23 o/p 0 0 1 0 1 0 1 0/1 1 0 0 0 1 0 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 1 0 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 1 0/1 0 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 0/1 1 1 1 1 0 1 0 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 0/1 0/1 0 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 1 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 0/1 1 1 1 1 0 1 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 1 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 0/1 1 1 1 1 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 1 0 0 0 0 0 1 0 1 0 1 0 1 1 1 0 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 0 1 1 1 1 0 1 0/1 1 1 1 1 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 1 0 1 0 1 0/1 1 0 0 0 0 0 1 0 1 0 0 1 1 1 0 0 1 0 1 0/1 0/1 0 0 1 1 1 0 0 1 0 1 1 1 1 1 0 1 0/1 1 1 1 1 0 1 0 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 Pause 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 Pause 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 X11 X12 X13 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 1 1 1 0 0 1 1 0 0 1 1 X21 X22 X23 o/p X31 X32 X33 o/p 0 0 0 0 0 0 1 1 0 1 0 1 1 0/1 1 0/1 0 0 0 0 0 0 1 1 0 1 0 1 1 1 1 0 0 1 0 1 0 0/1 0 0/1 1 1 1 0 0 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 0 1 0 (e) System identification and modeling (e) System identification and modeling Modeling plantations (Agriculture) (e) System identification and modeling Modeling plantations (Agriculture) Modeling earthquake dynamics (Structures) (e) System identification and modeling Modeling plantations (Agriculture) Modeling earthquake dynamics (Structures) Modeling channel disturbances (Communication) (e) System identification and modeling Modeling plantations (Agriculture) Modeling earthquake dynamics (Structures) Modeling channel disturbances (Communication) Modeling crash resistance of automobiles (Auto Eng.) SUPERVISOR SUPERVISOR Supervised learning Minimization of error between expected and computed output is obtained by adjusting weights. Minimization of error between expected and computed output is obtained by adjusting weights. The supervisor decides the rule which is applied to adjust weights. Unsupervised learning Minimization of error between expected and computed output is obtained by adjusting weights (as in supervised learning). Minimization of error between expected and computed output is obtained by adjusting weights (as in supervised learning). The rule is inbuilt. Pause Ground accel. Movement Ground accel. Movement 10 sin 1000t 2.5 30 sin 1000t 3 10 sin 4000t 3 Ground accel. Movement 10 sin 1000t 2.5 30 sin 1000t 3 10 sin 4000t 3 50 sin 3000t ? Ground accel. Movement 10 sin 1000t 2.5 30 sin 1000t 3 10 sin 4000t 3 50 sin 3000t ? 10→1, 1000→1 x y z 1 3 1 1 1 4 2.5 2.5 3 1 1 x y z 1 3 1 1 1 4 2.5 2.5 3 1 1 x y z 1 3 1 1 1 4 2.5 2.5 3 0.5 x y z 1 3 1 1 1 4 2.5 2.5 3 0.5 2 1 1 2 1 1 1 2 x y z 1 3 1 1 1 4 2.5 2.5 3 0.5 (1,0.5) 2 1 1 2 1 1 (1,0.5) 1 2 x y z 1 3 1 1 1 4 2.5 2.5 3 0.5 (1,0.5) 2 1 (1.5,1) 2 1 1 1 (1,0.5) 2 1 (1,0.5) x y z 1 3 1 1 1 4 2.5 2.5 3 0.5 (1,0.5) 2 1 (1.5,1) 2 1 1 1 (1,0.5) (3,2.5) 2.5 2 1 (1,0.5) x y z 1 3 1 1 1 4 2.5 3 3 x y z 1 3 1 1 1 4 2.5 3 3 3 3 1 x y z 1 3 1 1 1 4 2.5 3 3 0.5 3 3 1 x y z 1 3 1 1 1 4 2.5 3 3 0.5 1 3 1 -1 1 3 1.25 1 5.6 x y z 1 3 1 1 1 4 2.5 3 3 0.5 (3,2.5) 1 3 1 -1 1 3 1.25 1 (1,0.5) 5.6 x y z 1 3 1 1 1 4 2.5 3 3 0.5 (3,2.5) 1 3 1 (3,2.5) -1 1 3 1.25 1 (1,0.5) 5.6 (5.3,4.8) x y z 1 3 1 1 1 4 2.5 3 3 0.5 (3,2.5) 1 3 1 (3,2.5) -1 (3.5,3) 1 3 1.25 1 (1,0.5) 5.6 (5.3,4.8) x y z 1 3 1 1 1 4 2.5 3 3 0.5 (1,0.5) 6.83 1 1 (6.92,6.42) 1 (3.5,3) 1 3 -6.6 4 (4,3.5) 0.127 (0.94,0.44) x y z 1 3 1 1 1 4 2.5 2.5 2.5 0.5 (5,4.5) 6.83 5 1 (33.25,32.75) 1 (4.24,3.74) 1 2.5 -6.6 3 (3,2.5) 0.127 (4.82,4.32) 4.3 Ant Algorithms 4.3 Ant Algorithms (a) Why are the ants special? (a) Why are the ants special? Strength (a) Why are the ants special? Strength The strongest animal – the black ant can carry 50 times of its own weight Pain of Sting Pain of Sting 1.0 - Sweat bee: Light, ephemeral, almost fruity. Pain of Sting 1.0 - Sweat bee: Light, ephemeral, almost fruity. A tiny spark has singed a single hair on your arm. Pain of Sting 1.0 - Sweat bee: Light, ephemeral, almost fruity. A tiny spark has singed a single hair on your arm. Pain of Sting 1.0 - Sweat bee: Light, ephemeral, almost fruity. A tiny spark has singed a single hair on your arm. 4.0+ Bullet ant: Pure, intense, brilliant pain. Pain of Sting 1.0 - Sweat bee: Light, ephemeral, almost fruity. A tiny spark has singed a single hair on your arm. 4.0+ Bullet ant: Pure, intense, brilliant pain. Like fire-walking over flaming charcoal with a 3-inch rusty nail in your heel. Queen’s Ability to Preserve Sperms Queen’s Ability to Preserve Sperms A queen can keep sperms stored in her for nearly 18-20 years. Ant Colony Ant Colony A social insect. Their behavior is geared to the survival of colony rather than that of individuals. Ant Colony A social insect. Their behavior is geared to the survival of colony rather than that of individuals. Colony – structure of organization is excellent Individuals – very simple Ant Colony A social insect. Their behavior is geared to the survival of colony rather than that of individuals. Colony – structure of organization is excellent Individuals – very simple Colony – structure of organization is excellent Individuals – very simple How do they use pheromones? How do they use pheromones? Ants deposit pheromones while moving from nest to food sources and vice versa. How do they use pheromones? Ants deposit pheromones while moving from nest to food sources and vice versa. This activity creates a pheromone trail. How do they use pheromones? Ants deposit pheromones while moving from nest to food sources and vice versa. This activity creates a pheromone trail. Ants smell pheromones and while choosing a path they choose the path with high pheromone density. (b) Artificial Ants (b) Artificial Ants Similarities with Real Ants (b) Artificial Ants Similarities with Real Ants Colony of cooperating ants (b) Artificial Ants Similarities with Real Ants Colony of cooperating ants Deposition of pheromone while moving (b) Artificial Ants Similarities with Real Ants Colony of cooperating ants Deposition of pheromone while moving Shortest path searching and local moves (b) Artificial Ants Similarities with Real Ants Colony of cooperating ants Deposition of pheromone while moving Shortest path searching and local moves Stochastic and myopic state transition policy Differences with Real Ants Differences with Real Ants An artificial ant has an internal state. (the memory of past actions) Differences with Real Ants An artificial ant has an internal state. (the memory of past actions) The amount of pheromone deposited is proportional to the quality of solution obtained. (c) Traveling Salesman Problem (c) Traveling Salesman Problem 2 3 1 4 5 (c) Traveling Salesman Problem 2 3 1 4 5 (c) Traveling Salesman Problem 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 Ant decision table 8 2 4 6 3 1 3 3 5 9 5 4 8 5 7 1 2 3 4 5 1 2 3 4 5 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 2 3 4 5 1 2 3 4 5 Ant decision table 8 2 4 6 3 1 3 3 5 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 3 3 5 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.883 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.883 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.883 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.137 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.137 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.780 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.780 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.780 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.780 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.641 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.641 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.641 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table Total length = 36 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.25 0.25 0.25 0.25 2 3 0.25 0.50 0.50 0.50 0.50 0.75 0.50 0.75 4 5 0.75 1.00 0.75 1.00 0.75 1.00 1.00 1.00 Ant decision table Total length = 36 8 2 4 3 1 Add 1/36 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table Total length = 36 8 2 4 3 1 Add 1/36 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.446 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.446 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.446 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.977 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.977 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.977 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.301 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.301 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.301 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.28 0.24 0.24 0.24 2 3 0.24 0.48 0.52 0.52 0.48 0.76 0.48 0.72 4 5 0.72 1.00 0.76 1.00 0.76 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.521 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.521 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.521 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.841 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.841 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.076 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.676 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.876 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.876 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 3 1 Random number 0.876 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table Total length = 25 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.26 0.22 0.30 0.22 2 3 0.22 0.54 0.48 0.48 0.52 0.78 0.54 0.74 4 5 0.74 1.00 0.78 1.00 0.70 1.00 1.00 1.00 Ant decision table Total length = 25 8 2 4 3 1 Add 1/25 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table Total length = 25 8 2 4 3 1 Add 1/25 to relevant places 6 5 3 3 9 5 4 8 5 7 Next iteration… 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.27 0.19 0.35 0.19 2 3 0.19 0.61 0.46 0.42 0.59 0.81 0.56 0.73 4 5 0.77 1.00 0.81 1.00 0.61 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 Next iteration… 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table Total length = 25 8 2 4 6 3 1 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.29 0.21 0.29 0.21 2 3 0.21 0.57 0.50 0.46 0.55 0.79 0.52 0.71 4 5 0.75 1.00 0.79 1.00 0.67 1.00 1.00 1.00 Ant decision table Total length = 25 8 2 4 3 1 Add 1/25 to relevant places 6 5 3 3 9 5 4 8 5 7 1 1 2 3 4 5 0.35 0.19 0.27 0.19 2 3 0.19 0.61 0.54 0.42 0.59 0.81 0.48 0.65 4 5 0.77 1.00 0.81 1.00 0.61 1.00 1.00 1.00 Ant decision table Total length = 25 8 2 4 3 1 Add 1/25 to relevant places 6 5 3 3 9 5 4 8 5 7 Next iteration… 1 1 2 3 4 5 0.33 0.17 0.33 0.17 2 3 0.17 0.65 0.50 0.38 0.63 0.83 0.52 0.67 4 5 0.79 1.00 0.83 1.00 0.55 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 After 7 iterations… 1 1 2 3 4 5 0.21 0.09 0.49 0.09 2 3 0.09 0.73 0.40 0.28 0.69 0.87 0.68 0.79 4 5 0.87 1.00 0.91 1.00 0.47 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 After another 10 iterations… 1 1 2 3 4 5 0.05 0.00 0.79 0.00 2 3 0.01 0.97 0.16 0.12 0.83 0.97 0.88 0.95 4 5 0.99 1.00 0.97 1.00 0.21 1.00 1.00 1.00 Ant decision table Total length = 22 8 2 4 3 1 Add 1/22 to relevant places 6 5 3 3 9 5 4 8 5 7 (d) Other Applications (d) Other Applications Job-shop scheduling problem (d) Other Applications Job-shop scheduling problem Given M machines and a sequence of J jobs, it is required assign operations and time intervals to so that the completion time is minimum. (d) Other Applications Job-shop scheduling problem Given M machines and a sequence of J jobs, it is required assign operations and time intervals to so that the completion time is minimum. Quadratic Assignment problem (d) Other Applications Job-shop scheduling problem Given M machines and a sequence of J jobs, it is required assign operations and time intervals to so that the completion time is minimum. Quadratic Assignment problem Assigning N facilities to N locations so that the cost of assignment is minimized. Graph coloring problem Graph coloring problem Coloring a graph with minimum number of colors. Graph coloring problem Coloring a graph with minimum number of colors. Vehicle routine problem Graph coloring problem Coloring a graph with minimum number of colors. Vehicle routine problem To find the minimum cost vehicle routine so that Graph coloring problem Coloring a graph with minimum number of colors. Vehicle routine problem To find the minimum cost vehicle routine so that (i) every customer is visited exactly once by every vehicle. Graph coloring problem Coloring a graph with minimum number of colors. Vehicle routine problem To find the minimum cost vehicle routine so that (i) every customer is visited exactly once by every vehicle. (ii) the total demand does not exceed the vehicle capacity. Graph coloring problem Coloring a graph with minimum number of colors. Vehicle routine problem To find the minimum cost vehicle routine so that (i) every customer is visited exactly once by every vehicle. (ii) the total demand does not exceed the vehicle capacity. (iii) total length of tour does not exceed a certain bound Graph coloring problem Coloring a graph with minimum number of colors. Vehicle routine problem To find the minimum cost vehicle routine so that (i) every customer is visited exactly once by every vehicle. (ii) the total demand does not exceed the vehicle capacity. (iii) total length of tour does not exceed a certain bound (iv) Every vehicle comes back to the depot. (e) Conclusion (e) Conclusion As in the case of neural networks, genetic algorithms, and interior point algorithms, this algorithm may not take you to the optimum. (e) Conclusion As in the case of neural networks, genetic algorithms, and interior point algorithms, this algorithm may not take you to the optimum. Instead it takes you to a very good solution with a very low cost (no.of iter., computational time etc) Sequential ordering problem