Uploaded by Sidharth

Optimization Algorithms: Genetic, Brute Force, Simulated Annealing

advertisement
Brute force approach: Calculating the shortest possible distance for every route and then
selecting the shortest one
Combinatorial optimization: finding the optimal solution from many possibilities
Computational intractability: When no efficient algorithms exist to solve a problem
Convergence: When there is no significant improvement in the fitness of the population from
one generation to the next
Fitness Function: Used to determine how close a given solution is to the optimum solution of a
problem
Heuristic: A quick, efficient and actually possible solution, usually an approximation
Mating pool: Taking a sample of tours from the population and putting them into a mating pool,
usually the tours with higher fitness values are chosen. The solutions selected for crossover.
Mutation: Deliberate differences in copying genetic information from one generation to the
next
Mutation rate: Determines how many chromosomes should be mutated in one generation
Offspring: Combining the genetic information of the parents generates offspring
Optimization: Modifying some aspect of a function in order for it to work more efficiently or use
fewer resources
Population: All the current solutions
Premature convergence: Premature convergence refers to the convergence of a process that
has occurred too soon and cannot reach the global optimum
Ranking: Ranking is simply putting the solutions in order of their fitness
Roulette wheel selection: The better performing have a better chance of being selected (As
opposed to random Selection)
Selection strategy:
Termination condition: determining when a GA run will end
Tour: Visiting all the nodes
Tournament selection: Tournament selection involves running several "tournaments" among a
few individuals chosen at random from the population. The winner of each tournament is
selected for crossover.
Truncation selection: To eliminate a fixed percentage of the less fit individual solutions
Steepest ascent hill climbing: always picking the highest hill (exploitation). This will get the
local maxima.
Simulated annealing: Using probability and choosing random hills (so that in the future the
maximum value in all of the search space may be chosen) and then starting to pick the highest
ones (exploration). Pick a random solution, then pick a random neighbor of the solution even if it
isn’t better than the last solution. Sometimes picking a worse solution at first leads you to a
group of better solutions. This allows you to escape the local maxima and search for the global
maxima.
1. A tour is a path that visits all the nodes and gets back to the starting point at the end.
2. Heuristics are quick, possible solutions to the problem. In the travelling salesman
problem these would be potential tours.
3. Convergence is when the average fitness of doesn’t change from one generation to the
next. The mutation rate affects this, as with a low mutation the solutions would all
become very similar quickly and convergence would be faster. A high mutation rate
adds
4. A.
5. Roulette wheel selection is a process in which the more fit algorithms have a bigger
chance of being selected by a pointer on a roulette wheel. Stochastic universal sampling
also has the more fit solutions with a bigger probability, but it has multiple pointers so
solutions in different selection points may have more of a chance of being selected than
in roulette wheel selection. This leads to all the possible solutions being chosen at the
same time, so stochastic universal sampling is quicker than roulette wheel, where
multiple spins are needed to select all the solution. Stochastic universal sampling also
often leads to the selection of less fit solutions than in roulette wheel selection, as they
are in multiple selection points, and it is therefore considered less optimal.
6. Genetic algorithms are designed to attempt to find heuristics for combinatorial
optimization problems like the travelling salesman problem. One advantage of genetic
would be that they are relatively fast and have lower memory usage than certain
approaches like the brute force approach. In the brute force approach, every single
solution is tested in order to find the best one, and this can often take extremely long
and use up a lot of memory for computationally intractable problems as they have
millions of solutions. However, the brute force approach may be better for certain
combinatorial optimization problems where there are not as many solutions to be
tested, as they are superior to genetic algorithms in the sense that they will find the
best possible solution whereas genetic algorithms will only find a good solution.
Another possible advantage of genetic algorithms would be that they usually find a
solution quite close to the global optimum, provided that there is a high mutation rate
to prevent premature convergence by the achievement of the termination condition too
early. An approach like hill climbing which utilizes exploitation will often only result in
the finding of a local maxima, as it would only find the best solution in the vicinity of its
starting point. Hill climbing is quicker than genetic algorithms and may prove to be
better with problems where the search space is small, however, this is not the case with
the travelling salesman problem.
Approaches like simulated annealing, where exploration is used and there is a degree of
randomness to the program are similar to genetic algorithms. However, genetic
algorithms may find more optimal solutions, as there is a process of optimizing the most
fit solution after convergence is achieved. This makes the solution more efficient and
require less resources. This is not done in simulated annealing, where the most fit
solution is found and kept unchanged after.
Download