Uploaded by istealthzgg

Evolutionary Algorithms

advertisement
COMP1008
Modern Search
Techniques and
Evolutionary
Algorithms
Dr Warren G Jackson
Warren.Jackson2@nottingham.ac.uk
Previous Weeks on Search Methods
Search Space & Search Tree
Blind Search
Heuristic search
Game playing
Evolutionary algorithms
2
Today’s Topics
▪ Tree Search vs. Modern Search Techniques
▪ Evolutionary Algorithms
▪ Genetic Algorithm
▪ GA Examples
3
Tree Search vs. Modern Search Techniques
1
Tree Search
▪ Nodes in the tree represent partial
solutions.
▪ Nodes connected via branches,
defined by operators.
▪ Leaf nodes represent solutions to
the problem.
▪ Systematically search the search
tree: BFS/DFS/heuristics.
1
5
4
7
8
⋮
2
2
4
5
3
7
8
6
1
2
1+4
0+3
1+4
1
2
1+2
3
4
5
3
4
5
3
6
7
8
6
7
8
6
1
2
3
4
5
7
8
⋮
1
2
3
4
5
6
7
8
3+0
2+1
6
1
2
4
7
3
3+2
5
8
6
⋮
4
Tree Search vs. Modern Search Techniques
1
Tree Search
▪ Nodes in the tree represent partial
solutions.
▪ Nodes connected via branches,
defined by operators.
▪ Leaf nodes represent solutions to
the problem.
▪ Systematically search the search
tree: BFS/DFS/heuristics.
1
5
4
7
8
2
2
4
5
3
7
8
6
1
2
1+4
0+3
1+4
1
2
1+2
3
4
5
3
4
5
3
6
7
8
6
7
8
6
1
2
3
4
5
7
8
⋮
Partial solution; move the blank space:
right and down.
⋮
1
2
3
4
5
6
7
8
3+0
2+1
6
1
2
4
7
3
3+2
5
8
6
⋮
5
Tree Search vs. Modern Search Techniques
1
Tree Search
▪ Nodes in the tree represent partial
solutions.
▪ Nodes connected via branches,
defined by operators.
▪ Leaf nodes represent solutions to
the problem.
▪ Systematically search the search
tree: BFS/DFS/heuristics.
1
5
4
7
8
2
2
4
5
3
7
8
6
1
2
1+4
0+3
1+4
1
2
1+2
3
4
5
3
4
5
3
6
7
8
6
7
8
6
1
2
3
4
5
7
8
⋮
Complete solution; move the blank space:
right, down, and down.
⋮
1
2
3
4
5
6
7
8
3+0
2+1
6
1
2
4
7
3
3+2
5
8
6
⋮
6
Tree Search vs. Modern Search Techniques
𝑑
Modern Search Techniques
𝑓 𝑥 = 10𝑑 + ෍ 𝑥𝑖2 − 10 cos 2𝜋𝑥𝑖
𝑖=1
Define a search space of
complete solutions.
Solutions represented by
an encoding.
Search the search space
using different operators.
Image source: https://www.sfu.ca/~ssurjano/rastr.html
7
Tree Search vs. Modern Search Techniques
𝑑
Modern Search Techniques
𝑓 𝑥 = 418.9829𝑑 − ෍ 𝑥𝑖 sin
𝑥𝑖
𝑖=1
▪ Start with an initial complete
solution.
▪ Solution ≠ optimal solution
▪ Iteratively try and find a better
solution by using the operators.
▪ Termination criterion
▪ Time
▪ Iterations/generations
Image source: https://www.sfu.ca/~ssurjano/schwef.html
8
Fundamentals of Modern
Search Techniques
Local Search and Evolutionary Algorithms
(Advanced techniques covered in COMP2001/2011)
9
Terminology and Issues
Modern Search Techniques
𝑓 𝑥 = − sin 𝑥1 cos 𝑥2 exp
𝑥12 + 𝑥22
1−
𝜋
▪ Objective function
▪ Minimisation/maximisation
▪ Objective value
▪ Local optimum
▪ Global optimum
▪ Escaping sub-optimal regions
Image source: https://www.sfu.ca/~ssurjano/holder.html
10
Terminology and Issues
Modern Search Techniques
𝑓 𝑥 = − sin 𝑥1 cos 𝑥2 exp
𝑥12 + 𝑥22
1−
𝜋
▪ Objective function
▪ Minimisation/maximisation
▪ Objective value
▪ Local optimum
▪ Global optimum
▪ Escaping sub-optimal regions
Image source: https://www.sfu.ca/~ssurjano/holder.html
11
Terminology and Issues
Modern Search Techniques
𝑓 𝑥 = − sin 𝑥1 cos 𝑥2 exp
𝑥12 + 𝑥22
1−
𝜋
▪ Objective function
▪ Minimisation/maximisation
▪ Objective value
▪ Local optimum
▪ Global optimum
▪ Escaping sub-optimal regions
Image source: https://www.sfu.ca/~ssurjano/holder.html
12
Terminology and Issues
Modern Search Techniques
𝑓 𝑥 = − sin 𝑥1 cos 𝑥2 exp
𝑥12 + 𝑥22
1−
𝜋
▪ Objective function
▪ Minimisation/maximisation
▪ Objective value
▪ Local optimum
▪ Global optimum
▪ Escaping sub-optimal regions
Image source: https://www.sfu.ca/~ssurjano/holder.html
13
Modern Search Techniques
Local Search Algorithms
▪ Single initial complete
solution
▪ Neighbourhood operators
(heuristics) used the define
the search space of solutions.
▪ Search stops after a
termination criterion and
returns the best solution
found.
14
Modern Search Techniques
Local Search Algorithms
▪ Can easily become stuck in
local optima.
▪ Requires sophisticated
methods to escape these
locally optimal regions.
▪ Simulated Annealing, Great
Deluge, Tabu Search…
More on this topic in COMP2001/2011
15
Modern Search Techniques
Evolutionary Algorithms
▪ Population of initial complete
solutions
▪ Genetic operators used to define
the search space of solutions.
▪ Search stops after a termination
criterion and returns the best
solution found.
16
Modern Search Techniques
Evolutionary Algorithms
▪ Population of solutions are
evolved over time to (hopefully)
converge on good quality
solutions.
17
Modern Search Techniques
Evolutionary Algorithms
▪ Population of solutions are
evolved over time to (hopefully)
converge on good quality
solutions.
▪ GA/MA/PSO/…
GA’s topic of todays lecture
18
Evolutionary Algorithms
▪ Inspired by Darwin’s theory of biological evolution.
▪ Genetic operators evolve the solutions over multiple generations.
▪ Solutions with better costs are regarded as most-fit.
▪ Idea of “survival of the fittest” guides the search towards finding
good-quality solutions.
19
Genetic Algorithms
▪ Evolve a population of solutions over multiple generations exploiting
the phenomenon of survival of the fittest.
▪ Synonymous to the real-world, offspring are formed by:
▪ Selecting parents
▪ Crossover of genes (XY chromosomes) 🧬
▪ Random mutations
▪ Repeat this process over many generations replacing the old
population with the new.
20
Standard Genetic Algorithms
▪ Evolve a population of solutions over
multiple generations exploiting the
phenomenon of survival of the fittest.
▪ Synonymous to the real-world,
offspring are formed by:
▪ Selecting parents
▪ Crossover of genes (XY
chromosomes) 🧬
▪ Random mutations
▪ Generate an equal number of
offspring to use in the next generation.
Initial Population (n=0)
Population @ Gen ‘n’
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Offspring
✅
❌
n<m
Return final
population
21
Initial Population (n=0)
Population @ Gen ‘n’
Genetic Algorithms
First things first – solution
encoding / representation.
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Offspring
✅
❌
n<m
Return final
population
22
Genetic Algorithms
Encoding/Representation
▪ Depending on the problem to be
solved, we need a logical
representation that encodes a
complete solution.
▪ This encoding is modified through
the EA process to find new
solutions.
▪ Bitstring for benchmark function
optimisation, knapsack problem,
satisfiability, …
For 𝑥: ℕ ∈ 0,255
▪ 𝑥1 = 01110100 (116)
▪ 𝑥2 = 10110101 (181)
▪⋮
▪ 𝑥𝑑 = 01010011 (83)
23
Genetic Algorithms
Encoding/Representation
▪ Depending on the problem to be
solved, we need a logical
representation that encodes a
complete solution.
▪ This encoding is modified through
the EA process to find new
solutions.
▪ Forms of permutation encoding for
TSP/VRP
▪ TSP
▪ 𝑅𝑜𝑢𝑡𝑒 = 𝐴𝐵𝐷𝐸𝐶
B
A
D
C
E
▪ VRP
▪ 𝑅𝑜𝑢𝑡𝑒𝐴 = 𝐴𝐵𝐶
▪…
▪ 𝑅𝑜𝑢𝑡𝑒𝑍 = 𝑋𝑌𝑍
Z
A
B
Y
🏠
X
C
24
Genetic Algorithms
Genetic Operators
Initial Population (n=0)
▪ Parent Selection
▪ Crossover
▪ Mutation
Population @ Gen ‘n’
n++
Applying these operators one
after the other results in two
offspring.
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Offspring
✅
❌
n<m
Return final
population
25
Genetic Algorithms – The Metaphor
Genetic Operators
Initial Population (n=0)
GA Component
Metaphor
Population @ Gen ‘n’
Solutions
Chromosomes/Individuals
Decision variables
Number of genes
Evaluation function
Fitness
Problem solving
Evolution
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Offspring
✅
❌
n<m
Return final
population
26
Initial Population (n=0)
Population @ Gen ‘n’
Genetic Algorithms
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Parent Selection
Offspring
✅
❌
n<m
Return final
population
27
Genetic Algorithms – Parent Selection
Parent Selection
▪ Roulette Wheel Selection
▪ Each individual assigned a
“slice” of the roulette
wheel.
Chosen P2
▪ Size of each slice
proportional to their fitness
value.
▪ Randomly chosen by
“spinning” the wheel.
● P1
● P2
● P3
● P4
28
Genetic Algorithms – Parent Selection
Parent Selection
▪ Tournament Selection
▪ Best of `n` randomly
chosen distinct parents.
▪ `n` is the tournament size.
▪ Example `n` = 3
Population
P1
P2
P3
P4
29
Genetic Algorithms – Parent Selection
Parent Selection
▪ Tournament Selection
▪ Best of `n` randomly
chosen distinct parents.
▪ `n` is the tournament size.
▪ Example `n` = 3
Population
P1
P2
P3
P4
30
Genetic Algorithms – Parent Selection
Parent Selection
▪ Tournament Selection
▪ Best of `n` randomly
chosen distinct parents.
▪ `n` is the tournament size.
▪ Example `n` = 3
Population
P1
P2
𝑓(𝑃1)
= 8
𝑓(𝑃1)
=5
P3
P4
𝑓(𝑃1)
=7
31
Genetic Algorithms – Parent Selection
Parent Selection
▪ Tournament Selection
▪ Best of `n` randomly
chosen distinct parents.
▪ `n` is the tournament size.
▪ Example `n` = 3
Population
P1
P2
𝑓(𝑃1)
= 8
𝑓(𝑃1)
=5
P3
P4
𝑓(𝑃1)
=7
Chosen P2
32
Initial Population (n=0)
Population @ Gen ‘n’
Genetic Algorithms
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Crossover
Offspring
✅
❌
n<m
Return final
population
33
Genetic Algorithms – Crossover
Crossover Operators
Parent A
▪ Generates a pair of offspring
solutions.
▪ Crossover rate 𝑃𝑐 ∈ 0,1
▪ large ≥ 70% (0.7)
▪ Controls the probability of
applying XO (or else
solutions pas through to
mutation step).
Parent B
Crossover (𝑟𝑎𝑛𝑑𝑜𝑚 < 𝑃𝑐 )
Offspring I
Offspring II
Parent A
Parent B
Crossover (𝑃𝑐 ≤ 𝑟𝑎𝑛𝑑𝑜𝑚)
Offspring I
Offspring II
34
Genetic Algorithms – Crossover
Crossover Operators – 1PTX
▪ 1-Point Crossover (1PTX)
▪ Take a point/pivot in both
solutions
▪ Copy first part from P1 and
second part from P2 to create
C1.
▪ Vice versa for C2.
Parent A
00110011
Parent B
10101010
𝑝1: 0 0 1 | 1 0 0 1 1
𝑝2: 1 0 1 | 0 1 0 1 0
𝑐1: 0 0 1 | 0 1 0 1 0
𝑐2 : 1 0 1 | 1 0 0 1 1
Offspring I
00101010
Offspring II
10110011
35
Genetic Algorithms – Crossover
Crossover Operators – UXO
▪ Uniform Crossover (UXO)
▪ Exchanges information between
solutions at a bit-by-bit level.
▪ A randomised template is used
to determine which bits to
exchange.
▪ UXO template:
▪ 0 means swap, 1 means keep
Parent A
00110011
Parent B
10101010
Template
11001001
𝑝1 : 0 0 1 1 0 0 1 1
𝑝2 : 1 0 1 0 1 0 1 0
𝑐1 : 0 0 1 0 0 0 1 1
𝑐2 : 1 0 1 1 1 0 1 0
Offspring I
00100011
Offspring II
10111010
36
Initial Population (n=0)
Population @ Gen ‘n’
Genetic Algorithms
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Mutation
Offspring
✅
❌
n<m
Return final
population
37
Genetic Algorithms
Mutation Operators
▪ Changes some genes in the
solution.
▪ Mutation rate 𝑃𝑚
▪ small ≤ 10%
▪ Probability to mutate a
solution
▪ Modify a single gene
▪ Used to prevent premature
convergence
Offspring I
Offspring II
Mutation
Mutation
Offspring I
Offspring II
Offspring I
00100011
Offspring II
10111010
Mutate [4]
00110011
Mutate [1]
11111010
38
Initial Population (n=0)
Population @ Gen ‘n’
Genetic Algorithms
n++
Selection
Selection
Crossover
Crossover
Mutation
Mutation
Generational Replacement?
Offspring
✅
❌
n<m
Return final
population
39
Genetic Algorithms
Initial Population (n=0)
Generational Replacement
▪ Many different strategies…
▪ In this module, we only need
to understand the
fundamentals of GA’s.
▪ Replace with new population
Population @ Gen 0
Population @ Gen 1
Selection
Selection
Selection
Selection
Crossover
Crossover
Crossover
Crossover
Mutation
Mutation
Mutation
Mutation
Offspring
Offspring
Population @ Gen 2
Population @ Gen ‘m’
Selection
Selection
Selection
Selection
Crossover
Crossover
Crossover
Crossover
Mutation
Mutation
Mutation
Mutation
Offspring
Final Population (to return)
40
Genetic Algorithms
Example 1 – Function Optimisation
41
GA Example 1 (Function Optimisation)
▪ Example of using GA to solve 1-dimensional Schwefel function in the
co-domain [0,32]
42
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0
▪ Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝑥
▪ 0 ≤ 𝑥 < 32
▪ Can represent solutions using binary
encoding with 5 bits.
▪ Randomise initial population (size=4)
as:
▪
▪
▪
▪
P1 = 11000 (24)
P2 = 01011 (11)
P3 = 10011 (19)
P4 = 00000 (0)
43
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0
▪ Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝒙
𝑥
𝒇(𝒙)
Individual
Solution
P1
11000
24
442.566
P2
01011
11
420.898
P3
10011
19
436.808
P4
00000
0
418.983
44
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0
▪ Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝑥
𝒙
𝒇(𝒙)
Individual
Solution
P1
11000
24
442.566
P2
01011
11
420.898
P3
10011
19
436.808
P4
00000
0
418.983
▪ Tournament Selection (size=2)
▪ PA = best(P1,P3) = P3
▪ PB = best(P1,P4) = P4
45
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0
▪ Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝒙
𝑥
𝒇(𝒙)
Individual
Solution
P1
11000
24
442.566
P2
01011
11
420.898
P3
10011
19
436.808
P4
00000
0
418.983
▪ UXO with P3,P4 and template 01011
▪ C1 = 00011
▪ C2 = 10000
46
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0
▪ Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝒙
𝑥
𝒇(𝒙)
Individual
Solution
P1
11000
24
442.566
P2
01011
11
420.898
P3
10011
19
436.808
P4
00000
0
418.983
▪ UXO with P3,P4 and template 01011
▪ C1 = 00011 = 3; 𝑓(𝐶1) = 416.022
▪ C2 = 10000 = 16; 𝑓 𝐶2 = 431.092
47
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0
▪ Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝒙
𝑥
𝒇(𝒙)
Individual
Solution
P1
11000
24
442.566
P2
01011
11
420.898
P3
10011
19
436.808
P4
00000
0
418.983
▪ UXO with P2,P4 and template 10010
▪ C3 = 00010 = 2; 𝑓(𝐶1) = 417.007
▪ C4 = 01001 = 9; 𝑓 𝐶2 = 417.713
48
GA Example 1 (Function Optimisation)
▪ Crossover rate = 1.0; Mutation rate = 0.0
▪ Minimise: 418.9829 − 𝑥 sin
𝒙
𝑥
𝒇(𝒙)
Individual
Solution
P1
11000
24
442.566
P2
01011
11
420.898
P3
10011
19
436.808
P4
00000
0
418.983
▪ New population:
𝒙
𝒇(𝒙)
Individual
Solution
P1
00011
3
416.022
P2
10000
16
431.092
P3
00010
2
417.007
P4
01001
9
417.713
49
GA Example 2 (TSP)
▪ Travelling Salesman Problem
(TSP)
▪ Need to visit all locations
▪ Minimise distance travelled
▪ Example is actually the tram stops
in Nottingham
▪ (assumes can travel directly
from one to the other)
▪ Permutation representation
▪ Each value represents a
unique location
50
GA Example 2 (TSP)
▪ Let’s imagine a simpler example…
▪ Solution some permutation of:
▪ ABCDEFGHIJKL (11 locations)
▪ Crossover
▪ Requires a repair operator
▪ P1: ABCDEF | GHIJKL
▪ P2: ADJCEF | GHKLIB
▪ C1: ABCDEF | GHKLIB
▪ C2: ADJCEF | GHIJKL
▪ Not in scope of this module…
51
GA Example 2 (TSP)
▪ Let’s imagine a simpler example…
▪ Solution some permutation of:
▪ ABCDEFGHIJKL (11 locations)
▪ Mutation
▪ Cannot just mutate a location
as will be a duplicate.
▪ In this case, need to swap.
▪ C1: ABCDEFGHIJKL
▪ C2: ADJCEFGHKLIB
▪ C1’: ACBDEFGHIJKL
▪ C2’: ADJCEFGKHLIB
52
Online Animation of GA solving TSP
▪ Online Animation of GA solving TSP
Cost: 6.37
Generation: 88
Cost: 6.34
Generation: 1100
53
Key Points
▪ EAs completely different to tree search.
▪ EAs evolve a population of solutions.
▪ EAs require different solution representations to encode the solution to
the problem being solved but is general and powerful.
▪ Genetic Operators and their functions:
▪ Parent selection
▪ Crossover
▪ Mutation
54
Module Activities
Coming up:
Date
Activity
Monday 27th March
No more computing sessions
Monday-Friday (w/c 27th March)
Student Evaluation of Module (SEM)
Feedback.
Friday 31st March, 2-4pm
Lecture – Bayes Rules
55
Download