A Novel Meta-Heuristic Optimization Algorithm: Current

advertisement
Recent Researches in Artificial Intelligence and Database Management
A Novel Meta-Heuristic Optimization Algorithm: Current Search
Anusorn SAKULIN and Deacha PUANGDOWNREONG*
Department of Electrical Engineering,
Faculty of Engineering, South-East Asia University
19/1 Petchakasem Rd., Nongkhaem, Bangkok, THAILAND
*
corresponding author: deachap@sau.ac.th http://www.sau.ac.th
Abstract: - Inspired by an electric current flowing through electric networks, a novel meta-heuristic
optimization algorithm named the Current Search (CS) is proposed in this article. The proposed CS algorithm is
an optimization algorithm based on the intelligent behavior of electric current flowing through open and short
circuits. To perform its effectiveness and robustness, the proposed CS algorithm is tested against five wellknown benchmark continuous multivariable test functions collected by Ali et al. The results obtained by the
proposed CS are compared with those obtained by the popular search techniques widely used to solve
optimization problems, i.e., Genetic Algorithm (GA), Particle Swarm Optimization (PSO), and Tabu Search
(TS). The results show that the proposed CS outperforms other algorithms. The results obtained by the
proposed CS are superior within reasonable time consumed.
Key-Words: - Current Search, Genetic Algorithm, Particle Swarm Optimization, Tabu Search
al [16]. Obtained results will be compared with
those obtained by GA, PSO, and TS. This article
consists of five sections. The CS algorithm is
described in section 2. Benchmark continuous
multi-dimensional test functions used in this article
are given in section 3. Performance evaluation of
CS compared with GA, PSO, and TS algorithms
against five benchmark multivariable test functions
is illustrated in section 4, while conclusion is
provided in section 5.
1 Introduction
Over five decades, many heuristic algorithms have
been developed to solve combinatorial and numeric
optimization problems [1]. By literature, several
intelligent search techniques, i.e., Evolutionary
Programming (EP) [2], Tabu Search [3], Simulated
Annealing (SA) [4], Genetic Algorithm (GA) [5],
Ant Colony Optimization (ACO) [6], Hit-and-Run
(HNR) [7], Hide-and-Seek (HNS) [8], Particle
Swarm Optimization (PSO) [9], Harmony Search
(HS) [10], Bacterial Foraging Optimization (BFO)
[11], Shuffled Frog Leaping Algorithm (SFLA)
[12], Bee Colony Optimization (BCO) [13], Key
Cutting Search (KCS) [14], and Hunting Search
(HuS) [15] etc., have been proposed. These
algorithms can be classified into different groups
depending on their nature of criteria being
considered, such as population-based (EP, GA,
ACO, PSO, BFO, BCO, and HuS), neighborhoodbased (TS), iterative-based (SFLA), stochastic
(KCS, HNR, and HNS), and deterministic (SA).
Among them, GA, PSO, and TS are the most
popular intelligent search techniques that are widely
used to solve optimization and engineering
problems.
In this article, the current search (CS), one of the
powerful and efficient meta-heuristic optimization
search techniques, is proposed. The CS algorithm is
inspired by the electric current flowing through
electric circuits. The proposed CS algorithm is
coded and tested against five benchmark continuous
multi-dimensional test functions collected by Ali et
ISBN: 978-1-61804-068-8
2 Current Search Algorithm
Based on the principle of current divider in electric
circuit theory [17], the electric current flows through
all blanch connected in parallel form as can be seen
in Fig.1. Each blanch connects to a resistor R
having different resistances to obstruct the current.
Assume that 0 < R1 < R2 < L < R N . In fundamentals
of circuit theory [17], Kirchhoff’s current law
(KCL) stats that the algebraic sum of currents
entering a node is zero. On the other hand, the sum
of the currents entering a node is equal to the sum of
the current leaving the node. This means that, in Fig.
1, the sum of all currents in each blanch is equal to
the total current supplied by the current source as
expressed in (1), where, iT is the total current and
i j is the current in blanch j -th.
N
∑ i j = iT
j =1
125
(1)
Recent Researches in Artificial Intelligence and Database Management
Step 7. If f ( x′) < f ( x0 ) , keep x0 in set Γk and
set x0 = x′ , set j = 1 and return to Step 5. Otherwise
update j = j + 1 .
Step 8. If j < j max , return to Step 5. Otherwise
keep x0 in set Ξ and update k = k + 1 .
Step 9. Terminate the search process when
termination criteria are satisfied. The optimum
solution found is x0 . Otherwise return to Step 4.
The diagram in Fig. 2 reveals the search process
of the proposed CS algorithm.
The behavior of electric current is like a tide that
always flow to lower places. The less the resistance
of blanch, the more the current flows (see Fig.1, the
thickness of arrows representing the current
quantity). Referring to Fig. 1, in case of short
circuit, the blanch resistance is zero acted as a
conductor, while, in case of open circuit, the blanch
resistance is infinity acted as an insulator. The
Current Search (CS) algorithm is inspired by this
concept.
All blanches represent the feasible
solutions in search space. The local entrapment is
occurred when the current hits the open circuit
connection. The optimum solution found is the
blanch possessing the optimum resistance.
i1
iT
i2
i3
iN
R1
R2
R3
RN
i1
blanch
i2
node
Start
Initialize:
- search space
- k = j = 1, jmax = 10
- N = n = 10, = 0.1
Ø
Uniformly Random set of initial
solutions Xi, i=1,…,N within
i3
Evaluate f(Xi) and rank Xi leading
f(X1)<f(X2)<…<f(XN), the store ranked Xi into
iN
Let x0 = Xk be initial solution
0 < R1 < R2 < L < RN
current source
Uniformly Random set of neighborhood member
xi, i=1,…,n around x0 within radius
iT
Evaluate f(xi) and let x’ be an elite solution
among xi making f(.) minimum
Fig. 1 The behavior of electric current.
The CS algorithm is described step-by-step as
follows.
Step 1. Initialize the search space Ω, iteration
counter k = j = 1 , maximum allowance of solution
cycling jmax , number of initial solutions (feasible
directions of currents in network) N , number of
neighborhood members n , search radius ρ , and set
Ψ =Γ=Ξ=∅.
Step 2. Uniformly random initial solution X i ,
i = 1,K, N within Ω.
Step 3. Evaluate the objective function f ( X i ) of
∀X . Rank X i , i = 1,K, N that gives f ( X 1 ) < L
< f ( X N ) , then store ranked X i into Ψ .
Step 4. Let x0 = X k as selected initial solution.
Step 5. Uniformly random neighborhood xi ,
i = 1,K, n around x0 within radius ρ .
Step 6. Evaluate the objective function f ( xi ) of
∀x . A solution giving the minimum objective
function is set as x′ .
ISBN: 978-1-61804-068-8
f(x’)<f(x0)
yes
Store x0 into k, then
set x0 = x’ and set j = 1
no
Update j = j+1
yes
j<jmax
no
Store x0 into ,
and update k = k+1
no
TC met ?
yes
Report the optimal
solution x0
Stop
Fig. 2 The diagram of the proposed CS algorithm.
126
Recent Researches in Artificial Intelligence and Database Management
In Step 1, the search space Ω is performed as the
feasible boundary where the electric current can
flow. The maximum allowance of solution cycling
jmax implies the local entrapment occurred in the
selected direction. The number of initial solutions
N is set as feasible directions of the electric
currents in network. The number of neighborhood
members n is provided as the sub-directions of the
electric currents in the selected direction, and the
search radius ρ is given as the sub-search space
where the electric current can flow in the selected
direction.
In Step 2-3, the uniformly random approach is
conducted to perform the feasible directions of the
electric currents. These directions will be ranked by
the objective function to arrange the signification of
directions from most to least.
In Step 4-7, once the most significant direction
of the current is selected, the search process will
consecutively find the optimum solution along the
most significant direction within the sub-search
space where the electric current can flow in the
selected direction. Each feasible solution will be
evaluated via the objective function until the
optimum solution is found.
In Step 8-9, the local entrapment in the selected
direction will be identified via the maximum
allowance of solution cycling. If occurred, the
second, the third, and so on, of the significant
direction ranked in Step 2-3 will consecutively
employed, until optimum solution will be found or
the termination criteria will be met.
min f ( x) = x12 + x 22 − 10 cos(2πx1 ) − 10 cos(2πx 2 ) + 20
x
(iii) Shekel’s Fox-Holes function (SF) is
expressed as (4). It is the fifth function of De Jong’s
test suite. The global minimum is located at
x′ = (−32, − 32) with f ( x′) = 1 . Let f max = 0.9990 be
the maximum allowance of the global solution
found. The Shekel’s Fox-Holes surface is depicted
in Fig. 5.
⎡
⎤
⎢
⎥
25
1
1
⎢
⎥
min f ( x ) = ⎢
+∑
⎥
2
500 j =1
x
6
⎢
j + ∑ ( x i − a ij ) ⎥
⎢
⎥
i =1
⎣
⎦
subject to − 50 ≤ x1 , x 2 ≤ 50
1 × 10 −6 be the maximum allowance of the global
solution found. The Schwefel surface is depicted in
Fig. 6.
n
(
min f ( x) = 418.9829n − ∑ xi sin
x
i =1
)
xi , n = 2
(v) Shubert function (ShuF) is expressed as (6).
The global minima are located at 18 different
locations with f ( x′) = −186.7309 . Let f max =
−186.73 be the maximum allowance of the global
solution found. The Shubert surface is depicted in
Fig. 7.
⎞
⎞⎛ 5
⎛ 5
min f ( x) = ⎜ ∑ i cos[(i + 1) x1 + i ]⎟⎜ ∑ i cos[(i + 1) x 2 + i ]⎟
⎟
⎟
⎜
⎜
x
⎠
⎠⎝ i =1
⎝ i =1
subject to − 1 0 ≤ x1 , x 2 ≤ 10
(2)
subject to − 1 ≤ x1 , x 2 ≤ 1
(ii) Rastrigin function (RF) is expressed as (3).
The global minimum is located at x′ = (0, 0) with
f max = 1 × 10 −6
be the maximum
allowance of the global solution found. The RF’s
surface is depicted in Fig. 4.
ISBN: 978-1-61804-068-8
(5)
subject to − 500 ≤ x1 , x 2 ≤ 500
with f ( x ′) = 0 . Let f max = 1 × 10 −6 be the maximum
allowance of the global solution found. The
Bohachevsky surface is depicted in Fig. 3.
f ( x ′) = 0 . Let
(4)
(iv) Schwefel function (SchF) is expressed as (5).
The
global
minimum
is
located
at
x′ = (420.9687, 420.9687) with f ( x′) = 0 . Let f max =
3 Benchmark Functions
x
−1
0
16
32 − 32 L 0 16 32 ⎞
⎛ − 32 − 16
⎟⎟
where a ij = ⎜⎜
⎝ − 32 − 32 − 32 − 32 − 32 − 16 L 32 32 32 ⎠
In this section, five well-known benchmark
continuous multivariable test functions collected by
Ali et al., [16] are described as follows.
(i) Bohachevsky function (BF) is expressed as
(2). The global minimum is located at x′ = (0, 0)
min f ( x) = x12 + 2 x 22 − 0.3 cos(3πx1 ) − 0.4 cos(4πx 2 ) + 0.7
(3)
subject to − 5 ≤ x1 , x 2 ≤ 5
Fig. 3 Bohachevsky surface.
127
(6)
Recent Researches in Artificial Intelligence and Database Management
and robustness. The CS algorithm was coded by
MATLAB running on Intel Core2 Duo 2.0 GHz 3
Gbytes DDR-RAM computer. The CS parameters
are reasonable preset for each test function as
summarized in Table 1, where N is number of
initial solutions, n is number of neighborhood
members, and ρ is search radius. Maximum search
iteration = 100 and f ≤ f max are set as termination
criteria. The tests were conducted 100 trial runs
against test functions to obtain percentage of
success of global minimum found (%Success).
Table 2 summarizes sets of parameter values to
achieve 100 %Success of each test function
obtained over 100 trial runs.
Referring to Table 2, global minima of test
functions can be found with 100 %Success. It can be
noticed that the proposed CS algorithm is efficient
and robust according to given search parameter
values. Some movements and convergent rates of
the cost function obtained by the CS over the
Bohachevsky surface are depicted in Fig. 8, as an
example. The convergence curves of other test
functions are omitted because they have a similar
form to that of the Bohachevsky shown in Fig. 8.
Results in Table 2 provide the recommendations for
users to set the search parameters of the CS
appropriately. However, the proposed CS is still
problem-dependent, understanding the problem and
selecting an appropriate parameter are essential for
successful applications.
Fig. 4 Rastrigin surface.
Fig. 5 Shekel’s Fox-Holes surface.
Table 1 CS parameter values for test functions.
Parameters
N
n
Fig. 6 Schwefel surface.
ρ
Parameters
N
n
ρ
Parameters
N
n
ρ
Parameters
N
n
ρ
Parameters
N
n
Fig. 7 Shubert surface.
ρ
4 Performance Evaluation
Table 2 Results of CS performance tests.
Entry
BF
RF
SF
SchF
ShuF
4.1 CS performance tests
In this section, the CS is tested against five
benchmark continuous multivariable test functions
illustrated in section 3 to perform its effectiveness
ISBN: 978-1-61804-068-8
BF
10, 20, 30, 40, 50, 60, 70
10, 50, 100, 150, 200, 250, 300
0.005, 0.0075, 0.01, 0.015, 0.02, 0.025, 0.03
RF
100, 150, 200, 250, 300, 350, 400
50, 100, 150, 200, 250, 300, 350
0.001, 0.0025, 0.005, 0.0075, 0.01, 0.0125, 0.015
SF
50, 60, 70, 80, 90, 100, 110
20, 30, 40, 50, 60, 70, 80
0.07, 0.09, 0.11, 0.13, 0.15, 0.17, 0.19
SchF
100, 200, 300, 400, 500, 600, 700
100, 200, 300, 400, 500, 600, 700
0.5, 1.0, 1.25, 1.50, 1.75, 2.0, 2.50
ShuF
50, 75, 100, 125, 150, 175, 200
20, 40, 60, 80, 100, 120, 140
0.01, 0.02, 0.03, 0.04, 0.05, 0.06, 0.07
128
N
50 – 70
300 – 400
80 – 100
500 – 700
125 – 200
n
50 – 300
250 – 350
40 – 80
500 – 700
80 – 140
ρ
0.005 – 0.01
0.005 – 0.0075
0.11 – 0.19
1.0 – 1.5
0.01 – 0.05
%Success
100
100
100
100
100
Recent Researches in Artificial Intelligence and Database Management
members, ρ is search radius, and uniform random
search mechanism is conducted.
1
x2
0.5
(7)
x (t + 1) = x(t ) + v(t + 1)
(8)
+ φ 2 rand (0, 1){g best (t ) − x(t )}
0
For a fair comparison, parameter values of each
algorithm are set as summarized in Table 3.
Maximum search iteration = 100 and f ≤ f max are
set as termination criteria.
-0.5
-1
-1
-0.5
0
x1
0.5
1
Table 3 Parameter values of algorithms.
(a) Movements of the CS.
Entry
0.8
Convergent rate of objective function
v(t + 1) = ωv(t ) + φ1rand (0, 1){pbest (t ) − x(t )}
BF
ρ
0.7
0.6
RF
N
n
ρ
0.5
SF
0.4
N
n
ρ
0.3
SchF
0.2
N
n
ρ
0.1
0
Parameters
N
N
ShuF
0
20
40
60
Iterations
80
ρ
100
PSO
TS
CS
2,500
100,000
3,200
350,000
10,000
-
2,500
100,000
3,200
350,000
10,000
-
2,500
0.005
100,000
0.0075
3,200
0.15
350,000
1.00
10,000
0.01
50
50
0.005
400
250
0.0075
80
40
0.15
500
700
1.00
125
80
0.01
The performance comparison tests of candidate
algorithms were conducted 100 trial runs to obtain
the average cost function found, minimum solutions
found, average iteration (generation or search
round) used, and average search time consumed.
Results are summarized in Table 4 – 7, respectively.
It was found that the proposed CS outperforms other
algorithms.
(b) Convergences of cost function.
Fig. 8 Some results of the CS over BF.
4.2 Performance comparison
To compare the proposed CS algorithm with GA,
PSO, and TS, The CS and other algorithms were
coded by MATLAB running on Intel Core2 Duo 2.0
GHz 3 Gbytes DDR-RAM computer.
In GA, n is number of population, single point
uniform crossover with the rate of 0.95, random
selection mechanism, gaussian mutation with the
rate of 0.1. In PSO, n is number of swarm. The
velocity vector v and the solution x are expressed
in (7) and (8), respectively, where ω is the
additional inertia weight, which varies from 0.9 to
0.7 linearly with the iteration. The learning factors
φ1 and φ 2 are set to be 2. The upper and lower
bounds for v , (vmin , vmax ) = ( xmin , xmax ) are set. In
TS, n is number of neighborhood members, ρ is
search radius, and uniform random search
mechanism is used. The aspiration criterion (backtracking mechanism) is used to escape the local
entrapments. In CS, n is number of neighborhood
ISBN: 978-1-61804-068-8
N
n
GA
Table 4 Cost function.
Entry
BF
RF
SF
SchF
ShuF
129
Cost
Min
Max
Ave
Std.
Min
Max
Ave
Std.
Min
Max
Ave
Std.
Min
Max
Ave
Std.
Min
Max
Ave
Std.
GA
2.83×10-7
2.73×10-4
9.82×10-5
8.97×10-5
1.87×10-4
4.33×10-3
1.69×10-3
1.33×10-3
0.9980
0.9999
0.9986
6.23×10-4
2.37×10-3
1.54×10-1
6.24×10-2
4.47×10-2
-186.7288
-186.6488
-186.6969
2.31×10-2
PSO
2.92×10-9
2.22×10-5
2.95×10-6
5.01×10-6
5.36×10-9
1.79×10-5
4.95×10-6
5.82×10-6
0.9980
0.9992
0.9983
3.65×10-4
2.85×10-7
2.11×10-4
4.92×10-5
5.16×10-5
-186.7309
-186.5735
-186.7102
3.60×10-2
TS
3.26×10-7
1.68×10-4
3.23×10-5
3.86×10-5
1.63×10-7
1.41×10-4
3.86×10-5
4.18×10-5
0.9980
0.9994
0.9981
3.67×10-4
1.08×10-5
236.8771
100.6727
79.4509
-186.7305
-186.5504
-186.6988
4.91×10-2
CS
5.63×10-9
7.39×10-7
2.02×10-7
2.04×10-7
2.98×10-9
9.84×10-7
4.08×10-7
2.99×10-7
0.9980
0.9980
0.9980
1.98×10-11
4.48×10-8
5.97×10-7
4.81×10-7
2.62×10-7
-186.7309
-186.7307
-186.7309
4.16×10-5
Recent Researches in Artificial Intelligence and Database Management
[6] M. Dorigo, Optimization, Learning and Natural
Algorithms, PhD thesis, Politecnico di Milano,
Italie, 1992.
[7] Z.B. Zabinsky, D.L. Graesser, M.E. Tuttle, and
G.I. Kim, Global optimization of composite
laminates using improving hit and run, In:
Floudas C. and Pardalos P. (eds.), pp. 343–368,
Recent Advances in Global Optimization,
Princeton University Press, 1992.
[8] H.E. Romeijn and R.L. Smith, Simulated
annealing for constrained global optimization.
Journal of Global Optimization, Vol. 5, 1994,
pp. 101–126.
[9] J. Kennedy and R. Eberhart, Particle Swarm
Optimization,
Proceedings
of
IEEE
International Conference on Neural Networks,
Vol. 4, 1995, pp. 1942–1948.
[10] Z.W. Geem, J.H. Kim, and G.V. Loganathan, A
New Heuristic Optimization Algorithm:
Harmony Search, Simulation, Vol. 76, No. 2,
2001, pp. 60-68.
[11] K.M. Passino, Biomimicry of Bacterial
Foraging for Distributed Optimization and
Control, IEEE Control System Magazine, Vol.
22, 2002, pp.52-67.
[12] M.M. Eusuff and K.E. Lansey, Optimization of
Water Distribution Network Design using the
Shuffled Frog Leaping Algorithm, Journal of
Water Resource Planning and Management,
Vol. 129, No. 3, 2003, pp.210-225.
[13] D.T. Pham, A. Ghanbarzadeh, E. Koç, S. Otri,
S. Rahim and M. Zaidi, The Bees Algorithm –
A Novel Tool for Complex Optimisation
Problems, Proceedings of IPROMS 2006
Conference, 2006, pp.454-461.
[14] J. Qin, A New Optimization Algorithm and Its
Application – Key Cutting Algorithm, Grey
Systems and Intelligent Services, 2009, pp.
1537-1541.
[15] R. Oftadeh, M.J. Mahjoob, and M.
Shariatpanahi, A Novel Mata-Heuristic
Optimization Algorithm Inspired by Group
Hunting of Animals: Hunting Search,
Computers and Mathematics with Applications,
Vol. 60, 2010, pp.2087-2098.
[16] M.M. Ali, C. Khompatraporn, and Z.B.
Zabinsky, A Numerical Evaluation of Several
Stochastic Algorithms on Selected Continuous
Global Optimization Test Problems, Journal of
Global Optimization, Vol. 31, 2005, pp.635672.
Table 5 Minimum solutions.
Entry
BF
RF
SF
SchF
ShuF
x1
x2
x1
x2
x1
x2
x1
x2
x1
x2
GA
-1.31×10-4
3.38×10-5
6.60×10-4
7.11×10-4
-32.0065
-31.9193
421.1043
420.9899
5.4838
-1.4254
PSO
-1.90×10-7
-9.33×10-6
-3.71×10-7
-5.19×10-6
-31.9644
-32.0375
420.9694
420.9674
-1.4250
-0.8003
TS
5.82×10-5
9.10×10-5
1.82×10-6
-2.86×10-5
-32.0621
-31.9671
421.0139
421.3989
4.8584
-0.1116
CS
1.82×10-5
-5.24×10-6
3.87×10-6
-2.43×10-7
-31.9784
-31.9792
420.9689
420.9693
-7.7083
-7.0834
Table 6 Average iteration.
Entry
BF
RF
SF
SchF
ShuF
GA
98.70
100
9.05
100
100
PSO
75.90
80.85
2.50
98.40
85.90
TS
97.04
96.59
8.60
100
90.65
CS
1.70
34.25
21.60
45.65
11.95
Table 7 Average search time (sec.).
Entry
BF
RF
SF
SchF
ShuF
GA
9.1088
1,598.6415
4.1294
9,544.6173
91.0349
PSO
0.1914
8.0497
1.0314
169.19
1.6936
TS
0.1609
37.8574
2.4265
995.8310
1.68
CS
0.0094
0.5577
7.7621
15.2636
0.1671
5 Conclusion
In this article, a novel meta-heuristic optimization
algorithm named the Current Search (CS) has been
proposed. It is inspired by an electric current
flowing through electric networks. Its effectiveness
and robustness have been performed against wellknown benchmark test functions. Results obtained
by the CS are compared with those obtained by GA,
PSO, and TS. As results, it could be concluded that
the proposed CS superior to other algorithms.
Moreover, the recommendations of the CS
parameter setting are appropriately given. For future
trends, the CS is still needed to solve more complex
and real-world problems both discrete and
continuous such as engineering problems.
References:
[1] D.T. Pham and D. Karaboga, Intelligent
Optimisation Techniques, Springer, London,
2000.
[2] L.J. Fogel, A.J. Owens, and M.J. Walsh,
Artificial Intelligence through Simulated
Evolution, John Wiley, 1966.
[3] F. Glover and M. Laguna, Tabu Search,
Kluwer Academic Publishers, 1997.
[4] S. Kirkpatrick, C.D. Gelatt, and M.P. Vecchi,
Optimization by Simulated Annealing. Science,
Vol. 220, No. 4598, 1983, pp.671–680.
[5] D.E. Goldberg, Genetic Algorithms in Search
Optimization and Machine Learning. Addison
Wesley Publishers, 1989.
ISBN: 978-1-61804-068-8
[17] C.K. Alexander and M.N.O. Sadiku,
Fundamentals of Electric Circuits,
McGraw-Hill, 2004.
130
Download