test

advertisement
Chapter 1
Introduction:
1.1 Overview
The nature of mankind is such that, man has always pursed for the best quality of
life with the use of available resources. To achieve this effectively and efficiently,
design can be used as a tool. Design tools also helps to quench the thirst of
mankind to improve the quality of every aspect of our life. The ever-increasing
demand on engineers has prompted them to look for rigorous methods of
decision-making, such as optimization methods to design and produce products
more economically and efficiently. Optimization techniques, have reached degree
of maturity over the past several years, are being used in the wide spectrum of
industries,
including
aerospace,
automotive,
chemical,
electrical
and
manufacturing industries.
With rapidly advancing computer technology over the past few years, computers
are becoming more powerful and this increase in computational capabilities has
enabled tremendous advances in all disciplines of engineering. Computers have
become an inherent part of engineering design, manufacturing and analysis.
Optimization methods, coupled with modern tools of computer-aided design, are
also being used to enhance the creative process of conceptual and detailed
design of engineering systems.
The current technological advances have caused many researchers to rely on
hardware efficiency to solve complex mathematical problems. In the field of
numerical optimization, work is constantly being performed to find methods that
find solutions to optimal design problems more efficiently. However, many
efficient numerical optimization algorithms make use of first and second order
derivative information to reach the optima, which requires the objective function
and constraints to be expressed in closed form mathematical equations. While
finding the solutions quickly, this often results in higher computational costs.
However, in many real world problems, the objective function representation is
nothing but a black box where only the output corresponding to the given inputs
is known. Thus, in those cases obtaining derivative information must be done
through approximate methods again increasing the computational cost.
A second issue is that many of these methods are suited for unimodal design
problems. However many, real life problems are multimodal in nature. While
much work has been done to develop algorithms to handle these types of
problems, many rely on the power of the computational hardware to find a
solution efficiently. There is a definite need for more elegant mathematical
solutions to these problems that can simplify the type of calculations being
performed and the manner in which a design space is searched to find a solution.
Thus the objective of this work was to develop an efficient math-based
optimization algorithm that locates the global optimum in multimodal problems
without requiring derivative information. In order to better understand the
research issue and the proposed approach within this thesis, an overview of
optimization terminologies, optimization methods are provided in the remaining
chapter.
1.2 Optimization
1.2.1 Introduction:
Optimization is the act of obtaining the best result under given circumstances. In
the most general terms, optimization theory is a body of mathematical results and
numerical methods for finding and identifying the best candidate solutions from a
collection of alternatives without having explicitly enumerate and evaluate all
possible alternatives [1].
In design, construction, maintenance of any
engineering systems etc., engineers have to take many technological and
managerial decisions at several stages. The ultimate goal of all such decisions is
either to minimize the effort required or to maximize the desired benefit. Since
effort required or the benefit desired in any practical situation can be expressed
as a function of certain decision variables, optimization can be defined as the
process of finding he condition that give the maximum or minimum value of a
function. It can be seen from figure 1.1 [2] that if a point X* corresponds to a
minimum value of function f (X), the same point also corresponds to the
maximum value of the negative of the function, --f (X). Thus without the loss of
generality, optimization can be taken to mean minimization since the maximum of
the function can be found by seeking the minimum of the negative of the same
function. The desired solution is generally the maximization or minimization of the
objective function. In optimization, the primary objective of the designer is termed
as the objective function. The variables of the problem that the designer has
control are termed design variables. The optimization problems can be
unconstrained or the design space can be constrained. The unconstrained
problems are also termed as box-constrained problems. The problem is called
box-constrained if the problem has only side constraints to identify the feasible
range or acceptable range. In general there are two types of design constraints,
equality constraint, where the constraints have to confirm to a certain fixed value
or inequality constraints, where some function of the design variables may be
greater or lesser then the prescribed value.
1.2.2 Problem Formulation
An optimization problem can be stated as follows.
Minimize:
f (X1, X2, …., Xn)
1.1
Subject to constraints:
gj (X1, X2, …., Xn) ≤ 0
j = 1,2,….m
1.2
lj (X1, X2, …., Xn) = 0
j = 1,2,….p
1.3
I = 1,2,….n
1.4
Side constraints:
Xil ≤ Xi ≤ Xiu
where, X is an n-dimensional vector called the design vector, f (X) is termed the
objective function, and gj (X) and lj (X) are known as inequality and equality
constraints, respectively. The problem stated in above is called a constrained
optimization problem. In this work, the problems will be expressed in standard
form. Even in cases where maximization is desired, the negative of function will
be minimized. If we consider F be the feasible design space, the portion of
design space where all the constraints are active or satisfied, then the point X * is
a global minimum if and only if f (X*) ≤ f (X) for all X which belongs to feasible
design space F.
1.2.3 Optimization Algorithms
There are many traditional methods available for unconstrained as well as
constrained problems. Powell’s method, the conjugate direction method of
Fletcher and Reeves, and the Variable Metric method are few examples for
unconstrained problems while, for constrained problems we have Sequential
Linear Programming (SLP) method, the Method of Feasible Direction, Sequential
Quadratic Programming (SQP) method etc. All these optimization algorithms
work on the idea of iterative search in a direction where the objective function is
decreasing. The design variables are updated using the Up-date formula [3].
Xq-1 = Xq +  q* Sq
1.5
where, X is the design variable vector, Sq is the search direction,  q* is a scalar
multiplier determining the amount of change in design variable, and q is the
iteration number.
Traditionally, non-linear programming methods were developed to obtain local
minima. The methods stated above works well if the problem is unimodal.
However if the problem is multi-modal the probability of getting stuck with the
local minimum increases drastically. Although there was no mathematical
criterion that can prove that a particular local minima is also the global minima.
Research work on global optimization has been going on for decades [4,5].
Global optimization methods also take the consideration of local minima. Some
analytical and numerical methods tending to converge directly to global minima,
while others intend to find local minima and select global minima out of them.
This gives designer more insight to the feasible design space and also to
consider alternative solution in case the global minimum is unstable as
suggested by sensitivity analysis [6].
The multistart approach [4] is a very popular stochastic method that tries to find
all local minima by starting a local optimization procedure from a set of random
points uniformly distributed over a feasible design space. In general method have
two phases local and a global phase. Multistart method if used in its original form,
can be quite inefficient as it causes extra execution of local search procedure
and same local minima could be hit several times. There are various stochastic
methods viz. Pure Random Search [7,8], Controlled Random Search [9,10],
Simulated Annealing [11,12,13], Genetic Algorithm [14,15,16] which rely on the
power of computational hardware to find a solution efficiently. However the basic
multistart method has been quite inefficient because it causes extra execution as
described above. Several variants have been proposed to improve the efficiency
such as Random Tunneling [17], Domain Elimination [18,19] etc.
1.3 Organization of the Thesis
This part will be added as we will proceed towards completion.
References:
1. Reklaitis, G.V., Ravindran, A., Ragsdell, K.M., “Engineering OptimizationMethods and Applications”, John Wiley and Sons, New York, NY, 1983.
2. Rao, S. Singiresu, “ Engineering optimization -Theory and Practice”, Third
Edition, New Age International (P) Ltd., New Delhi, 1998.
3. Vanderplaats, G.N., “Numerical Optimization Techniques for Engineering
Design with Applications”, McGraw-Hill Inc., New York, NY, 1984.
4.Dixon, L.C.W. and Szego, G.P.(eds) (1975), Towards Global Optimization II,
North Holland, Amsterdam.
5.Dixon, L.C.W. and Szego, G.P.(eds) (1978), Towards Global Optimization II,
North Holland, Amsterdam
6. Saigal, Sunil and Mukharjee, Subrata (1990), Sensitivity Analysis and
Optimization with Numerical Methods, presented at the winter annual meeting of
the American Society of Mechanical Engineers, Dallas, Texas Nov. 25-30, 1990.
7.Brooks, S.H. (1958), “A Discussion of Random Search for Seeking Maxima,”
Operations Research, Vol.6 244-251.
8.Anderssen, R.S. (1972), “Global Optimization,” in Anderssen, R.S., Jennings,
L.S. and Ryan, D.M. eds., Optimization, University of Queensland Press, 1-15.
9. Price, W.L. (1978), “ A Controlled Random Search Procedure for Global
Optimization,” in Dixon, L.C.W., Szego, G.P. (eds.) Towards Global Optimization
II, North Holland, Amsterdam.
10.Price, W.L. (1983), “ Global Optimization by Controlled Search, “ JOTA, Vol.
40 No. 3, 333-348.
11.Kirkpatrick, S., Gelatt, C.D. and Vecchi, M.P. (1983), “ Optimization by
Simulated Annealing,” Science Vol. 220, 671-680.
12.Corana, A., Marchesi, M., Martini, C. and Ridella, S. (1987), “ Minimizing
Multimodal Function of Continuous Variables with the Simulated Annealing
Algorithm,” ACM Transactions on Mathematical Software, 13, 262-280.
13.Dekkers, A. and Aarts, E. (1991), “ Global Optimization and Simulated
Annealing,” Mathematical Programming, Vol. 50, 367-393.
14.Holland, J.H. (1975), Adaption in Natural and Artificial Systems, University of
Michigan Press, Ann Arbor, Michigan.
15.Goldberg, D.E. (1989), Genetic Algorithm in search, Optimization and
Machine Learning, Addison, Weskey.
16.Hussain, M.F. and Al-Sultan, K.S. (1997) “ Hybrid Genetic Algorithm for NonConvex Function Optimization,” Journal of Global Optimization 11, 313-324.
17.Lucii, S. and Piccioni, M. (1989), “Random Tunneling by Means of
Acceptance-Rejection Sampling for Global Optimization,” JOTA, Vol. 2, No. 2,
255-277.
18.Elwakiel, O.A. and Arora, J.S. (1996), “Global Optimization of Structural
Systems Using Two New Methods,” Structural Optimization, Vol. 12, 1-10.
19.Elwakiel, O.A. and Arora, J.S. (1996), “Two Algorithms for Global Optimization
of General NLP problems,” International Journal for Numerical Methods in
Engineering, Vol. 39, 3305-3325.
Download