Beruflich Dokumente
Kultur Dokumente
Introduction
What is optimisation? And why?
Optimization is the act of obtaining the best result under given circumstances. Optimization can be
defined as the process of finding the conditions that give the maximum or minimum value of a
function.
If a point x* corresponds to the minimum value of function f(x), the same point also corresponds to
the maximum value of the negative of the function, - f (x). Thus, without loss of generality,
optimization can be taken to mean minimization since the maximum of a function can be found by
seeking the minimum of the negative of the same function. There is no single method available for
solving all optimization problems efficiently. Hence a number of optimization methods have been
developed for solving different types of optimization problems.
Design Vector
Any engineering system or component is defined by a set of quantities; the design is sensitive to the
changes in the values of these quantities. Some of these variables which largely affect the design of
an engineering system or component are viewed as variables during the design process. In general,
certain quantities are usually fixed at the outset and these are called pre-assigned parameters. All
the other quantities are treated as variables in the design process and are called the design variables
or decision variables. The design variables are collectively represented as design vector.
Constraints
In many practical problems, the design variables cannot be chosen arbitrarily; rather, they have to
satisfy certain specified functional and other requirements. The restrictions that must be satisfied to
produce an acceptable design are collectively called design constraints. For illustration, consider an
optimization problem with only inequality constraints gj (X) <=0. The set of values of X that satisfy
the equation forms a hypersurface in the design space and is called a constraint surface. Thus the
points lying on the hypersurface will satisfy the constraint gj (X) critically, whereas the points lying in
the region where gj (X) > 0 are infeasible or unacceptable, and the points lying in the region where gj
(X) < 0 are feasible or acceptable. The collection of all the constraint surfaces gj (X) = 0, j = 1,2, ... ,m,
which separates the acceptable region is called the composite constraint surface.
The next figure shows a hypothetical two-dimensional design space where the infeasible region is
indicated by hatched lines. A design point that lies on one or more than one constraint surface is
called a bound point, and the associated constraint is called an active constraint. Design points that
do not lie on any constraint surface are known as free points.
Objective Function
The conventional design procedures aim at finding an acceptable or adequate design which merely
satisfies the functional and other requirements of the problem. In general, there will be more than
one acceptable design, and the purpose of optimization is to choose the best one of the many
acceptable designs available. Thus a criterion has to be chosen for comparing the different
alternative acceptable designs and for selecting/the best one. Criterion, with respect to which the
design is optimized, when expressed as a function of the design variables, is known as the criterion
or merit or objective function. The choice of objective function is governed by the nature of
problem; the objective function for minimization is generally taken as weight in aircraft and
aerospace structural design problems. In civil engineering structural designs, the objective is usually
taken as the minimization of cost. The maximization of mechanical efficiency is the obvious choice of
an objective in mechanical engineering systems design. The selection of the objective function can
be one of the most important decisions in the whole optimum design process.
Variable Bounds
Generally a maximum and minimum limit is set on the value of design variables. This ensures that
the feasible region is a closed surface.
An optimisation problem often formulated as,
Optimality Criteria:
(i) Local optimal point: A point or solution x* is said to be a local optimal point, if there exists no
point in the neighbourhood of x* which is better than x*. In the parlance of minimization problems,
a point x* is a locally minimal point if no point in the neighbourhood has a function value smaller
than f(x*).
(ii) Global optimal point: A point or solution x** is said to be a global optimal point, if there exists no
point in the entire search space which is better than the point x**. Similarly, a point x** is a global
minimal point if no point in the entire search space has a function value smaller than f(x**).
(iii) Inflection point: A point x* is said to be an inflection point if the function value increases locally
as x* increases and decreases locally as x* reduces or if the function value decreases locally as x*
increases and increases locally as x* decreases.
Certain characteristics of the underlying objective function can be exploited to check whether a
point is either a local minimum or a global minimum, or an inflection point.
Necessary Condition: At point x*, the first derivative is zero and it means that the point x* is either a
minimum, a maximum, or an inflection point.
Sufficient Condition: At point x*, the first derivative is zero and the first nonzero higher order
derivative is denoted by n; then
Algorithm
Algorithm
MULTI-VARIABLE OPTIMIZATION
Objective function is made of more than one variable
Optimality Criteria:
Same as single variable condition. There exists minimum, maximum and inflection points and the
conditions are,
Necessary Condition: If f(X) has an extreme point (maximum or minimum) at X = X* and if the first
partial derivatives of f(X) exist at X*, then
Sufficient condition: A sufficient condition for a point X* to be an extreme point is that the matrix of
second partial derivatives (Hessian matrix) of f(X) evaluated at X* is (i) positive definite when X* is a
relative/local minimum point, and (ii) negative definite when X* is a relative/local maximum point
and (iii) is semi-definite when X* is a inflection point.
Hessian Matrix
Algorithm
CONSTRAINED OPTIMIZATION
The Kuhn-Tucker conditions are the most generalised conditions for finding optimal (minimum)
points in a mathematical optimisation problem.
Note:
Inequality constraints are changed to equality constraints by adding slack variables if
necessary.
If the objective function and the constraints are all linear and satisfy non-negative
conditions, then the NLP can be termed as Linear Programming Problem A very famous
algorithm of solving LP problem is called Simplex Method.
Simplex Method
Pdf provided