Sie sind auf Seite 1von 19

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Optimization in Matlab
Kevin Carlberg
Stanford University

July 28, 2009

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

1 Overview 2 Optimization Toolbox 3 Genetic Algorithm and Direct Search Toolbox 4 Function handles 5 GUI 6 Homework

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Overview
Matlab has two toolboxes that contain optimization algorithms discussed in this class Optimization Toolbox
Unconstrained nonlinear Constrained nonlinear Simple convex: LP, QP Least Squares Binary Integer Programming Multiobjective

Genetic Algorithm and Direct Search Toolbox: general optimization problems


Direct search algorithms (directional): generalized pattern search and mesh adaptive search Genetic algorithm Simulated annealing and Threshold acceptance
Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Problem types and algorithms


Continuous
Convex, constrained (simple)
LP: linprog QP: quadprog

Nonlinear
Unconstrained: fminunc, fminsearch Constrained: fmincon, fminbnd, fseminf

Least-squares (specialized problem type): minx F (x)


F (x) linear, constrained: lsqnonneg, lsqlin F (x) nonlinear: lsqnonlin, lsqcurvefit

Multiobjective: fgoalattain, fminimax

Discrete
Linear, Binary Integer Programming: bintprog
Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Continuous, nonlinear algorithms

We will focus only on the following algorithms for continuous, nonlinear problems Unconstrained: fminunc, fminsearch Constrained: fmincon

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Nonlinear, unconstrained algorithms


fminunc: a gradient-based algorithm with two modes
Large-scale: This is a subspace trust-region method (see p. 7677 of Nocedal and Wright). It can take a user-supplied Hessian or approximate it using nite dierences (with a specied sparsity pattern) Medium-scale: This is a cubic line-search method. It uses Quasi-Newton updates of the Hessian (recall that Quasi-Newton updates give dense matrices, which are impractical for large-scale problems)

fminsearch: a derivative-free method based on Nelder-Mead simplex

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Nonlinear constrained algorithm: fmincon

fmincon: a gradient-based framework with three algorithms


Trust-region reective: a subspace trust-region method Active Set: a sequential quadratic programming (SQP) method. The Hessian of the Lagrangian is updated using BFGS. Interior Point: a log-barrier penalty term is used for the inequality constraints, and the problem is reduced to having only equality constraints

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Algorithms
Algorithms in this toolbox can be used to solve general problems All algorithms are derivative-free methods Direct search: patternsearch Genetic algorithm: ga Simulated annealing/threshold acceptance: simulannealbnd, threshacceptbnd Genetic Algorithm for multiobjective optimization: gamultiobj

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Function handles
Function handle: a MATLAB value that provides a means of calling a function indirectly
Function handles can be passed in calls to other functions Function handles can be stored in data structures for later use The optimization and genetic algorithm toolboxes make extensive use of function handles

Example: Creating a handle to an anonymous function


bowl = @(x,y)x^2+(y-2)^2; ezsurf(bowl)
x2+(y!2)2 100 80 60 40 20 0 5 0 !5 y !6 !4 !2 x 2 4 6

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Example: creating a handle to a named function


At the command line, type edit bowlNamed; In an editor, create an m-le containing function f = bowlNamed(x,y) f = x^2+(y-2)^2; At the command line, type bowlhandle = @bowlNamed; ezsurf(bowlhandle)
x2+(y!2)2 100 80 60 40 20 0 5 0 !5 y !6 !4 !2 x 2 4 6

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Function handles for optimization


For the optimization toolbox, only one vector-valued input argument should be used Example: creating a handle to an anonymous function with one vector-valued input variable bowlVec = @(x)x(1)^2+(x(2)-2)^2; Example: creating a handle to a named function with two scalar-valued input variables bowlVecNamed = @(x)bowlNamed(x(1),x(2)); ezsurf cannot accept handles with vector-valued arguments (stick with examples on previous pages)
Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Supplying gradients

It may be desirable to analytically specify the gradient of the function To do this, the named function must return two outputs: the function value and the gradient

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Complete example: creating a handle to a named function, plotting it, and specifying the handle for optimization
At the command line, type edit bowlNamed; In the editor, create an m-le containing function [f,g] = bowlNamed(x,y) f = x^2+(y-2)^2; g(1) = 2*x; g(2) = 2*(y-2); At the command line, type bowlhandle = @bowlNamed; ezsurf(bowlhandle); bowlhandleOpt = @(x) bowlNamed(x(1),x(2))

bowlhandleOpt can now be used as the argument to a Matlab optimization function with supplied gradients
Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

GUI
The optimization toolbox includes a graphical user interface (GUI) that is easy to use To activate, simply type optimtool at the command line

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

GUI options

We would like to track the progress of the optimizer Under options, set Level of display: iterative Under plot functions, check: function value When ga is used, check Best tness, and Expectation to track the tness of the best member of the population and the average tness of the population

Kevin Carlberg

Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

For the functions on the following pages, do the following:


1

Create a function that takes in two scalar-valued arguments and outputs both the function and gradient Create a handle for this function and use ezsurf to plot the function Create an optimization-ready handle for this function and solve using dierent starting points using:
fminunc, medium scale, derivatives approximated by solver fminunc, medium scale, gradient supplied fminsearch ga

Compare the algorithms on the following measures:


Robustness: ability to nd a global optimum and dependence of performance on initial guess 2 Eciency: how many function evaluations were required?
1
Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Problem 1
Consider a convex function with constant Hessian
2 f (x1 , x2 ) = 4x1 + 7(x2 4)2 4x1 + 3x2
prob1

prob1 6

4
800 600 400 200 0 6 4 2 0 !2 !4 !6 x 6 4 y 2 0 !2 !4 !6

!2

!4

!6 !6 !4 !2 0 x 2 4 6

Figure: Surface and contour plot

Also, nd the analytical solution to this problem


Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Problem 2
Consider the Rosenbrock function, a non-convex problem that is dicult to minimize. The global minimum is located at (x1 , x2 ) = (0, 0)
2 f (x1 , x2 ) = (1 x1 )2 + 100(x2 x1 )2
rosen

1 0.8 0.6 0.4

rosen

500 400 300 200 100 0 1 0.5 0 y ! 0.5 !1 0

0.2 0 ! 0.2
1

y
0.5 ! 0.5 x !1

! 0.4 ! 0.6 ! 0.8 !1 !1

! 0.8

! 0.6

! 0.4

! 0.2

0 x

0.2

0.4

0.6

0.8

Figure: Surface and contour plot


Kevin Carlberg Optimization in Matlab

Outline Overview Optimization Toolbox Genetic Algorithm and Direct Search Toolbox Function handles GUI Homework

Problem 3
Consider the Rastragins function, an all-around nasty function. The global minimum is located at (x1 , x2 ) = (0, 0)
2 2 f (x1 , x2 ) = 20 + x1 + x2 10(cos 2x1 + cos 2x2 )
ras

ras 6

4
100 80 60

40 20 0 5 0 !5 y !6 !4 !2 x 2 4 6

y
0

!2

!4

!6 !6 !4 !2 0 x 2 4 6

Figure: Surface and contour plot


Kevin Carlberg Optimization in Matlab

Das könnte Ihnen auch gefallen