Application Center - Maplesoft

App Preview:

Global and nonsmooth optimization toolbox

You can switch back to the summary page by clicking here.

Learn about Maple
Download Application


 

Image 

Global And Non-Smooth Optimization (GANSO) Maple toolbox 

Centre for Informatics and Applied Optimization
Copyright University of Ballarat, 2006
Australia
 

j.ugon@ballarat.edu.au, gbeliako@gmail.com 

 

Installation instructions 

To use this worksheet you need to download GANSO libraries, from 

http://www.ganso.com.au/libs/gansomapledlls.zip and extract the content of this file (using a utility like WinZip) into Maple's bin directory 

(typically c:\Program Files\Maple 9\bin) or any directory on the PATH. This way Maple can locate GANSO libraries and properly use them. 

Introduction 

This worksheet illustrates how GANSO package can be used with Maple (under MS Windows).   

GANSO is a programming library which implements several methods of global and nonsmooth, nonlinear optimization. GANSO library is specifically designed to solve optimization problems with a very complex, nondifferentiable objective function. It is not designed for solving linear, quadratic or integer programming problems, for which many specialised methods are available. However, when the objective function is non smooth, or has many local extrema, such specialised methods are inadequate. GANSO provides a computationally efficient way to tackle these complicated problems. Please see the user manual supplied with this worksheet. 


This worksheet is provided as a template for user's own objective functions and constraints. It shows how to declare the objective function and call the appropriate optimization methods.
 

It should be used in conjunction with the supplied dlls (gansodll.dll, mwrap_ganso.dll), and the user manual (gansomaple.pdf). 


The typical calling sequence is as follows:
 

1. execute define_external procedures in the next subsection 

2. declare your objective function (and also set the boundaries and constraints if required) 

3. call the relevant optimization routines with correct arguments
 

Please change the GANSOPATH line below to the location of this package on your computer! 

Initialization: Declarations of GANSO optimization procedures found in the dll 

> restart;
 

Execute the commands below to declare the external subroutines from GANSO library. 

> GANSOPATH:="mwrap_ganso.dll":
  # if you need to provide full path, use GANSOPATH:="c:/work/ganso/maple/mwrap_ganso.dll":
Mindfbm0 := define_external('MWRAP_MinimizeDFBM_0','MAPLE', LIB=GANSOPATH):
Mindfbm := define_external('MWRAP_MinimizeDFBM','MAPLE', LIB=GANSOPATH):
Minecam0 := define_external('MWRAP_MinimizeECAM_0','MAPLE', LIB=GANSOPATH):
Minecam := define_external('MWRAP_MinimizeECAM','MAPLE', LIB=GANSOPATH):
Minecamdfbm0 := define_external('MWRAP_MinimizeECAMDFBM_0','MAPLE', LIB=GANSOPATH):
Minecamdfbm := define_external('MWRAP_MinimizeECAMDFBM','MAPLE', LIB=GANSOPATH):
Mindfbmecam0 := define_external('MWRAP_MinimizeDFBMECAM_0','MAPLE', LIB=GANSOPATH):
Mindfbmecam := define_external('MWRAP_MinimizeDFBMECAM','MAPLE', LIB=GANSOPATH):
Mindso0 := define_external('MWRAP_MinimizeDSO_0','MAPLE', LIB=GANSOPATH):
Mindso := define_external('MWRAP_MinimizeDSO','MAPLE', LIB=GANSOPATH):
Minecamdso0 := define_external('MWRAP_MinimizeECAMDSO_0','MAPLE', LIB=GANSOPATH):
Minecamdso := define_external('MWRAP_MinimizeECAMDSO','MAPLE', LIB=GANSOPATH):
Miniterativedso0 := define_external('MWRAP_MinimizeIterativeDSO_0','MAPLE', LIB=GANSOPATH):
Miniterativedso := define_external('MWRAP_MinimizeIterativeDSO','MAPLE', LIB=GANSOPATH):
Minrandomstart0 := define_external('MWRAP_MinimizeRandomStart_0','MAPLE', LIB=GANSOPATH):
Minrandomstart := define_external('MWRAP_MinimizeRandomStart','MAPLE', LIB=GANSOPATH):

Mindfbm0HF := define_external('MWRAP_HFMinimizeDFBM_0','MAPLE', LIB=GANSOPATH):
MindfbmHF := define_external('MWRAP_HFMinimizeDFBM','MAPLE', LIB=GANSOPATH):
Minecam0HF := define_external('MWRAP_HFMinimizeECAM_0','MAPLE', LIB=GANSOPATH):
MinecamHF := define_external('MWRAP_HFMinimizeECAM','MAPLE', LIB=GANSOPATH):
Minecamdfbm0HF := define_external('MWRAP_HFMinimizeECAMDFBM_0','MAPLE', LIB=GANSOPATH):
MinecamdfbmHF := define_external('MWRAP_HFMinimizeECAMDFBM','MAPLE', LIB=GANSOPATH):
Mindfbmecam0HF := define_external('MWRAP_HFMinimizeDFBMECAM_0','MAPLE', LIB=GANSOPATH):
MindfbmecamHF := define_external('MWRAP_HFMinimizeDFBMECAM','MAPLE', LIB=GANSOPATH):
Mindso0HF := define_external('MWRAP_HFMinimizeDSO_0','MAPLE', LIB=GANSOPATH):
MindsoHF := define_external('MWRAP_HFMinimizeDSO','MAPLE', LIB=GANSOPATH):
Minecamdso0HF := define_external('MWRAP_HFMinimizeECAMDSO_0','MAPLE', LIB=GANSOPATH):
MinecamdsoHF := define_external('MWRAP_HFMinimizeECAMDSO','MAPLE', LIB=GANSOPATH):
Miniterativedso0HF := define_external('MWRAP_HFMinimizeIterativeDSO_0','MAPLE', LIB=GANSOPATH):
MiniterativedsoHF := define_external('MWRAP_HFMinimizeIterativeDSO','MAPLE', LIB=GANSOPATH):
Minrandomstart0HF := define_external('MWRAP_HFMinimizeRandomStart_0','MAPLE', LIB=GANSOPATH):
MinrandomstartHF := define_external('MWRAP_HFMinimizeRandomStart','MAPLE', LIB=GANSOPATH):
 

 

Section 1: Declaration of the objective function and constraints: general but slow 

We solve the problem minimize f(x), subject to a system of linear constraints Ax <= b and also box constraints lo_i <= x_i <= up_i. 

Declaration of the objective function f(x) 

Declare f as a procedure. The objective function f takes the arguments:  n (dimension) and x (vector of size n). It returns a real value. (In this example f does not depend on x3) 

> ff := proc( n, xx )
local s;
 s:= evalhf(xx[1]^2+(xx[2]-1)^2)+1.;
# print(n,xx,s); # you can use this to trace execution
 s  #this last value is what is returned
end proc;
 

proc (n, xx) local s; s := evalhf(xx[1]^2+(xx[2]-1)^2)+1.; s end proc
 

> # test the objective function: declare some vector x and evaluate ff(x)
x:=Vector(1..3,[1,2,3]):
ff(3,x);
 

3.
 

 

Declaration of box constraints and auxiliary vectors 

Constraints and boundaries, also dimension of x and declaration of x0 and other vectors.  

> Lo:=Vector(1..3,datatype=float[8],[-10e30,-10e30,-1e30]):  # large values (greater 1e20) mean no box constraints
Up:=Vector(1..3,datatype=float[8],[2e30,2e30,2e30]):
dim:=3; iter:=100: val:=1.0:
x0:=Vector(1..3,datatype=float[8],[1,1,1]);      # initial point for local minimzation
basic:=Vector(1..3,datatype=integer[4],[0,1,2]): # these are needed for constrained optimization with
# linear equality constraints. These values refer to "basic" variables
       # if basic is null then basic variables will be calclated automtically. They are 0 based. See GANSO manual.
 

3 

Vector[column](%id = 145592580)
 

 

Executing GANSO methods 

> print(`Executing DFBM method from the starting point x0. The value at the minimum is`);
Mindfbm0(dim,x0,val,ff,Lo,Up,iter);
print(`The minimizer is `,x0);
 

`Executing DFBM method from the starting point x0. The value at the minimum is` 

1. 

`The minimizer is `, Vector[column](%id = 145592580)
 

> print(`Setting up the equality constraints. The matrix of constraints and the right hand side are`);
Eq:=Matrix(1..2,1..3,datatype=float[8],[[1,-1,0],[1,0,-1]]);
RHSE:=Vector(1..2,datatype=float[8],[0,0]);
print(`Executing DFBM. The solution is `);
Mindfbm(3,x0,val,ff,2,0,Eq,Lo,RHSE,Lo,Lo,Up,basic,100);
print(x0);
 

`Setting up the equality constraints. The matrix of constraints and the right hand side are` 

Matrix(%id = 220412088) 

Vector[column](%id = 145594460) 

`Executing DFBM. The solution is ` 

1.50000000000000000 

Vector[column](%id = 145592580)
 

Executing global optimization method ECAM with given box constraints. 

> print(`Setting up the box constraints`);
Lo:=Vector(1..3,datatype=float[8],[-1,-1,-1]):
Up:=Vector(1..3,datatype=float[8],[2,2,2]):
x0:=Vector(1..3,datatype=float[8],[1,1,1]):
val:=2.0:
print(`Executing ECAM method with nly box constraints. The value at the minimum is`);
Minecam0(3,x0,val,ff,5.0,Lo,Up,100,0);
print(x0);
 

`Setting up the box constraints` 

`Executing ECAM method with nly box constraints. The value at the minimum is` 

1. 

Vector[column](%id = 145594620)
 

> print(`Executing DSO method. The value at the minimum is`);
Mindso0(3,x0,val,ff,Lo,Up,3,3);
print(x0);
 

`Executing DSO method. The value at the minimum is` 

1. 

Vector[column](%id = 145594620)
 

> print(`Executing Iterative DSO method. The value at the minimum is`);
Miniterativedso0(3,x0,val,ff,Lo,Up,3,3,4);
print(x0);
 

`Executing Iterative DSO method. The value at the minimum is` 

1. 

Vector[column](%id = 145594620)
 

> print(`Executing combination ECAM and DSO methods. The value at the minimum is`);
Minecamdso0(3,x0,val,ff,50.0,Lo,Up,10,0);
print(x0);
 

`Executing combination ECAM and DSO methods. The value at the minimum is` 

1. 

Vector[column](%id = 145594620)
 

> print(`Executing Random Start method. The value at the minimum is`);
Minrandomstart0(3,x0,val,ff,Lo,Up,10);
print(x0);
 

`Executing Random Start method. The value at the minimum is` 

1. 

Vector[column](%id = 145594620)
 

> print(`Executing combination ECAM and DFBM methods. The value at the minimum is`);
Minecamdfbm0(3,x0,val,ff,50.0,Lo,Up,10);
print(x0);
 

`Executing combination ECAM and DFBM methods. The value at the minimum is` 

1.00004013500000011 

Vector[column](%id = 145594620)
 

> print(`Executing combination DFBM and ECAM methods. The value at the minimum is`);
Mindfbmecam0(3,x0,val,ff,Lo,Up,1000,10,2);
print(x0);
 

`Executing combination DFBM and ECAM methods. The value at the minimum is` 

1.00004013500000011 

Vector[column](%id = 145594620)
 

Section 2: Example -- both general and fast methods 

Declaration of another objective function. We use two declarations: for the general and for the fast methods. 

Rastrigin's test function 

> f_fast := proc(  x1, x2)  # user's objective function, using fast method
       local s;
       s:= evalf(x1^2+x2^2-cos(18*x1)-cos(18*x2));
       s  #this last value is what is returned
   end proc;
 

proc (x1, x2) local s; s := evalf(x1^2+x2^2-cos(18*x1)-cos(18*x2)); s end proc
 

> f_gen := proc(n,  x)  #user's objective function, general method
       local s;
       s:= evalf(x[1]^2+x[2]^2-cos(18*x[1])-cos(18*x[2]));
       s  #this last value is what is returned
   end proc;
 

proc (n, x) local s; s := evalf(x[1]^2+x[2]^2-cos(18*x[1])-cos(18*x[2])); s end proc
 

Plot the graph of this function below to see that is has many local minimizers. 

> plot3d(f_gen(2,[t1,t2]),t1=-1..1,t2=-1..1, axes=boxed);
 

Plot
 

both declarations are here 

Declaration of bounds 

We now declare the upper and lower bounds, the desired number of iterations, Lipschitz constant (an estimate), and the vector x0 which will receive the solution. 

> # defining vectors
   dimension:=2;
   Lo:=Vector(1..2,datatype=float[8],[-1,-1]):
   Up:=Vector(1..2,datatype=float[8],[1,1]):
   iter:=1000: val:=1.0: # this is needed
   LipConst:=40.0:
   x0:=Vector(1..2,datatype=float[8],[1,1]);
 

2 

Vector[column](%id = 285774792)
 

Execution of GANSO methods 

We first execute ECAM global optimization method 

> # executing ECAM
Minecam0(dimension,x0,val,f_gen,LipConst,Lo,Up,iter,0);
   print(x0); #the ECAM solution
   
 

-1.99936322100000008 

Vector[column](%id = 285774792)
 

> # executing ECAM using fast method
Minecam0HF(dimension,x0,val,f_fast,LipConst,Lo,Up,iter,0);
   print(x0); #the ECAM solution
 

-1.99999999961593566 

Vector[column](%id = 285774792)
 

Setting iter to a negative number of iterations has the effect of not using  local optimization method at the final stage 

> Minecam0(dimension,x0,val,f_gen,LipConst,Lo,Up,-2000,0);
   print(x0); #the ECAM solution, without improving it using DFBM
 

-1.99995198200000002 

Vector[column](%id = 285774792)
 

We call DFBM 20 times from random staring points 

> Minrandomstart0HF(dimension,x0,val,f_fast,Lo,Up,200);
   print(x0); #the random start solution
   
 

-2. 

Vector[column](%id = 285774792)
 

Another global optimization method DSO 

> Mindso0(dimension,x0,val,f_gen,Lo,Up,3,3);
   print(x0); #the DSO method solution
 

-2. 

Vector[column](%id = 285774792)
 

> # try local search from a staring vector(0.5,0.5), no constraints
   Lo:=Vector(1..2,datatype=float[8],[-1e20,-1e20]):
   Up:=Vector(1..2,datatype=float[8],[1e20,1e20]):
   x0:=Vector(1..2,datatype=float[8],[0.5,0.5]):
Mindfbm0(dimension,x0,val,f_gen,Lo,Up,iter);
   print(x0); #the solution
 

-1.75780121300000004 

Vector[column](%id = 350157564)
 

> Mindfbm0HF(dimension,x0,val,f_fast,Lo,Up,iter);
   print(x0); #the ECAM solution
 

-1.75780130305915794 

Vector[column](%id = 350157564)
 

Section 3: Example -- using fast method 

Objective function, constraints and auxiliary vectors 

> # global variables: vector c
c:=array(1..10,[-6.089, -17.164, -34.054, -5.9514, -24.721, -14.986, -24.1,
         -10.708, -26.662, -22.179]):
 

This example illustrates the use of constrains. Refer to GANSO manual for the description of this optimization problem 

> f := proc( x1,x2,x3,x4,x5,x6,x7,x8,x9,x10 ) #user's objective function
       local s,t,i;
       t:=x1+x2+x3+x4+x5+x6+x7+x8+x9+x10;
       #  note abs under the logarithm, to ensure that s is computable
       #  even for incorrect (negative) arguments
   s:=x1*(c[1]+ln(abs(x1/t)))+
x2*(c[2]+ln(abs(x2/t)))+
x3*(c[3]+ln(abs(x3/t)))+
x4*(c[4]+ln(abs(x4/t)))+
x5*(c[5]+ln(abs(x5/t)))+
x6*(c[6]+ln(abs(x6/t)))+
x7*(c[7]+ln(abs(x7/t)))+
x8*(c[8]+ln(abs(x8/t)))+
x9*(c[9]+ln(abs(x9/t)))+
x10*(c[10]+ln(abs(x10/t)));
   end proc:
 

Now define the matrices and vectors of constraints 

> # defining vectors;
   dimension:=10; eps:=.0001:
   Lo:=Vector(1..10,datatype=float[8]):
   Up:=Vector(1..10,datatype=float[8]):
   x0:=Vector(1..10,datatype=float[8]):
# set up the bounds and initial approximation x0
   for i from 1 to dimension do
       Lo[i]:=eps;
       Up[i]:=1e20;
       x0[i]:=-1;
      od:
   iter:=1000: val:=1.0: # this is needed to allocate space for val

   EQ:=Matrix(1..3,1..dimension,datatype=float[8],
   [[1,2,2,0,0, 1,0,0,0,1],
   [0,0,0,1,2, 1,1,0,0,0],
   [0,0,1,0,0, 0,1,1,2,1]]):
   RHSE:=Vector(1..3,datatype=float[8],[2,1,1]):
   IEQ:=Matrix(1..1,1..dimension,datatype=float[8]):
   RHSI:=Vector(1..1,datatype=float[8]):
   basic:=Vector(1..10,datatype=integer[4],[0,0,0,0,0,0,0,0,0,0]):
# we need to declare all matrices/vectors, even if not used
print(`Equality constraints `,EQ,RHSE);
 

10 

`Equality constraints `, Matrix(%id = 224378388), Vector[column](%id = 204702272)
 

Executing GANSO methods 

We start with DFBM local search from an initial point x0 outside feasible domain. 

> # executing
   MindfbmHF(dimension,x0,val,f,3,0,EQ, IEQ, RHSE, RHSI,Lo,Up,basic,iter);
   print(x0); #the  solution
 

-47.7611447479775606 

Vector[column](%id = 204702232)
 

Here we manually set up the desired basic variables, 10-3 = 7 variables, the ones that are linearly independent 

>    basic:=Vector(1..10,datatype=integer[4],[2,5,6,7,8,9,10,0,0,0]):
   for i from 1 to dimension do x0[i]:=-1: od: # clear x0

   MindfbmHF(dimension,x0,val,f,3,0,EQ, IEQ, RHSE, RHSI,Lo,Up,basic,iter);
   print(x0); #the  solution
 

-47.7611447496708550 

Vector[column](%id = 204702232)
 

Now a modified problem: add one inequality constraint x_3+x_7 <= 0.2  

>    for i from 1 to dimension do
     x0[i]:=-1:
     IEQ[1,i]:=0;
   od:
   IEQ[1,3]:=1: IEQ[1,7]:=1: RHSI[1]:=0.2:
print(`Inequality constraints `,IEQ,RHSI);
   MindfbmHF(dimension,x0,val,f,3,1,EQ, IEQ, RHSE, RHSI,Lo,Up,basic,iter);
   print(x0); #the  solution
 

`Inequality constraints `, Matrix(%id = 224901492), Vector[column](%id = 224156440) 

-46.6929707940928794 

Vector[column](%id = 204702232)
 

> # set up the bounds and initial approximation x0
   for i from 1 to dimension do
       Lo[i]:=eps;
       Up[i]:=1;
       x0[i]:=-1;
      od:
#MindsoHF(dimension,x0,val,f,3,0,EQ, IEQ, RHSE, RHSI,Lo,Up,basic,50.0,1,1);
MinecamHF(dimension,x0,val,f,3,1,50.0,EQ, IEQ, RHSE, RHSI,Lo,Up,basic,1000,1);
print(x0);
 

-46.6929701129447494 

Vector[column](%id = 204702232)
 

 

References 

The methods implemented in GANSO are based on the theory of Abstract Convexity, outlined in 

  Rubinov, A. Abstract Convexity and Global Optimization, Kluwer, Dordrecht, 2000. 

 

Specific references are: 

Bagirov, A., (1999). Derivative-free methods for unconstrained nonsmooth optimization and its numerical analysis, Journal Investigacao Operacional, 19, 75.
Bagirov, A., (2002). A method for minimization of quasidifferentiable functions, Optimization Methods and Software, 17, 31.
Bagirov, A.; Rubinov, A., (2001). Modified versions of the cutting angle method, in Convex analysis and global optimization; N. Hadjisavvas and P. M. Pardalos, eds.; Kluwer: Dordrecht, Vol. 54; p 245.
Bagirov, A.; Rubinov, A., (2000). Global minimization of increasing positively homogeneous function over the unit simplex, Annals of Operations Research, 98, 171.
Bagirov, A.; Rubinov, A., (2003). Cutting angle method and a local search, Journal of Global Optimization, 27, 193.
Bagirov, A., (2000). Numerical methods for minimizing quasidifferentiable functions: A survey and comparison, in Quasidifferentiability and Related Topics; V. F. Demyanov and A. M. Rubinov, eds.; p 33, KLUWER ACADEMIC PUBL.
Beliakov, G., (2003). Geometry and combinatorics of the cutting angle method, Optimization, 52, 379.
Beliakov, G., (2004). The Cutting Angle Method - a tool for constrained global optimization, Optimization Methods and Software, 19, 137.
Beliakov, G., (2005). Extended Cutting Angle method of constrained global optimization, in Optimisation in Industry; L. Caccetta, ed.; Springer: New York; 2007, in press.
Beliakov, G., (2005). A review of applications of the Cutting Angle methods, in Continuous Optimization; A. Rubinov and V. Jeyakumar, eds.; Springer: New York; p 209.
Beliakov, G.; Bagirov, A.; Monsalve, J. E., (2003). Parallelization of the discrete gradient method of non-smooth optimization and its applications, in Proceedings of the 3rd International Conference on Computational ScienceSpringer-Verlag: Heidelberg; Vol. 3; p 592.
Beliakov, G.; Ting, K. M.; Murshed, M.; Rubinov, A.; Bertoli, M., (2003). Efficient Serial and Parallel Implementations of the Cutting Angle Method, in High Performance Algorithms and Software for Nonlinear Optimization; G. Di Pillo, ed.; Kluwer Academic Publishers: p 57.
Mammadov, M.A. (2004). A new global optimization algorithm based on dynamical systems approach. In: Alex Rubinov and Moshe Sniedovich (Editors), Proceedings of The Sixth International Conference on Optimization: Techniques and Applications (ICOTA6), Ballarat, December 2004, Article index number 198 (94th article), 11pp. University of Ballarat, also in: Research Report 04/04, University of Ballarat, 2004, http://www.ballarat.edu.au/ard/itms/publications/researchPapers.shtml
Mammadov, M.A. and Orsi, R., (2005). H_infinity systhesis via a nonsmooth, nonconvex optimization approach, Pacific Journal of Optimization, Volume 1, Number 2, May 2005, pp. 405-420.
Mammadov, M.A., Rubinov A. and Jearwood, J., (2005). Dynamical systems described by relational elasticities with applications to global optimization. In: Continuous Optimisation: Current Trends and Modern Applications, V.Jeyakumar and A. Rubinov, Eds. Springer, pp. 365-385.
 

Copyright by the Centre for Informatics and Applied Optimization, University of Ballarat, 2006.  

Please consult http://www.ganso.com.au for updates. 

 

We appreciate your comments about your experiences using GANSO and this Maple toolbox. Please send your feedback to 

j.ugon@ballarat.edu.au 


Legal Notice: The copyright for this application is owned by the author(s). Neither Maplesoft nor the author are responsible for any errors contained within and are not liable for any damages resulting from the use of this material. This application is intended for non-commercial, non-profit use only. Contact the author for permission if you wish to use this application in for-profit activities.
 

Image