Optimization Routine

History
06/06/2003: Created the page


INDEX





Introduction

I have developed an optimization routine a few years ago for my study. There are a variety of algorithms and most of them are very efficient. Even though, I have developed one. One of the reasons is that I wanted to develop an algorithm myself. Another is that the efficient algorithms are not very sensitive for problems with many number of free variables. I had to solve problems with a hundred of variables.

My algorithm is quite simple; it is a kind of Downhill Simplex Method. Possibly, it is much simpler. The advantages of my algorithm are

The disadvantage is that it converges very slow with many variables, although it converges as fast as other sophisticated algorithms with a few variables.

The interface of the optimization routine is very simple. Only you have to do is to prepare a function you want to minimize, and to specify a few tolerance constraints to terminate iteration. It also has some debugging options and flags which are associated with variables and with which you can specify which variables to be varied (tested) for a minimization.

All copyrights of the optimization routine belong to Hiroyuki Toyozumi and the scope of the rights is as written below. Using the routine is free in any forms and any methods. I am very appreciated if you write me before you use it.

HIROYUKI TOYOZUMI





Over View of the Algorithm

The basic idea is

  1. to roughly estimate the value of variables, and
  2. to find better, not the best, value for each iteration.

The reason is to prevent from being trapped in the local minimum points during a minimization. If you always trace the best minimum point for each iteration step, you easily trapped at a local minimum point. This should be avoided. The algorithm tests all variables varying the values slighly, finds better point to move, and varies only one variable for each iteration step. This sound as if the algorithm is extremely inefficient. Yes, it is to some extent, but not actually and very robust.

The optimization routine uses the gradients of the test function provided to find out the direction to move. However, it DOES NOT use the gradients to determine next values. Therefor, the routine works even for discontinuous functions properly.

I don't want to dig into detail of the algorithm. Please have a look at a function Nextvalues() to see how the function finds out next values in each iteration step. And, you may surprise that it is only 40 line routine.





How to use

The syntax of the optimization routine is

int optimize(double (*func)(), int nvar, int flags[], double var[], double param[], double ftol, double vtol, int maxiteration, int dflag, double *err);

The routine returns 0 or 1 to tell if the optimization was successful or not; 0 means the optimization terminated with success. The func() is the function you want to minimize and the interface of func() has to be

double func(double var[], double param[]);

The func() has to return double resolution value taking var[]. The nvar is the total number of variables for minimization: size of array var[] and flags[]. The flags[] is an array of flags associated with variables var[]. The optimization routine tests only the entries which have non-zero value in flags[] in var[]. The var[] is the array of variables to optimize. You have to set reasonable initial values in it; non-zero values are preferable. The param[] is an array which is directly passed to the test function func(). The optimization routine does not refer to the values and just pass the pointer to the test function. You can use the array for some parameters the test function refers. The ftol is a tolerance to terminate optimization iteration. The optimization routine stops iteration if

ftol < δfunc()/func()

The vtol is another tolerance to terminate optimization iteration. The optimization routine stops iteration if one of the variables satisfies

vtol < δvar[n]/var[n]

The maxiteration is the maximum number of iteration to be done. The dflag is a variable for flags to control messaging and debugging. See optimizer.h for detail. Finally, err is a variable in which the final value of func() will be returned.





Download of the Program

To download, .... click!!