EASEA examples

De
Aller à la navigation Aller à la recherche

Examples

Some working examples are already present in the examples/ directory. As a new user of EASENA, you can find it useful to walk through proposed examples with changing various parameters.

Your first Multi-Objective Optimization Problem example

As a simple example, we will show, how to define the 3-objective DTLZ1 problem using the NSGA-II algorithm. First of all, we select a test folder (examples/dtlz/dtlz1):

$ cd examples/dtlz1/

Now you can begin defining you multi-objective problem in *.ez file. In this example, the problem has to be defined in dtlz1.ez file as follow:

I. First, in section \User declarations:

1. The header files for Problem Interface and Genetic Operators must be included :

  1. include <problems/CProblem.h> //include header for problem discription
  2. include <operators/crossover/continuous/CsbxCrossover.h> //include header for sbx crossover operator
  3. include <operators/mutation/continuous/CGaussianMutation.h> //include header for gaussian mutation operator (CPolynomialMutation.h is also available)


2. Then you have to define main parameters of problem (a number of decision veriable and a number of objectives) as following:

  1. define NB_VARIABLES 10 // here we set 10 variables
  2. define NB_OBJECTIVES 3 // here we set 3 objectives


3. Genetic operator parameters have to be defined:

  1. define XOVER_DIST_ID 20 // crossover distribution index
  2. define MUT_DIST_ID 20 // mutation distribution index

4. Now it is time to use the Problem constructor, which is responsible for initializint the problem:
A discription of the constructor parameters is shown below. /*

* param[in 1] - number of objectives
* param[in 2] - number of decision variables
* param[in 3] - problem boundary
*/

TP m_problem(NB_OBJECTIVES, NB_VARIABLES, TBoundary(NB_VARIABLES, std::make_pair<TT, TT>(0, 1)));

By this constructor you indicate the problem which consists of 3 objectives, 10 decision variables with the min boundary = 0 and max boundary = 1.

5. And at the end, genetic operators must be defined:

typedef easea::operators::crossover::continuous::sbx::CsbxCrossover<TT, TRandom &> TCrossover; //Type of crossover
typedef easea::operators::mutation::continuous::pm::CGaussianMutation<TT, TRandom &> TMutation; //Type of mutation

To define crossover operator parameters:
/*

* param[in 1] - random generator
* param[in 2] - probability
* param[in 3] - problem boundary
* param[in 4] - distibution index
* */

TCrossover crossover(m_generator, 1, m_problem.getBoundary(), XOVER_DIST_ID);

To define mutation operator parameters:
/*

* param[in 1] - random generator
* param[in 2] - probability
* param[in 3] - problem boundary
* param[in 4] - distribution index
*/

TMutation m_mutation(m_generator, 1/m_problem.getBoundary().size(), m_problem.getBoundary(), MUT_DIST_ID);

II. Accepted denotations. Important to know!
TI - type of individual TI::m_variable - vector of decision variable of every individual;
TI::m_objective - vector of objective functions for every individual;
TT - type of decision variables;
TIter - iterator of desicion variables.
You can find example how to use it in the next section.

III. In order to define some extra functions, section \User functions has to be used:
As one example, the code is shown below:
template <typename TT, typename TIter>
TT userFunction1(TIter begin, TIter end) {

       TT sum = 0;
       for (TIter it = begin; it != end; ++it)
       {
               const TT tmp = *it - 0.5;
               sum += tmp * tmp - cos(20 * PI * tmp);
       }
       return 100 * (distance(begin, end) + sum);

}

IV. In order to define problem evaluation function, section \GenomeClass::evaluator has to be used.
For example, like below:

\GenomeClass::evaluator :

 // uses Genome to evaluate the quality of the individual
       const size_t pVariables = getNumberOfObjectives() - 1;
       const TT g = (1 + userFunction1(TI::m_variable.begin() + pVariables, TI::m_variable.end())) * 0.5;
       userFunction2(TI::m_variable.begin(), TI::m_variable.begin() + pVariables, TI::m_objective.begin(), TI::m_objective.end(), g);
       return 1;

\end


V. Compilation and running
After following instructions above, the problem is defined. Now you will be able to compile and run your program:
You select MOEA (possible options: -nsgaii, -nsgaiii, -asrea, -cdas) by changing script compile.sh.
Then run script:
$ ./compile.sh

If you have successfully compiled you .ez file you can find .cpp, .h, Makefile, executable and .prm files. By modifying file .prm, you can set a number of generation (nbGen) and a population size (popSize and nbOffspring must be the same).

Now you can run selected MOEA for resolving your problem and plotting results:
$ ./launch.sh

After execution, you can find in the same folder following files:
-.png - a figure of obtained Pareto Front
- objectives - values of objectives functions (an approximation of the Pareto optimal set)

V. Performance Metrics
When an executed multi-objective algorithm is done on a multi-objective problem, it outputs an approximation of the Pareto optimal set, which allows to measure the quality of an algorithm on a particular problem, if a true Pareto Front (PF - set of the optimal solutions) is known.

  1. Hypervolume (HV) maximisation: it provides the volume of the objective space that is dominated by a PF. HV is bounded by a reference point, which is usuall set by finding a worst-case objective value for each objective. It shows the convergence quality towards the PF and the diversity in the obtained solutions set. The main disadvantage of the HV is its computational complexity = O(n^(m-1)), where n - is a size of the non-dominated set, and m - is a number of objectives.
  2. Generational Distance (GD) minimization: it measures the average Euclidean distance between the optimal solutions, obtained by algorithm and those in the Pareto Optimal Front.
  3. Inverted Generational Distance (IGD) minimization: it is an inverted variation of Generational Distance that: i) calculates the minimum Euclidean distance between an obtained solution and the real PF and ii) measures both the diversity and the convergence towards the PF of the obtained set (if enough members of PF are known).

To use these performance metrics:

in your ez-file:

  1. define QMETRICS
  2. define PARETO_TRUE_FILE "pareto-true.dat"<vr>

where "pareto-true.dat" is a file with Pareto Otimal Front (which is in the same folder as ez-file). See examples in examples/zdt(4,6).ez and examples/dtlz(1-3).ez.

Weierstrass

Here a complete EASEA program, as found in the examples/ directory

In file weierstrass.ez:

/*_________________________________________________________

Test functions

log normal adaptive mutation

Selection operator: Tournament

__________________________________________________________*/

\User declarations :

  1. define SIZE 100
  2. define X_MIN -1.
  3. define X_MAX 1.
  4. define ITER 120
  5. define Abs(x) ((x) < 0 ? -(x) : (x))
  6. define MAX(x,y) ((x)>(y)?(x):(y))
  7. define MIN(x,y) ((x)<(y)?(x):(y))
  8. define SIGMA 1. /* mutation parameter */
  9. define PI 3.141592654


float pMutPerGene=0.1;

\end

\User functions:

  1. include <math.h>

__device__ __host__ inline static float SQR(float d) {

 return (d*d);

}

__device__ __host__ inline float rosenbrock( float const *x) {

 float qualitaet;
 int i;
 int DIM = SIZE;
       qualitaet = 0.0;
       for( i = DIM-2; i >= 0; --i)
         qualitaet += 100.*SQR(SQR(x[i])-x[i+1]) + SQR(1.-x[i]);
       return ( qualitaet);

} /* f_rosenbrock() */

__device__ __host__ inline float Weierstrass(float x[SIZE], int n) // Weierstrass multimidmensionnel h = 0.25 {

  float res = 0.;
  float val[SIZE];
  float b=2.;
  float h = 0.35;
  for (int i = 0;i<n; i++) {

val[i] = 0.;

   	for (int k=0;k<ITER;k++)

val[i] += pow(b,-(float)k*h) * sin(pow(b,(float)k)*x[i]); res += Abs(val[i]); }

  return (res);

}

float gauss() /* Generates a normally distributed random value with variance 1 and 0 mean.

   Algorithm based on "gasdev" from Numerical recipes' pg. 203. */

{

 static int iset = 0;
 float gset = 0.0;
 float v1 = 0.0, v2 = 0.0, r = 0.0;
 float factor = 0.0;
 if (iset) {
       iset = 0;
       return gset;
     	}
 else {    
       do {
           v1 = (float)random(0.,1.) * 2.0 - 1.0;
           v2 = (float)random(0.,1.) * 2.0 - 1.0;
           r = v1 * v1 + v2 * v2;

}

       while (r > 1.0);
       factor = sqrt (-2.0 * log (r) / r);
       gset = v1 * factor;
       iset = 1;
       return (v2 * factor);
   	}

} \end

\User CUDA: \end

\Before everything else function: { } \end

\After everything else function:

 //cout << "After everything else function called" << endl;

\end

\At the beginning of each generation function:{ } \end

\At the end of each generation function:

 //cout << "At the end of each generation function called" << endl;

\end

\At each generation before reduce function:

 //cout << "At each generation before replacement function called" << endl;

\end

\User classes :

GenomeClass {

 float x[SIZE];
 float sigma[SIZE]; // auto-adaptative mutation parameter

} \end

\GenomeClass::display: /* for( size_t i=0 ; i<SIZE ; i++){ */ /* // cout << Genome.x[i] << ":" << Genome.sigma[i] << "|"; */ /* printf("%.02f:%.02f|",Genome.x[i],Genome.sigma[i]); */ /* } */ \end

\GenomeClass::initialiser : // "initializer" is also accepted

 for(int i=0; i<SIZE; i++ ) {
    	Genome.x[i] = (float)random(X_MIN,X_MAX);

Genome.sigma[i]=(float)random(0.,0.5); } \end

\GenomeClass::crossover :

 for (int i=0; i<SIZE; i++)
 {
   float alpha = (float)random(0.,1.); // barycentric crossover
    child.x[i] = alpha*parent1.x[i] + (1.-alpha)*parent2.x[i];
 }

\end

\GenomeClass::mutator : // Must return the number of mutations

 int NbMut=0;
 float pond = 1./sqrt((float)SIZE);
   for (int i=0; i<SIZE; i++)
   if (tossCoin(pMutPerGene)){
   	NbMut++;
      	Genome.sigma[i] = Genome.sigma[i] * exp(SIGMA*pond*(float)gauss());
      	Genome.sigma[i] = MIN(0.5,Genome.sigma[i]);              
      	Genome.sigma[i] = MAX(0.,Genome.sigma[i]);
      	Genome.x[i] += Genome.sigma[i]*(float)gauss();
      	Genome.x[i] = MIN(X_MAX,Genome.x[i]);              // pour eviter les depassements
      	Genome.x[i] = MAX(X_MIN,Genome.x[i]);
   	}

return NbMut; \end

\GenomeClass::evaluator : // Returns the score {

 float Score= 0.0;
 Score= Weierstrass(Genome.x, SIZE);         
 //Score= rosenbrock(Genome.x);         
 return Score;

} \end

\User Makefile options: \end

\Default run parameters : // Please let the parameters appear in this order

 Number of generations : 100   	// NB_GEN
 Time limit: 0 			// In seconds, 0 to deactivate
 Population size : 2048			//POP_SIZE
 Offspring size : 2048 // 40% 
 Mutation probability : 1       // MUT_PROB
 Crossover probability : 1      // XOVER_PROB
 Evaluator goal : minimise      // Maximise
 Selection operator: Tournament 2.0
 Surviving parents: 100%//percentage or absolute  
 Surviving offspring: 100%
 Reduce parents operator: Tournament 2
 Reduce offspring operator: Tournament 2
 Final reduce operator: Tournament 2
 Elitism: Strong			//Weak or Strong
 Elite: 1
 Print stats: true				//Default: 1
 Generate csv stats file:false			
 Generate gnuplot script:false
 Generate R script:false
 Plot stats:true				//Default: 0
 Remote island model: true
 IP file: ip.txt 			//File containing all the remote island's IP
 Server port : 2929
 Migration probability: 0.33
 Save population: false
 Start from file:false

\end