Skip to content
Jan Stypka edited this page Feb 18, 2015 · 10 revisions

This tutorial shows step-by-step how to use the EMAS algorithm to optimise your own specific problem.

First, you need to clone and build the application itself. If you don't know how to do it, a detailed description is provided in the README file which you can follow.

The application in its default version optimises the multimodal Rastrigin function, which serves as an example application for the algorithm. All the necessary functions and operators needed for the usecase are located in src/emas_test_ops.erl module. In order for the program to optimise our custom function, we have to provide it with a similar module, but implementing our specific problem. You can also edit the existing emas_test_ops file if you wish.

Files which are supposed to provide problem operators should implement an Erlang behaviour emas_genetic_ops. This behaviour defines a few callbacks that have to be provided for the program to work. Previously mentioned exemplary file emas_test_ops.erl also implements this behaviour.

Let's assume that we want to find a maximum of a simple polynomial function:

f(x) =  -x^4 + 5 * x^3 - 7 * x^2 + 3*x

which has a plot:

plot

As mentioned in the Custom operators wiki section, there are five functions that one needs to implement for the program to run:

  • solution/1
  • evaluation/2
  • mutation/2
  • recombination/3
  • config/0

Almost all of them take as an argument a record sim_params() which is a structure containing basic parameters for the algorithm. You can view what it contains and its default values in the file src/emas_config.erl.

Let's go through the callbacks one by one.

Solution

This is the simplest function we have to implement. It gets the parameter list as an argument and generates a single random solution for the problem. It usually is a vector of numbers, but in our case it's just one float, because the function has just one dimension. You don't have to worry about seeding the processes. We need a random real number, so our function can look like this:

solution(_Parameters) ->
	1 / random_denominator(random:uniform()) * math:pow(-1, random:uniform(2)).

random_denominator(X) when X == 0 ->
	random_denominator(random:uniform());
random_denominator(X) ->
	X.

Evaluation

Another callback that needs to be implemented is the evaluation function, which transforms a solution (which is defined by the aforementioned function) to a float representing the fitness value. It also takes an argument containing simulation parameters, which is also passed to all callbacks.

The function should judge how good a certain solution is and assign a value to it - the higher the value, the better the solution. In our case the function is given and we just need to type in the formula:

evaluation(X, _Parameters) ->
	-math:pow(X, 4) + 5 * math:pow(X, 3) - 7 * math:pow(X, 2) + 3 * X.

Mutation

Mutation is one of the most important operators of all. It is applied very rarely, however has a critical influence on the algorithm. It is supposed to randomly modify an individual, but the change should be relatively small. It is supposed to add variety to the algorithm and avoid getting stuck in local minima. In our case we can implement mutation as adding/subtracting 1 from the solution. It can look like this:

mutation(X, _Parameters) ->
	X + math:pow(-1, random:uniform(2)).

Recombination

This callback is supposed to create new individuals by mixing two others. It is the essence of evolutionary part of the algorithm and should be created thoughtfully. We want to produce two new solutions incorporating properties of both their parents. In the case of finding the global maximum of a one-dimensional function, which is quite an artificial problem, we need to find two values in between the parents. We might use the mean of two values, however we should generate two different individuals. The callback can look as the following:

recombination(X, Y, _Parameters) ->
	D = (abs(X - Y) / 2) * random:uniform(),
	Mean = (X + Y) / 2,
	{Mean - D, Mean + D}.

Config

The last callback is design to enable you to pass custom parameters to the other operators. You may create an arbitrary data structure in this callback and this term will be included in the sim_params() record under the field extra. You may access this field in every other callback should you need your own specific parameters. In our very simple example no additional parameters are neccessary, therefore we can leave the field blank.

config() -> undefined.

Running

That's it. Our operators module now looks like this:

-module(simple_ops).
-behaviour(emas_genetic_ops).

-export ([evaluation/2, mutation/2, recombination/3, solution/1]).

solution(_Parameters) ->
    1 / random_denominator(random:uniform()) * math:pow(-1, random:uniform(2)).

random_denominator(X) when X == 0 ->
    random_denominator(random:uniform());
random_denominator(X) ->
    X.


evaluation(X, _Parameters) ->
    -math:pow(X, 4) + 5 * math:pow(X, 3) - 7 * math:pow(X, 2) + 3 * X.


mutation(X, _Parameters) ->
    X + math:pow(-1, random:uniform(2)).


recombination(X, Y, _Parameters) ->
    D = (abs(X - Y) / 2) * random:uniform(),
    Mean = (X + Y) / 2,
    {Mean - D, Mean + D}.


config() -> undefined.

The last step is to compile and run the program. You should save this file in src directory and compile the project (and start the shell) with make shell command.

You can launch the calculation by typing:

emas:start(5000, [{genetic_ops, simple_ops}, {model, mas_sequential}]).

The genetic_ops parameter is mandatory, because it selects your operators. model parameter and time can be tweaked according to your needs.

Congratulations, you launched your first simulation with custom operators!

Clone this wiki locally