How is it possible that gamultiobj gives worse solution when the number of MaximumGenerations is raised?

7 views (last 30 days)
Steffen Kuehl on 14 Nov 2016
Commented: Steffen Kuehl on 16 Nov 2016
Hi,
My muliobjective optimization modell (gamultiobj) does not get continuously better when I raise the number of MaximumGenerations. Sometimes a solution is worse than another solution which was generated with the same parameters, but a smaller number of MaximumGenerations. Since gamultiobj is a variant of NSGA-II( I can't find anything more specific in the Matlab-documentation), shouldn't the elitism of the algorithm make sure, that good solutions, that are calculated in early generations, are kept until the end?So, that it's not possible to get a worse solution when more generations are calculated?
I use the same starting/Initial Population for all my runs. I created my own Creation, Crossover and Mutation Function. Since the Elitism is supposed to be located in the Selection Function, I don't think that creating my own functions should interfer with the elitism.
I appreciate any idea and help. Thanks Steffen
And to clarify, here is some of my code.
These are the options for the Algorithm.
% code
options = optimoptions('gamultiobj',...
'UseParallel', true,...
'UseVectorized', false,...
'CreationFcn',@popFun,...
'CrossoverFcn',@crossoverBinary,...
'MutationFcn',@mutFun,...
'PopulationSize',InitPop,...
'MaxStallGenerations',MaxStallG,...
'MaxGenerations',MaxGamultiobj)
This is the Functioncall. A and B are linear constraints.
% code
[X,fval,exitflag] = gamultiobj(objFun,n,A,B,[],[],[],[],[],options);
This is my creation function.
% code
function Population = popFun(GenomeLength,~,options)
%Population function for example PopExample.
% This function must ensure that the linear constraints are met
global Zusatz
Population=zeros(options.PopulationSize,GenomeLength);
rng('default');
rng(Zusatz);
intcon=1:GenomeLength;
B=ones(85,1);
B(36,1)=-1;
lb=zeros(1,GenomeLength);
ub=ones(1,GenomeLength);
opts =optimoptions('intlinprog','IntegerTolerance',1e-06,'Display','off');
for a=1:options.PopulationSize
f= randn(GenomeLength,1);
[x,fval,exitflag]= intlinprog(f,intcon,A,B,[],[],lb,ub,opts);
Population(a,:)=x;
end
end
This is the Crossover function
if true
% code
end
function xoverKids = crossoverBinary(parents,options,GenomeLength,~,~,thisPopulation)
global Zusatz
rng('default');
rng(Zusatz);
% Extract information about linear constraints, if any
linCon = options.LinearConstr;
nKids = length(parents)/2;
index = 1;
xoverKids = nan(nKids,GenomeLength);
for k = 1:nKids
% Get the parents from the population
parent1 = thisPopulation(parents(index),:);
index = index + 1;
parent2 = thisPopulation(parents(index),:);
index = index+1;
% find locations where parents have a different genome
idx = randi(GenomeLength,1);
% Where genome is the same, keep it the same in the kids
xoverKids(k,1:idx) = parent1(1:idx);
xoverKids(k,idx+1:end) = parent2(idx+1:end);
% Ensure that kid astisfies constraints
flag = any(linCon.Aineq*(xoverKids(k,:)') > linCon.bineq,1);
while flag % Does not satisfy constraints
idx = randi(GenomeLength,1);
xoverKids(k,1:idx) = parent1(1:idx);
xoverKids(k,idx+1:end) = parent2(idx+1:end);
flag = any(linCon.Aineq*(xoverKids(k,:)') > linCon.bineq,1);
end
end
end
And the Mutation Function.
if true
% code
end
function mutationChildren = mutFun(parents, options, GenomeLength, ...
~, ~, ~, thisPopulation)
linCon = options.LinearConstr;
global Zusatz
rng('default');
rng(Zusatz);
% Initialize the output
mutationChildren = nan(length(parents),GenomeLength);
for k = 1:length(parents)
mut = thisPopulation(parents(k),:)';
idx = randi(GenomeLength,1);
mutated = mut;
if mutated(idx)==1
mutated(idx)=0;
else
mutated(idx)=1;
end
% Check that constraints are satisfied.
flag = any(linCon.Aineq*mutated > linCon.bineq,1);
while flag
idx = randi(GenomeLength,1);
mutated = mut;
if mutated(idx)==1
mutated(idx)=0;
else
mutated(idx)=1;
end
% Check that constraints are satisfied.
flag = any(linCon.Aineq*mutated > linCon.bineq,1);
end
mutationChildren(k,:) = mutated';
end
end
Steffen Kuehl on 15 Nov 2016
Thank you for your answer Brendan. Zusatz is just an integer for securing that for every run the same random numbers are used. I changed that in the code I uploaded.
I noticed (Thanks to your tip) that my code doesn't work when I run it without parallel processing, but it displays no error with parallel processing turned on. In the moment I'am trying to find out, why that is.
If you want to run the code the file "Kombiniert.m" is the one to start. The Functioncall for the gamultiobj is in "gamultiobjop.m"
Steffen Kuehl on 15 Nov 2016
Edited: Steffen Kuehl on 15 Nov 2016
Actually the code runs without parallel processing. I accidently set 'UseVectorized'=true instead of using serial evaluation. My bad.
But the problem has nothing to do with parallel processing. I get the same result with and without parallel processing turned on.

Brendan Hamm on 15 Nov 2016
Ok. So there is an additional change you make in here that was not mentioned and is affecting the output. You change the MaxStallGenerations which was causing one of the solutions to run more iterations which were producing "better" results. While they may have been better with respect to some of the functions, they were not necessarily better with respect to others.
The basic idea is that if the average change in the best Objective Evaluations between the current generation and the Generation which occurred MaxStallGenerations ago is less than the FunctionTolerance (1e-4 by default), termination will occur.
So, I have a few suggestions:
1. Keep the MaxStallGenerations the same between different runs.
2. There is no need to change the rng in each function, in fact I would discourage this. Instead what you can do is get the 4th output of gamultiobj which contains the state of the RNG at the start of the algorithms run. See the link for information on this.
3. Increase the population size. For a problem with this many variables I would consider having in the range of 500 different initial populations, this may provide you with a large enough "Elite" population to carry on to the next generation.
Brendan Hamm on 16 Nov 2016
I am referring to setting the PopulationSize to a scalar value of 500. The Sub-Population refers to the Migration options which are not actually implemented. Presumably this is something which has been considered for a future release (I am speculating and have no actual knowledge of this plan), but currently the options does nothing except apparently give you an error :).
Steffen Kuehl on 16 Nov 2016
Brendan, I thought you referring to a possibility using more than one (i.e. 500) initial Populations and not the size of one Population.
Thank you for all the work and thought you did put in your answers :)

John D'Errico on 14 Nov 2016
This is a stochastic solver. It generates points using random methods. As well, ANY numerical optimization tool can only find an approximate solution. They do not find an exact solution.
So there is no presumption that one optimization using a stochastic optimizer will always give as good a solution as another call to the same optimizer. Yes, by allowing it more iterations, you increase the chance that it will be able too improve, but it is only abetter chance, not an assurance of success.
Steffen Kuehl on 14 Nov 2016
Thank you for your Answer John. I did set all parameters that are based on random numbers, so that I get reproducible results. If I run the exact calculation twice I get the exact same results. So shouldn't there be a better (or at least as good) solution if the number of Maximum Generations is raised, although gamultiobj is a stochastic solver?

Walter Roberson on 14 Nov 2016
One aspect that is not always obvious is that a later configuration, with a larger (less optimal) objective function value, might be less of a constraint violation. The magnitude of the constraint violation is compared as well as the objective function value.
Steffen Kuehl on 16 Nov 2016