optimization routine for function that is not smooth and also has a lot of local minimum

7 views (last 30 days)
Hi, is anyone can give me some suggestion about optimization routine for function that is not smooth and also has a lot of local minimum?
  1 Comment
José-Luis
José-Luis on 18 Feb 2013
Edited: José-Luis on 18 Feb 2013
There are many many optimization routines. None is guaranteed to give you the correct result, given the characteristics of your function, short of exhaustive testing.

Sign in to comment.

Answers (5)

Alan Weiss
Alan Weiss on 19 Feb 2013
If you want to minimize a nonsmooth function, the general recommendation is to try patternsearch. If there are many local minima, the general recommendation is to use a variety of start points. If you have finite bounds on all components, lb and ub, then you can obtain random start points as follows:
x0 = lb + rand(size(lb)).*(ub - lb);
For more methods of choosing start points, see this section.
Good luck,
Alan Weiss
MATLAB mathematical toolbox documentation
  4 Comments
xueqi
xueqi on 20 Feb 2013
Thanks! So you think that the problem is not coming from the function being not smooth? I will try to add some constraints soon and see how it goes. But please stay with me in this post. I will tell you what happen and may still need your advice.
xueqi
xueqi on 1 Mar 2013
Yes. I do find that the value of the function is changing dramatically with respect to one variable. When near one particular number (which is the value that I want to estimate) it shows concavity. But as it gets far away from that particular value, the function goes to Inf quickly. I think it is why fmincon cant work correctly. But I can't really add proper constraint to this variable...What should I do?

Sign in to comment.


Thorsten
Thorsten on 18 Feb 2013
http://en.wikipedia.org/wiki/Simulated_annealing

Mohsen  Davarynejad
Mohsen Davarynejad on 18 Feb 2013
All the algorithms suitable for black-box optimization can be used. These algorithms, including Genetic algorithms (GA) and Particle swarm optimization (PSO), do not make any assumptions regarding the problem at hand, and thus they neither require the function to be convex, nor do they require the availability of the gradient of the function.
  5 Comments
xueqi
xueqi on 20 Feb 2013
I have 8 independent variables to estimate. And this is the code I use for coding ga
if true
% A=[1 1 1 0 0 0 0 0];
b=[1];
lb=[0.1;0.1;0.1;0.0001;1.1;0.001;1.1;0.001];
ub=[1;1;1;1;Inf;1;Inf;1];
options=optimset('Algorithm','interior-point','Display','iter','MaxIter',50,'TolFun', 1e-1);
[x,fval,exitflag,output,lambda,grad] = fmincon(@beta,[0.2,0.2,0.2,0.01,100,0.1,100,0.1],A,b,[],[],lb,ub,[],options)
%x= ga(@beta,8,A,b,[],[],lb,ub,[],options)
end
xueqi
xueqi on 20 Feb 2013
It is just it seems take forever for ga to get the result....for my function beta, it takes nearly 1 minutes if I just assign a particular values to get the result. But when minimize beta, it takes several days even for just around 10 iterations. I am not sure that this is normal....

Sign in to comment.


Juan Camilo Medina
Juan Camilo Medina on 2 Mar 2013
Edited: Juan Camilo Medina on 2 Mar 2013
Based on my experience, the best way is to use simulated annealing, which is less heuristic than GA and it's powered by a Markov chain. fmincon() or any other gradient-based algorithm won't work well since the function is non-smooth.
[x,fval,exitFlag,output] = simulannealbnd(ObjectiveFunction,X0)
you can also try
x = fminsearch(fun,x0)

Matt J
Matt J on 3 Mar 2013
Edited: Matt J on 3 Mar 2013
I mentioned this to you in another post, but I'll document it here as well. I think the best way to deal with an ill-behaved objective function is to replace it with a better one. I.e., question your initial reasoning behind this function and see if there are alternative formulations that you haven't considered.
If nothing else, it is suspicious that you think you absolutely must accept the non-smoothness of the function. If you've thought about the problem formulation carefully enough, there are almost always smooth approximations that you can find. The local minima are another matter.
If you can't see an alternative yourself, it may be time to walk us through the development of the function you're thinking of so we can see where you might have taken a wrong turn.
  41 Comments
Matt J
Matt J on 10 Mar 2013
Edited: Matt J on 10 Mar 2013
One idea would be, for large alpha, beta to write the loglikelihood in terms of Stirling's approximation of Beta(alpha,beta),
llh(x,alpha, beta)= alpha*log(x)+beta*log(1-x) + 0.5*log(2*pi)...
(alpha+beta-.5)*log(alpha+beta) - ...
(alpha-.5)*log(alpha) - ...
(beta-.5)*log(beta) - ...
xueqi
xueqi on 12 Mar 2013
There are something wrong with the maxmin function. Maybe it is the casue of all the problems...

Sign in to comment.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!