Fincon Warning: undesired gradient requirement

2 views (last 30 days)
Hi,
I'm having a problem using Fmincon and I cannot find a solution online. I have to minimize a function ('ikSolver' in the code below) whose input is a [10x1] vector of doubles:
%Get lower and upper boundaries
[lb,ub] = getLegJointLimits('A')
% Random starting point initialization
rng('shuffle');
qlegs = rand(10,1);
% Boundaries on the parameters must be satified by
% the initial value in the optimization procedure thus:
qlegs = lb + (ub-lb).*qlegs;
options = optimset('Display','iter','MaxFunEvals',50000,...
'TolFun',1e-06,...
'MaxIter',50000,'LargeScale','on');
[fqs,Fval,EXITFLAG] = fmincon('ikSolver',qlegs,[],[],[],[],lb,ub,'constraint',options);
The values of the boundary vectors are:
lb =
-0.6685
-0.9745
0.1193
-1.5464
-0.6229
-0.2624
-1.5480
0.1282
-0.9782
-0.2811
ub =
0.2712
0.7202
1.8977
0.2598
0.3003
0.6735
0.2582
1.8921
0.7114
0.6523
and the functions appearing (simplified for simplicity):
function F = ikSolver(theta)
global Rmatr % Relabeling matrix
global gt gteq
global H
Qplus = Rmatr*theta;
F = sum(Qplus)^2;
gt = [ ];
gteq = [ ];
end
with:
Rmatr =
0 0 0 0 0 0 0 0 0 1
0 1 1 1 0 0 -1 -1 0 0
0 0 0 0 0 0 0 1 0 0
0 0 0 0 0 0 1 0 0 0
0 0 0 0 0 1 0 0 0 0
0 0 0 0 1 0 0 0 0 0
0 0 0 1 0 0 0 0 0 0
0 0 1 0 0 0 0 0 0 0
0 1 0 0 0 0 0 0 0 0
1 0 0 0 0 0 0 0 0 0
And 'constraint':
function [gtr,gteqr] = constraint(Jsolcons)
global gt gteq
gtr = gt; %non linear inequality constraints
gteqr = gteq; %non linear equality constraints
return
The warning is the following:
Warning: To use the default trust-region-reflective algorithm you must supply the gradient in the objective function and set the GradObj option to 'on'. FMINCON will use the active-set algorithm instead. For information on applicable algorithms, see Choosing the Algorithm in the documentation. > In fmincon at 492
The same warning (and consequent wrong result) appears if I change the arguments of the function fmincon as for example:
[fqs,Fval,EXITFLAG] = fmincon('ikSolver',qlegs,[],[],[],[],lb,ub);
[fqs,Fval,EXITFLAG] = fmincon('ikSolver',qlegs,eye(10),ub);
options = optimset('Display','off','MaxFunEvals',50000,...
'TolFun',1e-06,...
'MaxIter',50000);
I don't understand where am I asking for the trust-region-reflective algorithm which requires the gradient...
Thank you very much!!
  1 Comment
Matt J
Matt J on 17 Apr 2013
Edited: Matt J on 17 Apr 2013
Are you sure you really meant
F = sum(Qplus.^2);
If you really did mean,
F=sum(Qplus)^2
then your objective function is equivalent to
Rsum=sum(Rmatr,1);
F=(dot(Rsum,theta))^2;
Obviously it is then more efficient to pre-compute Rsum once rather thatnto repeatedly perform the more expensive matrix multiplication Rmatr*theta in every iteration of the algorithm.

Sign in to comment.

Accepted Answer

Kye Taylor
Kye Taylor on 17 Apr 2013
Edited: Kye Taylor on 17 Apr 2013
The trust-region-reflective algorithm is the default option for fmincon.
You can specify the solver by adding the name-value pair
'Algorithm','active-set' to the input to optimset, as in the command
options = optimset('Display','off','MaxFunEvals',50000,...
'TolFun',1e-06,...
'MaxIter',50000,...
'Algorithm','active-set');
  1 Comment
Matilde
Matilde on 17 Apr 2013
Thank you that solved the problem. I don't know how in the past few months I was using fmincon like that without specifing the gradient and it always worked...

Sign in to comment.

More Answers (1)

Matt J
Matt J on 17 Apr 2013
Edited: Matt J on 17 Apr 2013
Instead of using an alternative algorithm that doesn't require the gradient of the objective function, why not just supply the trust-region-reflective algorithm the gradient that it requires? The gradient in your case has a very simple form
gradient = 2*(Rmatr.'*Qplus)
I assume here that you really meant to write
F = sum(Qplus.^2);
as per my comment above.
  1 Comment
Matilde
Matilde on 17 Apr 2013
Thank you very much for your advice, actually as I tried to say not very clearly, my ikSolver is much more complex and it solves totally another kind of problem, I thought there was no point in coping it here without explaining its meaning :-) For this reason I put that stupid example! I'm not able to define a gradient for my obj function by hand since is a bit too complex!

Sign in to comment.

Products

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!