Minimization of a function with unknown gradient but known sparsity pattern of its hessian
4 views (last 30 days)
Show older comments
Dear colleagues,
is there a fmincon option to minimize a function without the knowledge of its gradient but providing a sparsity pattern of a Hessian?
My function comes from a FEM formulation of an energy in nonlinear mechanics of solids and it is too difficult to differentiate analytically.
However the sparsity pattern of the hessian is easily available though a FEM connectivity of variables.
Is there a way to exploit it efficiently? If I run with 'Algorithm','quasi-newton', it seems not to accept 'HessPattern' option. An alternative would be to obtain an appriximative gradient (can you suggest one?) and use 'Algorithm','trust-region' insteady. Does anyone have experience with it?
Best wishes,
Jan
0 Comments
Accepted Answer
Alan Weiss
on 29 Oct 2019
Sorry, I am afraid that the available options don't work efficiently for your case. The HessPattern option is available only for the 'trust-region-reflective' algorithm, but for that algorithm you need to supply a derivative.
I am not sure what to suggest that you probably have not yet tried. For the default 'interior-point' algorithm you can try using the HessianApproximation option set to 'lbfgs' or {'lbfgs',Positive Integer}, but that does not directly use the sparsity pattern that you know. Or, and this seems crazy, you could code a finite difference gradient in your objective funtion, bypassing MATLAB's internal one, and then you could use the 'trust-region-reflective' algorithm with the HessPattern option. I am not sure that the 'trust-region-reflective' algorithm would satisfy you anyway, as it accepts only bound constraints or only linear equality constraints.
Sorry.
Alan Weiss
MATLAB mathematical toolbox documentation
5 Comments
Catalytic
on 29 Oct 2019
Edited: Catalytic
on 29 Oct 2019
One possibility might be to use a 1-iteration call to fmincon itself to return the gradient. This uses Matlab's finite differencer and so might be faster than 3rd party implementations,
function [f,numerical_grad]=myObjective(x)
f=...
if nargout>1
options=optimoptions('fmincon','MaxIter',1,'SpecifyObjectiveGradient',false);
[~,~,~,~,~,numerical_grad] = fmincon(@myObjective,x,[],[],[],[],...
[],[],[],options);
end
end
More Answers (1)
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!