1 view (last 30 days)

Show older comments

Hi,

I'm currently practicing numerical root-finding, using simple sets of nonlinear equations and writing my own solvers -- with an eye towards using Matlab's fsolve for my real problem that's in many more variables. This way, I'll be using fsolve while having a good sense of what root-finding methods generally do, what the pros and cons of various methods are, etc.

Since my practice problems are easy for now, I can differentiate the functions to compute the Jacobian matrices myself, and then code it up in the script files. However, I imagine it's not best practice to continue doing this, especially when I start considering many more variables, and the Jacobian / Hessian matrices get larger.

What's considered "best practice" for computing derivatives to use for, say, a root-finding method? Should I purchase and use the Symbolic toolbox, or is there another way to approach differentiation, without having to do it by hand?

I think I'll eventually get into some "finite-differencing" methods, but I'm not there yet and know nothing about them -- I'm maybe a few weeks away. So, any thoughts and recommendations are welcome.

Thanks,

Matt J
on 21 Sep 2020

Edited: Matt J
on 21 Sep 2020

However, I imagine it's not best practice to continue doing this, especially when I start considering many more variables, and the Jacobian / Hessian matrices get larger.

On the contrary, if you have many variables, it is best to differentiate the functions yourself analytically. However, the efficiency of doing it this way only manifests if you express your Jacobians in vectorized terms. For example, you should recognize that a quadratic form x.'*Q*x has a Jacobian efficiently implemented as (2*x.')*Q. The Symbolic Toolbox is not capable of differentiating in matrix-vector form and will give you very unwieldy expressions expanded into a large number of scalar variables.

There are tools on the File Exchange that will help with numerical differentiation if you want to go that route,

https://www.mathworks.com/matlabcentral/fileexchange/13490-adaptive-robust-numerical-differentiation

but the most efficient code will always come from an analytical differentiation customized to the objective function at hand.

Matt J
on 22 Sep 2020

You should never form the inverse of a matrix to solve an equation J*x=-g. You should be solving it with mldivide():

x=J\(-g)

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!