- When you need to calculate higher-order derivatives, ensure that the 'EnableHigherDerivatives' option is true otherwise you will get an error.
- A dlgradient call must be inside a function. To obtain a numeric value of a gradient, you must evaluate the function using dlfeval, and the argument to the function must be a dlarray. See Use Automatic Differentiation In Deep Learning Toolbox.
- As of R2024a, the dlgradient function does not support calculating higher-order derivatives that depend on the following functions: gru, lstm, embed, prod, interp1 (possibly you had one of these functions in your dlgradient computation and that's why you thought that higher-order derivatives were unsupported).
- For a full list of limitations, see dlgradient limitations.
Higher order derivatives for dlarray
14 views (last 30 days)
Show older comments
Hi,
I was able to get first order derivative using 'dlgradient' function.
Unfortunately, I checked 'dlgradient' does not support higher order derivatives. However, I need second derivative.
Is there anyway that I can do?
0 Comments
Answers (1)
Maksym Tymchenko
on 15 May 2024
You can definitely compute higher order derivatives with dlgradient, here's an example of how to compute the second derivative of the cube function:
x = dlarray(5);
function [y, d2YdX2] = cube(x)
y = x^3;
dYdX = dlgradient(y,x, EnableHigherDerivatives=true);
d2YdX2 = dlgradient(dYdX, x);
end
[y, d2YdX2] = dlfeval(@cube, x);
fprintf("The second derivative is 6*x = 6*5 = %.3f", d2YdX2);
A few things to note:
If you are interested in learning how to use higher-order derivatives in the loss function of a neural network, see one of the following examples:
0 Comments
See Also
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!