Neural Networks Gradient - what is a good value?
2 views (last 30 days)
When building neural networks with the toolbox, in the display I have a performance parameter, lower the better since it is the Mean Squared Error (MSE).
My problem is with the gradient parameter, i think it is associate to the MSE and what is a good value for it? if I got a gradient of 0.03, should I increase the number of epochs in order to make it lower?
I would appreciate some suggestion on this part. Should I be happy with my model only looking to the MSE/R2 ou should the decision also include the gradient paramter?
Mahesh Taparia on 10 Dec 2019
Gradient is associated with the rate of change of MSE with respect to the weights. You can’t justify the model performance based on gradient value. The optimization algorithm should be able to locate the global minima of the loss function. At global/ local minima the gradient is very small and tends to zero.