Loss function Ford multi Output regression

4 views (last 30 days)
David Sola
David Sola on 14 Dec 2022
Answered: Udit06 on 30 Aug 2024
Hi Community, I'm currently training a NN which has several real value Outputs. Unfortunately they are very different in their magnitude, e.g. Output 1 reaches from 1 to 50, where Output 2 reaches from 0.0001 to 0.0003. I have sone issues during Training especially for the values with the small range. Can someone explain, if the different magnitudes do negatively influence the Training? I can just assume that the loss and therefore the gradients for the low magnitude values are smaller. If this ist the case could someone explain what to so in this Case? Thanks a lot, David

Answers (1)

Udit06
Udit06 on 30 Aug 2024
Hi,
Yes, you are correct the different magnitudes of outputs will affect the training of the neural network model. "Output 1" which is of higher magnitude will dominate the training and your model will not be able to capture the information related to "Output 2" effectively.
You can normalize your output targets to bring them to a similar scale. For example, you can scale them to have a mean of 0 and a standard deviation of 1, or scale them to a specific range like [0, 1]. After training, you can transform the predictions back to the original scale.
You can refer to the following discussion which handles a similar issue:
To understand how to retrieve the original scale output from the normalized output, you can refer to the following MATLAB answer:

Categories

Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!