- Design(training+validation), test and new data should all have the same summary statistics BEFORE NORMALIZATION. This may require mixing all of the data together before creating the train/val/tst subsets.
- I prefer zero-mean, unit-variance normalization. It is very helpful for spotting outliers.
- One hidden layer of tanh (aka tansig) hidden nodes is sufficient. However, occasionally, problem specifics make using 2 or more to be appropriate.
- Use as few hidden nodes as possible to reinforce stability. I start with 1 and use an outer loop to increase the number until the training MSE is less than 0.01 times the training target variance.
- An inner loop is used to obtain 10 designs that differ because of random weight initializations.This yields a training RSQUARE of 0.99.
- For each sample choice of hidden nodes design 10 or more nets that differ by initial random weights.
Why my network is not giving the desired output
1 view (last 30 days)
Show older comments
vaishnavi potharaju
on 12 Feb 2019
Commented: Greg Heath
on 21 Feb 2019
I'm trying to design a neural network using nntool of matlab R2015a having input layer of 27 neurons, output layer of 2 neurons and one hidden layer of 10 neurons.I have scaled the input and output data to (0,1) for logsig activation function of hidden layer with purelin in the output layer.For tansig activation function in hidden layer i have scaled the data to (-0.5,0.5). I have trained the network with 1155 training patterns. My mse and R are very good,but the network is not giving the expected result when tested with new data. I have tried almost all the combinations possible from the user inerface(nntool) and in a great confusion. It would be very helpful if answered.
Thank you.
0 Comments
Accepted Answer
Greg Heath
on 18 Feb 2019
2 Comments
More Answers (0)
See Also
Categories
Find more on Sequence and Numeric Feature Data Workflows in Help Center and File Exchange
Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!