Pretrained Neural network ALEX-NET training process.
    5 views (last 30 days)
  
       Show older comments
    
    Deepika B
 on 13 Feb 2020
  
    
    
    
    
    Commented: Srivardhan Gadila
    
 on 25 Feb 2020
            Is the modelshown  below is overfitting or not? sometimes it seems that mini-batch accuracy is less than validation accuracy why?.

0 Comments
Accepted Answer
  Srivardhan Gadila
    
 on 19 Feb 2020
        I think the model is not overfitting. The validation loss normally decreases during the initial phase of training, as does the training loss. However, when the network begins to overfit the data, the loss on the validation set typically begins to rise and clearly there is not much difference between the training loss and validation loss. You can refer to Improve Shallow Neural Network Generalization and Avoid Overfitting for more understanding of Overfitting and steps to avoid overfitting.
The Validation accuracy can be higher than the training (mini-batch) accuracy, one possible situtation is when the network has layers that behave differently during prediction than during training for example, dropout layers. It also depends on how the training & validation data are split.
More Answers (0)
See Also
Categories
				Find more on Deep Learning Toolbox in Help Center and File Exchange
			
	Community Treasure Hunt
Find the treasures in MATLAB Central and discover how the community can help you!
Start Hunting!
