Clear Filters
Clear Filters

CNN Deep learning: Data size Vs Iteration per epoch

9 views (last 30 days)
I need your help to understand why the "data size" affects the number of "iteration per epoch". See below A and B.
(A) (B)
With the number of iterations per epoch, shown in figure A, the training data size = 3700 images. With the number of iterations per epoch, shown in figure B, the training data size = 57000 images. I did not change any settings in my CNN network (and, in both cases, the input images had the same size). Can you please explain why changing (or increasing) the data size has increased the number of iteration per epoch? In other words, what is the relationship between data size and number of iteration per epoch?
Ritu Panda
Ritu Panda on 21 Sep 2020
Iterations per epoch depends on the number of training sample that the model is trained on in each epoch.
For each epoch, your training data is divided into batches of data (specified by the miniBatchSize parameter in the options argument). The model trains on every batch and updates the weight parameters.
Hence, Iterations per epoch = Number of training samples ÷ MiniBatchSize
i.e., In how many iterations in a epoch the forward and backward pass takes place during training the network.

Sign in to comment.

Answers (0)

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!