Https://stats.stackexchange.com/questions/164876/tradeoff-batch-size-vs-number-of-iterations-to-train-a-neural-network
It had been observed in practice, when using a larger batch there was a significant degradation in the quality of T He model, as measured by it ability to generalize.
https://stackoverflow.com/questions/4752626/epoch-vs-iteration-when-training-neural-networks/31842945
In the neural network terminology:
- One Epoch = One forward pass and one backward pass of all the training examples
- Batch Size = The number of training examples in one Forward/backward pass. The higher the batch size, the more memory space for you ' ll need.
- Number of iterations = number of passes, each pass using [batch size] number of examples. To is clear, one pass = one forward pass + one backward pass (we do not count the forward pass and backward pass as Di Fferent passes).
Example:if You has training examples, and your batch size is $, then it'll take 2 iterations to complete 1 EPOC H.
What's the difference between iterations and epochs in convolution neural networks?