1. Epoch
The N_epochs parameter is often seen in the code, what exactly does this parameter mean? The answers are as follows:
In one epoch, all training set data is used once
one
Epoch = One forward pass and one backward pass of
all the training examples
2. Batch_size
In general, a training set will have a large number of samples, in order to improve the training speed, the entire training set will be divided into N_batch group, each group contains batch_size samples
That is: number of samples for the entire DataSet = Batch_size * N_batch
Batch Size = The number of training examples in one Forward/backward pass. The higher the batch size, the more memory space for you ' ll need.
3. Iterations
It's confusing to see the two parameters of iteration and epoch, and it's always unclear what the difference is between them, and these two parameters are completely different concepts.
Each time the iteration is performed, the model is trained using the samples of a batch
Number of iterations = number of passes, each pass using [batch size] number of examples. To is clear, one pass = one forward pass + one backward pass (we do not count the forward pass and backward pass as Di Fferent passes)
In particular
#Epoch NumberN_epochs = 100#total number of samplesNumSamples = 100 000#to split a sample into a n_batch groupN_batch = 10#Each batch contains a samplesBatch_size = numsamples/N_batch#grams is for training .iterations =0 forIinchRange (N_epochs): forJinchRange (N_batch):#using Group J Batch for trainingTrain (j)#iterations number plus 1iterations = iterations +1
Visible: iterations = Epoch * N_batch
That is, each epoch carries n_batch times training, each training, using batch_size samples
CNN Common parameter record