N_batches in the train_setup UDF for RNNs

The function used here is
~
def train_setup(net, lr = 0.01, n_batches = 100, batch_size = 10, momentum = 0.9, display_freq=5, device = ‘cpu’):
~

I guess “n_batches” is inspired from “epochs” in FNN/CNNs, i.e. iterate 100 times (or over 100 epochs) to refine the model (i.e. ‘net’ here). So, 100 random samples of data of each size 10 is taken. Is this the correct understanding?

Yes. n_batches in that function means the “number of iterations” to train on.
1 iteration is equal to doing 1 forward and 1 backward pass.
If the training is mini-batch SGD, it means each iteration will have number of samples = batch_size.

No, 1 iteration is not equal to 1 epoch in mini-batch SGD.
1 epoch means a full cycle through all the samples in data exactly once.

But if you were to do (full) batch gradient descent, you can say that number of iterations and epochs are same.

1 Like