What does initialise = True exactly mean in the context of consistent reduction in loss while training the model?

def fit(self, X, Y, epochs = 1, learning_rate = 1, initialise = True, loss_fn = "mse", display_loss = False):

What is the difference in setting initialise = True and initialise = False in the context of training the model in obtaining a consistent reduction of loss?

That is, if initialise = True then after each epoch, the weights are set to some normal distribution, which means the weights that will be updated in the second or subsequent epochs will not be the continuation from a previous epoch. So, how does this achieve a consistent reduction in loss when there could be fluctuations in learning by initialising the weights in every epoch to some random normal distribution.

Isn’t initialise = False help in achieving a consistent reduction in the loss than the other? Kindly help me with my understanding of this part.

Hi @4deep.prk,

The initialise argument is used in order to reset the weights every time fit() is been called, and not the epochs.

Initialise comes into use when you have made some changes in hyperparameters, and want to train the weights right from the beginning.

1 Like

Thank you for helping me to fix my bad observation of the fit function :smiley: