Exercise: Generic FFNN with Back Propagation Implementation

Hello Everyone,

I have tried to implement a generic FFNN for a simple multi-class classification problem in similar lines to
" vectorization of weights & inputs "

Observations :

  1. I found the loss function vary drastically each time I run the “fit function of the generic class”. Why is this happening ?
  2. I would like to know if conceptually the implementation is correct?

Your feedback would be helpful in correcting my mistakes.

Link for the my code : https://github.com/abhiramangit/Imag_reco/blob/master/Generic_FFNN_Class.ipynb

Check the initialization of Weights and Biases. The weights are set to initialize randomly and so whenever you run the fit function, the weights will have different values and the loss plot will vary. To solve this problem, first, run the fit function for one epoch and set the ‘initialize’ argument in the fit function to ‘True’. Later on, run the fit function in another cell with whatever number of epochs you want but set the initialize argument to ‘False’. This way your Loss plot will show the same results for the same data.

Sorry for the late answer. I just checked the forum today only. Hope so, this may clear your doubt