DL#105 Generic Network for back propagation

Dear All,

I have tried to implement a Generic FFNN with Back Propagation for a multi-class classification problem.

I am facing value error while running the function. I am unable to understand where I am doing wrong in the function. Please check the code and help me get through this. I have understood how the weights, bias and input data are vectorized and implemented the same.

Error:
ValueError: Input contains NaN, infinity or a value too large for dtype(‘float64’).

But the input does not have NaN or infinity values. The input is same make_blobs input.

I have posted the github code link below:

https://github.com/rasunag27/Deep_Learning/blob/master/Generic_Network.ipynb

Thanks and Regards,

Sunag R A.

The intermediate values (A1, A2) are becoming large positive or negative values, because of which the subsequent sigmoid operation returns either 0 or infinity as result