Error : grad_short code in DL#105

In " [Backpropagating using Python: Shortened Backpropagation Code
self.db3 is repeated where it should be self.db4, self.db5, self.db6 (marked below). Possibly gone unnoticed since grad_short is not used sunsequently

def grad_short(self, x, y):
    self.forward_pass(x)
    self.y1, self.y2, self.y3, self.y4 = y 
 
    self.da3 = (self.h3-self.y1)
    self.da4 = (self.h4-self.y2)
    self.da5 = (self.h5-self.y3)
    self.da6 = (self.h6-self.y4)

    self.dw5 = self.da3*self.h1
    self.dw6 = self.da3*self.h2
    self.db3 = self.da3  

    self.dw7 = self.da4*self.h1
    self.dw8 = self.da4*self.h2
    self.db3 = self.da4      ## [u]*This should be self.db4*[/u]

    self.dw9 = self.da5*self.h1
    self.dw10 = self.da5*self.h2
    self.db3 = self.da5    ## [u]*This should be self.db5*[/u]

    self.dw11 = self.da6*self.h1
    self.dw12 = self.da6*self.h2
    self.db3 = self.da6    ## [u]*This should be self.db6*[/u]

    self.dh1 = self.da3*self.w5 + self.da4*self.w7 + self.da5*self.w9 + self.da6*self.w11
    self.dh2 = self.da3*self.w6 + self.da4*self.w8 + self.da5*self.w10 + self.da6*self.w12   
    self.da1 = self.dh1 * self.h1*(1-self.h1)
    self.da2 = self.dh2 * self.h2*(1-self.h2)

   self.dw1 = self.da1*self.x1
   self.dw2 = self.da1*self.x2
   self.db1 = self.da1   

   self.dw3 = self.da2*self.x1
   self.dw4 = self.da2*self.x2
   self.db2 = self.da2
2 Likes

Hi @parsar0,
Thanks for pointing this out :slight_smile:

2 posts were merged into an existing topic: DNN using python Feed_Forward_Network m=X.shape[1]

Hello

Pl see the generic Feed Forward class FSNNetwork
The grad function has self.dA[L] = (self.H[L] - y)
Instead, it should be self.dA[L] = (self.H[L] - y)* self.grad_sigmoid(self.H[L])

Because of this, the loss oscillates widely in the initial epochs. After adding the sigmoid, it smoothens out and results match the corresponding scalar models exactly

Thanks
Partha

1 Like