Deep Transliteration - Unusually Increasing Loss

Hi I tried to implement the code for language transliteration, French, to English the same one give on Pytorch NLP but with few changes in Attention Mechanism ( Bahdanau Attention) Everything looks right to me but Loss keeps on increasing.
Here are the images:
1.)Encoder:


2.)Decoder

3.)Train function

3.)TrainIter

4.)Loss

The input method is exactly the same as given in Pytorch NPL.

Please help I am stuck here for quite a while.

Please share your full notebook (maybe a Colab link with outputs) so that we can look into it in detail.

Here is the link to the Colab Notebook:

Colab Notebook

And here is the zip file:

zip-file