Machine Transliteration

I tried executing given transliteration code with attention for more than 3000 epochs but still the accuracy doesn’t cross 70percent. The inference output is very bad as well. Is this model not the way we go about for transliteration tasks ?!.

How can we proceed with capstone project if the first phase is not working at all. Please shed your thoughts, I’m attaching notebook and inference output below.

[https://colab.research.google.com/drive/1DBM88kQRoM4wu77BrD_mvvZS7MGlTuHQ?usp=sharing](transliteration notebook)

GOWER - ङ्षल FELETI - भॉऴॉडू INSAAF - उऄऺीभ KUNWAR - गृषषील MANGALAGIRI - रऄङऴङुलूू BALVEER - मऴषूलल OF - ओभ BHAYANAK - यीऱीग PLACE - बॏऴॉऺ

1 Like

Got any solution.We are too trying but something is wrong with infer function I guess?

Yes, the error is in infer function. It must be [index-1] instead of +1. You can try using other attention mechanisms as well(transformers).

I completed the capstone project with low accuracy as I have skipped data augmentation step. But the process works.

Github Repo: HEIST