Batch Norm Problem

In batch norm lecture, he says that after calculating activations of different layers, we again normalize them.
But
I think if we normalize the input only, the further activations will automatically be normalized !
Please clear the doubts

Hi @Anand_Deep,
That seems interesting, but can you give your argument about why do you think that the further activations will automatically be normalized?