Order of lines of code in pytorch abstraction

Enclosing the screenshot from optim.SGD docs to write the 3 lines.

However, they are written in a different order as shown by the mentor in the video tutorial (as follows):
~
loss.backward()
opt.step()
opt.zero_grad()
~

Wouldn’t this affect the final result in any way?

Hi @mishramarnath,
No, it doesn’t makes any difference, notice that we’re anyway making gradients as zero before computing them for the next iteration.

@Ishvinder
Got it…thanks!