Enclosing the screenshot from optim.SGD docs to write the 3 lines.
However, they are written in a different order as shown by the mentor in the video tutorial (as follows):
~
loss.backward()
opt.step()
opt.zero_grad()
~
Wouldn’t this affect the final result in any way?