I was trying to train MNIST dataset using LeNet architecture. I wrote the same code as shown in video DL-108- Training LeNet. But not sure why for me it’s showing o/p differently. Below is the code and attached is the output screenshot.
max_epochs = 16
for epoch in range(max_epochs):
for i, data in enumerate(trainloader, 0): inputs, labels = data opt.zero\_grad() outputs = net(inputs) loss = loss\_fn(outputs, labels) loss.backward() opt.step() loss\_arr.append(loss.item() loss\_epoch\_arr.append(loss.item())
print('Epoch: %d/%d, Test acc: %0.2f, Train acc: 0.2f' (epoch, max_epochs, evaluation(testloader), evaluation(trainloader)))
I think ‘Epoch’ message is not printing because it’s getting failed in between itself. Please let me know if I’m doing anything wrong here.