Regarding cnn model LeNet

when i use LeNet architecture with dropout of 0.2 batch-size 256 epoch 100
the loss decrease very slowly
learning rate = .001
screenshot shows the accuracy and loss vs epoch

The slope for loss doesn’t seems to be zero yet, maybe you can try increasing epochs by order of two i guess. Just an experiment.

But the accuracy increase very slowly in last 50 epochs it only climb 2%
more over loss decrease from 1.7 to 0.9 in 100 epochs

It seems to end somewhere at a local minima.
What are the other hyperparameter combinations that you’ve tried so far?