Why DropOut in Neural Network so important?

Why can’t our fully connected network learn itself which all node to drop instead of we using the DropOut.

Dropping out simply means not using the complete model.
The main reason we use it is to reduce the power of model to overlearn the hidden structures.

1 Like