Knowledge Nugget

Dropout in Machine Learning
person Author: Process Fellows
“Dropout” is a regularization technique to reduce “overfitting”. NNs are often very complex and can adapt too much to the training data (overfitting). Consequence: Good performance on training data, poor generalization to new data.

Dropout means, during training, some neurons are randomly deactivated (set to zero) to make the model more robust. The model learns redundant structures and is less dependent on individual neurons. Dropout is not used during prediction (only during training).
Advantages: Reduces overfitting, increases the robustness, promotes better generalization
Disadvantages: Can slow down training, does not always work well for small NNs

Mapped with these items:
  • Automotive SPICE 4.0
    • MLE.3.BP1 Specify ML training and validation approach.