Knowledge Nugget

Learning Algorithms for Neural Networks
person Author: Process Fellows
There are various learning algorithms that differ in their method for updating weights. However, they are mostly based on gradient descent and its variants.
Gradient descent explained simply: Imagine a mountain hike:
  • You are standing on a hill (high error value)
  • You want to go down into the valley (minimum error function).
  • The gradient (slope) shows you in which direction you are going downhill.
  • You take small steps in the direction of the steepest descent.
Examples of algorithms: Backpropagation, Stochastic Gradient Descent (SGD), Mini-Batch Gradient Descent, Adaptive Moment Estimation (Adam), etc.
Mapped with these items:
  • Automotive SPICE 4.0
    • MLE.3.BP3 Create and optimize ML model.