Skip to main content

Gradient Descent

Definition

Gradient Descent is an iterative optimization algorithm used to find the minimum of a function. It works by repeatedly taking steps in the direction of the steepest decrease of the function, which is determined by the negative of the gradient. This method is fundamental in machine learning for training models by adjusting parameters to minimize error.