Skip to main content

Stochastic Gradient Descent

Definition

Stochastic Gradient Descent is an iterative optimization algorithm used to train machine learning models by minimizing a cost function. Instead of computing the gradient using the entire dataset, it approximates the gradient using a single randomly selected data point or a small batch of data. This approach makes the training process faster and more memory-efficient, especially for large datasets. It adjusts model parameters incrementally.