Skip to main content

Gradient Sharing

Definition

Gradient sharing is a technique used in distributed machine learning, particularly in federated learning, where multiple parties collaboratively train a model without directly sharing their raw data. Instead, participants compute local gradients of the model based on their private data and then share these gradients with a central server or among themselves. This method allows for collective model improvement while preserving data privacy. It is a key component in privacy-preserving AI.