Skip to main content

Secure Gradient Sharing

Definition

Secure gradient sharing is a cryptographic technique utilized in federated learning where multiple parties collaboratively train a machine learning model without directly sharing their raw data. Instead, only the aggregated model updates, or “gradients,” are exchanged, often secured using privacy-preserving methods like homomorphic encryption or secure multi-party computation. This process allows for distributed model training while protecting the confidentiality of individual data contributions. It addresses privacy concerns inherent in traditional centralized machine learning approaches. This method enhances data sovereignty.