Skip to main content

Data Poisoning

Definition

Data poisoning is a type of adversarial attack where malicious actors intentionally corrupt the training data used by machine learning models, leading to biased or incorrect outputs. In decentralized AI or data markets, this could compromise the integrity of shared datasets, affecting the reliability of predictions or decisions made by AI systems. Such attacks can degrade model performance and introduce vulnerabilities. They pose a threat to data-driven blockchain applications.