Hybrid Synchronous BFT Model Achieves Low Latency by Separating Message Sizes
AlterBFT introduces a hybrid synchronous model, relying on small, timely coordination messages to drastically reduce consensus latency.
Zero-Knowledge Proof of Training Secures Decentralized Federated Learning
ZKPoT consensus uses zk-SNARKs to verify machine learning contributions privately, resolving the privacy-verifiability trade-off for decentralized AI.
Zero-Knowledge Proof of Training Secures Federated Learning Consensus
ZKPoT uses zk-SNARKs to verify model contributions privately, eliminating the trade-off between decentralized AI privacy and consensus efficiency.
