Formalizing Data Availability Sampling as a New Cryptographic Commitment Primitive
Researchers formalize Data Availability Sampling as a cryptographic primitive, introducing a new commitment scheme that rigorously secures light client verification.
Zero-Knowledge Finality Enables Constant-Time Light Client Verification
A novel ZKP system proves block finality in constant time, decoupling verification cost from chain complexity to unlock trustless cross-chain interoperability.
Universal Vector Commitments Achieve Constant-Time Data Availability Sampling
A novel Universal Vector Commitment scheme achieves constant-time data availability sampling, fundamentally solving the verifier's dilemma and enabling infinite L2 scalability.
FRIDA: FRI-based Data Availability Sampling without Trusted Setup
Leverages a novel property of the FRI proof system to construct a trustless, efficient data availability sampling scheme for modular blockchains.
Probabilistic Sampling Verifies Data Availability Securing Modular Blockchain Scaling
Data Availability Sampling leverages erasure coding to enable light nodes to probabilistically verify block data, fundamentally solving the L2 scaling data bottleneck.
Zero-Knowledge State Accumulators Democratize Validator Participation and Finality
Introducing Zero-Knowledge State Accumulators, a primitive that compresses blockchain state into a succinct proof, radically lowering validator costs and securing decentralization.
Vector Commitments Enable Modular Blockchain Scalability and Asynchronous Security
A new Probabilistically Verifiable Vector Commitment scheme secures Data Availability Sampling, decoupling execution from data and enabling massive asynchronous scalability.
Coin Holder Checkpointing Secures Proof-of-Stake History against Long-Range Attack
Winkle introduces coin holder-driven decentralized checkpointing, cryptoeconomically securing Proof-of-Stake history against deep chain rewrites.
Data Availability Sampling Secures Modular Blockchain Scalability
Modular architecture decouples core functions, using Data Availability Sampling and erasure coding to enable trust-minimized, mass-scale rollups.
