Quantum Rewinding Secures Succinct Arguments against Quantum Threats
A novel quantum rewinding strategy enables provably post-quantum secure succinct arguments, safeguarding cryptographic protocols from future quantum attacks.
NFT-Authenticated DAOs: Private Governance via Punishment, Not Reward
Dual-NFT DAOs achieve private, accountable governance via reputational penalties, shifting from financial rewards for sustainable decentralized systems.
Sublinear Memory Zero-Knowledge Proofs Democratize Verifiable Computation
Introducing the first ZKP system with memory scaling to the square-root of computation size, this breakthrough enables privacy-preserving verification on edge devices.
Verifiable Computation for Approximate FHE Unlocks Private AI Scalability
This new cryptographic framework efficiently integrates Verifiable Computation with approximate Homomorphic Encryption, enabling trustless, private AI computation at scale.
Sublinear Zero-Knowledge Proofs Democratize Verifiable Computation on Constrained Devices
A novel proof system reduces ZKP memory from linear to square-root scaling, fundamentally unlocking privacy-preserving computation for all mobile and edge devices.
Subspace Codes Enable Logarithmic Proof Size Constant Verification Time Commitment
A novel polynomial commitment scheme using subspace codes achieves logarithmic proof size and constant verification, enhancing rollup efficiency.
Post-Quantum Non-Malleable Commitment from One-Way Functions
A novel cryptographic commitment scheme achieves post-quantum security and constant-round efficiency using only one-way functions, establishing a new foundational primitive for secure computation.
New Lattice-Based Zero-Knowledge Proofs Achieve Post-Quantum Compactness
A novel polynomial product technique efficiently proves short vector norms in lattice-based cryptography, delivering compact, quantum-resistant ZKPs.
Polynomial Commitments Secure Erasure Codes for Scalable Data Availability Sampling
Cryptographically-secured erasure codes enable light clients to verify data availability by sampling, resolving the scalability bottleneck for modular architectures.
