Definition ∞ Linear computation is a computational process where execution time increases proportionally with the input size. This refers to a type of computational operation where the resources required, such as time or memory, scale directly in proportion to the size of the input data. In the context of blockchain and cryptography, this refers to algorithms whose complexity grows linearly, making them relatively efficient for processing increasing data volumes. Many cryptographic primitives strive for linear or near-linear performance characteristics.
Context ∞ Discussions about blockchain scalability and cryptographic efficiency often reference linear computation as a desirable characteristic for optimizing network performance. Achieving linear scaling in transaction processing or proof generation is a continuous goal for protocol developers. News reports on advancements in this area typically signal improvements in the practical utility and adoption potential of new blockchain technologies.