Skip to main content

Linear Computation

Definition

Linear computation is a computational process where execution time increases proportionally with the input size. This refers to a type of computational operation where the resources required, such as time or memory, scale directly in proportion to the size of the input data. In the context of blockchain and cryptography, this refers to algorithms whose complexity grows linearly, making them relatively efficient for processing increasing data volumes. Many cryptographic primitives strive for linear or near-linear performance characteristics.