Skip to main content

Linear Complexity

Definition

Linear complexity, in the context of algorithms or protocols, describes a system where resource consumption increases directly with the size of the input or workload. For distributed ledger technologies, processing more transactions or data demands a proportional increase in computational effort. This scaling behavior is generally regarded as efficient for many operations.