Linear Complexity

Definition ∞ Linear complexity, in the context of algorithms or protocols, describes a system where resource consumption increases directly with the size of the input or workload. For distributed ledger technologies, processing more transactions or data demands a proportional increase in computational effort. This scaling behavior is generally regarded as efficient for many operations.
Context ∞ When assessing the performance of new cryptographic protocols or consensus algorithms, linear complexity is a preferred attribute, indicating predictable resource allocation. Discussions in crypto news often compare systems with linear scaling to those with higher-order complexities, especially when addressing transaction throughput limitations. Achieving linear scalability is a goal for many projects seeking to manage a growing user base and transaction volume without excessive costs.