Linear Latency

Definition ∞ Linear Latency is a characteristic of a system where the time delay for an operation increases proportionally with the size of the input or the number of participants. In distributed systems, this implies that as more nodes or data are added, the communication or processing time scales directly. Minimizing linear latency is a design goal for efficient networks.
Context ∞ Achieving low linear latency is a critical performance metric for real-time applications and high-frequency trading within decentralized finance. Research efforts continually seek to optimize consensus algorithms and network topologies to reduce this scaling factor.