Skip to main content

Linear Latency

Definition

Linear Latency is a characteristic of a system where the time delay for an operation increases proportionally with the size of the input or the number of participants. In distributed systems, this implies that as more nodes or data are added, the communication or processing time scales directly. Minimizing linear latency is a design goal for efficient networks.