Skip to main content

Linear Time

Definition

Linear Time, in a computational context, refers to an algorithm’s execution duration that grows proportionally to the size of its input. If the input doubles, the processing time roughly doubles. This characteristic is a measure of computational efficiency. Algorithms operating in linear time are generally considered efficient for large datasets.