Linear Time, in a computational context, refers to an algorithm’s execution duration that grows proportionally to the size of its input. If the input doubles, the processing time roughly doubles. This characteristic is a measure of computational efficiency. Algorithms operating in linear time are generally considered efficient for large datasets.
Context
In blockchain technology, understanding linear time can be relevant when assessing the scalability and performance of transaction processing. For example, verifying a block of transactions might ideally operate in a manner approaching linear time relative to the number of transactions. News reports discussing blockchain throughput or network congestion implicitly relate to the efficiency of underlying processes, where linear time performance is often a desirable attribute for high-volume operations.
A new zero-knowledge argument system achieves optimal linear prover time, fundamentally eliminating the computational bottleneck for verifiable execution of large programs.
We use cookies to personalize content and marketing, and to analyze our traffic. This helps us maintain the quality of our free resources. manage your preferences below.
Detailed Cookie Preferences
This helps support our free resources through personalized marketing efforts and promotions.
Analytics cookies help us understand how visitors interact with our website, improving user experience and website performance.
Personalization cookies enable us to customize the content and features of our site based on your interactions, offering a more tailored experience.