Skip to main content

Logarithmic Entropy

Definition

Logarithmic entropy is a mathematical measure of uncertainty or randomness within a system, often expressed using a logarithm. It quantifies the amount of information required to describe the state of a system. In information theory and cryptography, higher logarithmic entropy indicates greater unpredictability and security. This metric is fundamental for assessing the strength of random number generation.