Full entropy

From HandWiki
Revision as of 18:55, 6 February 2024 by WikiEd2 (talk | contribs) (fixing)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Cryptographic property of random output

In cryptography full entropy is a property of an output of a random number generator. The output has full entropy if it cannot practically be distinguished from an output of a theoretical perfect random number source (has almost n bits of entropy for an n-bit output).[1]

The term is extensively used in the NIST random generator standards NIST SP 800-90A and NIST SP 800-90B. With full entropy the per-bit entropy in the output of the random number generator is close to one: [math]\displaystyle{ 1-\epsilon }[/math], where per NIST a practical [math]\displaystyle{ \epsilon\lt 2^{-32} }[/math].[1]

Some sources use the term to define the ideal random bit string (one bit of entropy per bit of output). In this sense "getting to 100% full entropy is impossible" in the real world.[2]

Definition

The mathematical definition relies on a "distinguishing game": an adversary with an unlimited computing power is provided with two sets of random numbers, each containing W elements of length n. One set is ideal, it contains bit strings from the theoretically perfect random number generator, the other set is real and includes bit strings from the practical random number source after a randomness extractor. The full entropy for particular values of W and positive parameter [math]\displaystyle{ \delta }[/math] is achieved if an adversary cannot guess the real set with probability higher than [math]\displaystyle{ \frac 1 2 + \delta }[/math].[3]

Additional entropy

The practical way to achieve the full entropy is to obtain from an entropy source bit strings longer than n bits, apply to them a high-quality randomness extractor that produces the n-bit result, and build the real set from these results. The ideal elements by nature have an entropy value of n. The inputs of the conditioning function will need to have a higher min-entropy value H to satisfy the full-entropy definition. The number of additional bits of entropy H-n depends on W and [math]\displaystyle{ \delta }[/math]; the following table contains few representative values:[4]

Minimum value of additional entropy H-n
W [math]\displaystyle{ \delta = 2^{-20} }[/math] [math]\displaystyle{ \delta = 2^{-10} }[/math]
[math]\displaystyle{ 2^{32} }[/math] 67.3 47.3
[math]\displaystyle{ 2^{48} }[/math] 83.3 63.3
[math]\displaystyle{ 2^{56} }[/math] 91.3 71.3

Randomness extractor requirements

Not every randomness extractor will produce the desired results. For example, the Von Neumann extractor, while providing an unbiased output, does not decorrelate groups of bits, so for serially correlated inputs (typical for many entropy sources) the output bits will not be independent.[5] NIST therefore defines the "vetted conditioning components" in its NIST SP 800-90B standard, including AES-CBC-MAC.[5]

References

Sources