Bit-length

From HandWiki
Revision as of 06:25, 27 June 2023 by Unex (talk | contribs) (linkage)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Short description: Number of bits needed to represent an integer datum or message in computing and communications


Bit-length or bit width is the number of binary digits, called bits, necessary to represent an unsigned integer[1] as a binary number. Formally, the bit-length of a natural number [math]\displaystyle{ n \geq 0 }[/math] is

[math]\displaystyle{ \ell(n) = \lceil \log_2(n+1) \rceil }[/math]

where [math]\displaystyle{ \log_2 }[/math] is the binary logarithm and [math]\displaystyle{ \lceil \cdot \rceil }[/math] is the ceiling function.

Bit lengths (ε denotes the empty string)
decimal ε 1 2 3 4 5 6 7 8 9 10
binary ε 1 10 11 100 101 110 111 1000 1001 1010
bit length 0 1 2 2 3 3 3 3 4 4 4

At their most fundamental level, digital computers and telecommunications devices (as opposed to analog devices) process data that is encoded in binary format. The binary format expresses data as an arbitrary length series of values with one of two choices: Yes/No, 1/0, True/False, etc., all of which can be expressed electronically as On/Off. For information technology applications, the amount of information being processed is an important design consideration. The term bit-length is technical shorthand for this measure.

For example, computer processors are often designed to process data grouped into words of a given length of bits (8 bit, 16 bit, 32 bit, 64 bit, etc.). The bit-length of each word defines, for one thing, how many memory locations can be independently addressed by the processor. In cryptography, the key size of an algorithm is the bit-length of the keys used by that algorithm, and it is an important factor of an algorithm's strength.

References