Search

The Online Encyclopedia and Dictionary

 
     
 

Encyclopedia

Dictionary

Quotes

 

Bit

This article is about the unit of information, see Bit (disambiguation) for other meanings.

A bit (abbreviated b) is the most basic information unit used in computing and information theory. A single bit is a one or a zero, a true or a false, a "flag" which is "on" or "off", or for that matter any two mutually exclusive states.

Claude E. Shannon first used the word bit in a 1948 paper. Shannon's bit is a portmanteau word for binary digit (or possibly binary digit). He attributed its origin to John W. Tukey.

A byte is a collection of bits, originally variable in size but now almost always eight bits. Eight-bit bytes, also known as octets, can represent 256 values (28 values, 0–255). A four-bit quantity is known as a nibble, and can represent 16 values (24 values, 0–15). In some architectures, 16 bits make up a word, 32 bits a double word (dword): see word size.

Terms for large quantities of bits can be formed using the standard range of prefixes, e.g., kilobit (kbit), megabit (Mbit) and gigabit (Gbit). Note that much confusion exists regarding these units and their abbreviations, see Binary prefixes. Although it is clearer symbology to use "bit" for the bit and "b" for the byte, "b" is often used for bit and "B" for byte. (In SI, B stands for the bel.)

Certain bitwise computer processor instructions (such as xor) operate at the level of manipulating bits rather than manipulating data interpreted as an aggregate of bits.

Telecommunications or computer network transfer rates are usually described in terms of bits per second.

The bit is the smallest unit of storage currently used in computing, although much research is ongoing in quantum computing with qubits.


See also

The contents of this article are licensed from Wikipedia.org under the GNU Free Documentation License. How to see transparent copy