Quiz: Remember the definition of mal de mer?
A sequence of adjacent bits operated on as a unit by a computer. A byte usually consists of eight bits. Amounts of computer memory are often expressed in terms of megabytes (1,048,576 bytes) or gigabytes (1,073,741,824 bytes).
Our Living Language : The word bit is short for binary digit. A bit consists of one of two values, usually 0 or 1. Computers use bits because their system of counting is based on two options: switches on a microchip that are either on or off. Thus, a computer counts to seven in bits as follows: 0, 1, 10 , 11 , 100 , 101 , 110 , 111 . Notice that the higher the count, the more adjacent bits are needed to represent the number. For example, it requires two adjacent bits to count from 0 to 3, and it takes three adjacent bits to count from 0 to 7. A sequence of bits can represent not just numbers but other kinds of data, such as the letters and symbols on a keyboard. The sequence of 0s and 1s that make up data are usually counted in groups of 8, and these groups of 8 bits are called bytes. The word byte is short for binary digit eight. To transmit one keystroke on a typical keyboard requires one byte of information (or 8 bits). To transmit a three-letter word requires three bytes of information (or 24 bits).
the basic unit of information in computer storage and processing. A byte consists of 8 adjacent binary digits (bits), each of which consists of a 0 or 1. The string of bits making up a byte is processed as a unit by a computer; bytes are the smallest operable units of storage in computer technology. A byte can represent the equivalent of a single character, such as the letter B, a comma, or a percentage sign; or it can represent a number from 0 to 255. Because a byte contains so little information, the processing and storage capacities of computer hardware are usually given in kilobytes (1,024 bytes) or megabytes (1,048,576 bytes). Still larger capacities are expressed in gigabytes (about one billion bytes) and terabytes (one trillion bytes).