binary coded decimal

Science Dictionary
binary coded decimal  
Computer Science
A code in which a string of four binary digits represents each decimal number 0 through 9 as a means of preventing calculation errors due to rounding and conversion. For example, since the binary equivalent of 3 is 0011 and the binary equivalent of 6 is 0110, 36 is represented as 0011 0110.
The American Heritage® Science Dictionary
Copyright © 2002. Published by Houghton Mifflin. All rights reserved.
Cite This Source
FOLDOC
Computing Dictionary

binary coded decimal definition

data
(BCD, packed decimal) A number representation where a number is expressed as a sequence of decimal digits and then each decimal digit is encoded as a four-bit binary number (a nibble). E.g. decimal 92 would be encoded as the eight-bit sequence 1001 0010.
In some cases, the right-most nibble contains the sign (positive or negative).
It is easier to convert decimal numbers to and from BCD than binary and, though BCD is often converted to binary for arithmetic processing, it is possible to build hardware that operates directly on BCD.
[Do calculators use BCD?]
(2001-01-27)

The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature