Cheap and Secure Web Hosting Provider : See Now

[Answers] Why are bytes treated like the base unit?

, , No Comments
Problem Detail: 

If bits are the base unit of information, why are bytes treated like the base unit?

For example, usually values are expressed in Mega/Giga/Tera/Exa bytes instead of bits. I am aware that bits are sometimes used (e.g. sometimes for internet speed), but generally to me it seems like bytes are used as if they are the base unit instead of bits.

Asked By : AAM

Answered By : Yuval Filmus

CPUs operate in machine words and in bytes, but never in single bits. Old CPUs (say 8086/8088 and 80286) had machine words of length 16 bits, which were separated into two halves of 8 bits. Machine code could operate on each of these bytes separately. Memory could also be read in byte chunks, though in reality memory was always read in machine words (the 8088, however, apparently read its memory byte by byte), and nowadays in even larger cache lines.

Another reason to consider bytes is that files are stored as sequences of bytes (nowadays, actually in multiples of a larger unit, the disk sector). There are two reasons for that: one is that CPUs are byte-oriented, and the other is that encodings such as ASCII and EBCDIC use bytes as their atomic unit of data (that is, each byte stores one character). This legacy is still with us - if you read binary data from a file, it will come up with chunks whose length is measured by bytes (even if in practice data is read to a much larger buffer), and the length of a file is measured in bytes.

Best Answer from StackOverflow

Question Source :

3.2K people like this

 Download Related Notes/Documents


Post a Comment

Let us know your responses and feedback