bits and bytes
Author Message
bits and bytes

Yesterday someone ask me: "How was it determined that 1 byte would actually
(usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

Mon, 21 Apr 2003 08:30:22 GMT
bits and bytes

Quote:

> Yesterday someone ask me: "How was it determined that 1 byte would actually
> (usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

Yup.

Mon, 21 Apr 2003 08:40:37 GMT
bits and bytes

Quote:

> Yesterday someone ask me: "How was it determined that 1 byte would actually
> (usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

Because 8 is the smallest integral power of 2 that can hold all the
characters of C's character set.

(Okay, it's a wild guess.  Does anyone have a better idea?)

Gregory Pietsch

Mon, 21 Apr 2003 08:42:17 GMT
bits and bytes

Quote:
> Yesterday someone ask me: "How was it determined that 1 byte would actually
> (usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

It isn't 8 bits, as far as the C language is concerned.  The number of bits in
a char/byte in the C language is CHAR_BIT bits, as defined in limits.h.
There are definitely systems on which it is 9 or more bits (up to at least 32
on a popular DSP).

As far as C is concerned, it cannot be 7 bits, since that would not allow the
implementation of conforming characters.  Hence, it is _at least_ 8 bits, but
possibly more.
--
C-FAQ: http://www.eskimo.com/~scs/C-faq/top.html
"The C-FAQ Book" ISBN 0-201-84519-9
C.A.P. FAQ: ftp://cap.connx.com/pub/Chess%20Analysis%20Project%20FAQ.htm

Mon, 21 Apr 2003 08:42:47 GMT
bits and bytes
Charles Martin a crit dans le message ...

Quote:
>Yesterday someone ask me: "How was it determined that 1 byte would actually
>(usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know
the

How is this a C question ?

<slightly ot>
It depences the target. A byte (aka char in C) is the smallest amount of
adressable memory. From the C language point of view, a byte must have at
last 8 bits.

Some targets have bytes of 8,9,16,32,36 bits, whatever...

Note: An 8-bit data is called an octet.
</ot>

--
-hs-    Tabs out, spaces in.
CLC-FAQ: http://www.eskimo.com/~scs/C-faq/top.html
ISO-C Library: http://www.dinkum.com/htm_cl
FAQ de FCLC : http://www.isty-info.uvsq.fr/~rumeau/fclc

Mon, 21 Apr 2003 08:48:02 GMT
bits and bytes

Quote:
>Yesterday someone ask me: "How was it determined that 1 byte would actually
>(usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

was a word address) and the words were divided in bytes.  The sizes of
these bytes were typically 6, 7 or 9 bits (the typical words were 12, 18
and 36 bits).  Then, IBM came with the byte addressed 360 that had
8-bit bytes.  A few years later, DEC came with the byte addressed
PDP-11 that had 8-bit bytes.  Some three or four years after that Intel
came with the 8080, an 8-bit microprocessor.  Each of these systems was
the most popular in its class (mainframe, mini and micro) at the time,
so the rest of the industry followed the trend.

By the time C was standardised, 8-bit bytes were the de facto industry
standard, so the ANSI committee that did the job decided to require
that the C byte has *at least* 8 bits, although the minimum number of
bits required by the C character set is 7.

Dan
--
Dan Pop
CERN, IT Division

Mail:  CERN - IT, Bat. 31 1-014, CH-1211 Geneve 23, Switzerland

Mon, 21 Apr 2003 10:58:50 GMT
bits and bytes

Quote:
>Yesterday someone ask me: "How was it determined that 1 byte would actually
>(usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

It wasn't determined. A byte could be 9 bits or 7 bits.  In the C language, a
byte is at least 8 bits wide; the standard specifies that a byte is the
smallest individually addressable unit of storage, and that it is at least
eight bits wide.

You are confusing octets and bytes. An octet is a unit of information
consisting of 8 ordered bits. It is a common unit of data transfer in
networking. Since C bytes are 8 *or more* bits wide, the type unsigned
char can always store the binary value of an octet.

--
operators of a Usenet-to-web gateway, without obtaining the proper permission
of the author, who does not endorse any of the linked-to products or services.

Mon, 21 Apr 2003 11:20:12 GMT
bits and bytes

Quote:

>> Yesterday someone ask me: "How was it determined that 1 byte would actually
>> (usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

>Because 8 is the smallest integral power of 2 that can hold all the
>characters of C's character set.

>(Okay, it's a wild guess.  Does anyone have a better idea?)

Yes. The two nibbles of an 8 bit byte can each store a BCD digit.
Also, an 8 bit byte can represent an ASCII character plus a parity bit.

This is good topic for alt.folklore.computers; I would guess that if
that newsgroup has a FAQ, this would probably be in it.

--
operators of a Usenet-to-web gateway, without obtaining the proper permission
of the author, who does not endorse any of the linked-to products or services.

Mon, 21 Apr 2003 11:22:20 GMT
bits and bytes
On Wed, 1 Nov 2000 16:42:47 -0800, "Dann Corbit"

Quote:

> > Yesterday someone ask me: "How was it determined that 1 byte would actually
> > (usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

> It isn't 8 bits, as far as the C language is concerned.  The number of bits in
> a char/byte in the C language is CHAR_BIT bits, as defined in limits.h.
> There are definitely systems on which it is 9 or more bits (up to at least 32
> on a popular DSP).

Actually on at least two DSPs that I know of, TI and AD.

Quote:

> As far as C is concerned, it cannot be 7 bits, since that would not allow the
> implementation of conforming characters.  Hence, it is _at least_ 8 bits, but
> possibly more.

Jack Klein
--
Home: http://jackklein.home.att.net

Mon, 21 Apr 2003 11:42:25 GMT
bits and bytes

Quote:
>On Wed, 1 Nov 2000 16:42:47 -0800, "Dann Corbit"

>> > Yesterday someone ask me: "How was it determined that 1 byte would actually
>> > (usually) be equal to 8 bits?  Why not 9 bits or 7 bits?"  I didn't know the

>> It isn't 8 bits, as far as the C language is concerned.  The number of bits in
>> a char/byte in the C language is CHAR_BIT bits, as defined in limits.h.
>> There are definitely systems on which it is 9 or more bits (up to at least 32
>> on a popular DSP).

>Actually on at least two DSPs that I know of, TI and AD.

I am intrigued.

My fallible understanding of history was that, in the beginning, there
was the bit (binary digit) and there was the word.

The word size was architecture dependent, and common values included
4, 8, 12, 16, 24 and 32 bits.

However, the 8-bit unit prooved so useful (ASCII = 7 bits + parity,
two hex digits, etc) that it became a widespread common denominator,
irrespective of machine word size. So widespread that it needed a
name.

I didn't think the term 'byte' existed in the C standard at all - we
simply have a coincidental correlation between the most common
representations of a character and what we independently think of as a
byte. Character sizes can obviously be bigger than 8 bits (unicode,
for instance) but why redefine the byte - the word was meant to be
redefined, but the byte is a kind of concensus-based standard that is
too useful to arbitrarily break.

--
Steve Horne

Mon, 21 Apr 2003 20:56:44 GMT
bits and bytes
Quote:

> >On Wed, 1 Nov 2000 16:42:47 -0800, "Dann Corbit"

<snip "how many bits per byte?">

Quote:

> >> It isn't 8 bits, as far as the C language is concerned.  The number of bits in
> >> a char/byte in the C language is CHAR_BIT bits, as defined in limits.h.
> >> There are definitely systems on which it is 9 or more bits (up to at least 32
> >> on a popular DSP).

> >Actually on at least two DSPs that I know of, TI and AD.

> I am intrigued.

<snip>

> I didn't think the term 'byte' existed in the C standard at all

I hope you will allow me to show you that the term 'byte' not only
*exists* in the C Standard, but also is in fact *defined* (for the
purposes of C programmers) in that Standard. The definition that follows
is from the n869 draft of ISO/EIC 9899:1999, and as far as I know it's
identical to the wording in the current Standard (but I can check at the
weekend if need be):

3.4
[#1] byte
addressable  unit  of  data storage large enough to hold any
member  of  the  basic  character  set  of   the   execution
environment

And, a little later, we have:

5.2.4.2  Numerical limits

[#1] A conforming  implementation  shall  document  all  the
limits  specified  in this subclause, which are specified in
specified in <stdint.h>.

5.2.4.2.1  Sizes of integer types <limits.h>

[#1]  The  values  given below shall be replaced by constant
expressions  suitable   for   use   in   #if   preprocessing
directives.   Moreover,  except for CHAR_BIT and MB_LEN_MAX,
the following shall be replaced by expressions that have the
same  type  as  would an expression that is an object of the
corresponding  type  converted  according  to  the   integer
promotions.   Their  implementation-defined  values shall be
equal or greater in  magnitude  (absolute  value)  to  those
shown, with the same sign.

-- number  of  bits for smallest object that is not a bit-
field (byte)
CHAR_BIT                         8

Thus, the C Standard defines a byte to be at least 8 bits in size (note
the wording - the value of CHAR_BIT is implementation-defined, and must
have a magnitude of at least 8 (and a positive sign!)).

Quote:
> - we
> simply have a coincidental correlation between the most common
> representations of a character and what we independently think of as a
> byte. Character sizes can obviously be bigger than 8 bits (unicode,
> for instance) but why redefine the byte - the word was meant to be
> redefined, but the byte is a kind of concensus-based standard that is
> too useful to arbitrarily break.

<shrug> If you want to know /why/ the C Standard "redefines" the byte,
you may care to check with the guys in comp.std.c, since several ISO C
committee guys subscribe to that group. They rarely poke their noses
into comp.lang.c, because they think we're too loud. :-)

--
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html

Mon, 21 Apr 2003 21:11:29 GMT
bits and bytes
On Thu, 02 Nov 2000 13:11:29 +0000, Richard Heathfield

Quote:

>> I didn't think the term 'byte' existed in the C standard at all

>I hope you will allow me to show you that the term 'byte' not only
>*exists* in the C Standard, but also is in fact *defined* (for the
>purposes of C programmers) in that Standard. The definition that follows

OK - I wasn't aware of that.

Quote:
><shrug> If you want to know /why/ the C Standard "redefines" the byte,
>you may care to check with the guys in comp.std.c, since several ISO C
>committee guys subscribe to that group. They rarely poke their noses
>into comp.lang.c, because they think we're too loud. :-)

Here I was wrong, though - see the 7 and 9 bit bytes that others have
mentioned. The standards guys presumably just set a constraint on
something that might otherwise have been variable.

Oh - thats the English meaning of the word variable, not a variable as
defined in the C standard  ;-)

--
Steve Horne

Mon, 21 Apr 2003 22:27:51 GMT
bits and bytes

<snip>

Quote:
> The standards guys presumably just set a constraint on
> something that might otherwise have been variable.

> Oh - thats the English meaning of the word variable, not a variable as
> defined in the C standard  ;-)

Er... the C Standard doesn't define the term "variable". :-)

--
Richard Heathfield
"Usenet is a strange place." - Dennis M Ritchie, 29 July 1999.
C FAQ: http://www.eskimo.com/~scs/C-faq/top.html

Mon, 21 Apr 2003 22:48:11 GMT
bits and bytes

Quote:
>Er... the C Standard doesn't define the term "variable". :-)

Not in the text, no -- but it does use the term, and thus presumably
falls back on the definition in one of its references, probably the
first of these two:

ISO/IEC   2382-1:1993,   Information   technology    -
Vocabulary - Part 1: Fundamental terms.

ISO/IEC TR 10176, Information technology -  Guidelines
for the preparation of programming language standards.

(I do not have either one so I cannot check.)
--

Mon, 21 Apr 2003 22:51:15 GMT
bits and bytes

Quote:

>I am intrigued.

>My fallible understanding of history was that, in the beginning, there
>was the bit (binary digit) and there was the word.

>The word size was architecture dependent, and common values included
>4, 8, 12, 16, 24 and 32 bits.

And 7, 9 and other values. powers-of-two only became popular when
people _stopped_ doing parity properly (gd&r).

Quote:
>However, the 8-bit unit prooved so useful (ASCII = 7 bits + parity,

you do realise that your PC memory is probably NINE bit don't you?  8
plus parity...

Quote:
>two hex digits, etc) that it became a widespread common denominator,
>irrespective of machine word size. So widespread that it needed a
>name.

Byte existed before the "standardization" onto 8 bits. Don't let
mainframe people hear you call 8 bits be a standard by the way.

Quote:
>I didn't think the term 'byte' existed in the C standard at all - we

As Richard pointed out... it is.

--
Mark McIntyre
C- FAQ: http://www.eskimo.com/~scs/C-faq/top.html

Tue, 22 Apr 2003 07:02:46 GMT

 Page 1 of 2 [ 25 post ] Go to page: [1] [2]

Relevant Pages