[ih] byte order, was Octal vs Hex, not Re: Dotted decimal notation

John Levine johnl at iecc.com
Tue Dec 29 15:20:59 PST 2020


In article <5e41a3fe-07dd-daa5-8781-4010e59f4835 at 3kitty.org> you write:
>Any historian interested in the 70s era of bits and bytes in computers
>and networks should also read:
>
>https://www.ietf.org/rfc/ien/ien137.txt

[ the byte order argument ]

I have been trying for years to figure out where the little-endian
byte order came from. The first machine with addressible 8-bit bytes
was S/360, which was big-endian. There were a bunch of 360 clones and
semi-clones, also all big-endian.

The first little-endian design was the PDP-11. I have never been able
to find anything about why they used the opposite byte order. The DEC
Computer Engineering book says nothing about it, nor do any of the DEC
internal memos I've found at bitsavers. Getting a little-endian design
right was surprisingly hard; some of the 11's arithmetic options got
it wrong and had middle-endian values with the high-order 16 bit word
stored first, but the low byte in each word stored first. The Vax
finally got consistent little-endian addressing right, as do the Intel
x86 and other subsequent chips.

I've seen plenty of speculation about why they might have used a
different byte order from prior designs, but no actual facts.

R's,
John

>Thankfully that architecture died out so all we have now is binary, at
>least until someone discovers that computers don't necessarily even have
>to be binary....

Of course not, they could be decimal like the 650 and 1620.



More information about the Internet-history mailing list