[ih] byte order, was Octal vs Hex, not Re: Dotted decimal notation
Jack Haverty
jack at 3kitty.org
Tue Dec 29 16:18:22 PST 2020
Back then, there were no established designs for computers, or any
strong sense of The Right Way. Or perhaps too many such strong senses
with different designs.
You'd probably have to find a PDP-11 designer to get some insight into
why that little-endian decision was made. In my own work, I remember
that sometimes design decisions were made for very pragmatic reasons.
For example, the pinouts of logic ICs and the complexity of a PC board
layout of traces could have motivated a design choice. While I was
working as a student at MIT Draper Labs on Apollo-related equipment, I
remember design decisions which were made in order to minimize the
number of "pins" on a board connector. From experience, computer
failures were strongly correlated with the number of pins and their
associated connectors which corroded. So minimizing pinouts was the
driving design parameter, especially for stuff that ended up in space.
Perhaps PDP-11 designers made similar design decisions to create a
little-endian choice.
I wrote the first TCP for PDP-11 Unix. I remember after getting it
working, I made a pass through the code to remove a bunch of SWAB
instructions (SWAp Bytes left-right in a 16-bit memory location),
because the bytes in IP headers seemed to always be in the wrong place.
So as I was debugging, I SWABed wherever needed to get it to work. It
was a common experience to PDP-11 programmers.
/Jack
On 12/29/20 3:20 PM, John Levine wrote:
> In article <5e41a3fe-07dd-daa5-8781-4010e59f4835 at 3kitty.org> you write:
>> Any historian interested in the 70s era of bits and bytes in computers
>> and networks should also read:
>>
>> https://www.ietf.org/rfc/ien/ien137.txt
> [ the byte order argument ]
>
> I have been trying for years to figure out where the little-endian
> byte order came from. The first machine with addressible 8-bit bytes
> was S/360, which was big-endian. There were a bunch of 360 clones and
> semi-clones, also all big-endian.
>
> The first little-endian design was the PDP-11. I have never been able
> to find anything about why they used the opposite byte order. The DEC
> Computer Engineering book says nothing about it, nor do any of the DEC
> internal memos I've found at bitsavers. Getting a little-endian design
> right was surprisingly hard; some of the 11's arithmetic options got
> it wrong and had middle-endian values with the high-order 16 bit word
> stored first, but the low byte in each word stored first. The Vax
> finally got consistent little-endian addressing right, as do the Intel
> x86 and other subsequent chips.
>
> I've seen plenty of speculation about why they might have used a
> different byte order from prior designs, but no actual facts.
>
> R's,
> John
>
>> Thankfully that architecture died out so all we have now is binary, at
>> least until someone discovers that computers don't necessarily even have
>> to be binary....
> Of course not, they could be decimal like the 650 and 1620.
More information about the Internet-history
mailing list