[ih] Octal vs Hex Re: Dotted decimal notation

Jack Haverty jack at 3kitty.org
Tue Dec 29 15:00:37 PST 2020


Any historian interested in the 70s era of bits and bytes in computers
and networks should also read:

https://www.ietf.org/rfc/ien/ien137.txt

which summarizes a lot of the chaos and warring factions that we
programmers had to deal with back then, when lots of different computers
had lots of different ideas about the "right" way to design such
things.   How to represent the data in a "notation" suitable for humans
was only part of the problem.

I remember once even playing around with "ternary" hardware in the late
60s -- where each "bit" could have one of three values: +1, 0, -1.  
Most people today probably think that computers always use binary [0,1],
but was certainly possible to implement a computer using ternary logic. 
What I built used current flow to represent a value - no current=0;
current in one direction=+1; current in the other direction=-1.

Thankfully that architecture died out so all we have now is binary, at
least until someone discovers that computers don't necessarily even have
to be binary....

/Jack Haverty




More information about the Internet-history mailing list