[ih] Dotted decimal notation

John Levine johnl at iecc.com
Tue Dec 29 12:20:49 PST 2020


In article <386aba57-7d56-6725-9d35-d3e200d0cac7 at channelisles.net> you write:
>I spent a lot of time writing code in Macro-11 in the early 80s. I
>personally found octal FAR easier to deal with intuitively than the hex
>used by microprocessor code.
>
>I wonder what was better about it? (Apart from 'it goes up to 16')??

Octal was great on machines where the word size was a multiple of 3,
like the 36 bit 709x and PDP-6/10 or 12 bit PDP-8, with 6 or 9 bit
characters. It's not so great on 8 bit bytes or 16 bit machines since
you have to do masking and shifting in your head at byte boundaries.

R's,
John



More information about the Internet-history mailing list