[ih] Dotted decimal notation
John Day
jeanjour at comcast.net
Tue Dec 29 13:56:59 PST 2020
+1
> On Dec 29, 2020, at 15:20, John Levine via Internet-history <internet-history at elists.isoc.org> wrote:
>
> In article <386aba57-7d56-6725-9d35-d3e200d0cac7 at channelisles.net> you write:
>> I spent a lot of time writing code in Macro-11 in the early 80s. I
>> personally found octal FAR easier to deal with intuitively than the hex
>> used by microprocessor code.
>>
>> I wonder what was better about it? (Apart from 'it goes up to 16')??
>
> Octal was great on machines where the word size was a multiple of 3,
> like the 36 bit 709x and PDP-6/10 or 12 bit PDP-8, with 6 or 9 bit
> characters. It's not so great on 8 bit bytes or 16 bit machines since
> you have to do masking and shifting in your head at byte boundaries.
>
> R's,
> John
> --
> Internet-history mailing list
> Internet-history at elists.isoc.org
> https://elists.isoc.org/mailman/listinfo/internet-history
More information about the Internet-history
mailing list