[ih] Octal vs Hex Re: Dotted decimal notation

Bill Ricker bill.n1vux at gmail.com
Tue Dec 29 13:57:06 PST 2020


On Tue, Dec 29, 2020 at 3:52 PM Dave Crocker via Internet-history <
internet-history at elists.isoc.org> wrote:

> On 12/29/2020 12:20 PM, John Levine via Internet-history wrote:
> > It's not so great on 8 bit bytes or 16 bit machines since
> > you have to do masking and shifting in your head at byte boundaries.
>
> thereby, nicely distinguishing what makes for geek macho-hood.  back
> then, at least.


Indeed there was some geek-macho in working in hex _or_ in having a larger
wordsize.
MULTICS and DEC PDP-10 having 36bit words won on wordsize while fitting
Octal better than Hexadecimal  :-D.
(I even remember being annoyed that DEC PDP-10 Fortran packed 5 x
7-bit-ASCII chars into a word, but the spare bit was NOT the sign bit that
would be easily checked as an out of band flag bit. COBOL allowed both 6 x
6bit or 7 x 5bit iirc.)

Since the PDP-11 was from DEC, Octal was the official binary representation
*despite* the 16-bit wordsize; the frontpanel address/data load toggles
were tinted in alternating triples same as on their 18- and 36-bit systems.
(VAX11/780 dropped the frontpanel.)

The advantage of Octal even with 8/16/32 bit words - such as the 16bit
Macro-11 mentioned by Nigel - despite having a variant Most Significant
Nibble since the octal aligned MSBit when wordisze isn't divisible by 3 -
is that it only uses a subset of the decimal digits familiar from
grade-school.

While we do learn to do clock arithmetic in our heads with base 12 and base
24 for doing literally clock arithmetic, both in school and in real life,
for timezone conversions and figuring durations that span Noon or Midnight,
or Military 24:00 <=> Civil 12:00 AM/PM times,  we do not do that with
single-glyph digits invented to represent the hours 10 through 23, we do it
punctuated with colons, HH:MM:SS, BCD style, as with Dotted Decimal
notation of the OP; so that this Base12/Base24 work is only partially
helpful in shifting mentally into hexadecimal arithmetic.
(Also, use of 12:30 AM rather than 00:30 makes it not quite proper
arithmetic!)

Hexadecimal made sense for me when doing S/370 assembler in the late 1970s
when i taught myself to count in binary on my fingers.

4 fingers (not incl thumbs) per hand = nibble, 8 fingers  = byte.
if *o* is a knuckle of a finger retracted and *&* is a finger extended,
(don't get ahead of me here ...)
oooo = 0
o&oo = 4
o&&& = 7
&00& = 9
&o&o = A,
&&oo = C,
&&&& = F .

It helps to know your powers of two, and some of their smaller multiples of
3
(3, 6, xC=12., x18=24., x30=48., x60=96., xC0=192.)
some of which we recognize from partially filled memory and intermediate
RSA key sizes (the next are x180=384., x300=768.)!
One can shift left and right on the fingers to learn these!

I taught this trick to my offspring in gradeschool and they found that
pretty much all Math teachers would let her "flip the bird" to obnoxious
classmates if she counted 1 2 3 4 simultaneously out loud and on fingers in
binary, with emphasis on FOUR as if playing golf (FORE!), because it
returned the discordant discourse to Maths!
Their imagination wasn't limited to 8 bit words or bytes, so they included
thumbs (so let's make '*-*' a thumb that's down)
thence a double-fisted pair of birds  was 2^7+2^2 = 132 = 00&0- -o&oo ,
"ONE THIRTY TWO for those that deserve the very best".
(And i'm not going to tell them that when we twists our wrists for display,
the 5bit nibble bitfields reverse endianness without nibble swapping,
to -0&00 00&o- *which works for 132 but not in general*!)

And if I'm working with small *uint*s rather than *uchar*s, I do likewise
with thumbs: 10 bits is enough for most of my favorite integers. Except
1792, alas, that's more than 1024.



More information about the Internet-history mailing list