[ih] Internet analyses (Was Re: IPv8...)

Karl Auerbach karl at iwl.com
Fri Apr 24 00:15:42 PDT 2026


On 4/23/26 4:00 PM, Jack Haverty via Internet-history wrote:
> At all sorts of trade shows we usually showed the capabilities of the 
> "Interchange".   We also bought booths several times at Interop to 
> show what we offered.  Interop itself at one point had become a 
> multiprotocol network, probably at the Interop after it merged with 
> Networld so TCP was no longer the only protocol in use on the show 
> floor; Netware and SPX/IPX were now important too.  I don't recall 
> what others were active, maybe Karl A does.... 

The Interop show network was multi-protocol pretty much from the outset.

By 1988/89 we were running TCP/IP with DIX ethernet underneath, DECnet, 
Netware, and ISO/OSI via CLNP (I think).  I remember that we had a 
mountain of routers of all brands - Cisco, Wellfleet, 3COM, Proteon - 
even some Doug Karl boxes I suspect.  We even had RS232 pathways 
(usually asynchronous) often to attach to what we called "Milking 
Machines" - mostly Livingston Portmasters - that gave us access to the 
console ports of many devices out on the show floor.

We ran IP multicast almost from the start.  I remember one show - 
probably around 1997 - where we had a fairly extensive VoIP network.  I 
remember calling into a Federal government meeting and mentioning that I 
was coming in via the Internet - which surprised a lot of folks.  I also 
had jury rigged an IBM Thinkpad laptop with an early wi-fi adapter, 
external battery, some microphones, a camera, and a hard hat - I walked 
around the show floor interviewing people.  (And because the camera was 
mounted on the hard hat my wife could see exactly who I was looking at.)

Carl Malamud used our multicast/MBONE links to run RTFM radio.

We had a /8 network block that we hauled around the world - often with 
only weeks between shows.  That, plus the fact that we were almost 
always dual homed to different external providers, meant that we got 
lots of experience putting out things like route flapping fires and 
trying to convince some NOC somewhere to remove a block that was created 
as we were setting things up and bouncing up and down like a yo-yo.

We also had all kinds of lower layers - Thicknet (yellow hose) ethernet, 
thinnet (thin coax) ethernet, ATM, FDDI.  Twisted pair for Ethernet came 
in via Synoptics and then 10-Base-T (largely from David Systems.)  Most 
of us used DIX ethernet but some companies insisted on using the full 
802.?? SNAP and other framing - and thus made themselves 
non-interoperable with the rest of the show network.  We also had radio 
and laser links to link to the remote hotels and sites.  I also know 
that we ran some fiber through active railroad tunnels (without telling 
the railroad.)

We pretty much tried anything and everything we could get our hands on.

We very early on learned the dark side of twisted pair wiring - first 
that there is a difference between solid and stranded and that one must 
use the proper connectors (we, of course, learned this the hard way.)  
Second was that AT&T in its "logic" decided that each pair would be a 
solid color wire and a white wire - the logic being that this would 
discourage people from untwirling the twisted pairs very much.  It 
didn't work - we had to untwist the wires in order to attach the 
connectors and that stupid non-color coding led to a lot of failures.

One aspect of the Interop show networks that few saw was our "spy" 
network - which was a parallel network that allowed us in the operations 
area to drop into any segment of the show network to monitor traffic or 
even to inject traffic.  For this we used things like passive fiber 
optical taps and boxes with movable/steerable mirrors driving by 
piezoelectric crystals.  We had programs to figure out how to arrange 
the mirrors to create a channel out to the target segment.

Another aspect was our pre-show warehouse where we built and configured 
the entire core network (without the vendor booths). We dismantled it 
onto trucks (for one Atlanta show our network gear filled 43 large 
trucks).  There is a definite choreography that is needed to get that 
kind of stuff into a convention center; protect it from the trucks, 
Teamsters, and Electricians; and then drop it down (or in some cases, 
pull it up) into the various vendor booths.

We also developed some useful test tools - I built the Internet's first 
"butt set" (Dr. Watson, The Network Detective's Assistant) to help get 
up-n-running doing tests and diagnostics within a few seconds after 
arrival.  That tool, useful as it was, died with the change from PC-DOS 
to Windows and also due to financial shenanigans by a practitioner of 
Oracle's Atilla-the-Hun style "business" methods.

We had to do this under rather stressful conditions (including mountains 
of steaming compost that were moved in during a major snow storm).  Some 
convention centers were freezing and I know we exceeded 105F in the Las 
Vegas convention center before they closed the truck doors and turned on 
the air conditioning.  We consumed a lot of alcohol - and charged it all 
to Dan Lynch's credit card.  (We also partied pretty hard - we rented 
nice places, like the Howard Hughes penthouses in Las Vegas and the Air 
and Space museum in DC.  We also did things like white water rafting.)  
And other things happened too - several marriages resulted from working 
on the show network.

Another aspect of the Interop nets was that it was a metronome that 
drove product releases - each show was a tick of the metronome and 
companies had to have their products ready for the shows.  In addition 
during the earlier decades of the show net the authors of many an RFC 
were on hand to explain and refine.

>
> At Interop, several people I had known for a long time tried to 
> convince me that Oracle should offer the Interchange architecture as a 
> general-purpose scheme for the networking world of that time.

I was one who took a somewhat deeper look at IBM's SNA LU 6.2 stuff - 
which I called "yenta networking" because the connections were arranged 
to pass through necessary proxies and translators. I am seeing a bit of 
that coming back in the land of web-based services.

Oracle was not a friendly company and I doubt that even if their ideas 
had technical merit that they could have overcome the reluctance of many 
of us to have anything to do with Oracle.

         --karl--


More information about the Internet-history mailing list