[ih] state of the internet probes? (was Re: AOL in perspective)

Karl Auerbach karl at iwl.com
Wed Sep 17 12:52:58 PDT 2025


Ah, automatic probing of the net... that got me into a bit of hot water.

 From my experiences building/operating the Interop show networks (late 
1980s through 2000+) I realized that we very much needed what was the 
internet equivalent of the "butt set" that telco repair folks carry.

So in the early 1990s I formed a company (Empirical Tools and 
Technologies [also known as Empirical Tools and Toys] - ETNT) to build 
such a thing.

And I built it - It was called "Dr. Watson, The Network Detective's 
Assistant" (DWTNDA).  It ran on PC-DOS and could be up and running (on 
those early laptops) within a relatively few seconds, which is something 
needed because from my experience network problems always require 
somebody to get into an uncomfortable place (such as a cold, damp wiring 
closet) where nobody wants to spend much time (not to mention the time 
pressure to resolve the network problem.)

In one of the modes of operation DWTNDA would go in and out of 
promiscuous mode (many times a second), sampling local traffic (for IP, 
Decnet, Netware) as a way of capturing addresses and names to explore 
further.

Once it had an address (and was not in promiscuous mode) DWTNDA would 
begin to explore using a variety of methods such as ARP, ICMP ECHO, 
reverse DNS, etc, including SNMP queries (from which it tried to do 
things like obtain the remote machine's ARP, routing, and TCP connection 
data - thus feeding more addresses into the list of "nodes I should I 
look at".

Trouble was that the early versions of DWTNDA had no limits on the range 
or pace of exploration - It was pretty amazing to watch it when it 
started up and began exploring.  The display would quickly fill with 
nodes it had found and was exploring.  It also used various heuristics - 
such as presuming that IP and Netware probably had similar routing.

The tool was pretty amazing - we usually had several running at all 
times on the Interop show network.  (I also got a couple 
"Troubleshooting Product of the Year" awards.  And it was robust - it 
would, and did, run for years without stopping - a unique attribute back 
in the early 1990's.)

The lack of limits on DWTNDA's exploration first came to my notice when 
I came in one morning after leaving a unit running. There was a very 
angry phone message from Columbia University.  I thought "I am in Santa 
Cruz, California, why is someone at Columbia calling and screaming at 
me?"  Well, it turned out that my test machine had caught an address or 
name at Columbia and had begun to probe it.  The target machine at 
Columbia turned out to be a big IBM mainframe with a buggy SNMP agent 
that would crash when I probed it.  Crashing mainframes tend to get 
attention - and this got a lot of attention because this early version 
of DWTNDA was relentless and kept probing.

(I kinda had to bite my tongue and not tell the folks at Columbia that 
they ought to have used a better SNMP agent engine - In particular one 
that I had built and sold via my previous company, Epilogue Technology.  
Had they done so they would not have crashed.  But had I mentioned it I 
am sure somebody from Columbia would have sent Guido From Baltimore to 
pay me a visit.)

It was from these events that I also began to realize that there is a 
mutual embrace between security and diagnosis/repair - and that we are 
going increasing far out onto the security end of the seesaw without 
much going onto the counterbalancing monitoring/diagnosis/repair end of 
the seesaw.  I foresee trouble coming to the net from this imbalance - 
see my note "Is The Internet At Risk From Too Much Security?" at 
https://www.cavebear.com/cavebear-blog/netsecurity/

Unfortunately DWTNDA died due to the machinations of an unscrupulous 
investor as well as me not paying adequate to the financial ledgers and 
statements.  (It was that event that started me on the path of paying 
attention to financial data - particularly the raw ledgers and 
supporting documents (purchase orders, invoices, checks, etc) when I was 
on boards of directors of various companies.)

I've had an on-again/off-again project to resurrect DTWNDA is a modern 
form as part of an overall Internet fault detection, isolation system.  
The idea is similar to RIPE's ATLAS but far more elaborate and also 
incorporating some ideas that I have borrowed from my local UC Santa 
Cruz astrophysics friends.

One of the ideas that I want to explore in this is based on the wisdom 
that you can't be two places at once and that you are rarely at the 
right place at the right time with the right tools. I was beginning to 
define a system of electronic work requests and work-reports that test 
tools could exchange with one another so that tools that were in the 
right place could run tests and then, at some later time, deliver the 
results.  (There is some similarity to how things like Apple's Air Tags 
work.  And there are lots of big security issues.)

This idea extended rather further - at the data gathering side I wanted 
vibration and sound sensor data from network devices (power supply 
noises are great sources of diagnostic data) to inference engines (I 
used Prolog in my prototypes), to an expert database of network 
pathologies (with paths leading from symptoms to methods to 
differentiate among different possible causes, to running of those 
methods, etc).  One important aspect was the creation of baselines so 
one could note deviations.  I suspect some modern AI tools could be 
helpful, much as they are on the new Vera Rubin telescope to quickly 
note events of interest so that more focused telescopes could be quickly 
re-targeted to capture hot-off-the-star data.

I've been working on this idea for at least 40 years - I keep getting 
interrupted and I doubt I'll survive long enough to finish it.  But 
somebody will do something like this, eventually, at which point 
Internet probing will become rather more formalized and intensive.  (But 
hopefully not as wild-west crazed as the web and AI data mining bots 
that plague so many of us who run network servers.)

It bothers me that the IETF does not have a formal effort to assure that 
all Internet protocol designs are coupled to evaluations of failure 
modes, designation of test points, and tools to exercise those test 
points.  I really fear a Rub-Goldberg-ization of the Internet.

             --karl--




More information about the Internet-history mailing list