[ih] Recently restored and a small ARPANET was run using simulated IMP hardware. (was: TTL [was Exterior Gateway Protocol])

Bill Ricker bill.n1vux at gmail.com
Wed Sep 9 15:21:24 PDT 2020


> On 9/8/20 2:49 AM, vinton cerf wrote:
> > i also wonder whether dense crowds of wifi users creates a big
> > desensing risk?
>

With both WiFi and Cellular technology, the transmitters are low power
because they are designed for a dense network of near-by receivers - even
in comparison to early cell phones (which could run 3W into the antenna at
full power, which is a bit much for microwaves that close to the eyeballs,
which is why i had an antenna on the roof of the truck).
On Wed, Sep 9, 2020 at 5:41 PM Jack Haverty wrote:

> Radio technology has gotten a lot better over the last 40 years; my
> experience is that desensing isn't as much of an issue now.


Absolutely. Technologies like direct-sequence spread spectrum (DSSS)
allows
(a) reliable synchronous reception  of low power signals
(b) sharing frequencies between users.
This both allows the lower power that minimizes desense distance, and with
synchronous decode,  a weak or weakened signal is still decodable.

   Also, signal strength decreases rapidly with distance,


Well Actually™
The  *power* (delivered per unit area) decreases rapidly with distance
(R^-2) but detectable signal is *voltage* (per unit distance) which only
decreases linearly with distance (R^-1), so one has to multiply distance x
100 to reduce received signal 1/100.

(Optimizing Power vs Voltage is why TV receive antennae are 75 ohm,
TV/Radio station transmitter anntennae are 33 ohms, and two-way radios'
antennae, Ethernet coax (thick and thin), and test equipment are 50 ohms.)

so socially-distanced
> cellphone users shouldn't have a problem, except perhaps if they try to
> use a phone on each ear.  /j
>

A truly large crowd *will* saturate the network for cellular phone voice
calling, but AFAIK it's the network capacity of the cell nodes, not desense
in the handsets' receivers.

(If the cell phone network is working *properly*, if there ever is desense
at the handset, that should cause the partnered *node* to increase power,
and NOT the handset, which only increases power if the *node* is receiving
poorly, and will back down the power for user safety if it's plenty "loud"
at the node. This is side-channel signal-strength feedback.)

(Even on 2001-09-11, outside of lower Manhattan, when the cell voice
network rapidly saturated as everyone called everyone else NxN repeatedly,
the SMS/Text facilities were initially fine, took hours to saturate. I did
get home before my message saying i was finally getting on the train,
though.)
Desense does still happen though in this marvellous 21st Century ...

I have observed desense in **cheap** BlueTooth Speaker (which is DSSS or
related). I can carry it while listening to podcast-player on the desktop
into the kitchen, but acceptable safe levels of microwave leakage from the
oven (i checked with an antique RadioShack™ analog kitchen microwave safety
tester!) will desense the BT speaker unless i shield it with a metal muffin
pan.

73 de Bill n1vux



More information about the Internet-history mailing list