Ethernet started as a network that runs over coaxial cable. Over years the Ethernet has started to use mainly twisted pair wiring and fiber optics for communications. There seems to be every now and then still questions related to running Ethernet over coaxial cabling. This is the first part of multi-part post on running Ethernet over coaxial cable.
Old coaxial Ethernet
Experimental Ethernet started in 1972 as 2,94 Mb/s network over 50 ohm coaxial cable. 10BASE5 (also known as thick Ethernet or thicknet) was the first commercially available variant of Ethernet. The technology was standardized in 1982 as IEEE 802.3. 10BASE5 uses a thick and stiff coaxial cable up to 500 meters (1,600 ft) in length.
10BASE2 (also known as cheapernet, thin Ethernet, thinnet, and thinwire) is a variant of Ethernet that uses thin coaxial cable terminated with BNC connectors to build a local area network. In 1985 10BASE2 10 Mb/s “thinnet” was released, it used 50 ohm RG-8-like coaxial cable. 10BASE2 uses RG-58A/U cable or similar for a maximum segment length of 185 m.
During the mid to late 1980s 10BASe2 was the dominant 10 Mbit/s Ethernet standard. 10BASE2 coax cables have a maximum length of 185 metres (607 ft). The maximum practical number of nodes that can be connected to a 10BASE2 segment is limited to 30. Ethernet segments have to be terminated with a 50 ohm resistor at each end of the cable.
Due to the immense demand for high speed networking, the low cost of Category 5 cable, and the popularity of 802.11 wireless networks, both 10BASE2 and 10BASE5 have become increasingly obsolete, though devices still exist in some locations. As of 2011, IEEE 802.3 has deprecated 10BASE2 standard for new installations
Using other coaxial cable types with 10BASE2
If you have some 50 ohms coaxial cable (does not need to be exactly like RG-8 or RG-58) you want to use for Ethernet, 10 megabits is enough and you have suitable old Ethernet hardware around, then you can try to run Ethernet over it.
Coaxial cables also come in various impedances — typically 50, 75 and 93 ohm. 50 ohm coaxial is used typically for radio transmitters and for Ethernet. 75 ohm cable does a better job of maintaining signal strength and is primarily used for connecting any type of receiving device such as CATV receivers, high-definition TVs and digital recorders. Originally used in mainframe IBM networks in the 1970s and early 1980s, 93 ohm coax is rare and expensive special cable.
ANSI/TIA-568-4.D specifies requirements for 75 ohm broadband coaxial cabling, cords and connecting hardware to support CATV, satellite television and other broadband applications. For the Cabling Subsystem 1 between the outlet and the first distribution point, the length limit is 46 meters for RG6 and 90 meters for RG11. For a Cabling Subsystem 2 between distribution points, the length limit remains at 46 meters for RG6 and increases to 100 meters for RG11. For each of these deployments, TIA-568-4.D specifies insertion loss limits over the frequency range of 5 to 1002 MHz.
What if you have 75 ohm TV antenna coax run instead? There was once an obsolete computer network standard 10BROAD36 that was designed to support 10 Mbit/s Ethernet signals over standard 75 ohm cable television (CATV) cable over a 3600-meter range. 10BROAD36 was less successful than its contemporaries because of the high equipment complexity (and cost) associated with it. It was forgotten and replaced later with cable modems.
What about using 10BASE2 hardware with 75 ohms coax? Of course, there will be an impedance mismatch between the 10base2 transceivers (50 Ohms) and the TV coax (75 Ohms). Some energy of the signal will be lost due to reflection. An impedance mismatch like that is generally a bad idea. It’s not just the energy loss, but the reflections rattling all over your circuit can lead to interference. There are many stories of 10BASE2 networks being crippled to very slow speeds when someone by accident installs a piece of 75 ohms cable to it.
Use of all 75 Ohm cable may work in some cases. It might work well or not. Here is one report from https://groups.google.com/g/alt.cable-tv/c/3se4xxQ-z48 says:
“But the 10Base-2 is based on 50ohm RG58 thin coax cable. And CATV is
supposed to be based on 75ohm RG59 thin coax cable.
Go ahead, it’s going to work quite well. I have tried it at home on my 10base2 network and it did work. My cables are all RG58 (50 ohms) and someone asked the same question as you, I was curious and I spliced in 100 feet of RG59 (with all the adapters from “f” connectors to BNC) and not a singleerror. Obviously you would not want to go 185 meters with RG59, but between two rooms or inside a house, no problems.”
Yes, RG58 can be used, and yes, you can sometimes get away with RG59. In technical point of view the problem in using 75 ohms is impedance matching issue between Ethernet designed for 50 ohms in many ways. The Ethernet terminators are (supposed to be) 50-ohms (so cards transmitting will see 25 ohms load9. When connected to the ends of 75 ohms cable, instead of terminating all the signal hitting them, they will reflect back significant part of it. If you used regular 75-ohm TV terminators on the correct termination impedance would stop the signal reflections, they would ‘increase’ the impedance of the cable seen by signal transmitter to 35-ohms. The higher than expected impedance potentially can cause issues with the collision detection of the Ethernet network, that is based on transmitters sending a known current to line and monitoring the voltage on the line (it gets higher when two stations are transmitting). When cable system is higher impedance than right one, the voltage produced by one transmitter gets closer to “collision detect” limit that it should.
Article at https://groups.google.com/g/alt.cable-tv/c/3se4xxQ-z48 says:
“The terminators, by the way, must be 50 ohms at each end regardless of the impedance of the cable. Yes, this is an impedance mismatch to the cable, but it is required by the *DC* signalling requirements of 10-Base-2. If you’re using RG59 and so you thought you’d need 75-ohm terminators, it may limp along, but collision detection will no longer be reliable and so there will be even more retries by the upper layers… If your cable runs are reasonably short, the 75 ohm cable will probably work.”
With 50% more resistance than you’re supposed to have, the receivers will see 50% more voltage than they were designed for. Since collision detection is based on seeing more voltage than a single transmitter’s current would imply, you end up with a collision whenever you try to talk.
Also said at https://groups.google.com/g/alt.cable-tv/c/3se4xxQ-z48:
“The terminators, by the way, must be 50 ohms at each end regardless of
the impedance of the cable. Yes, this is an impedance mismatch to the
cable, but it is required by the *DC* signalling requirements of
10-Base-2. If you’re using RG59 and so you thought you’d need 75-ohm
terminators, it may limp along, but collision detection will no longer
be reliable and so there will be even more retries by the upper
When we put 50-ohm terminators on 75-ohm cables, there will be some reflections. But the *DC* signalling mechanism in 10-Base-2 requires 50-ohm terminators, i.e., a 25-ohm DC load. The reflections aren’t usually too much of a problem with short cable lengths and small numbers of taps (preferably just point-to-point connection). A lot of folks in this thread have reported getting away with using RG59. If you can connect the connections at both ends and you are sure their are no other devices connected to the cable and the devices you are connecting on each end are supposed to be connected to each other, it could work. If you have old Ethernet hardware laying around to test, then feel free to test. Well if it works, let’s leave it like that. If it does not work, maybe forget the idea. I don’t think it would be worth of trouble trying to hunt down or buy old 10BASE5 hardware for testing.
There are some impedance matching transformers, but generally 50-to-75-ohm baluns won’t work with 10Base2 because they don’t typically preserve the DC levels. Passive resistor based impedance matching circuits on the other hand cause lots of signal attenuation.
Yes, RG58 can be used, and yes, you can sometimes get away with RG59. I wouldn’t suggest
it for a business, but for home use it could work (especially with simple point-to-point links). Even if “it works” with your RG59, look at your NIC and protocol error counters. Look for collisions, bad packets received, retries, etc. You might be unpleasantly surprised. You see, it may be “working” but only because your upper level protocols are doing far more retries
than you would really like. This can hurt throughput in a big way.
Tomi Engdahl says:
Effects of impedance matching between 50 and 75 Ohm coaxial cables for 10 Mbit/s, Manchester-coded signals (20 MHz)
This is quite a bunch of text because I have included plenty of background info. However, there will finally be a good, and precise question: Should I use an impedance matching network when connecting cables of different impedance such as 50 Ω and 75 Ω? Possible answers will likely start with “It depends…”, and this is why I provide a ton of background info first.
We’re looking at a combination of the signals transmitted and received by the ethernet hub near the oscilloscope. Judging by the “clean” part, the transmitted signal has approx. 1.9 Vpkpk, and the received signal has 1.6 Vpkpk. If it’s safe to assume that both drivers have an output of the same amplitude, we can even calculate the loss introduced by the cable: 20×log(1.6/1.9)dB = 1.5 dB. Good enough, because the calculation for 15 m of typical coax with 6.6 dB/100 m yields 1 dB.
The noise is greatly reduced when a matching network is inserted at the near or far ends of the 75 Ω part of the coax. It looks like this (Credits to this source)…
.. there are still some reflections visible travelling back from the unmatched far end.
With the matching network at the far end, there must also be reflections along the comparatively short 50 Ω cable between the hub and the discontinuity labeled “near”, but as I’ve learned from a friend, the scope can’t “see” them, because they are absorbed by the driver. Also, a part of the signal from the “far” driver is reflected and travels back along the 75 Ω cable, and gets terminated into the matching network on the far end
Compared to the unmatched setup, the amplitude of the signal from the far end is approximately halved (-6 dB), and this is in good agreement with the theory that predicts a loss of 5.6 dB over the network and the impedance it “looks” into.
Now, why not use two matching networks at “near” and “far”? Well, 10base2 is designed for a maximum length of 185 m of RG58, having a loss of 6.6 dB/100 m or 12.2 dB/185 m. Therefore, two of my resistive matching networks would already eat almost all the signal and bring me so close to the allowed limit that, including the cable, there is too much loss altogether. I am still in doubt that a low-loss, transformer-based solution would work because I think 10base2 (“cheapernet”) needs a DC path: “DC LEVEL: The DC component of the signal has to be between 37 mA and 45 mA. The tolerance here is tight since collisions are detected by monitoring the average DC level on the coax.” (Source: p.4; also backed up by this data sheet) Then again; the resistive matching network will also put any DC bias in trouble…
… the short question again: Should I use an impedance matching network when connecting cables of different impedance such as 50 Ω and 75 Ω?
Anything between “I prefer the unmatched/matched setup because I like this/that oscillogram better” to answers with plenty of background info on RF or the low-level hardware of 10base2 is greatly appreciated.
If you have access to the inside of the Coaxial Transceiver Interface (CTI), you can modify the circuit between the chip (8392 seems to be the type made by a large variety of manufacturers and also the type that’s used almost exclusively for pretty much any interface made by anyone for 10base2 adapters) and the BNC connector. A trade-off for cables with 75 Ω and 93 Ω is possible at the cost of allowed bus length. National Semiconductor made an Application Note on this topic, called AN-620 (pdf, Sept. 1992).
But even after finding this app’note, it would be great to find some background info about what’s inside an 8392, i.e. what one would have to use to build the interface using discrete parts and maybe some glue logic and opamps.
Tomi Engdahl says:
The refection coefficient due an impedance mismatch is: -
Where Zo is the impedance of the cable and R is the source or load resistance.
And, for your 50/75 ohm setup will be -0.2. So the signal you put down the cable of (say) 3Vp-p will produce a reflection of 0.6Vp-p. Is this too much? It’s not great but it’s certainly not terrible.
Tomi Engdahl says:
Ether coaxial receiver-transmitter interface for driving 75 ohm coaxial cable
The utility model discloses an ethernet coaxial transceiver interface for driving 75 ohm coaxial cables. A direct current biasing voltage device is connected with the collision detection interception port (CDS base pin) of a DP 8392 (National Semiconductor chip) or DP 8392 compatible chip (the chip is provided with a CDS collision detection interception base pin) in a transmitting-receiving interface circuit of the coaxial cable of the thick cable of a baseband of 10M ethernet and the thin cable of a baseband of 500M (10 base 5) /10 M ethernet which conform to the 802.3 standard of the international institute of electrical and electronics engineers (IEEE). The utility model provides the transmission of an ethernet signal on the 75 ohm coaxial cable, and can extend the transmission distance of the ethernet into the distance more than 2 km from 500 m and 185 m under the condition without a relay.
play snake says:
I adore versions individual short article. It’s going to be reasonable to discover one particular make clear throughout words and phrases from key and in many cases love within this certain crucial topic are going to be quickly seasoned.