The Importance of SNR in Cable Testing

Signal to noise ratio

As Internet connection speeds have increased, so has the need to ensure that equipment is suited to these higher data speeds. This especially applies to Ethernet cable because this is where most problems occur; as speed increases, the more susceptible the signal becomes to noise issues. For example, Cat 3 cable is designed for 10 Mbps and operates using low demand Manchester encoding whereas a Cat 5 100 Mbps line is designed to operate using 3 logic levels and Gigabit Ethernet cable requires the need to have superior Signal to Noise tolerances to tolerate additional logic levels.

In order to achieve these high speeds, the Signal-To-Noise Ratio (SNR) becomes important because a noisy cable affects signal quality, causes errors, and unreliable connections which leads to a downgraded service.

Impact of Signal-Noise-Ratio on Service

The signal-to-noise ratio of a data cable is a measurement of the potential for the cable to be affected by noise. It is measured in Decibels (dB), and the higher the SNR, the greater the cable’s immunity to noise. The reason it’s important is that if the SNR is low (poor), noise caused by outside electromagnetic interference (EMF) sources, such as power system transients, cross talk and other sources of EMF, overwhelm the signal to the extent that error rates increase sharply and transmission speeds are severely downgraded. Other factors that affect SNR include cable damage, split pairs, poor connections, poor quality cable, and the length of the cable.

Earlier cables such as Cat 3 were limited to 10 Mbps and had relatively high noise immunity, but as data transmission speeds increased, the debilitating effects of noise correspondingly increased. This is especially so for Gigabit Internet which has 5 logic levels contained in a 2 volt envelope which makes the signal especially vulnerable to interference.

The total noise contributing toward the SNR is the combination of transmission echoes, near-end cross talk, far-end crosstalk, interference due to noise from the adjacent twisted pairs and alien crosstalk caused by coupling of signals from nearby cables.

A SNR of less than 17 dB is considered to be extremely poor with a strong likelihood of causing data errors. An SNR of 17 dB may be acceptable depending upon the application, while a SNR that exceeds 17 dB is preferred.

Cable Testing Standards

Two standards are used to assess cable performance. The older is the TIA 568, introduced in 1991 although subsequently updated. This standard sets acceptable parameters for cable capability, and cable testing is carried out over a wide range of frequencies: a process that is time consuming and expensive. A weakness of this approach is that the test only compares the cable to a set of standards; it does not give a pass or fail rating based on functionality, which is what most technicians need.

This was followed by the IEEE 802.3 standard for Ethernet networks. This standard presents a different approach that defines the requirements for Gigabit Ethernet, allowing technicians to decide what is acceptable.

Practical Testing Using Digital Tester

As a result of the publication of these standards and in response to the need for technicians to have an easy-to-use portable cable tester, T3 Innovation developed a handheld tester, the Net Chaser, which is able to test cables to both TIA 568 connection specifications and the IEEE 802.3 standards at speeds up to 1 Gbps.

The Net Chaser has the ability to measure the signal-to-noise ratio as well as to carry out bit error rate testing, by sending data packets down a defined cable at user-selected data rates. Other tests that can be carried out include checking if the cable has the required speed capability and to identify and locate cable faults.

Tags: