If you’re a library network admin, you are probably familiar with complaints concerning slow download speeds on your local network. “The Internet is crawling today! What’s wrong?”
At OPLIN, this is certainly familiar to us, too. After all, many times the libraries’ network administrators end up calling us about this kind of issue. Some of them have even used an online tool to test the speed of their network, to have some form of numerical backup when they call us. Unfortunately, this kind of “data” (yes, the quotes are there for a reason, I’m getting there) is extremely undependable and even misleading. Here’s why:
- The main problem with online speed testers is that there is no test control. Yes, these tools are testing something, but there are so many variables it’s usually impossible to get reliable information returned about what exactly is being tested. Karl Jendretzky, OPLIN’s Technology Project Manager, explains it this way: “You’re testing the maximum speed that a third-party application tries to measure throughput with, between your machine inside your network, and a unknown device on the commodity internet with several uncontrollable networks between you and it, with an unknown amount of bandwidth available.” That’s an awful lot of unknowns, resulting nearly always in a test that doesn’t have meaningful results.
- Even if the tests work (which they basically don’t), many people misinterpret the results. For example, if a T1 is currently at 50% utilization (and if the speed test actually works perfectly), it’s going to tell you that you have a 750Kb connection. It is not uncommon for people to evaluate this incorrectly. They might call us and say “I’m supposed to have a 1.5MB connection!” They mistakenly believe that the test is telling them they only have that much bandwidth total, rather than understanding that it’s telling them how much of their connection is available.
- Testing itself affects the results. Using these tools attempts to consume whatever free bandwidth is currently on your WAN connection at the time. If you’re low on bandwidth the way it is, running a test makes the network appear even worse.
- Sometimes, the online tools are just wrong. In one instance, Karl installed a Flash-based testing tool on an OPLIN server. In tests with one library’s network, results varied from 11MB to 75MB. The problem? The library only had a 50MB total connection to begin with. It was impossible to get to 75MB. Yikes.
So, what should you do instead? Karl recommends two strategies:
- Monitor the edge of your network with a SNMP monitor, or a packet sniffer so you’ll know the current utilization.
- Use iperf (a free tool) on the edge of your network to verify your local throughput. Karl suggests starting by installing iperf on a couple machines, and then to run a test between them using just a switch. This will give you a control case, and by knowing how much potential the two machines can push, you can now put various devices/network segments in the middle, and verify throughput. Without first knowing what works, you can’t spot what doesn’t.
What does this mean to me, Laura?
Online testing of network speeds doesn’t work and doesn’t provide usable information. Your urge to confirm your suspicions about network slowdowns is understandable; just be sure to use real tools to get your data.