Consumer Broadband Test Update
March 17th, 2010 by Jordan Usdan - Attorney-Advisor, Broadband Task Force Thanks to the over 150,000 unique users who have taken over 300,000 Consumer Broadband Tests, as well as the nearly 4,000 addresses submitted to the broadband Dead Zone Report. The popularity of the consumer tools has exceeded our expectations.We’ve made some text changes to the short “About” section found on a tab below the Consumer Broadband Test Tool. Some users have been confused by the differences between the two testing platforms presented by the FCC – Ookla and M-Lab – and this section explains the variabilityOver the weekend, the FCC also updated both the Android and iPhone FCC Apps to improve the user experience. The FCC App can be found by searching for “FCC” in either the Android or iPhone App store.
The FCC chose to use two testing applications for the Beta version of the Consumer Broadband Test. The two applications are among the most popular on the Internet and the FCC hopes to make available additional testing platforms in the future. However, software based broadband testing is not an exact science and contains inherent variability, as described in the About section. This is why the FCC will also be conducting a hardware based scientific study of broadband quality across the country. See this recent blog post about this venture, and the RFQ here. The FCC will use the results of this hardware study for analytical purposes. The results of the software bases testing (see data below) are interesting and show broad trends, but the FCC is not relying on the data for analytical purposes.
Here are the user experienced differences between the two testing platforms:
Metric | M-LAB | OOKLA |
Average Download Speed (mbps) | 7.04 | 11.5 |
Median Download Speed (mbps) | 3.95 | 8.14 |
Average Upload Speed (mbps) | 2.74 | 2.09 |
Median Upload Speed (mbps) | 0.87 | 1.01 |
You will see that Ookla provides a higher overall average and median speed than M-Lab. This is likely due to the different methodologies these testing applications use. The difference comes from the fact that broadband speeds vary over time, even within a single second. Ookla measures peak performance and ignores short periods of slow speed, which it considers to be speed bumps in performance, while M-Lab takes many rapid speed measurements and averages them all. For more detail, see the Ookla and M-Lab methodology sections. Additionally, Ookla and M-Lab each have testing servers geographically distributed across the country. Individual’s proximity to these testing servers could also affect testing results.
Although software based testing cannot provide users with a 100% reliable measures of broadband quality, the FCC makes these tools available as they provide comparative and relative real-time performance information and helps the FCC collect broadband availability data.
Here are some interesting data and maps from the first six days of the Consumer Broadband Test. This data is derived from the results of both testing applications.
As you can see, 87% of test takers are home users, which is the FCC’s target audience with this application. Additionally, a clear trend is visible across business sizes, high bandwidth connectivity for community institutions, and lower bandwidth for mobile connections. Again, these results are non-scientific extrapolations from the Beta version of the Consumer Broadband test. Additionally, about 98% of user submitted addresses are geo-coding correctly, which is a very good rate.
Given that this is the Beta version, we want to hear from you about additional features we can add to this interface. We already have some internal plans to rollout an updated version in the near future that provides greater context to users about the meaning of their testing results.
Hi dear,
ReplyDeleteYour blog really very nice. I like it.More info for Video conferencing services
Regards
Kim Roddy