For most people, internet performance is a mystery. Many subscribers do not even know the level of bandwidth they are paying for, let alone how to test if they are actually receiving the sorts of speeds their ISP advertises. Canadian regulators have often been in the dark as well, which is a problem when their decisions are supposed to take the availability and geographic distribution of broadband into account.
Regulators have traditionally depended on information provided by industry as a basis for policy decisions, but this information can be inaccurate or incomplete. There are ample cases in the US and Canada where certain regions have been listed as having access to a certain level of broadband, or choice of ISPs, whereas the reality on the ground has been far less than what is supposedly available. This problem is not unknown to regulators. Network BC, working with Industry Canada and the CRTC, launched its broadband mapping initiative in 2014. This included consultations with the various ISPs spread across the province to determine what services where actually available in what locations, resulting in an interactive connectivity map. Industry Canada watched the efforts in BC closely, and is currently soliciting information from ISPs to carry out a national broadband availability mapping project. However, such efforts to not include any independent means of actually measuring internet performance in these areas.
Up until now, the go-to place for Canadian internet performance surveys that utilize a third-party service (that don’t on ISPs for information) has been averages of Ookla’s speedtest.net (see here and here), which is the same service typically used by individuals to see how their internet connections measure up. But the results are not really meant to be a basis for policy decisions, since the averages are not pulled from a representative sample, and the (mean) speeds are often higher than what is available to a “typical” internet subscriber,
The big news in recent weeks has been the entry of new players in the internet metrics game. First, CIRA kicked off its own broadband mapping effort, which anyone can participate in and provide information to (an appropriate browser/OS combo may be required to participate). The map is very much a work-in-progress, which will fill out as individuals add more data points, and as new features and methods are added. Not long after, the CRTC announced its own internet measuring initiative. This is new territory for the CRTC, which has never had much of an ability to independently investigate or collect data about the telecom industry it regulates. However, the plan has been in the works since at least 2013, and may be based on the FCC’s Measuring Broadband America project, which has been underway since 2011. As in the US (along with Europe, Brazil, Singapore, and other nations), the CRTC’s program depends on the use of the SamKnows “whiteboxes” deployed at participating locations (the CRTC is currently looking for volunteers to receive and set up the devices). These devices measure connectivity between the subscriber’s premises and major connection points between ISPs.
There are a number of concerns (see here and here) with the CRTC’s efforts. ISPs could try to “game” the metrics to make their network’s performance appear better (ISPs know which of their subscribers have the boxes, since they use this information to make sure the testing doesn’t contribute to a subscriber’s data cap). SamKnows might only measure internet performance in off-peak hours, when connectivity is less likely to be a problem, since the boxes are intended to operate when subscribers aren’t making full use of their bandwidth (on another page, the CRTC has gone even farther to say the information will be gathered “when users are not connected”). Not all ISPs are participating the program, raising the concern that smaller players and rural areas that are most disadvantaged in terms of connectivity are being left out. This last point relates to the importance of having a representative sample, which is a fundamental precondition for any survey that attempts to calculate meaningful (or generalizable) statistics. All of the above can be addressed with a properly designed methodology, full transparency of these methods, and careful qualification of the results. Here, the CRTC has plenty of international examples to draw from, and SamKnows has built its business around such openness, but we will have to wait for more details to weigh in on whether this particular partnership has done a good job.
Finally, it is important to realize that no test can ever truly gauge the speed of “the internet” from a given location. Typically, the best that can be achieved is a measurement from a subscriber’s home to a “major internet gateway”, where an ISP connects to the rest of the world. The ISP has no control over how fast the rest of the world’s internet is, and limited control over the performance of services that aren’t hosted on its network. Even the fastest gigabit networks are no faster than their connections to services “upstream,” like Netflix – a problem the FCC had to contend with as it tried to measure the performance of ISPs that were engaged in peering disputes that limited their connections to the streaming service.
Ultimately, all of this indicates a broader trend towards data gathering to erase some of the mystery about how the internet actually “performs”. For individuals, these are welcome steps towards becoming better informed about what one’s ISP actually provides, but also about what goes into determining internet speed or performance in the first place. For regulators, accurate and comprehensive information is a precondition for effective public policy, and it’s great to see Industry Canada and the CRTC taking steps to refine the picture they have of Canadian connectivity as they come to decide important questions about the future of Canada’s internet.