Measuring Canada’s Internet

For most people, internet performance is a mystery. Many subscribers do not even know the level of bandwidth they are paying for, let alone how to test if they are actually receiving the sorts of speeds their ISP advertises. Canadian regulators have often been in the dark as well, which is a problem when their decisions are supposed to take the availability and geographic distribution of broadband into account.

Regulators have traditionally depended on information provided by industry as a basis for policy decisions, but this information can be inaccurate or incomplete. There are ample cases in the US and Canada where certain regions have been listed as having access to a certain level of broadband, or choice of ISPs, whereas the reality on the ground has been far less than what is supposedly available. This problem is not unknown to regulators. Network BC, working with Industry Canada and the CRTC, launched its broadband mapping initiative in 2014. This included consultations with the various ISPs spread across the province to determine what services where actually available in what locations, resulting in an interactive connectivity map. Industry Canada watched the efforts in BC closely, and is currently soliciting information from ISPs to carry out a national broadband availability mapping project. However, such efforts to not include any independent means of actually measuring internet performance in these areas.

Up until now, the go-to place for Canadian internet performance surveys that utilize a third-party service (that don’t on ISPs for information) has been averages of Ookla’s speedtest.net (see here and here), which is the same service typically used by individuals to see how their internet connections measure up. But the results are not really meant to be a basis for policy decisions, since the averages are not pulled from a representative sample, and the (mean) speeds are often higher than what is available to a “typical” internet subscriber,

The big news in recent weeks has been the entry of new players in the internet metrics game. First, CIRA kicked off its own broadband mapping effort, which anyone can participate in and provide information to (an appropriate browser/OS combo may be required to participate). The map is very much a work-in-progress, which will fill out as individuals add more data points, and as new features and methods are added. Not long after, the CRTC announced its own internet measuring initiative. This is new territory for the CRTC, which has never had much of an ability to independently investigate or collect data about the telecom industry it regulates. However, the plan has been in the works since at least 2013, and may be based on the FCC’s Measuring Broadband America project, which has been underway since 2011. As in the US (along with Europe, Brazil, Singapore, and other nations), the CRTC’s program depends on the use of the SamKnows “whiteboxes” deployed at participating locations (the CRTC is currently looking for volunteers to receive and set up the devices). These devices measure connectivity between the subscriber’s premises and major connection points between ISPs.

There are a number of concerns (see here and here) with the CRTC’s efforts. ISPs could try to “game” the metrics to make their network’s performance appear better (ISPs know which of their subscribers have the boxes, since they use this information to make sure the testing doesn’t contribute to a subscriber’s data cap). SamKnows might only measure internet performance in off-peak hours, when connectivity is less likely to be a problem, since the boxes are intended to operate when subscribers aren’t making full use of their bandwidth (on another page, the CRTC has gone even farther to say the information will be gathered “when users are not connected”). Not all ISPs are participating the program, raising the concern that smaller players and rural areas that are most disadvantaged in terms of connectivity are being left out. This last point relates to the importance of having a representative sample, which is a fundamental precondition for any survey that attempts to calculate meaningful (or generalizable) statistics. All of the above can be addressed with a properly designed methodology, full transparency of these methods, and careful qualification of the results. Here, the CRTC has plenty of international examples to draw from, and SamKnows has built its business around such openness, but we will have to wait for more details to weigh in on whether this particular partnership has done a good job.

Finally, it is important to realize that no test can ever truly gauge the speed of “the internet” from a given location. Typically, the best that can be achieved is a measurement from a subscriber’s home to a “major internet gateway”, where an ISP connects to the rest of the world. The ISP has no control over how fast the rest of the world’s internet is, and limited control over the performance of services that aren’t hosted on its network. Even the fastest gigabit networks are no faster than their connections to services “upstream,” like Netflix – a problem the FCC had to contend with as it tried to measure the performance of ISPs that were engaged in peering disputes that limited their connections to the streaming service.

Ultimately, all of this indicates a broader trend towards data gathering to erase some of the mystery about how the internet actually “performs”. For individuals, these are welcome steps towards becoming better informed about what one’s ISP actually provides, but also about what goes into determining internet speed or performance in the first place. For regulators, accurate and comprehensive information is a precondition for effective public policy, and it’s great to see Industry Canada and the CRTC taking steps to refine the picture they have of Canadian connectivity as they come to decide important questions about the future of Canada’s internet.

Positive and Negative Responsibilities for Internet Intermediaries

I’m interested in the responsibilities of various “internet intermediaries”. These might be internet service providers (ISPs), online service providers (like Google or Netflix), or increasingly, some combination of the two functions under the same organizational umbrella.

Regulations require these intermediaries to do certain things and avoid doing others. Child pornography or material that infringes copyright must be taken down, but personal communications or online behaviours cannot be tracked without consent and a valid reason. Certain protocols might be throttled where necessary for “network management”, but otherwise ISPs should not discriminate between packets. It strikes me that these responsibilities – duties to intervene and duties not to intervene – can be likened to the idea of positive and negative rights or duties in philosophy, where positive rights oblige action, and negative rights oblige inaction.

If notified of the presence of illicit content, a host must take action or face sanctions. This is a positive responsibility to intervene given certain conditions. Privacy protections and net-neutrality regulations are often negative responsibilities, in that they prevent the intermediary from monitoring, collecting, or discriminating between data flows.

However, as with positive and negative rights, it is not always easy to tease the two apart. Negative responsibilities can have a positive component, and the two are often bundled together. For example, the positive duty to install a court-ordered wiretap is typically tied to the negative duty of not informing the wiretap’s target. Non-discrimination is a negative responsibility, but US ISPs have been accused of discriminating against Netflix by not upgrading links to handle the traffic coming from the video services. Under this logic, an ISP has a positive responsibility to ensure its customers have adequate access to Netflix. Anything less amounts to discrimination against Netflix. In Canada, ISPs also have a negative responsibility not to discriminate against video services like Netflix, particularly since Netflix competes with incumbent ISPs’ own video offerings. However, the Canadian regulatory regime seems to be headed towards imposing the positive responsibility on these ISPs to make their own video services available through other providers under equal terms, under the reasoning that equal treatment and exclusivity cannot coexist.

I think the distinction between positive and negative responsibilities can be useful, particularly since the majority of the academic literature about internet intermediaries has emphasized their positive responsibilities. There has been less discussion of all the things that intermediaries could be doing with our traffic and data, but which they choose not to, or are constrained from doing.