Measuring Canada’s Internet

For most people, internet performance is a mystery. Many subscribers do not even know the level of bandwidth they are paying for, let alone how to test if they are actually receiving the sorts of speeds their ISP advertises. Canadian regulators have often been in the dark as well, which is a problem when their decisions are supposed to take the availability and geographic distribution of broadband into account.

Regulators have traditionally depended on information provided by industry as a basis for policy decisions, but this information can be inaccurate or incomplete. There are ample cases in the US and Canada where certain regions have been listed as having access to a certain level of broadband, or choice of ISPs, whereas the reality on the ground has been far less than what is supposedly available. This problem is not unknown to regulators. Network BC, working with Industry Canada and the CRTC, launched its broadband mapping initiative in 2014. This included consultations with the various ISPs spread across the province to determine what services where actually available in what locations, resulting in an interactive connectivity map. Industry Canada watched the efforts in BC closely, and is currently soliciting information from ISPs to carry out a national broadband availability mapping project. However, such efforts to not include any independent means of actually measuring internet performance in these areas.

Up until now, the go-to place for Canadian internet performance surveys that utilize a third-party service (that don’t on ISPs for information) has been averages of Ookla’s (see here and here), which is the same service typically used by individuals to see how their internet connections measure up. But the results are not really meant to be a basis for policy decisions, since the averages are not pulled from a representative sample, and the (mean) speeds are often higher than what is available to a “typical” internet subscriber,

The big news in recent weeks has been the entry of new players in the internet metrics game. First, CIRA kicked off its own broadband mapping effort, which anyone can participate in and provide information to (an appropriate browser/OS combo may be required to participate). The map is very much a work-in-progress, which will fill out as individuals add more data points, and as new features and methods are added. Not long after, the CRTC announced its own internet measuring initiative. This is new territory for the CRTC, which has never had much of an ability to independently investigate or collect data about the telecom industry it regulates. However, the plan has been in the works since at least 2013, and may be based on the FCC’s Measuring Broadband America project, which has been underway since 2011. As in the US (along with Europe, Brazil, Singapore, and other nations), the CRTC’s program depends on the use of the SamKnows “whiteboxes” deployed at participating locations (the CRTC is currently looking for volunteers to receive and set up the devices). These devices measure connectivity between the subscriber’s premises and major connection points between ISPs.

There are a number of concerns (see here and here) with the CRTC’s efforts. ISPs could try to “game” the metrics to make their network’s performance appear better (ISPs know which of their subscribers have the boxes, since they use this information to make sure the testing doesn’t contribute to a subscriber’s data cap). SamKnows might only measure internet performance in off-peak hours, when connectivity is less likely to be a problem, since the boxes are intended to operate when subscribers aren’t making full use of their bandwidth (on another page, the CRTC has gone even farther to say the information will be gathered “when users are not connected”). Not all ISPs are participating the program, raising the concern that smaller players and rural areas that are most disadvantaged in terms of connectivity are being left out. This last point relates to the importance of having a representative sample, which is a fundamental precondition for any survey that attempts to calculate meaningful (or generalizable) statistics. All of the above can be addressed with a properly designed methodology, full transparency of these methods, and careful qualification of the results. Here, the CRTC has plenty of international examples to draw from, and SamKnows has built its business around such openness, but we will have to wait for more details to weigh in on whether this particular partnership has done a good job.

Finally, it is important to realize that no test can ever truly gauge the speed of “the internet” from a given location. Typically, the best that can be achieved is a measurement from a subscriber’s home to a “major internet gateway”, where an ISP connects to the rest of the world. The ISP has no control over how fast the rest of the world’s internet is, and limited control over the performance of services that aren’t hosted on its network. Even the fastest gigabit networks are no faster than their connections to services “upstream,” like Netflix – a problem the FCC had to contend with as it tried to measure the performance of ISPs that were engaged in peering disputes that limited their connections to the streaming service.

Ultimately, all of this indicates a broader trend towards data gathering to erase some of the mystery about how the internet actually “performs”. For individuals, these are welcome steps towards becoming better informed about what one’s ISP actually provides, but also about what goes into determining internet speed or performance in the first place. For regulators, accurate and comprehensive information is a precondition for effective public policy, and it’s great to see Industry Canada and the CRTC taking steps to refine the picture they have of Canadian connectivity as they come to decide important questions about the future of Canada’s internet.

Positive and Negative Responsibilities for Internet Intermediaries

I’m interested in the responsibilities of various “internet intermediaries”. These might be internet service providers (ISPs), online service providers (like Google or Netflix), or increasingly, some combination of the two functions under the same organizational umbrella.

Regulations require these intermediaries to do certain things and avoid doing others. Child pornography or material that infringes copyright must be taken down, but personal communications or online behaviours cannot be tracked without consent and a valid reason. Certain protocols might be throttled where necessary for “network management”, but otherwise ISPs should not discriminate between packets. It strikes me that these responsibilities – duties to intervene and duties not to intervene – can be likened to the idea of positive and negative rights or duties in philosophy, where positive rights oblige action, and negative rights oblige inaction.

If notified of the presence of illicit content, a host must take action or face sanctions. This is a positive responsibility to intervene given certain conditions. Privacy protections and net-neutrality regulations are often negative responsibilities, in that they prevent the intermediary from monitoring, collecting, or discriminating between data flows.

However, as with positive and negative rights, it is not always easy to tease the two apart. Negative responsibilities can have a positive component, and the two are often bundled together. For example, the positive duty to install a court-ordered wiretap is typically tied to the negative duty of not informing the wiretap’s target. Non-discrimination is a negative responsibility, but US ISPs have been accused of discriminating against Netflix by not upgrading links to handle the traffic coming from the video services. Under this logic, an ISP has a positive responsibility to ensure its customers have adequate access to Netflix. Anything less amounts to discrimination against Netflix. In Canada, ISPs also have a negative responsibility not to discriminate against video services like Netflix, particularly since Netflix competes with incumbent ISPs’ own video offerings. However, the Canadian regulatory regime seems to be headed towards imposing the positive responsibility on these ISPs to make their own video services available through other providers under equal terms, under the reasoning that equal treatment and exclusivity cannot coexist.

I think the distinction between positive and negative responsibilities can be useful, particularly since the majority of the academic literature about internet intermediaries has emphasized their positive responsibilities. There has been less discussion of all the things that intermediaries could be doing with our traffic and data, but which they choose not to, or are constrained from doing.

On Cyberspace

When William Gibson coined “cyberspace” in the early 1980s, he was primarily interested in coming up with an exciting setting for science fiction, and one with a cool-sounding name. As he has told the story in numerous interviews, Gibson came across a Vancouver arcade one day and was struck by the intensity with which the gamers engaged with the screen, leaning ever closer as if they were trying to push through it to a world on the other side. He wanted to imagine what that world was like – to explore the “notional space” inside the computer. These days, Gibson has mixed feelings about the term he coined. In 2007 he was reported announcing the demise of ‘cyber’ talk, and has joined many others pointing out how unhelpful it was to think about cyberspace as some separate, virtual realm.

And yet, cyber talk keeps proliferating. Cyberspace has become a bloated, rudderless place-holder of a word. It means less and less every day, as it expands to encompass more and more. As the world fills up with networked computers, cyberspace is suddenly everywhere. Militaries have started slapping the ‘cyber’ label onto practices that fifty years ago had other names, like signals intelligence and electronic warfare. Now these are all ‘cyber operations’ and the domain of operations is cyberspace. In 2010 Canada’s government put forward a rather 1980s Gibsonian definition of cyberspace, and went about trying to secure it.

William Gibson is not the only one trying to helpfully remind people that cyberspace does not actually exist – that this is a word he invented to fill a storytelling need, which then took on a life of its own. Other writers have also been trying to get past the virtual, and point to the material. In the 1990s and 2000s, ‘cyber-utopians’ imagined they would have the freedom to build a new world in cyberspace. Some still do, but a realist backlash (of which Evgeny Morozov is the prime example) has reminded us that utopias can be dangerous, and that cyberspace is not somewhere we can go to escape power and exploitation. Our networks are material; they exist in governed territories; they must contend with states and other sovereigns.

My ongoing work is certainly an attempt to help ground internet studies in a material dimension, but I am struck by a vision similar to what Gibson saw in those kids in that arcade. These days, if you want to see someone getting immersed in a screen, you can likely just look across the room or out the window. Few of us imagine that we are somehow ‘in cyberspace’ when we hold the screen up to our face, and yet there is a world behind that screen. This world is largely invisible, sometimes secret, and usually hard to understand. It is a world of cables and switches, companies handing packets to one another on privately-agreed terms, while regulators and assorted security agents work to produce some sort of order.

Like a bad hangover from the 1980s and 90s, cyberspace persists in jargon and a great deal of government and academic discourse. One of the reasons is the difficulty of finding an adequate catch-all replacement. ‘The internet’ can be even more nebulous than cyberspace, and ‘online’ tends to be used as an adjective. At the present moment, it is more helpful to turn away from talk of virtual worlds, and focus on the material one we all have to contend with.