Blocking Bad Traffic

Canadian internet policy is currently dominated by discussions of blocking – what to block and how. Blocking copyright infringement has been debated in a number of forms in recent years, was implemented in a limited sense to deal with GoldTV, and is now the subject of a new Government consultation (see here). Cabinet is also promising new measures to deal with illegal pornography, which may include new measures to block access (or building on existing ones, such as Cleanfeed). And then there is the CRTC’s proposal to mandate blocking cyber security threats – specifically botnets – though as a number of interveners have argued (including RCMP, CSE, M3AAWG) the focus on botnets is too narrow and should be expanded or kept flexible enough to adapt to future threats. Internationally the focus has been on how social media companies moderate their platforms, including blocking mis/disinformation and taking down accounts.

The common thread here is the understanding that some categories of internet traffic generate significant harm, and that there are centralized points of control where this traffic can be filtered out. As documented in Telecom Tensions (forthcoming May 2021), many of these debates are more than twenty years old (copyright owners have been arguing that ISPs should be responsible for infringing traffic since the mid-1990s), but they regularly re-emerge to focus on new methods, intermediaries, and targets. ISPs are often at the center of these controversies and participate as interested parties. Because they can exercise centralized control over their networks (DPI, DNS, BGP etc.), this is where filtering is deemed most effective (mandating end-user controls is also generally off-limits), and this means they are most likely to be saddled with new responsibilities and liabilities. There are already many existing processes in Canada for filtering at the ISP level (particularly when it comes to cyber security) and the question is often how to better coordinate and standardize what ISPs are doing on their own. Incumbents and government agencies have pointed to the work of CSTAC and CTCP in regard to the botnet-blocking proposal, and the work of the CCAICE goes back to the mid-2000s.

Network neutrality or the “open internet” is often implicated in these discussions, as blocking traffic generally contradicts the principle that users should be able to connect to whatever they want (or that ISPs should not discriminate among traffic). Those in favor of blocking will typically focus on the illegality of specific kinds of traffic as a justification, even where they may have significant financial interests in engaging in blocking (intellectual property, gambling revenues, bandwidth recovery). Opposition to blocking is less of an issue the more universally the traffic is recognized to be ‘bad’, which is why the earliest blocking regimes focused on child abuse imagery and spam. Botnets are an example where the users of ‘infected’ devices may not be aware that they are sending and receiving such traffic, and this can take us to questions of personal autonomy/responsibility for our devices (the Australian iCode system involved ISPs notifying users who were infected, placing the responsibility on users to take action). Even when there is broad agreement that a category of traffic is undesirable and should be stopped, there are always questions over effectiveness and overblocking (some bad traffic will get through, and some legitimate traffic will be blocked inadvertently). Finally, debates over internet filtering are inherently also about internet surveillance – since to discriminate between categories of traffic, ISPs and other actors need to monitor flows and have some sense of their contents.

Underlying all of this are two different view of what an ISP is and should be: whether our networks should be “dumb pipes”, limited to being just a conduit for packets as imagined in the end-to-end internet, or whether the pipes should be “smart” and exercise agency over traffic. The smart pipe’s “intelligence is rooted in its ability to inspect and discriminate between traffic, to decide what should be permitted, prioritized, or cleaned”. Clearly, the current rounds of proposals envision even broader roles for intermediaries in governing traffic — what I describe as the process of intermediation. However, the specifics in each case matter, involving different business interests and public consequences (including unintended consequences). Any action needs to be informed by an understanding of the actors and processes already in place to deal with bad traffic, previous experience with filtering elsewhere, and the harms and risks of traffic discrimination. We need to be mindful of the fact that some actors will always try to mandate blocking in order to safeguard their narrow interests. Political actors will also recurrently feel the need to ‘do something’ about internet threats, and imposing additional surveillance and filtering responsibilities on intermediaries often seems like the easiest solution. Policies for copyright filtering, pornography, and child abuse imagery have a long history in internet governance, but cyber security remains in many ways a frontier of internet policy, and one worth watching closely given the expansive possibilities for what might be considered a cyber threat in the future.

Rogers + Shaw — What are the policy options?

So, the big news from last week is that Canada’s two biggest cablecos want to become Canada’s biggest cableco.

This shouldn’t be surprising – Canada’s two biggest telcos have thought about merging for years, but the political winds have apparently not been favorable. While you can generally (but not always) satisfy shareholders with a big enough bucket of cash, governments are another matter. Since telecom liberalization in the 1990s, public policy has tried to move the country away from monopoly and towards a competitive marketplace for telecom services. Incumbent telephone and cable companies were unmoored to compete with one another across the country, but the practical consequences have been a gradual consolidation of the industry through mergers & acquisition. Because it makes little sense to build infrastructure in a ‘territory’ that is home to a well-established incumbent, the big companies stuck to their domains, and the biggest grew their territories by gobbling up potential competitors in other regions. People have been saying that there isn’t much left to gobble – or not much room for further consolidation in the industry – since the 2000s. But deals that further reduced the number of incumbents have continued to be approved by regulators. Eastern cableco Rogers is still hungry, and it’s confident government will agree to their planned purchase of Western cableco Shaw. Perhaps they will get what they want, but this is not a done deal and will be debated for some time to come. So, what are the options?

Option 1: Let the capitalists have their way

Rogers acquiring Shaw wouldn’t change much of the status quo anyways. The two companies already don’t compete for wireline customers (although they have previously squabbled over whether there is actually a non-compete agreement). The big change is going to come in wireless, where Shaw does compete (as Freedom, formerly WIND) with Rogers and the rest of the Big Three (Bell & TELUS), and access to spectrum is a critical factor. Given how hard it has been to develop a fourth wireless competitor, we could just embrace the reality of the Big Three. But why stop there? Maybe two companies is all you need for competition? Much of the country already operates as a duopoly for wired internet access (where Canadians can choose between their local cableco and telco). Regulatory proceedings at the CRTC would be greatly simplified in a world with only two players, as long as the regulator can assure itself that this is what competition looks like and there’s no need to set rates. Canadians will pay more – shareholders will profit. However, this would in many ways give up on what liberalization was meant to accomplish. Sure, governments care about shareholder interests, but they also care about other things, and will have to draw the line somewhere. Apparently that line wasn’t reached when Bell acquired MTS (followed by former MTS customers paying more). Is Rogers-Shaw going to be the line?

Option 2: Release the Americans

If no facilities-based competitors are going to organically develop from within the Canadian telecom industry (by climbing the mythical “ladder of investment”), then one option is to attract competitive interest from outside the country, and U.S. companies offer the best potential given their infrastructure south of the border. Of course, it’s not as easy as simply relaxing foreign ownership rules. A new foreign-owned competitor would have to spend billions to build infrastructure to compete against the comfortably-situated incumbents. More realistically, foreign investors would buy an existing Canadian incumbent where they saw an opportunity. This does have the potential to shake things up, just as WIND mobile did with its foreign backers (or when Verizon seemed interested in acquiring WIND and Mobilicity). Ultimately, WIND was bought by Shaw, and now Rogers is buying Shaw. As a new entrant, WIND required favorable regulatory treatment to stand any chance against the incumbents. New entrants need access to existing infrastructure (towers, poles, rights of way) and wireless spectrum. Long-time telecom journalist Rita Trichur (who was a correspondent during the ‘telecom war’ of 2013, when it seemed that Verizon might move into Canada) has come out advocating government drop the foreign ownership rules as a source of getting fresh competition into the market. She imagines that Verizon or AT&T could buy up the infrastructure that Bell and/or TELUS have built. Perhaps customers would see lower costs, and she points out that we already coordinate with the Americans on spectrum and security. After all, Canada and the U.S. are both members of the Five Eyes security alliance. Those AT&T/NSA splitter rooms were just for foreign traffic right? And Canada’s not really foreign, because the Five Eyes have a “gentleman’s agreement” not to spy on each other. Right?

Option 3: Service-based competition/structural separation:

The “European” model – recurrently floated as a possibility in Canadian telecom, but never seriously considered as a broad approach given our commitment to facilities-based competition. If we can call facilities-based competition a failure, then how about we accept a leading alternative? Let service providers compete over some kind of shared infrastructure (with the infrastructure owner “separated”, so that they don’t compete with these service providers). Canada’s rejection of this approach in favor of facilities-based competition was once justified by the fact that the country had overlapping cableco and telco infrastructure in urban areas, but being stuck with just two competitors was not imagined as the final outcome (in wireless, there are the Big Three, but Bell and TELUS split the country between them and share facilities, rather than build competing networks in the same territories). If we don’t like the outcome of duopoly, let’s deal with the policy failure by pushing in a different “separated” direction. There’s a lot of different models such an approach could follow, however the path will be difficult, and many people will be very upset by a change in the status quo.

Option 3b: MVNOs

Given the implications of the deal for wireless competition specifically, this could be the time to extend mandated access to wireless facilities to MVNOs (Mobile Virtual Network Operators). MVNOs have been very controversial in Canada, as another way of mandating access to incumbent facilities for new competitors to provide (wireless) services over — a specific kind of service-based competition. Canada has historically expanded mandated-access regimes for other forms of connectivity to deal with the flaws of facilities-based competition, but incumbents have opposed the introduction of MVNOs as they opposed other forms of mandated access. This opposition will be harder to maintain in the wake of a Rogers-Shaw deal.

Option 4: Just say No

In some ways, this is the conservative approach – reject the deal as bad for consumers, bad for competition, and keep things the way they are. In other ways, this would be a departure from Canada’s post-liberalization approach to competition policy, because some arm of government would have to finally signal that enough is enough. Perhaps this would just kick the can down the road and maintain regulatory uncertainty, but rejecting the deal provides an opportunity to articulate a new policy approach that the industry can orient to.

Are there other options? Sure – the deal can be approved with various conditions (as Rogers apparently expects), but these are not likely to make a fundamental difference to the outcome. Even if Shaw hands over Freedom and its spectrum to a different player, this won’t translate to a national competitor. And of course, we could go all-in on monopoly consolidation (let the biggest fish eat the rest) — which takes us back to the utility model and erases what liberalization was supposed to accomplish.

I do not want to weigh in on what is likely to happen, but to highlight the fact that this presents an opportunity to do something differently in Canada. My worry is that regulators will focus too much on how they might somehow make the Rogers-Shaw deal seem acceptable, and not enough about how they might steer policy in a different direction. The Competition Bureau is already getting an earful about the deal, and you can let them know your own thoughts.

Trouble in Olds

Canada’s best-known community network has met its existential threat — municipal politics. O-NET, based in Olds, achieved national fame back in 2013 as a community-owned ISP offering gigabit internet, in competition with local incumbents (TELUS and Shaw). If you are interested in community/municipal broadband in Canada, you will at some point have come across a version of the Olds story or heard reference to the “Olds model“. In general, O-NET has been presented as a success story, but this is a “success” that needs to be qualified. It’s remarkable that a community network was able wire up a town of Olds’ size with fibre and establish itself as a full-service ISP. But the long-term financial picture has never been clear, given the expense of building the network. O-NET successfully achieved more than many community broadband projects in Canada, but like many such projects, its future depends on local political support. Instead, the last several months have seen political conflict and uncertainty.

I haven’t been to Olds since 2014 (when I helped with a “citizen planning circle” about broadband) or spoken with anyone affiliated with O-NET since 2017. Local media (Olds Albertan/Mountain View Today) have been my way of keeping in touch, and based on the news it appears that after years of fading into the background of local politics, O-NET has recently become a major object of contention. On May 22, 2020, Olds Town Council issued a statement calling in $14 million in loans related to the build, and pushing for increased oversight. Subsequent discussions took place in closed Council sessions as well as through public allegations. Most recently, Town Council has created a broadband investment committee to decide the project’s future. So far the committee has met behind closed doors, and there has been public disagreement over its makeup (specifically, the exclusion of Coun. Mitch Thomson, who has long played an important role in the Olds fibre project).

It wasn’t meant to be like this. In fact, a key reason why O-NET’s relationship with the Town of Olds is so difficult to explain is because of the way that the organizational/ownership structure was meant to provide some independence from Town politics. O-NET is not a municipal ISP in the straightforward sense — it (and its owners) have been a separate organization from the Town. However, the funding required to build the network depended on the Town’s support, which has cut annual funding and called in loans given to the Olds Institute for Community and Regional Development (OICRD) or Olds Institute (OI). This non-profit organization “oversees Olds Fibre Ltd., a for-profit business that owns O-NET“. You see, the for-profit O-NET is really a creation of the Olds Institute (specifically, its Technology Committee), a “non-profit community and economic development organization” that had the Town of Olds as one of its four founding members. The project champions and volunteers that worked on establishing the community network benefited from the support of municipal politicians, but until now the Town limited itself to a supporting role. This included supporting O-NET in 2010 and 2014 when it required an additional $14 million to complete the fibre build — the Town was able to borrow the money from the Alberta Capital Finance Authority and then loan it to OI. O-NET used the funding to get its network built, but thereby became deeply indebted to the Town, which has recently pulled on the purse strings. It’s had to know where all of this will lead, but there is certainly a political effort in Olds to do something different with how the fibre project has been managed. To the extent that the Town’s CAO Michael Merritt would comment on whether there might be a future divestment from O-NET, he blandly stated: “The town is working with Olds Fibre/O-NET to determine the next steps“.

Regardless of what the future holds, anyone bringing up fibre in Olds as community network “success story” should probably add a significant footnote (which is essentially what this blog post should be considered for a chapter in my forthcoming book).

End-to-end versus end-to-end

If you’ve spent any amount of time reading about what the internet is, you will have heard it described as an end-to-end network.  End-to-end architecture is said to be a founding principle of the internet. It has often been invoked in discussions of net neutrality, or network management. It is typically understood to mean that the internet is decentralized: to the extent possible, control is best exercised by the computers at the ‘ends’ of the network, and everything in between (the internet ‘pipes’ and the networking hardware owned by the telecom companies) should be kept as simple (or ‘dumb’) as possible — limited to passing along data packets.

Now, it is possible to argue that the internet has never been decentralized. There are points of control in its architecture, through state institutions, and the ISPs that manage our experience of the internet. But this is only surprising if we lose sight of what the end-to-end argument was meant to be — not a statement of what the internet is, but what it ought to be.  This normative argument was premised on the notion of an authentic, original internet, but it was a fragile notion in danger of being lost, or intermediated out of existence. While the end-to-end principle was not intended as a description of reality, it did come to shape reality through its influence, as many people responsible for designing technologies, running networks, and governing connectivity took it seriously. However, countervailing pressures to move control from the edges to the center of the network will ensure that this will remain a constant point of tension.

The above is an argument largely based on Tarleton Gillespie’s article, Engineering a Principle: ‘End-to-End’ in the Design of the Internet. This article provides some STS-informed discourse analysis of the end-to-end principle, tracing where the term came from and how it has shifted over time. But there was another kind of end-to-end principle that preceded the internet as we know it today, and was became an obstacle to creative uses of telecom networks – a path that led us to the internet.

Gillespie doesn’t dive into this earlier notion of end-to-end, although he does reference some (Bell) sources from the early 1980s that use it to discuss virtual circuits and the need to provide end-to-end digital networking over an infrastructure composed of both analog and digital parts (or over the old telephone system). What is missing from the account is the history of the term before the 1980s, which is the background that these Bell authors have in mind when they discuss end-to-end. This alternate meaning of end-to-end persists today in numerous contexts, to denote roughly: ‘from one end to the other, and everything in between’. In the history of telecom, this meaning has been intimately tied to the notion of system integrity in the monopoly era.

The old end-to-end principle argued that the network operator must have full control of the network, from one end to the other, or that the carrier had end-to-end responsibility for communications. This meant control throughout the network, but also of the endpoints, so that only approved devices could be connected to it. The principle was also sometimes cited to prevent interconnection with other networks. In the US, regulations prohibiting “foreign” attachments to the telephone system were challenged by the 1968 Carterphone decision, but in Canada it took until later in the 1970s to displace the notion of end-to-end control and systemic integrity in telecom regulation.

It is interesting how the internet’s end-to-end principle is not just different from this earlier notion, but in many ways its mirror image. In the mid-20th century the telephone was seen as a “natural monopoly”, because it was best managed by a single provider with centralized, end-to-end control, protecting the system’s “integrity” from any unauthorized elements. The internet in contrast is cobbled together from various network and bits of hardware, its protocols allowing diverse autonomous systems to exchange traffic, and users can plug whatever devices they want into the endpoints.

What happened in between the fall of natural monopoly arguments, and the rise of the packet-switched commercial internet was the liberalization of telecom, effectively opening up the monopoly telcos to competition. The loss of end-to-end control made it possible for the internet to emerge. The old monopolies became incumbents, and had to give up some control over what happened over their wires.

In recent years, with a flatter internet, online intermediaries maintaining more control by deploying their own transport infrastructure, and “middleware” in the network making more decisions, the internet’s end-to-end principle is arguably under threat. But these concerns have been around for most of the history of the commercial internet. Fundamentally, these are issues about who and what exercises control over communication, rather than what the nature of our communication infrastructure dictates. Because of this, tensions about the relative power of the edges versus the middle will continue, but this also means that the history of these arguments continues to be relevant.

Connected across Distance

I’ve recently seen television segments air on news channels that were clearly recorded before the pandemic, but broadcast after, with people happily mingling together and discussing things which no longer seem to matter. While I’m as tired as anyone of the fact that the news has become nearly-24/7 COVID-19, I do find myself feeling partly offended by these portals into a different world (how dare they cheerily show me this reality which no longer exists?) I then find myself wondering if these things will once again matter, one day. As I’m sure many of us are, I know that things will be forever changed by this pandemic, but it’s still unclear how far those changes will go. To what extent will the previous order be transformed, and what has merely been suspended?

I ask this question as I edit a book manuscript that was written before the pandemic, and which will hopefully be published at a time when we have some answers to the above. What of Canadian telecom policy? What of the Canadian telecom industry? How will all of this change what we think about our intermediaries, how they are governed, and how they conduct themselves?

The telecom industry is among the more fortunate ones, given how essential this infrastructure is to anything else that is still operating. Being essential has other consequences however; right now voluntary actions by telecom companies have forestalled additional regulation, but this may not be enough in the medium-term. Wireless companies have suspended data caps, TELUS is giving students in Alberta and BC a deep discount, and no ISP wants to be in the news for disconnecting someone who is cash-strapped and under quarantine, but will government need to step in to guarantee some sort of connectivity safety net? Then there is the ongoing discussion over the role of telecom data for pandemic surveillance, and whether this can serve as a foundation for what ‘normal’ might look like several months from now.

Could the changes be even more fundamental? All of this presumes that capitalism will survive, that the political economy of telecom will largely be preserved, that various foundations of social order will be maintained. What I’m hoping we realize in all of this is how vital infrastructures like utilities, health care, supply chains are, the importance of the people doing the truly essential work we rely upon, along with the value of long-term planning. But we are also being shaken out of our taken-for-grantedness, learning just how fragile social life can be. Many Canadians have had the privilege of assuming certain social structures are permanent, that certain institutions would be there for us. Residents of other countries (along with some Canadians and First Nations) have not shared this sense of security. Now we are all experiencing the same upheaval, manifesting differently as each locale follows its asynchronous path.

As the experience of watching media that was recorded pre-pandemic becomes more rare in the coming months, I wonder if watching it will feel even weirder. Will seeing strangers relax in close proximity fill us with nostalgia or dread? In the meantime, I have a lot to be thankful for, and there’s only so much guessing about the future that is helpful for the here and now.

 

 

Connected Coast

Where the subsea fibre is meant to meet the last mile

For the past few years, one of the most intriguing and exciting public connectivity projects has been underway along Canada’s Pacific Coast. Born of the fusion of two proposals  — one from Prince Rupert’s municipally-owned telco CityWest, and the other from the Strathcona Regional District (the SRD, administered from Campbell River, Vancouver Island) – the Connected Coast project will lay fibre from Haida Gwaii and Prince Rupert, down the Pacific Coast, and around Vancouver Island (map). This is an ambitious build for the parties involved to undertake, with the potential to make a big difference in a lot of places that would otherwise be waiting for broadband. Because of funding constraints, it is on a strict timeline for completion (2021). Assuming everything goes well with the subsea portion, the success will ultimately depend on what happens when the cable reaches the shore.

I visited Vancouver Island in June to attend the final three presentations and meetings the SRD was conducting within its territory: in Tahsis, Gold River, and Cortes Island. Of the three, Cortes is the only area that currently has cell coverage, and availability of wireline depends on where one lives. Right now, internet access is often provided across a backbone of microwave links, or over satellite.

How TELUS keeps Cortes Island connected

As explained at the community meetings, on the SRD’s side the Connected Coast project began from the expressed need for connectivity from local representatives across the District. It became an infrastructure project after the SRD was informed by a telecom consultant that improvements in the last mile could only help as far as there was a backbone available to connect to. Because the funding is limited to building this backbone, the SRD can only facilitate and instigate the last mile, leaving this to other local parties (TELUS was not a participant at the meetings).

Conuma Cable operates in Gold River and Tahsis, where TELUS also has a network.

Basically, the plan is for an open-access, middle-mile network. Open access is a condition of the funding (as it is for many publicly-funded backbones), so in theory, any last-mile ISP will be able to connect to the Connected Coast backbone. The extent to which this happens will depend a lot on local conditions, and given that these are ‘underserved’ communities, this isn’t a retail market with many competitors. If incumbents choose to pursue their own builds independently, independent ISPs (or community networks) will need to step up and find the resources to bridge the last mile.

The Uptown Cafe in Gold River. Great food, and connectivity with your meal!

In addition to talking specifics with local ISPs and letting residents know about the new cable arriving at their shores, this was also an opportunity for local residents to to ask questions, imagine the possibilities that greater access to broadband could bring, and to voice concerns. This was especially the case on Cortes, where the meeting included a number of people strongly supportive what Connected Coast could provide, as well as opposition from those who don’t see connectivity as an unbridled good (including over fears that cell towers and radiation might follow the wired infrastructure). All in all, it was great to have a reason to visit these communities, and development of the Connected Coast project remains one to watch.

Connected Coast Comes To Cortes

BCBA Conference and Artificial Intelligence in Telecom

I’m currently at the British Columbia Broadband Association (BCBA) Conference, reconnecting with perspectives from the industry and the provincial perspective in BC. Of the Canadian events I’ve attended, this is definitely one of my favorites – the local context keeps it at a manageable size, affordable, but still with quality speakers. BC stands out as a province in Canada because of how non-incumbent ISPs have historically organized through the BCBA (for other industry conferences one generally has to travel to Toronto). BC is also remarkable in the attention that connectivity has received from the provincial government. The province is involved primarily through Network BC, which has in recent years focused more on supporting local solutions and supporting communities directly, rather than just partnering with the incumbent. While their materials/resources are intended for BC audience, communities around Canada interested in improving connectivity can find some ideas and inspiration in what Network BC has recently been involved in.

One of the new developments since I was a regular at these industry conferences is that we are now firmly in the midst of the artificial intelligence (AI) wave, where it is basically impossible to attend any policy or industry/technology conference without hearing at least one presentation about the promise of AI. At the BCBA, this presentation was provided by Jim Radzicki from TELUS International, who spoke on AI and the customer experience, including a broad overview of approaches and technologies under the AI “umbrella”.

Here the focus was really on how to better provide “customer care” – addressing customer questions, troubleshooting, and so forth. The application of AI in handling customer calls (including helping human agents handle calls) and acting as chatbots is a natural extension of these technologies, especially if we define AI with reference to the Turing Test (as Radzicki did), which connects the high-tech hype of AI with incredibly dumb chatbots of an earlier era like ELIZA. But AI is far more than just ‘fooling’ someone that they are talking to an intelligent being. As Radzicki also explained, what distinguishes these new technologies is their ability to organize and work with (or ‘learn’ from) vast data sets. And this is where AI already impacts our lives in a variety of ways, whether we know it or not – powering our Google searches, recommendations, or handling various functions behind the scenes. Elsewhere in the telecom industry, AI is touted as a solution for cybersecurity, network management, and traffic management, with companies offering products in these domains (beware however, if it says “AI” on the box, you may just be being sold some clever marketing and little else).

It’s great to be at a broadband conference again, and I will continue working on these topics and writing publications (particularly about community networks and public connectivity projects). Internet governance remains a fascinating domain, but this year I am also beginning a new long-term research project into AI governance: both how AI is governed, and how AI is used to deliver public services, or in the service of public policy. Governments at all levels in Canada are working to implement these technologies (machine learning, automation) with agencies such as TBS trying to guide developments across departments. I look forward to attending more AI-themed talks over the summer, getting a broad perspective of this fascinating set of social transformations, and seeing what stock images of humanoid robots the presenters have selected as illustrations.

All Forms of Competition

Well that didn’t take long. The CRTC just launched a review of mobile services, indicating that it was now in favor of allowing MVNOs. This comes just two days after receiving a new policy direction from ISED Minister Bains, and a period of strain in the CRTC’s arm’s length relationship with the federal Cabinet, which had different ideas over competition policy. After sending two displeasing CRTC decisions back for review, ISED Minister Bains pulled the CRTC’s arm on Feb. 26 and gave the Commission a course correction.

Much to the dismay of former Industry Minister Bernier, the CRTC had come to effectively ignore the old 2006 policy direction, apart from the obligation to mention it in telecom decisions. It had applied further regulation to incumbents and accorded more legitimacy to leased access. The Commission extended mandated access to last-mile fibre networks in 2015, and claimed that forcing incumbents to share these facilities would promote facilities-based competition (somewhere else — the middle mile). The fact that the Commission was required to tie arguments into knots to explain itself, or could redefine the scope of facilities-based competition to justify shared facilities was not particularly bothersome. The only body that could hold it accountable generally approved of these moves. Once the CRTC (under the direction of its new Chairman Ian Scott) began to behave in ways that made Cabinet uncomfortable, revisiting the policy direction became a way to instruct the Commission on how it was expected to behave.

The 2019 policy direction is more open-ended than the small government directive of 2006, signaling that the CRTC can intervene to promote “all forms of competition”. In effect, Cabinet is telling the regulator that it has not been regulating enough, and that competition is still valuable even if it isn’t between competing facilities. Certainly, the CRTC could try to flout the policy direction or interpret it narrowly as it had previously done, but this would escalate the conflict with Cabinet. When sovereignty is contested between the two, Cabinet will eventually have its way. In 2019, the rationality of facilities-based competition has been officially demoted.

The move by Bains has the effect of transforming an inconvenient policy failure into a horizon of possibilities. The vague fantasy that competition would lead to overlapping wires running into every home can be abandoned. In its place, consumer interests and competition have been reaffirmed as core values, but competition can be defined differently than before. Facilities-based competition is still an option, and one evidently favored by the CRTC even as it course-corrected after receiving its policy direction. In the Notice of Consultation for the new review, the CRTC is open to a mix of facilities-based and service-based competitors in wireless, but mandated access is still presented as a temporary stepping stone until “market forces take hold”. But according to Cabinet, all forms of competition are on the table, and facilities-based is not the only officially recognized option.

There’s still no reason to believe this is the first step to the ‘nuclear option’ of structural separation. More likely, the goal of competition policy will end up being something closer to the status quo than the elusive dream of a competitive facilities-based market. This would mean that mandated access won’t end up being presented as a temporary detour, but as the final destination.

2021 UPDATE: The MVNO decision in CRTC 2021-130 did go some ways to establishing a wholesale regime for wireless, but it limits MVNOs to those providers that already operate wireless networks. This effectively limits this form of service-based competition to facilities-based companies — therefore likely having minimal impact on the status quo.

ISPs as Privacy Custodians

Just published in the Canadian Journal of Law and Society (CJLS) is my article on Internet Service Providers as Privacy Custodians (a pre-print version is available here). The content is adapted (and updated) from a chapter of my PhD dissertation, wherein different chapters dealt with different social responsibilities of ISPs in Canada. The focus of this piece is on privacy responsibilities, but these are interrelated with ISPs’ other responsibilities and social roles, such as surveillance. For example, Canada’s Privacy Act was referred to as the “wiretap bill” while it was being debated in 1973, because while it criminalized the invasion of privacy, it also provided a formal legal route through which the police could obtain wiretaps (I particularly enjoyed studying the murky history of wiretapping in Canada for this piece, which I could only include in summary form).

The responsibilities of ISPs to protect privacy directly shape how police can carry out investigations involving subscriber information, how copyright enforcement operates, and the sorts of commercial relationships ISPs can enter into when monetizing subscriber information. I settled on the term “privacy custodians” to describe the role of ISPs in governing privacy for lack of a better one (the term is used in health care, and here I conceive of privacy governance as being broader than managing the personal information of users, encompassing a broader relationship to the public including policy advocacy, public accountability, and privacy education). I’ve been interested in how different ISPs approach the role of privacy custodian, at times through differing interpretations of legal obligations, but also through different kinds of voluntary efforts to go beyond legal obligations. I discuss these by distinguishing positive responsibilities (the responsibility to do something) and negative responsibilities (the responsibility to not do something). I argue that we should pay attention to the ways that ISPs are distinguishing themselves by carving out and asserting new positive responsibilities, but being mindful of the discretion with which they do so, and the pressures to compromise privacy given the growing value of the data that these intermediaries can collect.

The abstract reads:

This article examines the role of internet service providers (ISPs) as guardians of personal information and protectors of privacy, with a particular focus on how telecom companies in Canada have historically negotiated these responsibilities. Communications intermediaries have long been expected to act as privacy custodians by their users, while simultaneously being subject to pressures to collect, utilize, and disclose personal information. As service providers gain custody over increasing volumes of highly-sensitive information, their importance as privacy custodians has been brought into starker relief and explicitly recognized as a core responsibility.

Some ISPs have adopted a more positive orientation to this responsibility, actively taking steps to advance it, rather that treating privacy protection as a set of limitations on conduct. However, commitments to privacy stewardship are often neutralized through contradictory legal obligations (such as mandated surveillance access) and are recurrently threatened by commercial pressures to monetize personal information.

While tensions over privacy and state surveillance have been long-lasting and persistent, in recent years the most interesting developments have been related to the monetization of personal information. Recent news have included the re-launch of Bell’s targeted ads program, and another U.S. privacy scandal involving the resale of location information. Canadian incumbents collaborate on location and identification services through EnStream, which has so far remained relatively quiet and scandal-free, but also introduced a new role into the subscriber-provider relationship. We pay service providers to give us connectivity and some extent of privacy, but these companies are also serving the needs of customers who want information about us.

In short, the internet’s gatekeepers are also the gatekeepers of our identities and activities.

SuperNet 1.0 Post-Mortem

Earlier this month the Auditor General of Alberta released a major report, with a section on contract management at Service Alberta devoted to SuperNet. Several news outlets covered the release, and decided that the section dealing with SuperNet was the most newsworthy, summarizing it as mismanagement of a $1 billion contract.

The report is a bit of a strange read, since it is on the topic of the original SuperNet, discusses it in present-tense, but the relationships and contracts that defined SuperNet 1.0 belong to the past. The Auditor General’s office effectively studied the period between 2005 and 2017, carrying out interviews and analyzing documents in 2017, and completing the audit in January 2018 — long before the last-minute hand-off of the network last summer. One of the report’s findings was that the Government of Alberta had identified risks in the 2018 transition and “incorporated mitigation strategies” into its planning, but the audit did not look at the procurement process or how the transition (to Bell) actually took place, or what is in the new contract.

Service Alberta Minister Brian Malkinson took the opportunity to reassure that GoA has already learned the report’s lessons, and will post the new contract online once Axia’s assets are transferred to Bell. Since Bell reported that it had completed acquisition of Axia back in September, I’m not sure what the reason for the hold-up is. This is the sort of public accountability that was badly missing from the last SuperNet deal. This time, the public is being asked to accept the Minister’s reassurance that it’s all good, while we wait to learn what sort of deal was struck with Bell back in June.

So what did the Auditor General learn about SuperNet 1.0? A lot of the report puts on official stamp on what we already knew — lax oversight, a badly-written contract, Axia doing things GoA thought was inappropriate, but GoA feeling largely powerless to stop them:

“In 2011 the department sought legal advice on potential non-compliance with operator independence requirements. The department then sought additional information from the operator on services provided to third parties… The operator has asserted it is compliant with contract terms and obligations. As a result, again, the parties to the contract did not consistently interpret the terms and conditions”

Or:

“The department attempted to exercise these audit rights in 2015 as a result of a number of contract disputes, including those identified above. The department has not been successful in exercising that right… again because of differing interpretations of contract terms”

The report does offer a kind of explanation for something I’ve wondered since reading the SuperNet 1.0 contracts — what about all the regular reporting that Axia was supposed to provide GoA about how the SuperNet was running?

“We found that, since 2006, the department has not always received this reporting from all contract parties. We also found no evidence of the department routinely requesting this information from the parties. We asked department management why they have not obtained this information as required under the contract. Management stated that they considered the reporting to be more relevant to the initial construction of the network rather than ongoing operations.

Right. I’m guessing that the various ministries responsible for SuperNet over much of this period really weren’t interested or capable in monitoring what was happening with the network. What about when GoA started to have more serious concerns about the contract after 2011? Maybe no one at Service Alberta saw this information as valuable enough to ask for. Maybe some people weren’t aware this was in the contract. Whatever the reasons, I’ll be interested in what kind of accountability is written into SuperNet 2.0, and whether accountability on paper translates to accountability in practice.

So, we now have some more specifics about fundamental issues that  Service Alberta was quite up-front about last year before the contract expired. The SuperNet seems to be chugging along in continuity mode for the time being, and the communities Axia fibred up are still getting the internet they agreed to for the same price. But last summer did not exactly inspire confidence about the SuperNet 2.0 transfer. What was all that last-minute contract negotiation about? What exactly is Alberta paying for right now? Here, the Auditor General can tell us nothing — these sorts of audits can take many months to pull together.

Maybe next year?