Lawful Access Consultation 2016

Another federal government consultation has recently wrapped up, this time with Public Safety asking about national security. Like other ongoing consultations, this one was criticized (for example, by Christopher Parsons and  Tamir Israel) as framing the policy issue in a way that the government prefers, and trying to legitimate some ideas that should have been discredited by now. I would say that the consultation framed the issue very much as Public Safety (for instance, the RCMP) would prefer, repeating old rationales, and seeing the world from a perspective where the ability to exercise sovereign will over information flows is paramount. The Green Paper provided for background reading foregrounds the concerns of law enforcement & security agencies, is peppered with the words “must” and “should”, advancing some dubious assumptions. Public Safety asked for feedback on terrorism-related provisions (including C-51), oversight, intelligence as evidence, and lawful access. The last of these has seen a number of previous consultations, but is back in the news as police make their case for the issue of “going dark” (which has become part of the RCMP’s “new public narrative” for a set of concerns that were once broadly talked about as lawful access).

I let this one get away from me, so I didn’t have anything ready for Dec. 15 when the online submission closed. Regardless, I’ve decided to complete most of the questions related to the topic of Investigative Capabilities in a Digital World as a blog post. I don’t feel particularly bad for missing the deadline, since several of these questions border on ridiculous. For a true public consultation on what has long been a very contentious issue, it would be important for the questions to be informed by the arguments on both sides. Privacy experts would have asked very different questions about privacy and state power, and on a number of topics Public Safety seems to be trying to avoid mentioning the specific policies that are at stake here.

How can the Government address challenges to law enforcement and national security investigations posed by the evolving technological landscape in a manner that is consistent with Canadian values, including respect for privacy, provision of security and the protection of economic interests?

When I think of Canadian values, “privacy, provision of security and the protection of economic interests” are not what come to mind. When I ask my students what they associate with Canada, these particular values have never come up in an answer. I think we should consider democracy as a fundamental value, and understand that state secrecy is antithetical to democracy. When it comes to the relationship between citizens and the state, Canadian values are enshrined in the Charter, and the Supreme Court is ultimately responsible for interpreting what is consistent with the Charter. Therefore, Canadians deserve to understand what is being done in their name if we are to have a meaningful democracy, and this includes the existence of an informed, independent judiciary to decide what government actions are consistent with Canadian values.

In the physical world, if the police obtain a search warrant from a judge to enter your home to conduct an investigation, they are authorized to access your home. Should investigative agencies operate any differently in the digital world?

If we accept the digital/physical distinction, the answer is a definite yes — investigations carried out today operate differently than they did in the simpler, more “physical” 1980s. But it is important to keep in mind that analogies between the digital and physical environment can be misleading and dangerous. When it comes to the “digital world”, I prefer to talk about it in digital terms. The stakes are different, as are the meaning of terms like “to enter”. If we must make these comparisons, here is what treating these two “worlds” as analogous would mean:
The police can enter my home with authorization, and seize my computer with authorization. I am not required to make my computer insecure enough for the police to easily access, just as I am not required to keep my home insecure enough for the police to easily access. I am not required to help the police with a search of my home, and so I should not be required to help police search my computer. If I have a safe with a combination lock in my home, I cannot be compelled by police to divulge the combination, so by analogy, I should not be compelled to divulge a password for an encrypted disk.

But analogies can only take us so far. A computer is not a home. Metadata is not like the address on a physical envelope. We need to understand digital information in its own terms. To that end, some of the more specific questions found further in this consultation can produce more helpful answers. Before we get to these however, this consultation requires me to answer a couple more questions based on the presumption of digital dualism.

This question is hard to answer without knowing what it means to “update these tools”, and seems to be intended to produce a “yes” response to a vague statement. Once again, digital/physical comparisons confuse more than they clarify — these are not separate worlds when we are talking about production orders and mandating the installation of hardware. We can talk about these topics in their own terms, and take up these topics one at a time (see further below).

If we could only get at the bad guys in the digital world, but there's all this code in the way!
If we could only get at the bad guys in the digital world, but there’s all this code in the way!

Is your expectation of privacy different in the digital world than in the physical world?

My answer to this question has to be both yes and no.

No, because I fundamentally reject the notion that these are separate worlds. I do not somehow enter the “digital world” when I check my phone messages, or when I interact with the many digitally-networked physical devices that are part of my lived reality. Privacy law should not be based on trying to find a digital equivalent for the trunk of a car, because no such thing exists.

Yes, expectations of privacy differ when it comes to “informational privacy” (the language of Spencer), because the privacy implications of digital information need to be considered in their own terms. Governments and public servants do Canadians a disservice with phonebook analogies, license plate analogies, or when they hold up envelopes to explain how unconcerned we should be about government access to metadata (all recurring arguments in the surveillance/privacy debate). In many cases, the privacy implications of access to digital information are much more significant than anything we could imagine in a world without digital networks and databases of our digital records.

Basic Subscriber Information (BSI)

 

As the Green Paper states, nothing in the Spencer decision prevents access to BSI in emergencies, so throwing exigent circumstances into the question confuses the issue, and once again seems designed to elicit a particular response that would be favorable to police and security agencies. In the other examples, “timely and efficient” is the problem. Agencies understandably want quicker and easier access to personal information. The Spencer decision has made this access more difficult, but any new law would still ultimately have to contend with Spencer. Government, police, and security agencies seem to be in a state of denial over this, but barring another Supreme Court decision there is no going back to a world where the disclosure of “basic” metadata avoids section 8 of the Charter, or where private companies can voluntarily hand over various kinds of personal information to police without fear of liability.
If the process of getting a court order is more onerous than police would like, because it would be easier to carry out preliminary investigations under a lesser standard, it is not the job of government to find ways to circumvent the courts. If the process takes too long, there are ways to grant the police or the courts more resources to make it more efficient.
There are ways to improve the ability of police to access metadata without violating the Charter, but any changes to the existing disclosure regime need to be accompanied by robust accountability mechanisms. Previous lawful access legislation (Bill C-30) was flawed, but it at least included such accountability measures. In their absence, we only know that in a pre-Spencer world, police and government agencies sought access to Canadian personal information well over a million times a year without a court order, and that a single court order can lead to the secret disclosure of personal information about thousands of Canadians. Police and security agencies have consistently advocated for these powers, but failed to document and disclose how they actually use them. This needs to change, and the fear of disclosing investigative techniques cannot be used to prevent an informed discussion about the appropriateness of these techniques in a democratic society.
Do you consider your basic identifying information identified through BSI (such as name, home address, phone number and email address) to be as private as the contents of your emails? your personal diary? your financial records? your medical records? Why or why not? 
The answer to this question depends on an exhaustive list of what counts as BSI. It is important to have a clear definition of what counts as BSI, because otherwise we might be back in the pre-Spencer postion where police are able to gain warantless access to somebody’s password using powers that were meant for “basic identifying information”.
The answer to this question also depends on an explanation of what is done with this “basic” information. As was recognized in Spencer, we can no longer consider the privacy impact of a piece of personal information in isolation. This is how lawful access advocates prefer to frame the question, but this is not how investigations work in practice. BSI is useful only in combination with other information, and if we are talking about metadata (a term that curiously, never appears in the Green Paper) it is now increasingly-understood that metadata can be far more revealing than the content of a personal communication, when it is used identify people in large datasets, determine relationships between individuals, and patterns of life.
So in short, yes — I am very concerned about BSI disclosures, particularly when I don’t know what counts as BSI, and what is being done with this information.
Do you see a difference between the police having access to your name, home address and phone number, and the police having access to your Internet address, such as your IP address or email address?
I see an enormous difference. As previously discussed, these are not analogous. An IP address is not where you “live” on the internet — it is an identifier that marks interactions carried out through a specific device.

Interception Capability

This is not a question… Yes all of this is true.
Should Canada’s laws help to ensure that consistent interception capabilities are available through domestic communications service provider networks when a court order authorizing interception is granted by the courts?
The key word here is “consistent”, and the question of what standard will be required. It would be very easy for government to impose a standard that large telecom incumbents could meet, but which would be impossible for smaller intermediaries. As things are, the incumbents handle the vast majority of court orders, so I would love to see some recent statistics on problems with ‘less consistent’ intermediaries, particularly if this is a law that might put them out of business.

Encryption

I think the answer to this has to be never. People cannot be forced to divulge their passwords — in our society they can only be put in prison for very long periods of time. In other cases, assisting with decryption means forcing Apple to break through their own security (which was meant to keep even Apple out), or driving companies out of business unless they make products with weak security. This does not work in a world where a single individual can create an encryption app.

How can law enforcement and national security agencies reduce the effectiveness of encryption for individuals and organizations involved in crime or threats to the security of Canada, yet not limit the beneficial uses of encryption by those not involved in illegal activities?

By doing anything other than mandating insecurity for everyone. The answer cannot be to make technology insecure enough for the state to exploit, because this makes everyone insecure, except for those who use good encryption (which has become too commonplace to stamp out).

 

The final two questions deal with data retention, a topic I’ll leave for a later time…

Differential Pricing

diff

The CRTC recently concluded its differential pricing (or net neutrality) hearing. If you weren’t glued to CPAC earlier this month, you can check out the transcripts while we wait on the Commission’s decision. Like any regulatory issue before the CRTC, this one has a long history. The hearings included several mentions Canadian Gamers Organization’s complaint against throttling of certain types of traffic associated with gaming, and the related ITMP regime that developed out of complaints that certain peer-to-peer traffic (like BitTorrent) was being throttled. These cases clearly implicated net neutrality, because they involved ISPs treating some kinds of traffic differently than others, making certain applications perform worse. The CRTC took a dim view of this sort of discrimination unless ISPs could justify its necessity. For example, blocking ‘malicious’ traffic (like DDoS) is acceptable under the ITMP regime because the reasons are deemed valid, but torrenting shouldn’t be blocked just because it is sometimes used to infringe copyright. In an alternate world of net neutrality absolutism we might have ended up with a regulatory regime under which all traffic is protected, and ISPs are legally prohibited from mitigating the sorts of DDoS attacks that have been knocking many services offline in recent years. However, most net neutrality advocates would not support such an extreme interpretation. Under the existing regulatory regime, Canadian ISPs can intervene when they can justify the need, but are not generally allowed to give some kinds of traffic preferential treatment over others.

Most recently, the question has been whether the practice of pricing certain types of traffic differently than others amounts to a similar kind of discrimination. For this, we owe a debt to Ben Klass, whose 2013 complaint (while he was an MA student in Manitoba) got the ball rolling. Klass is one of a small (but growing) number of individuals who have participated in a regulatory process that was really designed to serve the institutions that are being regulated (the ISPs). His work is a great example of how a regulatory system that depends on parties coming forward with complaints fails, when the stakeholders (ISPs) who are meant to come forward don’t want to complain, even though the issue is of public policy importance. Differential pricing is clearly an important public policy debate to have, and the CRTC has recognized as much with the recent hearings.

While an ISP may treat traffic related to Netflix, YouTube, and CraveTV the same way, if two of these services count against a subscriber’s data cap while one does not, then that is a form of differential pricing (known as zero-rating). I may be able to watch Netflix or YouTube without buffering, but if an ISP makes Netflix zero-rated, I will end up paying more at the end of the month if I watch YouTube and exceed my cap. In this example, distinctions are being made about traffic passing through these networks, and they will presumably affect the behaviour of subscribers. The ethical dimensions of these discriminations become clear in situations where ISPs start favoring services in which they have an interest, or when money starts changing hands between companies so that ISPs treat certain applications more favorably than others. Instead of blocking content, an ISP might simply make it unaffordable, with roughly the same effect.

Many ISPs (large and small) have argued that these policies are not nefarious attempts to control subscriber behaviour, but are all about offering choice to consumers, and differentiating themselves from their competition. Some have continued to claim these discriminations are about managing network congestion (much like the old rationale for throttling BitTorrent), but this argument took a beating at the CRTC hearings and isn’t likely to be very convincing. There are good business reasons why you might want to offer customers different options, including unlimited use of a particular app. However, if an ISP is concerned about the amount bandwidth people are using, zero-rating certain services and imposing caps on the rest seems like a silly way to address the problem.

The CRTC’s forthcoming decision has to grapple with some tough questions, and some easy ones. Vertically-integrated companies using internet pricing to discriminate against competing services has analogies with common carriage in the railway/telegraph era, and feels like the sort of unjust discrimination the CRTC is meant to prevent. But if we are going to accept the existence of data caps (and not everyone agrees we should) then should it be a matter of principle to subject all traffic to the cap? If we can discriminate against malware, maybe we can discriminate in favor of security updates, by zero-rating them, or zero-rate access to essential government services. Without data caps, these become non-issues, but a world without caps would have its own issues (which wireless and satellite providers are well aware of).

It’s times like these I don’t envy the regulator’s job.

Finally, we should remember that differential pricing, just like interventions against malicious traffic, presumes monitoring to accurately distinguish different applications and data usage. The ISP does not need to know exactly what subscribers are doing online, but it needs to be able to tell when subscribers are using a zero-rated service. Unless the ISP is somehow relying on the app to provide this information, this means using DPI technology to inspect and categorize traffic. For better or worse, differential pricing is part of the process of intermediation, in which ISPs play a growing and more refined role in governing our digital flows.

Decision Time for the Future of Canadian Content

digicanconWe’re still in the middle of public consultations on what seems like every domain of policy for the federal government, and that includes cultural policy. Canadian culture has a fascinating history, particularly as seen through various efforts over the years to shape, manage, and protect it. Before the Second World War, English Canada’s cultural identity was lodged firmly in the British Empire, and efforts to shape culture were targeted at groups who didn’t fit the mold, like First Nations. During and after the war, the state became involved in creating a national identity and a distinctly Canadian national culture, independent of Britain, and often in opposition to the cultural threat posed by the media industries of the United States. A variety of institutions were directed to this task, including the NFB, CBC, Canada Council for the Arts, CRTC, and the complex set of agencies that administer what we might call the Canadian content (CanCon) regime (though calling it a regime might suggest more coherence than is actually the case).

By the time the twentieth century ended, and the internet was opened to the public, efforts to actively shape Canadian culture into some prescribed form had largely been abandoned. Instead of creating a particular national identity, or telling a national narrative, the concern shifted to supporting CanCon creators and ‘telling Canadian stories’, whether those stories were Trailer Park Boys or Anne of Green Gables. However, this meant that justifications for government’s role in promoting Canadian culture were often on fairly thin grounds — attracting or retaining cultural industries, or making sure the characters we saw in popular culture were ones we could relate to. In the 1990s, sweeping discussions of internet policy (such as the Information Highway Advisory Council reports) were still based on the assumptions of cultural protectionism — that we should find ways to promote and protect Canadian culture on the “information highway”, since international media flows were a threat to cultural sovereignty.  In the end (and unsurprisingly given its composition), IHAC ended up divided on this topic, and didn’t support an ISP tax or any radical measures like a cultural firewall in its final report. Subsequent discussions of online cultural policy have been more limited, including debates over tariffs, copyright legislation or how the CRTC should classify over-the-top services like Netflix.

Online services have largely avoided being subject to CanCon regulation, and according to Canada’s pollsters, the country’s population isn’t keen to change this (and this is especially true for people under 35). We’ve gotten quite used to our tax-free Netflix, and extending the scope of internet regulation is not an easy sell. I think many Canadians have no idea this whole world of content regulation even exists — I certainly had no clue until I started volunteering at the campus radio station, and even then it’s not like anyone sat me down to explain the purpose of this system.

Canadian culture, like telecom, is a domain of public policy that is governed by several (sometimes overlapping) federal departments. The Department of Canadian Heritage is largely responsible for culture, but the CRTC  regulates telecom and broadcasting in Canada, administering their respective acts. This puts the Commission in charge of two rather different sets of priorities for what is often the same media infrastructure, one of which includes cultural promotion and protection.

In an age when telecom meant telephones, and broadcasting meant television and radio, the two really did appear to be distinct categories, but that appearance has faded, and so there have been numerous calls to somehow revise or consolidate these mandates. As former CRTC Commissioner Denton writes, there is a contradiction between the two statues: “The Broadcasting Act says ‘go forth and discriminate in favour of Canadian programming’. The Telecommunications Act says ‘thou shalt not discriminate among signals except for very good reason'”. One ensures that intermediaries do not give preferential treatment to content without good reason, while the other sustains the privilege of a particular class of content — Canadian content (CanCon). Integrating cultural and telecom policy would be no easy feat, and bring about some dramatic changes depending on what we wanted to prioritize.

CanCon requirements were recently lessened and focused for broadcasters, and there has been no extension of these obligations for online providers. The spectre of such a move (presented by critics as a ‘Netflix tax’) is something politicians have generally fought against in this country rather than championed. In the last federal election Stephen Harper presented himself as just a regular, Netflix-loving dude, and also the only thing standing between Canadians and higher monthly bills.

harp

Under the Harper government, it was clear that the CRTC was powerless to go after Netflix even if they had wanted to. When the company defied the Commission in 2014, all Chairman Blais could do was act upset and offended, with Netflix effectively calling his formal authority a bluff. The Liberals have yet to fulfil the Harper prophecy of regulating Netflix, but Canadian media has recently been open to public consultation (and some quasi-public discussions), with our Heritage Minister declaring that “everything is on the table“.

In general, the government might choose to extend Canadian cultural policy online, or it might pull back the CanCon regime. We might even see a bit of both, but definitely not extending to some sort of cultural firewall around a sovereign Canadian internet. The Government of Canada recognizes that “The way forward is not attempting to regulate content on the Internet“, and consultations are focused on how to support the production of Canadian culture in a “digital world”, with the real questions being who will benefit from this support, and how we are going to fund it.

Personally, I’m not opposed to public funding for culture, but I also don’t see it as a requirement in many cases. I value broadcasters like the CBC for helping me understand Canadian society, primarily through news and documentary rather than dramatic or comedy series (in that respect there isn’t much of CBC TV that I would be sad to see go, while there are still a number of good radio programs). Many kinds of art are not capital-intensive, and will be produced whether or not there are government programs in place. The fact that I came up in a thriving Canadian music culture operating largely independent of copyright or cultural funding no doubt shapes my thinking in this regard (a story I might share another time). The nightmare scenario for me is not about losing out to other cultural markets, fewer jobs in Canada’s cultural industries, or artists no longer being able to sustain careers as they once were. Culture is dynamic, and the most exciting forms come from below rather than top-down. Cultural protectionism makes the most sense if we think of culture as expensive mass media, individuals as consumers of culture, and U.S. cultural industries as a threat to Canada’s cultural sovereignty. But artists will make art everywhere, some of these artists will be Canadian, and we may or may not end up with some sense of Canadian identity as a result.

I am more worried about the possibility that living in an information-rich world will also mean being ignorant about local events, and no one being rewarded for answering the sorts of questions that powerful interests in this country would rather not hear asked. Perhaps a public broadcaster can be well-resourced and independent enough to play this role, but in an ideal world this wouldn’t just be the CBC’s responsibility. I rely on journalism and related media that tell me what is happening in the world in order to actively participate in democracy. I rely on it to do my job (which often involves classroom discussions of Canadian society). I can do without other kinds of CanCon.

I’m also in support of public funding for indigenous cultural programs. For most of Canada’s history the state has tried to eradicate indigenous culture, systematically resocializing children in residential schools, banning ceremonies, leaving behind broken communities and cultural dead-ends. The damage done by this cultural policy is hard to calculate but still ongoing. Its victims include young indigenous people who are unable to situate themselves in Canadian society because it does not speak to them, but also lack a cultural understanding of their own because it has been extinguished in previous generations. Some of this damage is irreversible, but in many cases knowledge, practices, culture can be recovered, preserved, and kept alive. The least a Canadian cultural policy could do is to try to address some of these wrongs and support efforts within First Nations communities to meet these cultural needs.

These are my opinions, so let the government know what you think before November 25. Otherwise, the voices that may be heard loudest are the ones that are most invested in the existing (and often declining) regime of cultural production.

On Standards

I’ve been thinking about standards and telecom, or more specifically, the process of standardization. Recently, I read an article by Timmermans and Epstein that tries to advance a “Sociology of Standards and Standardization“. As the authors explain, there is a great deal of sociology that deals with standards in different domains, or as part of other processes, such as classification, quantification, and regulation. But “relatively few scholars analyze standards directly” (p. 74) — for instance by studying standardization as a social phenomenon.

Drawing significantly from Bowker and Star,  Timmermans and Epstein define standardization as “a process of constructing uniformities across time and space, through the generation of agreed-upon rules… [making] things work together over distance or heterogeneous metrics” (p. 71). Standards can coordinate “people and things in ways that would be difficult to achieve on an ad hoc basis, they may allow communication between incompatible systems, and they may create specific kinds of mobility, uniformity, precision, objectivity, universality, and calculability” (p. 83). While we often aspire to (have) standards, the authors point out how standardization typically carries negative connotations of uniformity and “dull sameness” (p. 71). And yet, we only have to look to the internet to see the vast creativity and heterogeneity that has been enabled through standardization. Diverse networks, systems and devices can communicate with one another because they agree on basic standards and protocols. If the experiences we have through the internet are trending towards uniformity and sameness, this says more about the concentration of power in certain platforms, algorithms, and service providers than standardization.

Timmermans and Epstein’s article doesn’t discuss the internet, but scholars of internet governance have often focused on standards as the internet’s core. Inspired by Deleuze’s (1992) Postscript on the Societies of Control, Galloway’s (2004) Protocol grapples with the contradictions of internet standards being forms of power and control, while also facilitating autonomy, decentralization, and local decision-making. The protocols Galloway is interested in are the “standards governing the implementation of specific technologies” (p. 7), and he holds up DNS as the “most heroic of human projects” (p. 50). Galloway goes some way in advancing social theory on the basis of standards, painting a complex picture, but one in which the “full potential” (p. 122) of protocol is restricted, and channelled instead by law, government and corporate power. He ultimately envisions an even darker future where open-source standards and TCP/IP are replaced by something more proprietary, either under the control of states or a corporation  (which in 2004 is naturally assumed to be Microsoft).

While corporate and state power over internet policy has intensified in the intervening years, and old principles like end-to-end architecture sound increasingly idealistic, the internet’s established standards-making bodies continue their work, and often do so in the open. In general, internet standards are voluntary, and internet protocols work because networks agree to use them. The IETF acts as a key standards-making organization, where individuals (sometimes employees of rival companies) collaborate to develop new proposals for improving how the internet operates. Galloway (2004, p. 122) paints a picture of a “technocratic elite [that] toils away, mostly voluntarily, in an effort to hammer out solutions to advancements in technology”.

ietfThe technocrats of the IETF toiling away

Anyone can participate in this “technocratic elite”, and at the IETF your membership is defined by your participation. But because of the technical understanding required, membership tends to be limited to a particular social class. Meaningful participation also requires a time commitment, and so bodies like the IETF often see the greatest participation from individuals with employers who are willing to support their activities. Organizations can benefit from being part of the standards-making process, but participation in the IETF is on an individual basis (with individuals often disclosing their organizational affiliations).

The IETF produces RFCs, but has no power to compel anyone to adopt these standards. Many are ignored, or (like IPv6, which Laura DeNardis wrote about in 2009) are only slowly implemented long after the problem they are meant to solve is well-known. So the actual making of standards is just one aspect of standardization, and achieving voluntary adoption is actually a bigger challenge, particularly when things seem to work ‘good enough’ as they are.

While government agencies have relatively little impact on internet standards, they do produce various kinds of standards for service providers operating within their territory. These may be voluntary ‘best practices’ or backed by regulatory law. Sometimes companies ‘voluntarily’ standardize their conduct, under threat of government regulation. Standardization can even be pursued through criminal law, as when previous Liberal and Conservative governments in Canada tried to pass lawful access legislation, standardizing surveillance and disclosure responsibilities for intermediaries. More recently, standardization has become a frequently-suggested means of addressing cyber security, but I’ll save these topics for a subsequent post.

 

Canada’s Cyber Security Seeks Public Input — Here’s Mine

cybThe Government of Canada is carrying out a public consultation on cyber security. Specifically, the consultation is being administered by Public Safety Canada’s National Cyber Security Directorate (NCSD). NCSD’s role is sometimes described as cyber policy and coordination, such as designing and implementing Canada’s Cyber Security Strategy, and the consultation asks for the public’s help in addressing some really thorny cyber security challenges.

On its face, it’s hard to know what to make of this consultation. PSC/NCSD wants to hear from “experts, academics, business leaders, and provincial, territorial and municipal governments” on the topic, but they also want “all citizens to get involved in a discussion about the security and economic dimensions of Canada’s digital future.” There are four main topics the government is consulting on, and a workbook has also been created to accompanies the process. The workbook breaks the consultation down into trends, themes, and related questions for consideration, but the contents seem designed to steer answers in particular directions, and the one topic that doesn’t include any specific questions is Canada’s “way forward”, the outlines of which seem to have already been decided.

Some of the questions in the workbook are ones that I imagine Government would love an innovative answer for (How can public and private sector organizations help protect themselves from cybercrime… and what tools do they need to do so?), while others seem loaded to produce a particular response (with “example” answers provided). I only hope that the responses to this consultation won’t be quantified as statistics (since this isn’t a methodologically-sound survey), or used to support decisions that have already been made. So let’s give them the benefit of the doubt and assume that NCSD really does want some help from Canadians in dealing with one of society’s most important challenges, and they’re open to all sorts of ideas.

To that end, I’ve provided my response to the consultation’s four “topic areas”:

The Evolution of the Cyber Threat
I think a lot of this has been covered in broad strokes by Canada’s Cyber Security Strategy and related documents. The threat has certainly evolved, in terms of actors, motives, and potential harm. State actors are increasingly involved around the world, and there are dedicated industries of criminals profiting from vulnerabilities. The most interesting way that I think the cyber threat has evolved in recent years is a recognition of the Five Eyes (Canada’s alliance with the US, UK, Australia and New Zealand) as a security threat. This recognition has certainly not come from the Canadian government, or even much of the Canadian population (as we really have yet to talk about this issue). Instead, the changing nature of the threat has been expressed most publicly by the likes of Microsoft and Google, after they learned through the Snowden documents that the Five Eyes were compromising their infrastructure and the relationships of trust these companies have established with their users.

The Increasing Economic Significance of Cyber Security
I don’t consider this to be much of a topic in need of public consultation, since it seems like Public Safety is already aware that cyber security is vital to the economy. It’s hard to put a dollar value on security, but it’s pretty obvious that the value of maintaining information security and the “losses” that result from various kinds of threats are enormous. Huge numbers are estimated and cited to justify the need for cyber security,  and I’m not sure that we need more accurate numbers (since we know they’re big), or that bigger numbers will compel action. We can talk about how better to communicate the seriousness of the issue, but I’m more interested in finding perspectives other than the economic lens to talk about threats. Government ideas about the value of the internet in Canada too often lapse into talk of the “digital economy”, and harms that don’t involve children are often expressed in economic terms. As people like Ron Deibert point out, we need to think more about the democratic/political dimensions of cyber security. This means articulating the value of connectivity in a way that doesn’t translate into dollars, but instead relates to our values as Canadians (like those “rights and freedoms” mentioned at the end of the workbook).

The Expanding Frontiers of Cyber Security
While the workbook discusses this in terms of the need for “cyber security [to] evolve at the same rate as new technologies” (p. 17), I want to use this topic to discuss the expanding scope of cyber security.

cyber

The workbook defines cyber security as “the protection of digital information and the infrastructure on which it resides. Cyber security addresses the challenges and threats of cyberspace in order to secure the benefits and opportunities of digital life” (p. 5). The first part of this definition is relatively straight-forward, and encompasses the domain of IT security. However, cyber security is not limited to these concerns, and Canada’s closest allies have used the language of cyber security to justify creating and preserving technological vulnerabilities in the service of strategic objectives. Meanwhile, it seems that Public Safety Canada considers “threats of cyberspace” to include more than just threats to digital information and infrastructure.

Internationally, cyber security now includes a variety of concerns, including over public order and morality. For instance, in Canada cyberbullying is sometimes listed as a cyber security threat alongside phishing and malware (particularly in Get Cyber Safe resources). Cyberbullying can certainly involve personal information being compromised, but it can also refer to the hateful and abusive comments found in many online media. The danger is that cyber security can be equated with online “safety”, which can mean safety from content that might insult, harm, or disturb.

The more concerning expansion of cyber security is as a justification for whatever actions serve national security or the priorities of state agencies. This is a worry because the goals of some state “partners” in cyber security are not to provide the public with the most secure technologies. In the US for instance, secret efforts to make commercial technologies (the same technologies widely used by Canadians) more vulnerable and less secure were justified as part of an ostensibly-defensive cyber security program (the CNCI). As discussed below, there is no reason to believe that Canadian agencies are an exception to the same tendencies demonstrated by their closest international allies in cyber security.

One the few things that all cyber security threats have in common is that they all involve a computer, or digital networks. Since we are supposedly moving towards a world covered in networked computers, the potential for cyber security’s expansion is a major cause for concern. I feel a lot more comfortable talking about information (IT), network, or computer security, because at least there the subject matter is relatively defined. Cyber security is more of a mixed bag, and I hope that the Government of Canada will keep the expansionist tendency of cyber security in check. Focus on the threats we know and are having difficulty defending against, don’t go looking for new forms of troublesome conduct involving a computer that can be listed as a cyber security threat, and let’s talk about whether the government’s idea of cyber security includes purposefully maintaining certain kinds of insecurity.

Canada’s Way Forward on Cyber Security
As part of Canada’s way forward, we need to take an explicit position on the extent to which we want to promote information/IT security at the expense of other conceptions of security, particularly those  favored by police and national security agencies. It seems disingenuous to promote the security of information and infrastructure, without acknowledging the limits that government agencies are comfortable allowing such developments. Police in Canada and around the world are well aware of this conflict, particularly after the Snowden revelations led to widespread adoption of more secure technologies, which are now an obstacle to their ability to investigate crime. The recent showdown between Apple and the FBI is a recent manifestation of this tension, and Canada should not simply sit on the sidelines and wait for these new “crypto wars” to play out in the US and Europe.

We also need to discuss our membership in the Five Eyes, because Canadians have never had a real opportunity to do so. Predicated on a secret treaty, the Five Eyes often acts as a coordinated group and an exclusive club, supposedly based on its members’ “common Anglo-Saxon culture, accepted liberal democratic values and complementary national interests”. Originally formed to further intelligence collection and the sharing of information in the interests of national security, today the Five Eyes also includes collaboration of a more defensive nature in the realm of cyber security. We know that Canada’s membership in the Five Eyes can be a privacy threat to Canadians, because of last year’s revelation that CSE had for years violated the law by sharing Canadians’ personal information with these allies. We know that the Five Eyes can pose a security threat to our information infrastructure, because of documents revealed by Edward Snowden showing how the NSA worked to weaken the security of commonly-used systems in order to more easily obtain intelligence (efforts in which Canada appears to have been complicit).

In the US, the Snowden disclosures resulted in the President’s Review Group on Intelligence and Communications Technologies recommending the separation of the NSA’s offensive and defensive roles, through the creation of a new agency to take over the NSA’s defensive “information assurance” mission. Canada has yet to acknowledge the contradiction at the heart the Five Eyes – where government agencies work simultaneously (or at cross-purposes) to both secure infrastructure and make it more vulnerable. In the US, the NSA is currently merging its offensive and defensive capabilities. This NSA reorganization contradicts the recommendations of the President’s Review Group, strains trust with non-government partners, but is at least being openly acknowledged and discussed. In Canada, a similar process of merging offensive and defensive capabilities may very well be underway at CSE, but this is just what we can deduce from five-year old Snowden documents, and the government’s position on this topic is limited to CSE’s statements about the same news story.

Can the Canadian government be a trusted partner in cyber security when it has never even acknowledged its role (or the conduct of its closest allies) in making information infrastructure less secure? Is it permissible to have one cyber security agency (CCIRC) responding to threats and vulnerabilities, some of which may have been created or kept secret by CSE and its Five Eyes allies? These are not hypothetical questions — just last week CCIRC issued an advisory to correct a vulnerability that the NSA had likely exploited for over a decade. If the attributions of security experts are correct, this means that the Canadian public is being notified about a security vulnerability that was kept secret and exploited by our closest cyber security ally, and we are learning about it through foreign actors whose motivations are unknown, but presumably do not include a desire to make our infrastructure more resilient.

Certainly, most Canadians have more to fear from more mundane threats, like phishing, ransomware, and others listed as part of the government’s consultation. But I wanted to focus on the Five Eyes because these are precisely the sorts of blind spots that need to be uncovered through public consultation. If government agencies will not acknowledge this threat, either because of secrecy or the failure to recognize what those outside government perceive, then it becomes the responsibility of Canadians to point out how the government’s version of reality is different than the one we are reading about in the news. However, at that point we are no longer having a shared discussion of cyber security, but two parallel discussions, with very different ideas of what constitutes a cyber threat.

These tensions at the heart of cyber security are not going anywhere, but by acknowledging them, the Government of Canada can at least take an explicit policy position, rather than the implicit one we can deduce from its former conduct. The Government of Canada has already taken the historic step of suspending metadata sharing with the Five Eyes until it is confident that this no longer threatens the privacy of Canadians. Before Canada resumes its full participation in a secretive alliance that works to both strengthen and weaken the security of systems we depend on, we need a stated position on such conduct. Specifically, are security vulnerabilities ever acceptable or desirable? Is it ever appropriate for government agencies such as CSE and the RCMP to use vulnerabilities that might otherwise be disclosed and corrected? What should we do when our closest cyber security allies are repeatedly found exploiting vulnerabilities and weakening security?

In response to the last of these questions, I would answer that Canada needs to either openly declare its support for government efforts to compromise security, including any limits or conditions, or it needs to publicly oppose these efforts. Only by working to strengthen IT security against all threats can the Government of Canada be a trusted partner in cyber security. To take no position at all by failing to acknowledge the issue is untenable, will weaken trust in government, and will continue the post-Snowden bifurcation of security into two separate discussions — one that includes government as a partner and one that does not.

Watching Six Years of the Regulatory Blockbuster

Regulatory

I spent a weekend re-watching videos of the “Regulatory Blockbuster” from the yearly Telecom Summit. The Summit is a major industry get-together, taking place over a couple of days in a Toronto convention center, with presentations, networking and deal-making opportunities for significant players in the telecom industry. I’ve only been able to attend once, but luckily if you’re interested in public policy the most interesting event at the Summit can be streamed on CPAC, where you can watch the last six years of the Regulatory Blockbuster.

The Blockbuster features an hour and a half of telecom industry lawyers (typically from incumbents Bell, TELUS, Rogers, a smaller provider or two such as WIND or TekSavvy, and John Lawford from PIAC) discussing the regulatory issues of the day. In previous years, each participant would get a few minutes at the outset to present what they thought were the most important regulatory topics, followed by questions from the moderator (cartt.ca‘s Greg O’Brien) or the audience. Sometimes the discussion gets a little heated, and it’s worth remembering that the people on the stage can be embroiled in disputes with one another at the CRTC or before the courts. It’s common for participants to point out the self-serving nature of rivals’ arguments, to allege hypocrisy or inconsistency, and to present themselves as disadvantaged victims of government regulation (or lack thereof).

As an observer, it helps to understand the underlying conflicts and regulatory proceedings being discussed. However, even without knowing the nuances of CRTC procedure or regulations, the Blockbuster provides a sense of what kinds of issues are keeping industry lawyers occupied. It’s also an opportunity for participants to air their complaints with the existing regulatory regime.

The Mandated Access Regime

In the previous five years, the dominant issue at the Blockbuster has been how government regulates relationships between competitors in the industry, specifically through mandated access to incumbent facilities and wholesale connectivity. Other regulatory issues come and go as they pass on and off the federal government and CRTC’s agenda — lawful access, spectrum auctions, reviews of basic services. But mandated access has endured and expanded since the late 1990s, causing no end of complaints from both incumbents and the smaller competitors it is intended to benefit. In 2010, participants in the Blockbuster offered some analogies of how we might understand obligations under the regime. One likened it to a system in which airlines must reserve a certain number of seats for passengers of competing airlines, or parcel delivery companies are obliged to deliver the parcels of smaller competitors. In Canadian telecom, these obligations generally mean that the large incumbents (including Rogers, Bell, Shaw, TELUS) must allow “independents” (TekSavvy, Distributel, and many other smaller players) to use incumbent infrastructure and to purchase wholesale connectivity at set rates. These rates are meant to ensure that incumbents can profit from this arrangement, but the result is a system where small providers depend on large providers, and both compete for the same customers.

The conflicts that result are quite predictable. Small players argue that wholesale rates are too high for them to compete or expand their business, while large players argue the rates are just right, too low, or that mandated wholesale should be eliminated. Because the so-called independents are actually highly dependent on incumbent infrastructure, they must rely on their larger competitors to connect customers and resolve technical issues, such as network outages. Incumbents are therefore obliged to help their smaller competitors address customer concerns, and complainants at the CRTC have argued that incumbents treat competitors’ customers differently than their own.

From the outside, the whole setup looks ridiculous — as if it was designed to impose contradictory pressures and inevitable conflict amongst industry players (as well as endless proceedings before the CRTC). But to understand this regulatory regime, we need to consider that it was intended as a temporary framework to deliver us to the mythical land of facilities-based competition.

This image from the City of Calgary's November 28, 2014 presentation to the CRTC speculates what a future of competing fibre facilities would look like
This image from the City of Calgary’s November 28, 2014 presentation to the CRTC speculates what a future of competing fibre facilities would look like

Facilities-based competition remains a myth because the world it envisions has never been clearly spelled out. Instead, facilities-based competition reflects both the persistent drive to create something resembling a competitive market in Canadian telecom following the monopoly era, and a rejection of the sort of structural (and functional) separation practiced in other parts of the world (most notably, large parts of Europe). Facilities-based competition means a telecom marketplace populated by competing networks (facilities): the Bell network competing with Rogers, TELUS, Shaw, and whoever else can afford to build telecom infrastructure. It has never been clear just how many competing networks there should be (with the exception of wireless, where the previous government seemed committed to bringing about four national competitors). However, while incumbent participants at the Blockbuster love to emphasize just how hard they compete with one another, the CRTC has repeatedly indicated that the current state of competition leaves a lot to be desired. Although Canada has hundreds of service providers, their facilities often do not overlap. Incumbents are sometimes classified as operating either inside or outside of their “territory”, and are reluctant to “overbuild” facilities where these already exist in a competitor’s territory (hence, Bell and TELUS have been repeatedly criticized at the Blockbuster for “sharing” facilities in their respective territories). Smaller competitors have sometimes wondered just how many competing wires the world of facilities-based competition imagines going into each home, and where the money to build all of these competing wires is meant to come from.

The CRTC has tried to address the inadequate state of competition in Canadian telecom through the mandated wholesale regime. The original idea (known as the stepping-stone or ladder-of-investment theory) was that small competitors could use the facilities of incumbents until they grew to have competing facilities of their own. Once some adequate number of competing facilities had flowered, the hand of regulation could fall away, and the market would take care of the rest. However, this never happened.

Instead, mandated access seems to be here to stay, and regulators talk a lot less about facilities-based competition than they used to.

The 2016 Telecom Summit

You can see the changing view of the mandated access regime through the past six years of the Blockbuster. By this year’s event (concluded earlier this month), the legitimacy of the regime was hardly raised as an issue (although Ted Woodhead from TELUS did remind everyone that the job of the CRTC had been to promote facilities-based competition, and that’s what “got us to being a leading broadband nation in the world”). A somewhat bigger concern was whether the CRTC was flouting the “law of the land” by effectively ignoring the 2006 Policy Direction — a document that was in many ways the high-water mark for the idea of facilities-based competition. There’s some dissonance in a regulator that has to justify its actions with reference to a document from a previous era in policy. Since the Policy Direction still stands, every decision the Commission takes is haunted by the ghost of Maxime Bernier reminding Canadians that they live in “a capitalist country, a country of freedom, and that regulation must be as limited as possible, to allow market forces to play out, particularly in telecommunications.”

Since 2006, we’ve seen a decade of continued mandated access, and a gradual acceptance of the fact that this regulatory approach is here to stay, even if we’re not clear on what the outcome is meant to look like. The recent expansion of mandated access to fibre seems to aim for a world of competing “middle-mile” networks, since the CRTC recognized that competitors “cannot feasibly or practically duplicate” last-mile wired networks (the part of the network that physically runs into your home).

I should note that the ghost of Maxime Bernier haunting the CRTC is just the imprint of his time as Minister of Industry between 2006 and 2007. The man himself is very much alive, seeking the leadership of the Conservative Party, and also spoke at the 2016 Telecom Summit. There, he lamented that the CRTC “seemed to take the Policy Direction seriously for a few years” before it “reverted back to its old ways”. Echoing incumbent positions at the Blockbuster (and deploying the wisdom of Ronald Reagan), Bernier asserted that the CRTC had failed to recognize just how much competition there was in Canadian telecom, which led him to conclude that the Commission should get out of telecom regulation altogether.

At the 2016 Regulatory Blockbuster, there were no calls for the CRTC to get out of regulating telecom competition and wholesale access, but incumbent participants gave their usual warnings about the harms of regulation, and much of the discussion was about what the role of the CRTC should be in these times. The first set of opinions was on Chairman Blais’ remarkable statements about digital strategy during the Basic Services hearing. Then (after a suggestion for CRTC procedural reform floated by Mirko Bibic), discussion turned to Commission’s relationship to industry and the public. Incumbents expressed the desire for a better way to sit down and talk with the CRTC, and even PIAC’s John Lawford voiced agreement that things had gotten out of hand in recent hearings — with so many diverse voices pulling the discussion every which way. The Commission has tried to do a better job including the public, and recently numerous people have been engaging with the process for the first time. Admittedly, hearings would run more smoothly if there was a single voice speaking for the public interest, but that’s not the direction things are headed.

The rest of the time was spent discussing those topics that have come to the fore depending on the regulatory cycle and the whims of politicians. The biggest of these was the Basic Services review (and how to fill various gaps in connectivity), but Quebec’s Bill 74 also came up for discussion. While most Canadians haven’t heard of this issue,  telecom lawyers are seriously worried about what it means for a province block websites in order to maintain control over gambling.

Conclusion

So what do you learn from watching close to ten hours of Canadian telecom lawyers on a stage? First, as someone who tries to study changes in telecom policy, the archive of these videos is a very valuable resource, for which I’m grateful to the Summit organizers, CPAC, and the participants who put themselves up there each year.

Secondly, some new regulatory issues come into play at each Blockbuster, and some things stay the same. Facilities-based incumbents are going to keep advocating for facilities-based competition, but in 2016 this means pointing to a previous era in telecom policy. Incumbent representatives at the Blockbuster like to fondly remember previous iterations of telecom regulation (remember when government said it would let the market sort things out?), because today’s regulatory environment seems more hostile and just plain confusing.

What was once meant to be a temporary scaffold (mandated access) has become an enduring regime. Facilities-based competition was once the goal of regulatory liberalization, but at the CRTC it has now either shifted in meaning (from the last mile to the middle mile), or describes some competitive ideal that will always be out of reach. Since there seems to be no appetite for getting rid of mandated access regulation on the one hand, or for doing away with the goal of competing private networks on the other, this ambiguity seems set to continue for a long time.

Cabinet Rejects Bell’s Wholesale Appeal

DSCN5972cropToday, we learned what the Government of Canada thinks about Bell’s petition to overturn the CRTC 2015-326 Telecom Regulatory Policy, which will open fibre networks to wholesale access. I’m not sure if anyone is surprised by this decision, since there were no indications that the Liberal cabinet (namely, Navdeep Bains, Minister of Innovation, Science and Economic Development) was predisposed to favour Bell’s position. In fact, there hasn’t been much indication of what the Liberal government’s stance is on telecom policy, or how it differs from the previous government. As a result, many are looking at this decision as a “first hint” of what to expect.

So, let me join the speculation about what this 200-word government statement really means:

First, cabinet recognizes that “wholesale broadband is a proven regulatory tool for enabling retail competition in the Internet service market”. This aligns with the increased legitimacy granted to wholesale access by the previous government, along with the CRTC’s decisions in recent years. The wholesale access regime is no longer imagined as some temporary stepping stone to facilities-based competition; mandated wholesale is here to stay. If the CRTC wants to focus the scope of facilities-based competition on the middle-mile, that’s fine, but this government values retail competition and consumer choice.

This government also seems to be playing it safe and leaving its options open. Supporting the CRTC is the default choice for cabinet, and there’s no strong reason or principled policy here for doing otherwise. The language used by the Minister echoes the Conservatives’ consumer-focused telecom populism, but it also indicates that the government’s telecom policy boat is maintaining its current heading. If this continues, the Liberals could simply avoid leaving their mark on telecom policy and manage the file according to a familiar pattern: espousing the importance of competition, supporting access to incumbent facilities, and distributing one-time injections of funding to individual broadband projects.

The other option would be for the Liberals to do something distinctive, which is probably what CRTC Chairman Blais was hoping for when he brought up the lack of a broadband policy in this country. There’s still no reason for me to believe that any distinctive digital policy in the works, and if it is, it will likely be a long time coming as the Liberals have plenty already on their plate. In the short term, the Bell-MTS deal could be another opportunity for the government to spell out what its vision of a competitive telecom industry looks like. However, my guess is that we will learn more from the government’s decision in that deal than whatever brief statement accompanies it.

 

Essential Broadband

CRTC

The CRTC is currently in the late stages of its review of basic telecom services, intended to “examine which telecommunications services Canadians require to participate meaningfully in the digital economy and the Commission’s role in ensuring the availability of affordable basic telecommunications services to all Canadians“. This review has been proceeding through written submissions for the past year, but is currently wrapping up the public hearings phase. You can watch these on CPAC through the video archives, or read transcripts of the presentations and the back-and-forth with the Commissioners.

What is all of this about?

Given the scope of the review, this is not an easy question to answer.  First of all, it has become blindingly obvious that some level of internet access is required to “participate meaningfully” in society. This “self-evident truth” was expressed by CRTC Chairman Blais early on in the hearings. The question of whether broadband is a “want” or a “need” has shifted to more detailed  questions around what sorts of minimum speeds (or other performance indicators) are needed, or what kinds of networks Canadians require. Should obligations to provide a certain level of connectivity be imposed on some intermediaries, or can we make do with “aspirational targets”? If obligations are imposed, who should be obliged, where, and to what standard? How much will it cost, and who should pay for it?

There’s been a lot of talk during the hearings about reaching those populations who face persistent challenges, including rural pockets that have been bypassed by the spread of connectivity. Connectivity for low-income populations has also been discussed repeatedly, since the digital divide carves through urban areas as well as the countryside. Surprisingly, digital literacy keeps coming up in questions from the Commissioners, an area that has rarely been a focus for providers, or covered by their support for MediaSmarts. All of this is interesting because the long-standing criticism of the digital divide concept was that it was overly concerned with the technical provision of access, and failed to consider the social obstacles, such as skills (digital literacy) and ability (including cost). Well, the CRTC is certainly thinking about these things, but actually regulating in these areas would be  something new for the Commission.

Perhaps the most remarkable thing about the hearings has been the diversity of the participants. Speakers have included major and minor connectivity providers, as well other stakeholders. Since the ultimate stake is connectivity for the nation,  the entirety of Canadian society is effectively a stakeholder, and written submissions have come from far and wide. The CRTC has agreed to hear presentations from advocacy groups, consumers, campaigners, policy wonks, not-for-profits, and populations at the thin edges of our networks. Some of these participants have appealed for very broad government interventions, and been pressed by Commissioners’ to comment on specific broadband targets or implementation strategies that the CRTC might actually have a role in.

Given my Alberta roots, it was especially interesting to see Axia’s Art Price present his regulatory vision, which understandably coincides with the business model the company is already pursuing in Alberta. Alberta’s SuperNet was held up as a model for the sort of “community interconnect grid” that could be pursued elsewhere. During the question-and-answer, Price noted the provincial government’s current lack of attention to issue, and sidestepped the question of what happens when a backbone is built but no one steps up for the last mile. Cybera’s presentation earlier today led to a more mixed view of SuperNet through the questioning of Commissioner Vennard, who has some experience with the history of this project.

It’s also been good to get a chance to hear from some of the hundreds of intermediaries scattered across the country, including ILECs, SILECs, IISPs, WISPs, cablecos, satellitecos, non-profits, regional networks, and co-operatives. I’ve tried to get a good sense of the diversity of these institutions through my research, but there’s still plenty of smaller ones out there that I’m obviously not aware of (like Chebucto Community Net). The incumbents and their facilities may be key to anything that results from this proceeding (because that is where new targets and obligations really matter), but it’s important not to overlook these more local institutions that have their own particular perspectives.

One remarkable part of the hearings was CRTC Chariman Blais’s address on April 18, in which he stated that the review might be the “last best chance to get it right – a chance to create, together, a coherent national broadband strategy“, and that the CRTC would be “taking some leadership on defining the strategy“. This is the sort of leadership that has long been lacking from the federal government, and indicates a role for the CRTC beyond simply tweaking existing rates, incentives, and obligations.

So where will all of this lead?

The range of actions the CRTC could decide to take (after the Commissioners have time to digest the whole process) is nearly as broad as the scope of the review. There has been some discussion online about what authority the CRTC could use to impose obligations for new networks, but various models for a way forward have been proposed by participants in the process, and any decision by the CRTC can generate years of dispute about its basis in regulatory law. The CRTC could also do nothing at all, and may feel like it has little ability to address these problems. After all, the Commission can’t fund the infrastructure itself, or ask the federal government to do so. The CRTC gets to set the rules under which intermediaries operate, through obligations and incentives, and it has never been the role of the Chairman to develop a “digital strategy” for the nation.

While we probably won’t end up with a government-funded open-access national fibre backbone, a new crown corporation, or obligations for incumbents to extend fibre across Canada’s north, it does seem that the CRTC will at least do something that looks significant. Given the comments of the Chair, and the Commissioners’ demonstrated understanding and recognition of connectivity problems, continuing with the status quo doesn’t seem to be an option. There will have to be a move that promises to address at least some of the remaining technical (territorial) gaps in connectivity.  However, any action that’s truly ambitious here will mean the CRTC carving out a new role for itself. I think that without Cabinet support, a new national strategy or a new leadership role for the CRTC just doesn’t seem that likely.

Telecom Responsibilization: Internet Governance, Surveillance, and New Roles for Intermediaries

I’ve just had my most recent article published in the Canadian Journal of Communication. From the abstract:

This article foregrounds internet intermediaries as a class of actors central to many governance and surveillance strategies, and provides an overview of their emerging roles and responsibilities. While the growth of the internet has created challenges for state actors, state priorities have been unfolded onto the private institutions that provide many of the internet’s services. This article elaborates responsibilization strategies implicating internet intermediaries, and the goals that these actors can be aligned toward. These include enrolling telecom service providers in law enforcement and national security-oriented surveillance programs, as well as strategies to responsibilize service providers as copyright enforcers. But state interests are also responsive to pressures from civil society, so that “internet values” are increasingly channelled through the formal political processes shaping internet governance.

This particular work took more time and revision than anything else I’ve had appear in print. I began working on it prior to my PhD research (and before Snowden), germinating in a conversation I had with my supervisor. I was trying to explain some of my interests in how intermediaries end up serving state surveillance and security objectives, and how “deputization” didn’t seem to be an adequate way of describing the process. He proposed I look at the notion of “responsibilization”, even if what I was describing ran counter to some of the neoliberal logic often associated with the concept.

In the end, the article became a way for me to engage and disengage with different theoretical commitments, while working through some particular cases of intermediary obligations that I was interested in (graduated response, lawful access, interconnection). I’m using the piece as a way to talk about something that many people have pointed out: the importance of intermediaries in contemporary power relations. However, my focus is not just on the power that these companies have over our lives, but the potential for intermediaries to become instruments of power. This leads numerous actors (state and non-state), with particular visions of how to shape or order society, to treat intermediaries as “points of control” (Zittrain, 2003).

The idea of responsibilization is a useful way to understand certain relationships between state and private actors, but it is a concept that deserves some elaboration and careful qualification. Responsibilization has frequently been presented as an aspect of neoliberal governance, corresponding with an emphasis on individual responsibility for one’s conduct and well-being, and the increased involvement of private actors in domains that were previously a responsibility of the state (Burchell, 1996, p. 29). Under this definition, the state’s enlistment, partnering with, or outright deputizing of intermediaries can be seen as a way to devolve state responsibilities and regulatory powers onto private actors. Yet there is nothing particularly new about telecom providers being aligned toward state goals, or accepting obligations towards some sort of public good (security, surveillance, universal service). Also, rather than a shrinking neoliberal state transferring responsibilities to the private sector, responsibilization can actually represent an extension of state power — reaching deeper into civil society by enlisting key network nodes.

Responsibilization and Social Theory

If we understand responsibilization as a technique of government that can be independent of neoliberalism, we can think about how it might be compatible with more generalizable social theories. Originally, I was interested in exploring how the responsibilization of intermediaries could be treated as a combination of Castells’s “programming power” and “switching power”. Abandoning Castells, I then moved further in the direction of governmentality literature and the work of Mitchell Dean. Dean’s work became invaluable as I was thinking through the role of state power and its relationship to all that we now sometimes refer to as civil society. In particular, I was strongly influenced by Dean‘s analysis of what he calls “liberal police”, which operates (in part) through an “unfolding” of governmental programs into civil society.

In regards to surveillance studies, responsibilization seems quite compatible with Haggerty and Ericson’s (2000) well-known idea of “the surveillant assemblage”, referring to the “disconnected and semi-coordinated character of [contemporary] surveillance” that allows actors to “combine and coordinate different monitoring systems that have diverse capabilities and purposes” (Haggerty and Ericson, 2006, p. 4). Responsibilization describes one important means by which the surveillant assemblage can become coordinated, and while Haggerty and Ericson tend to emphasize the decentralized and diffuse character of contemporary surveillance, they also recognize that “powerful institutions” can remain “relatively hegemonic” to the extent that they can “harness the surveillance efforts of otherwise disparate technologies and organizations” (Haggerty and Ericson, 2006, p. 5). The state remains in a privileged position to coordinate various aspects of the surveillant assemblage, whether through the force of law or less coercive means (such as moral suasion and appeals to patriotic duty).

Where else might the idea of responsibilization bear fruit? The distinctions I make about different types of responsibilization in the published article may certainly be applicable beyond telecom, and I think we can find plenty of examples of responsibilization operating as a technique of governance if we detach the concept from certain presumptions about neoliberalism.

In summary…

Our daily experiences are increasingly being governed through intermediaries, often in ways that we don’t appreciate. Proposed solutions to social problems, threats, immorality, and disorder now often argue for better governance of intermediaries. Battles over the shape of digital society often come in the form of battles over the responsibilities we should impose on intermediaries, or debates about the responsibilities that intermediaries should willingly accept.

 

Still sorting out the post-Snowden balance

The ongoing fight between Apple and the FBI, in which a growing number of companies have declared their own interest and support, is the latest constitutive moment for what it means to live in the “post-Snowden” era. This is because the fight is a direct consequence of changes made by Apple following the Snowden disclosures, and because it is now being used as a way to stabilize some sort of “balance” between government and industry, after the massive shake-up of this relationship in late 2013/early 2014. The shift that occurred included major tech companies treating their own government as an adversary to defend against. Now, Apple has reportedly decided that its own engineers must also be part of this threat model. After Snowden, the company decided that it no longer wanted to be able to unlock phones for the government. Now, the challenge is to develop security that the company cannot even help the government break through some indirect means.

The term “post-Snowden” has gotten a lot of use in the last couple of years, but the Apple-FBI battle demonstrates the real shift to which it refers. Perhaps in a few years, the impact of the Snowden disclosures will be forgotten, in much the same way as the crypto war of the 1990s faded from memory as the relationship between industry and government got cosy after 9/11. But the world did change in a variety of substantial ways as a consequence of Edward Snowden’s actions, and we are still grappling with the legacy of those changes.

The Snowden disclosures were a truly international story with many local manifestations. Just as NSA-affiliated surveillance infrastructure had been extended around the globe, scandal touched the various nations implicated in the documents, and opened the door to local investigations. News stories broke one after another, with governments as either targets or practitioners of surveillance. Canada, as a member of the exclusive “Five Eyes” surveillance club, was reminded that it too had an agency with a mandate similar to the NSA (CSEC, now CSE). More clearly than ever, citizens understood that the surveillance infrastructures of intelligence agencies had global reach. Canada hasn’t seen public battles between government and industry like the one currently involving Apple, and discussions of government surveillance have been more muted than in the US, but a series of Snowden-related stories in this country have also fed into long-standing concerns about surveillance and privacy.

I want to spend more time on how the Snowden disclosures impacted Canada in a later post, but for now I’ll just briefly reflect on my own experiences studying the telecom industry during this period.

I began attending meetings of network operators and engineers in 2012. The first of Snowden’s revelations hit in June 2013, and by the fall of 2013, the topic of state surveillance was a regular part of conference conversations and presentations, if not the actual topic of presentations themselves. At the October 2013 NANOG conference, the internet’s North American engineers cheered the resistance of Snowden’s email provider to disclosure demands by the US government (Ladar Levison had built what was meant to be a secure email provider, but the FBI ordered him to hand over the encryption keys. Attendees applauded his efforts to make the FBI’s job as difficult as possible). At the IETF in Vancouver the following month, participants overwhelmingly voted to treat pervasive surveillance by state intelligence agencies as a technical attack on the internet, and debated how to protect against it. At a Canadian industry conference in April 2014, an executive with an incumbent ISP argued that service providers had an opportunity to gain a competitive advantage by offering better security, and showed a photo of Snowden as an answer to the question of why we care about privacy and security. Interestingly, Canadian government agencies reportedly joined Canadian companies in touting the country’s privacy and security advantages to customers concerned by surveillance in the US.

After Snowden, corporate management and operational decisions took time to shift, but the change in discussions and governance forums was more immediate. It wasn’t just that private intermediaries suddenly had a new threat to worry about, but that the nature of their role, and their relationship to their users/customers had changed. Snowden’s revelations included the fact that the NSA had been undermining the very internet infrastructure that the agency had been tasked with protecting, but also the suggestion that it had done so with intermediaries acting as private partners. Best exemplified by early reports of the PRISM program, some intermediaries were now seen as complicit in this global spying apparatus. As a consequence, companies began limiting cooperation with government agencies and issuing transparency reports about the nature and extent of their information disclosures.

The Snowden disclosures contributed to cynicism and distrust of both government and private industry, and trust is key for companies that have built a business model around securing personal information. Companies such as Apple are positioning themselves as trusted stewards of personal information, with the recognition that customers often do not trust government assurances that they will only access such data in limited and justified circumstances. The most recent moves by Apple are an attempt to move data even further out of the reach of these providers themselves. Such an approach will not be possible for companies that depend on access to this data as part of their business model (for advertising purposes), but for those selling hardware and online services, building walls against governments is now often more desirable than negotiating access.

From one perspective, the Apple-FBI fight is about setting a precedent for government power in the post-Snowden era. But I would say that it is an indicator of a loss of government power, a shift in the orientation of the US tech industry to the state, and one of the continuing consequences of Snowden’s decision to shake up the world.