Another federal government consultation has recently wrapped up, this time with Public Safety asking about national security. Like other ongoing consultations, this one was criticized (for example, by Christopher Parsons and Tamir Israel) as framing the policy issue in a way that the government prefers, and trying to legitimate some ideas that should have been discredited by now. I would say that the consultation framed the issue very much as Public Safety (for instance, the RCMP) would prefer, repeating old rationales, and seeing the world from a perspective where the ability to exercise sovereign will over information flows is paramount. The Green Paper provided for background reading foregrounds the concerns of law enforcement & security agencies, is peppered with the words “must” and “should”, advancing some dubious assumptions. Public Safety asked for feedback on terrorism-related provisions (including C-51), oversight, intelligence as evidence, and lawful access. The last of these has seen a number of previous consultations, but is back in the news as police make their case for the issue of “going dark” (which has become part of the RCMP’s “new public narrative” for a set of concerns that were once broadly talked about as lawful access).
I let this one get away from me, so I didn’t have anything ready for Dec. 15 when the online submission closed. Regardless, I’ve decided to complete most of the questions related to the topic of Investigative Capabilities in a Digital World as a blog post. I don’t feel particularly bad for missing the deadline, since several of these questions border on ridiculous. For a true public consultation on what has long been a very contentious issue, it would be important for the questions to be informed by the arguments on both sides. Privacy experts would have asked very different questions about privacy and state power, and on a number of topics Public Safety seems to be trying to avoid mentioning the specific policies that are at stake here.
How can the Government address challenges to law enforcement and national security investigations posed by the evolving technological landscape in a manner that is consistent with Canadian values, including respect for privacy, provision of security and the protection of economic interests?
When I think of Canadian values, “privacy, provision of security and the protection of economic interests” are not what come to mind. When I ask my students what they associate with Canada, these particular values have never come up in an answer. I think we should consider democracy as a fundamental value, and understand that state secrecy is antithetical to democracy. When it comes to the relationship between citizens and the state, Canadian values are enshrined in the Charter, and the Supreme Court is ultimately responsible for interpreting what is consistent with the Charter. Therefore, Canadians deserve to understand what is being done in their name if we are to have a meaningful democracy, and this includes the existence of an informed, independent judiciary to decide what government actions are consistent with Canadian values.
In the physical world, if the police obtain a search warrant from a judge to enter your home to conduct an investigation, they are authorized to access your home. Should investigative agencies operate any differently in the digital world?
If we accept the digital/physical distinction, the answer is a definite yes — investigations carried out today operate differently than they did in the simpler, more “physical” 1980s. But it is important to keep in mind that analogies between the digital and physical environment can be misleading and dangerous. When it comes to the “digital world”, I prefer to talk about it in digital terms. The stakes are different, as are the meaning of terms like “to enter”. If we must make these comparisons, here is what treating these two “worlds” as analogous would mean:
The police can enter my home with authorization, and seize my computer with authorization. I am not required to make my computer insecure enough for the police to easily access, just as I am not required to keep my home insecure enough for the police to easily access. I am not required to help the police with a search of my home, and so I should not be required to help police search my computer. If I have a safe with a combination lock in my home, I cannot be compelled by police to divulge the combination, so by analogy, I should not be compelled to divulge a password for an encrypted disk.
But analogies can only take us so far. A computer is not a home. Metadata is not like the address on a physical envelope. We need to understand digital information in its own terms. To that end, some of the more specific questions found further in this consultation can produce more helpful answers. Before we get to these however, this consultation requires me to answer a couple more questions based on the presumption of digital dualism.
This question is hard to answer without knowing what it means to “update these tools”, and seems to be intended to produce a “yes” response to a vague statement. Once again, digital/physical comparisons confuse more than they clarify — these are not separate worlds when we are talking about production orders and mandating the installation of hardware. We can talk about these topics in their own terms, and take up these topics one at a time (see further below).
Is your expectation of privacy different in the digital world than in the physical world?
My answer to this question has to be both yes and no.
No, because I fundamentally reject the notion that these are separate worlds. I do not somehow enter the “digital world” when I check my phone messages, or when I interact with the many digitally-networked physical devices that are part of my lived reality. Privacy law should not be based on trying to find a digital equivalent for the trunk of a car, because no such thing exists.
Yes, expectations of privacy differ when it comes to “informational privacy” (the language of Spencer), because the privacy implications of digital information need to be considered in their own terms. Governments and public servants do Canadians a disservice with phonebook analogies, license plate analogies, or when they hold up envelopes to explain how unconcerned we should be about government access to metadata (all recurring arguments in the surveillance/privacy debate). In many cases, the privacy implications of access to digital information are much more significant than anything we could imagine in a world without digital networks and databases of our digital records.
Basic Subscriber Information (BSI)
Interception Capability
Encryption
I think the answer to this has to be never. People cannot be forced to divulge their passwords — in our society they can only be put in prison for very long periods of time. In other cases, assisting with decryption means forcing Apple to break through their own security (which was meant to keep even Apple out), or driving companies out of business unless they make products with weak security. This does not work in a world where a single individual can create an encryption app.
How can law enforcement and national security agencies reduce the effectiveness of encryption for individuals and organizations involved in crime or threats to the security of Canada, yet not limit the beneficial uses of encryption by those not involved in illegal activities?
By doing anything other than mandating insecurity for everyone. The answer cannot be to make technology insecure enough for the state to exploit, because this makes everyone insecure, except for those who use good encryption (which has become too commonplace to stamp out).
The final two questions deal with data retention, a topic I’ll leave for a later time…