On regulating AI, or failing at it

I wrote a new opinion piece published in Policy Options here. It comes out of some of the latest warnings of AI doom, and Canada’s current approach to AI regulation in C-27. I’ve copied the text below.


Recently, at a time when fear and hype around artificial intelligence (AI) were already running high, another notable voice added further alarm regarding these rapidly emerging technologies. One of the world’s most influential AI scientists, the University of Toronto’s Geoffrey Hinton, announced he was parting ways with Google so he could add his voice to the chorus of warnings about the dangers these technologies pose. Whether or not one agrees with his perspective, the move was relevant and telling.

You can be forgiven if you’re experiencing a feeling of déjà vu. Many of the arguments about the existential threat posed by AI and its promise to automate away entire sectors of human work have been heard before. These fears are often exaggerated, but technologies based on large language models (LLMs), such as ChatGPT, are changing how work is done in a number of fields. They have also created a backlash in ways that previous deployments of machine-learning technologies did not.

Some have called for immediate action to prevent the development of “AI that does not do what we want, and does not care for us.” Yet, there are plenty of potential harms that may arise from AI doing exactly what it was designed to do by corporations that are hopelessly short-sighted about the greater welfare of humanity. The risks posed by AI are often presented as an “alignment problem,” meaning the real issue is finding a way to “align” AI with human values or best interests. However, framing the potential dangers this way ignores one critical factor: AI is already aligned with a particular set of human values and interests – those of the profit-seeking corporations driving AI development.

An example of this came in 2021, when the publication of a paper warning of the harms already emerging from LLMs led Google to acrimoniously expel Timnit Gebru from the company and its Ethical AI team. More recently, Google and its competitors have been cutting back their AI ethics teams even as they charge ahead in response to ChatGPT.

According to Hinton, Google was initially cautious with these technologies because “it knew there could be bad consequences” if released to the public. Once Google was seen to be falling behind OpenAI, however, it “didn’t have really much choice” but to change its position, because “it is just inevitable in the capitalist system… that this stuff will be developed.”

Meanwhile, Yoshua Bengio, the scientific director of the Montreal Institute for Learning Algorithms (Mila) and co-winner (with Hinton) of the 2018 Turing Award, has echoed Hinton’s existential concerns and argued that we need immediate action to prohibit “AI-driven bots” from impersonating humans. In the longer term, to mitigate potential harms due to the rapid acceleration of AI development currently underway, Bengio has stated that “we should be open to the possibility of fairly different models for the social organization of our planet.”

However, all is not lost. We have many examples of government regulation that could help bring about greater alignment between AI development and the public interest. That said, Canada’s proposed Artificial Intelligence and Data Act (or AIDA, which is part of Bill C-27) currently before the Commons standing committee on industry and technology, does not address many of these issues.

As many have noted, a key problem with AIDA is that it leaves the details of how to govern AI up to future regulations, the details and enforcement of which will be largely up to Innovation, Science and Economic Development Canada (ISED), the agency that drafted the legislation without public consultation. ISED’s pro-industry mandate and commitment to promoting AI commercialization sets up potentially conflicting functions when it drafts the regulations. AIDA already seems to be an outdated response – not because it fails to address existential risks of AI takeover, but because it would likely be unable to address many of the more immediate examples of harms involving algorithmic systems.

In addition, AIDA’s focus on addressing individual harm excludes collective harms such as threats to democracy and the environment, and the reinforcement of existing inequalities. AIDA effectively regulates only “high impact” systems, while leaving this term undefined in the legislation. ISED’s companion document does set out some criteria it will use as part of its process for determining whether an AI system is “high-impact,” but these would probably exclude most of the pressing concerns around LLMs. The EU’s proposed Artificial Intelligence Act, which takes a related approach in classifying “high-risk” systems for additional regulation, has recently extended separate obligations for “foundation models” such as the LLMs underlying generative AI, to bring these into scope.

It is hard to see how legislation focused on high-impact individual harms would address the more diffuse harm of unattributed LLM-generated text, which is now being commercialized to draw clicks, serve ads and potentially misinform. An inherent characteristic of these systems is that they end up hallucinating outputs that seem plausible, whether they are true or not. Many recent proposals to regulate machine-learning models (including LLMs) argue for greater accountability in how these systems are “trained.” It can be argued that AIDA would only potentially impose such measures for certain high-impact cases.

 

Finally, many of the most “high-impact” applications of AI are those carried out by governments, including immigration decisionsbenefits claimspolicing and military operations. Yet, AIDA applies specifically to the private sector. (A small number of federal government systems have received algorithmic impact assessments, which have their own limitations.)

In short, while many of the risks and existential harms around AI are overblown and serve mainly to fuel further hype for these technologies, there are actual risks and harms being produced by existing algorithmic systems. The solution is better regulation. Unfortunately, Canada’s approach through AIDA has been arguably undemocratic and will likely result in a permissive, pro-industry regulatory regime that will be unable to address many of these existing challenges. Rather than spending the coming years fleshing out this skeletal attempt at AI and data regulation, we should be heeding those who argue for a more fundamental rethink.

Any technological development with broad social impacts requires regulatory reform. Unfortunately, the Canadian government has unwisely bundled AIDA with reforms to outdated private-sector privacy law in C-27 instead of ensuring the two instead proceed on separate tracks. While those promoting AI hype and existential gloom look primarily to a speculative future, we need to start again with a policy approach that more effectively addresses our present circumstances.

Canada is failing to regulate AI amid fear and hype

CRTC Picks up the Pace of Wholesale Reform (to save the non-incumbents that remain)

Big news at the CRTC, as the Commission moves to throw a lifeline to the remaining non-incumbent ISPs. New CRTC Chair Vicky Eatrides has her marching orders, which include addressing the “perception that the CRTC is taking too long to make decisions”. Also in play is the 2023 Policy Direction, which finally repeals the earlier one from 2006 (and 2019), officially recognizes that mandated wholesale is here to stay, and commits to an aggregated approach to network access. All of this is reflected in the CRTC’s newly announced review of internet competition and wholesale. Here is a quick summary of the more remarkable aspects:

A market dominated by incumbents and FTTP

Telecom Notice of Consultation CRTC 2023-56 provides some important context for where things currently stand when it comes to wireline internet access in Canada. Incumbents “hold dominant positions” and their dominance is growing, as “many competitors have begun losing subscribers” in addition to incumbents acquiring these competitors. Furthermore, incumbents have rolled out FTTP across much of their territories and are decommissioning their old copper networks, making access to FTTP crucial for the future of mandated wholesale. The Commission recognizes that rapid and significant changes are required to make wholesale access to FTTP viable for non-incumbents (IISPs).

Expedited access to FTTP on an aggregated basis

The first thing the CRTC needs to do is to stop the bleeding. Debates over wholesale rates and configurations (aggregated/disaggregated) have dragged on for many years, but given where things are headed there will be no actual “competitors” to the incumbents years from now unless something happens more quickly. So the priority for the CRTC is to get some kind of “expedited and temporary” framework in place, “at least until the conclusion of this proceeding”. The CRTC warns that “procedural requests by interveners that the Commission finds will unduly delay or impede this process will not be granted”, recognizing that it is in the incumbents’ interest to draw out the process as long as possible until they are the only ones left standing. So we’re going to see a “multi-stage” process, with expedited access sorted out in the coming months, with the first tariff filings due from incumbents at the end of next week (to address the immediate 10% reduction for some “traffic-sensitive components”). The CRTC is clearly indicating where it wants to go on this, and is departing from its previous ponderousness to get the ball in motion, while warning incumbents not to get in the way. This is commendable. All of these expedited and temporary moves are to get something in place while the CRTC follows a more conventional process for determining what a more long-term regulatory regime looks like.

The return of price regulation?

Perhaps the most surprising part of CRTC Notice 2023-56 is the indication that retail price regulation is on the table as part of its broader approach to competition policy. As the CRTC notes, this is something it has avoided since the 1990s wave of telecom liberalization. Price regulation was imposed on wholesale access so that incumbents wouldn’t strangle their competitors (economically of course), but everyone was free to set whatever prices they wanted for regular customers. Government and industry have both claimed credit for programs to provide low-income access, but this has not been enough to prevent prices that the CRTC says “remain high relative to international peers” for “mid-range and top-range plans”. At this point retail regulation is just a “potential” step and the CRTC wants to hear input about what might trigger such a move. Furthermore, the CRTC instructs those participating in its process to comment on what other forms retail regulation could be pursued (under section 24 of the Telecom Act), prior to considering regulating rates (sections 25 and 27). However, this could be read as an attempt to provide more concrete means for opening a policy door that would ordinarily be shut, and a warning of just how far the CRTC is willing to go.

All in all, the CRTC has taken immediate steps to save what remains for the players that depend on mandated access to incumbent networks, but it will take more to encourage new competitors to replace those that have been lost in recent years. For the moment, this adds to what is certain to continue to be a very busy agenda for internet policy in Canada more generally, and at the CRTC specifically.

End-stage Mandated Wholesale

When I moved to BC in 2018, I decided I wouldn’t go back to an incumbent ISP. I signed up with an “independent” ISP (or IISP) with a local presence, and even though it took ten days for me to get connected (because independent ISPs still depend on incumbent technicians to set up their subscribers) I was very happy with my choice. Then a couple years later my ISP was purchased by the incumbent telco, and I had to go looking for another option. More recently on the other side of the country, Fenwick McKelvey had signed up with EBOX, and when they were bought by Bell (in Feb. 2022), he switched to Distributel. Well, the news from earlier in the month is that Bell is now also buying Distributel. It turns out you can run from the incumbents in Canada, but they will catch up with you eventually.

The IISPs that developed under the mandated wholesale regime have been getting rolled up in the West and East, with VMedia acquired by Québecor over the summer, and Bell buying EBOX and Distributel. Is this wave of consolidation a “death blow” for IISPs? I suppose we can hold the funeral if TekSavvy finally gets swallowed by the incumbents it’s been fighting all these years, but no matter what, it is clear there has been a big market shift. Whether or not the Rogers-Shaw merger is completed, the mandated wholesale regime has turned out to be a policy failure. It used to be possible for regulators to maintain the status quo, preserving the existence of IISPs as a source of competitive tension, even if these smaller operators were never going to truly compete with the incumbents in terms of scale. This is no longer an option. With each acquisition there is less for regulators to preserve. It becomes impossible to pretend the policy is working, unless we want to pretend that the goal of telecom liberalization was always to end up with a telco/cableco duopoly.

No, the half-baked neoliberal dream of the 1990s is finally dead. The idea that the CRTC’s role in competition policy was just part of the “transition to competitive markets” is ridiculous now. People stopped writing such things years ago. We never arrived at this promised land of “competitive markets”, but policymakers seem to have been of the opinion that where things ended up was good enough. Now, consolidation (or reconsolidation) has gone so far that it puts government in a difficult position. Is this still a ‘good enough’ outcome of liberalization? We’ve gone from most Canadians having a monopoly telco and a monopoly cableco provider in the 1990s, to the descendants of these two companies being the choice that Canadians still have in the 2020s. So thirty years after this bold shift in telecom policy that put an end to the monopoly era, after all of the fights over mandated wholesale at the CRTC, we’re in some ways back to where we started, with two big companies, each controlling one wired network going into your house (unless you live somewhere less fortunate, in which case it might be monopoly for you). The big difference brought about by liberalization and the shift to data networks is that these two former monopolies can provide competing services over those wires, and ideally this means government doesn’t have to regulate prices the way it did in the monopoly era.

None of this was inevitable – it was up to the CRTC to decide how viable mandated wholesale would be for IISPs, but Distributel was never going to neutralize the advantage of the incumbents by spending billions ‘overbuilding’ their infrastructure. The vague policy fantasy that such a thing might happen was sometimes useful for the incumbents, in that they could point to the existence of all of these apparent competitors, but incumbents fought against mandated wholesale at ever turn, and it must bring them some satisfaction to see it crumble. However, this does potentially put the spotlight squarely on the telecom giants, complicating any further attempts for them to gobble one another up.

In my book I wrote about three possibilities for the future of competition policy: something radical (like structural separation), the status quo, or to embrace incumbent dominance. Well the status quo was largely a duopoly anyway, but it came with the notion (hanging on from the 1990s) that things might yet become more competitive. That’s gone now, so government inaction would mean accepting that the terminal stage of liberalization is incumbent duopoly.

For many in Canada, there are still options besides the incumbents – non-incumbent ISPs are still managing to exist, or succeeding by doing important work in regions that incumbents have historically neglected. This is not a call to throw in the towel on mandated wholesale. There can and should be a variety of ISPs serving local needs across Canada, and it needs to be possible for alternative providers of connectivity to exist. But we also need to confront the failure of imagination that led us here, and to learn from it by actually trying to imagine the future that our competition policy is set to deliver.

“Community broadband” success stories: Kaslo’s KiN

What happens when a community decides to take connectivity into its own hands and builds itself a broadband network? This question has been raised in past months with the pursuit of ConnectTO in Toronto, which may be the topic of a future post. The “poster child” of successful community broadband in Canada has often been identified as Olds, AB, but as I discussed in a previous post, the future of that particular project is in considerable doubt. In the United States, recent years have indeed produced numerous community broadband success stories, so how about Canada?

First of all, we need to spend a bit of time defining what we are talking about. Unlike a municipal network, which is typically owned by a municipality, a “community network” can encompass this and many other possibilities, including co-operatives, not-for-profits, or other ‘community-based’ organizations. The word ‘community’ tends to be used for its positive connotations than its descriptive accuracy, and can suggest a “spurious unity” of similarly-thinking people (Calhoun, 1998, p. 34). Rather than representing the will of some larger community, these broadband projects are often the result of a small number of champions who successfully mobilize resources and obtain the support of key political actors — despite differences and disagreements among the local population.

For me, community broadband is defined by the fact that it is (1) locally-based — rooted in the needs of people in a particular place or region, and that it (2) serves some articulation of the public good or collective interest of these people. Typically what this means is that residents, businesses, or government actors in some location feel ill-served by incumbent providers, and rather than waiting for these incumbents to improve the situation, they take matters into their own hands. This situation often arises in rural regions that are less profitable, and hence less of a priority for incumbents (which is why ConnectTO is so unusual). Unfortunately, rural areas are also the sorts of places where there tends to be a shortage of people with the skills required to build and operate a broadband network.

This is why I was so intrigued by a presentation that Tim Ryan gave to the BC Broadband Association in 2019, describing the do-it-yourself approach of the Kaslo infoNet Society (KiN), that involved locally-trained staff installing fibre and wireless for far less than is typically budgeted for such projects. Later that summer, I visited Kaslo to talk further with Tim and see things for myself. As is typically the case in Canada, the construction season is weather dependent, and KiN was busy digging and laying fibre before the freeze hit.

They’ve continued this work ever since, laying more fibre underwater to connect shoreline communities along North Kootenay Lake, as well as within Kaslo itself.

Area connected by submarine fibre

This is now effectively a regional network, connecting residents on both sides of a narrow mountain lake, over roughly 50km. Over the years, the technology underpinning the network has shifted from largely wireless to largely fibre-based, enabled by some innovative underwater deployments in the lake. Laying fibre in water poses some challenges, but it eliminates many of the land-based headaches over rights-of-way and infrastructure access (roadways, poles etc.).

One of the lessons here is that the specific technologies used to provide internet access should depend on the local circumstances. Fibre can be installed in different ways, and there are still situations where other technologies make better sense. KiN’s approach gives you a sense of just how low fibre deployment costs can be pushed in Canada. The cost of the actual cable tends be relatively minor — it’s getting that cable into the ground (or up on poles) that creates the expense. Aerial fibre involves getting approval from whoever (telco or power utility) owns the utility poles, and paying for rent or upgrades to the poles. Burying fibre can create different headaches in terms of approvals and overlapping infrastructure, plus the major expense of digging up the ground. On land, Ryan has claimed KiN can get trenching costs down to $7 [CAD] / meter. In lakewater, this cost can be a lot less, but most regions do not have the benefit of a long lakeshore geography in which to deploy backhaul.

Beyond the fact that there is no one-size-fits-all approach to building a community network, it is indeed possible to use success in one particular case to generalize more broadly. Below are some key determinants of success for a community broadband project, building on some Tim Ryan’s personal views about what has contributed to the success of KiN.

1. Backhaul (backbone connectivity)

The other elements below are all important, but if they are missing they can be developed. Without a backbone however, the network is a non-starter. Any community network is going to need to be plugged into the global internet somewhere, and its bandwidth is going to depend on the bandwidth of that connection to a fibre backbone. In an ideal situation, the community network finds a way to reach an internet exchange where it can connect with content providers and world-spanning networks. However, for much of rural Canada this is simply not possible. Satellites will open some possibilities in future years, but for the foreseeable future the viability of a new network is going to depend on access to existing infrastructure that can carry traffic regionally and globally. This may be the local telco incumbent(s), who will likely be competing with any new network for customers. While incumbents are mandated to provide wholesale network access in Canada, the terms of this mandated access do not provide a lucrative opportunity for community networks. The most successful approaches for community networks involve alternative backbones, or leasing a fibre path directly to an exchange if possible.

In Kaslo, KiN had been relying on incumbent TELUS for connectivity until 2014, when CBBC (a subsidiary of Columbia Basin Trust) extended its backbone to the town. This shifted the economics of broadband in the region significantly, and KiN then proceeded with its ongoing fibre deployments.

2. People and skills

Having the right people, with practical skills, experience, and personal connections ends up being valuable in a project that is going to be difficult in a best-case scenario, and which will likely depend on a small number of bodies to keep things moving. There is a path forward for community-based groups who want to build a network but don’t know how to go about it, but this can involve hiring telecom consultants and a steep or expensive learning curve. Effective project champions and relevant technical skills among community members are enormous assets. Local governments that have in-house networking expertise and experience managing infrastructure can draw on those for broadband projects, while groups outside of government are fortunate if they involve members with an understanding of telecom networks.

In the early 2000s, Tim Ryan was trying to carry out business from Kaslo over a limited internet connection, and his background in IT helped him imagine how things could be better. He joined KiN in 2012, where other members also had technical skills, but as Ryan stated “There is a perception that that fibre connectivity is complex … (but) it’s 90% ditch-digging and 10% technology”. In terms of critical skills, in his view they are: “Critical skill number one: ditch digging. Critical skill number two: network management. That was contributed by the existing ISP [KiN]. Critical skill number three: an understanding of how optical networking technology is assembled, and we had two people in the community that knew how”. These skills gave the project’s participants the confidence to proceed — knowing that building a fibre network was possible: “Optical technology on the face of it, is as opaque as brain surgery and rocket science. In fact, it’s not that difficult, but most people don’t know that.” Having the knowledge to imagine what is possible helps to get a project started, but being able to train local staff in the required skills ends up being important to keep costs down (and circulate money locally) while maintaining local autonomy.

3. Organization

There are examples of local networks built by a committed individual who wants better connectivity and extends this to their neighbors, but beyond this scale, some sort of larger organization is required. An ISP could be organized along many different lines: for-profits, not-for-profits, co-operatives, or public utilities have all had some success meeting local needs. If possible, an advantage comes from being able to work through already-established organizations. In Kaslo, KiN had been incorporated as a not-for-profit since 1996, when it had offered dial-up service. While network technology had moved on, KiN retained a couple hundred subscribers and an organizational that could be built upon for a future network. In Olds, the OICRD became the vehicle for envisioning, developing, and eventually owning its fibre network. Organizations coordinate individual action into collective efforts, and as legal entities they can do things that individuals cannot — such as applying for funding. But organizations also require internal governance and maintenance (the non-technical kind), and can always benefit from a shared vision and collective enthusiasm.

4. Political support

To some extent (often a large one) an ISP is a political actor, and must by necessity maintain relationships with different levels of government: the CRTC and ISED, provincial ministries (land use, transportation), public utilities, local councils and municipal departments. ISPs can be enlisted towards public policy goals, such as connecting public facilities (a valuable “anchor tenant” for a community network), tackling the digital divide, or facilitating economic development. In Kaslo, KiN connects the Village Hall, Langham Cultural Centre, and Library, and has made the most of being a local actor in a small place. As Tim Ryan put it, KiN’s early expansions benefitted from being in “a small village with a village government that knows who you are, that you can have a face-to-face conversion at pretty much any point, and business can be done around a coffee table, first thing in the morning, and you can carry on and get on with it”.

Construction and maintenance depends on municipal right-of-ways. If the ISP is municipally-owned, its fortunes are tied to the decisions of municipal government. In Olds, the City didn’t own the network but helped secure the loans needed to build it, and was able to exercise control by calling in these loans when political attitudes towards the network changed. While KiN has not been dependent on its municipality for funding, it has benefited from access to right-of-ways by maintaining a working relationship with local government (a municipal access agreement is key) — so that when the Village is expanding its sewer system, it’s possible to extend fibre as part of the process.

Incumbent resistance can also play out at the political level, and while Canada has not seen the sorts of lawmaking that US incumbents have lobbied for to ban community networks, other kinds of political battles do play out. A 2014 article from the Valley Voice recounts how “Telus tried to negotiate a secret deal with [Kaslo] Village council. The corporation promised a $500,000 upgrade to existing copper lines… Former Mayor Greg Lay says he fought for KiN’s right to be the point-of-presence provider rather than having to open it up to bidding from contractors outside the community. ‘I had to say, we support local people, I don’t care what Telus is offering,’ says Lay”. As much as political involvement can be a liability for telecom networks, it can also be a shield for local actors against larger interests.

5. Material resources

Community networks of significant scale don’t come cheap — millions may be required. KiN has found ways of doing it for less, but even in Kaslo securing funding is a major part of the challenge. There are sound economic reasons why incumbents “underserve” certain populations — without government support for construction many rural connectivity projects would either never be profitable, or only profitable in the long term. Writing and managing grant applications can be a full-time job, and navigating the funding landscape is complex (in BC, there are resources available online and funding is also available through federal programs). It helps to have some funds available early, so that work can begin without waiting for big grants to come in. And because of the large amounts of money involved, it can make a huge difference if a project can find ways to source cheaper materials, effectively adapt to the terrain, and spend less on labor (this includes using local workers or having supporters willing to volunteer their time/equipment). I think all of these factors have helped achieve successes for KiN, and could make a difference elsewhere.

Additional links:

Kaslo infoNet Society

Metaviews/Future Fibre profile 

Bill C-10 Leaves Far Too Much to the Imagination

Internet policy in Canada is set for a major update should Bill C-10 pass – but it’s the sort of update that will take ages to install and may break more than it will fix. It is odd to have a piece of legislation spoken about in such starkly different terms. Some opponents have described it as an attack on ‘free speech’, while proponents see it as getting our due from the ‘web giants’. This contrast is made possible by the details missing from the legislation, or what it leaves for regulators to determine in the future, when they struggle with implementing cultural policy across internet platforms.

Supporters working in the Canadian media industries imagine that they will benefit from the eventual outcome, although this is less true of content creators that have historically been excluded from Canadian cultural policy, such as users uploading videos to streaming sites and smaller online services. Those who feel well-served by the aims of pre-internet broadcasting policy can expect to be well-served by C-10, which essentially extends the Broadcasting Act to regulate where it previously has not. C-10 is first and foremost about regulating “online undertakings” as broadcasters – treating today’s internet platforms like the television and radio stations that dominated media when the Broadcasting Act was written. What this will actually mean for Netflix or YouTube will be decided by the CRTC, but it’s entirely reasonable to expect some system to promote the “discoverability of Canadian programs” that commands these services to modify their menus or algorithms if they want to operate in Canada. This raises the question of what counts as a Canadian program, and what platforms the regulations will apply to. However, the gist of it is that the CRTC has largely regulated the internet through telecom policy (the Telecommunications Act) rather than cultural policy (the Broadcasting Act). Where telecom policy requires non-discrimination and net neutrality (telecom companies are not supposed to mess with the content of what they are delivering to your screen), cultural policy includes the promotion of content that serves our shared interest as Canadians.

This is where concerns about free speech and the open internet have come into play, but there are numerous other issues that have been raised in recent weeks by legal scholars and internet policy experts such as Michael Geist, Dwayne Winseck, Emily Laidlaw, and Fenwick McKelvey, as well as former CRTC Chair Konrad von Finckenstein. These experts broadly agree that it is indeed important to regulate the big platforms, just not in this way, and all of them have been concerned with the vague scope of the bill. C-10 so far has been a squandered opportunity to actually consider what we want from internet and cultural policy in Canada, and instead seeks to simply extend the obligations of 20th century broadcasters (to fund Canadian content, and to show a certain amount of it) to the internet. Unfortunately, doing so with this particular piece of legislation will be anything but simple, and in trusting the CRTC to sort out the details we are setting up years of political battles, regulatory proceedings, and court cases, with some very uncertain outcomes.

The key questions to ask are: who will be regulated by this law and in what manner? The government has done a terrible job of answering these questions, including explaining why it has eliminated the section (4.1) of the bill that explicitly stated it would not regulate user-generated content, or explaining what kinds of online services won’t be regulated by the bill. The collateral damage from the bill could be significant, and implementing it would require asking the CRTC to do a lot more as a regulatory body in new domains, where it often seems challenged by its current responsibilities. C-10 is deliberately unimaginative, but remains open to various interpretations, including imagined windfalls from Silicon Valley and dark visions of government control.

Last week the government took the rare step of cutting short committee review of the bill and pushing ahead towards a House of Commons vote in the near future. Then came the strange sight at the end of the week of the heritage committee wrapping up its work by plowing through votes on a long list of amendments without explaining what was in them. As a result we have C-10 heading back to the House, without a clear idea of what’s in it beyond what Michael Geist pieced together. Apparently we will find out what the amended text is in the next few days. C-10 has been a badly-written and poorly justified piece of legislation, which has left far too much to the imagination (including what is actually in the current text of the bill).

Blocking Bad Traffic

Canadian internet policy is currently dominated by discussions of blocking – what to block and how. Blocking copyright infringement has been debated in a number of forms in recent years, was implemented in a limited sense to deal with GoldTV, and is now the subject of a new Government consultation (see here). Cabinet is also promising new measures to deal with illegal pornography, which may include new measures to block access (or building on existing ones, such as Cleanfeed). And then there is the CRTC’s proposal to mandate blocking cyber security threats – specifically botnets – though as a number of interveners have argued (including RCMP, CSE, M3AAWG) the focus on botnets is too narrow and should be expanded or kept flexible enough to adapt to future threats. Internationally the focus has been on how social media companies moderate their platforms, including blocking mis/disinformation and taking down accounts.

The common thread here is the understanding that some categories of internet traffic generate significant harm, and that there are centralized points of control where this traffic can be filtered out. As documented in Telecom Tensions (forthcoming May 2021), many of these debates are more than twenty years old (copyright owners have been arguing that ISPs should be responsible for infringing traffic since the mid-1990s), but they regularly re-emerge to focus on new methods, intermediaries, and targets. ISPs are often at the center of these controversies and participate as interested parties. Because they can exercise centralized control over their networks (DPI, DNS, BGP etc.), this is where filtering is deemed most effective (mandating end-user controls is also generally off-limits), and this means they are most likely to be saddled with new responsibilities and liabilities. There are already many existing processes in Canada for filtering at the ISP level (particularly when it comes to cyber security) and the question is often how to better coordinate and standardize what ISPs are doing on their own. Incumbents and government agencies have pointed to the work of CSTAC and CTCP in regard to the botnet-blocking proposal, and the work of the CCAICE goes back to the mid-2000s.

Network neutrality or the “open internet” is often implicated in these discussions, as blocking traffic generally contradicts the principle that users should be able to connect to whatever they want (or that ISPs should not discriminate among traffic). Those in favor of blocking will typically focus on the illegality of specific kinds of traffic as a justification, even where they may have significant financial interests in engaging in blocking (intellectual property, gambling revenues, bandwidth recovery). Opposition to blocking is less of an issue the more universally the traffic is recognized to be ‘bad’, which is why the earliest blocking regimes focused on child abuse imagery and spam. Botnets are an example where the users of ‘infected’ devices may not be aware that they are sending and receiving such traffic, and this can take us to questions of personal autonomy/responsibility for our devices (the Australian iCode system involved ISPs notifying users who were infected, placing the responsibility on users to take action). Even when there is broad agreement that a category of traffic is undesirable and should be stopped, there are always questions over effectiveness and overblocking (some bad traffic will get through, and some legitimate traffic will be blocked inadvertently). Finally, debates over internet filtering are inherently also about internet surveillance – since to discriminate between categories of traffic, ISPs and other actors need to monitor flows and have some sense of their contents.

Underlying all of this are two different view of what an ISP is and should be: whether our networks should be “dumb pipes”, limited to being just a conduit for packets as imagined in the end-to-end internet, or whether the pipes should be “smart” and exercise agency over traffic. The smart pipe’s “intelligence is rooted in its ability to inspect and discriminate between traffic, to decide what should be permitted, prioritized, or cleaned”. Clearly, the current rounds of proposals envision even broader roles for intermediaries in governing traffic — what I describe as the process of intermediation. However, the specifics in each case matter, involving different business interests and public consequences (including unintended consequences). Any action needs to be informed by an understanding of the actors and processes already in place to deal with bad traffic, previous experience with filtering elsewhere, and the harms and risks of traffic discrimination. We need to be mindful of the fact that some actors will always try to mandate blocking in order to safeguard their narrow interests. Political actors will also recurrently feel the need to ‘do something’ about internet threats, and imposing additional surveillance and filtering responsibilities on intermediaries often seems like the easiest solution. Policies for copyright filtering, pornography, and child abuse imagery have a long history in internet governance, but cyber security remains in many ways a frontier of internet policy, and one worth watching closely given the expansive possibilities for what might be considered a cyber threat in the future.

Rogers + Shaw — What are the policy options?

So, the big news from last week is that Canada’s two biggest cablecos want to become Canada’s biggest cableco.

This shouldn’t be surprising – Canada’s two biggest telcos have thought about merging for years, but the political winds have apparently not been favorable. While you can generally (but not always) satisfy shareholders with a big enough bucket of cash, governments are another matter. Since telecom liberalization in the 1990s, public policy has tried to move the country away from monopoly and towards a competitive marketplace for telecom services. Incumbent telephone and cable companies were unmoored to compete with one another across the country, but the practical consequences have been a gradual consolidation of the industry through mergers & acquisition. Because it makes little sense to build infrastructure in a ‘territory’ that is home to a well-established incumbent, the big companies stuck to their domains, and the biggest grew their territories by gobbling up potential competitors in other regions. People have been saying that there isn’t much left to gobble – or not much room for further consolidation in the industry – since the 2000s. But deals that further reduced the number of incumbents have continued to be approved by regulators. Eastern cableco Rogers is still hungry, and it’s confident government will agree to their planned purchase of Western cableco Shaw. Perhaps they will get what they want, but this is not a done deal and will be debated for some time to come. So, what are the options?

Option 1: Let the capitalists have their way

Rogers acquiring Shaw wouldn’t change much of the status quo anyways. The two companies already don’t compete for wireline customers (although they have previously squabbled over whether there is actually a non-compete agreement). The big change is going to come in wireless, where Shaw does compete (as Freedom, formerly WIND) with Rogers and the rest of the Big Three (Bell & TELUS), and access to spectrum is a critical factor. Given how hard it has been to develop a fourth wireless competitor, we could just embrace the reality of the Big Three. But why stop there? Maybe two companies is all you need for competition? Much of the country already operates as a duopoly for wired internet access (where Canadians can choose between their local cableco and telco). Regulatory proceedings at the CRTC would be greatly simplified in a world with only two players, as long as the regulator can assure itself that this is what competition looks like and there’s no need to set rates. Canadians will pay more – shareholders will profit. However, this would in many ways give up on what liberalization was meant to accomplish. Sure, governments care about shareholder interests, but they also care about other things, and will have to draw the line somewhere. Apparently that line wasn’t reached when Bell acquired MTS (followed by former MTS customers paying more). Is Rogers-Shaw going to be the line?

Option 2: Release the Americans

If no facilities-based competitors are going to organically develop from within the Canadian telecom industry (by climbing the mythical “ladder of investment”), then one option is to attract competitive interest from outside the country, and U.S. companies offer the best potential given their infrastructure south of the border. Of course, it’s not as easy as simply relaxing foreign ownership rules. A new foreign-owned competitor would have to spend billions to build infrastructure to compete against the comfortably-situated incumbents. More realistically, foreign investors would buy an existing Canadian incumbent where they saw an opportunity. This does have the potential to shake things up, just as WIND mobile did with its foreign backers (or when Verizon seemed interested in acquiring WIND and Mobilicity). Ultimately, WIND was bought by Shaw, and now Rogers is buying Shaw. As a new entrant, WIND required favorable regulatory treatment to stand any chance against the incumbents. New entrants need access to existing infrastructure (towers, poles, rights of way) and wireless spectrum. Long-time telecom journalist Rita Trichur (who was a correspondent during the ‘telecom war’ of 2013, when it seemed that Verizon might move into Canada) has come out advocating government drop the foreign ownership rules as a source of getting fresh competition into the market. She imagines that Verizon or AT&T could buy up the infrastructure that Bell and/or TELUS have built. Perhaps customers would see lower costs, and she points out that we already coordinate with the Americans on spectrum and security. After all, Canada and the U.S. are both members of the Five Eyes security alliance. Those AT&T/NSA splitter rooms were just for foreign traffic right? And Canada’s not really foreign, because the Five Eyes have a “gentleman’s agreement” not to spy on each other. Right?

Option 3: Service-based competition/structural separation:

The “European” model – recurrently floated as a possibility in Canadian telecom, but never seriously considered as a broad approach given our commitment to facilities-based competition. If we can call facilities-based competition a failure, then how about we accept a leading alternative? Let service providers compete over some kind of shared infrastructure (with the infrastructure owner “separated”, so that they don’t compete with these service providers). Canada’s rejection of this approach in favor of facilities-based competition was once justified by the fact that the country had overlapping cableco and telco infrastructure in urban areas, but being stuck with just two competitors was not imagined as the final outcome (in wireless, there are the Big Three, but Bell and TELUS split the country between them and share facilities, rather than build competing networks in the same territories). If we don’t like the outcome of duopoly, let’s deal with the policy failure by pushing in a different “separated” direction. There’s a lot of different models such an approach could follow, however the path will be difficult, and many people will be very upset by a change in the status quo.

Option 3b: MVNOs

Given the implications of the deal for wireless competition specifically, this could be the time to extend mandated access to wireless facilities to MVNOs (Mobile Virtual Network Operators). MVNOs have been very controversial in Canada, as another way of mandating access to incumbent facilities for new competitors to provide (wireless) services over — a specific kind of service-based competition. Canada has historically expanded mandated-access regimes for other forms of connectivity to deal with the flaws of facilities-based competition, but incumbents have opposed the introduction of MVNOs as they opposed other forms of mandated access. This opposition will be harder to maintain in the wake of a Rogers-Shaw deal.

Option 4: Just say No

In some ways, this is the conservative approach – reject the deal as bad for consumers, bad for competition, and keep things the way they are. In other ways, this would be a departure from Canada’s post-liberalization approach to competition policy, because some arm of government would have to finally signal that enough is enough. Perhaps this would just kick the can down the road and maintain regulatory uncertainty, but rejecting the deal provides an opportunity to articulate a new policy approach that the industry can orient to.

Are there other options? Sure – the deal can be approved with various conditions (as Rogers apparently expects), but these are not likely to make a fundamental difference to the outcome. Even if Shaw hands over Freedom and its spectrum to a different player, this won’t translate to a national competitor. And of course, we could go all-in on monopoly consolidation (let the biggest fish eat the rest) — which takes us back to the utility model and erases what liberalization was supposed to accomplish.

I do not want to weigh in on what is likely to happen, but to highlight the fact that this presents an opportunity to do something differently in Canada. My worry is that regulators will focus too much on how they might somehow make the Rogers-Shaw deal seem acceptable, and not enough about how they might steer policy in a different direction. The Competition Bureau is already getting an earful about the deal, and you can let them know your own thoughts.

Trouble in Olds

Canada’s best-known community network has met its existential threat — municipal politics. O-NET, based in Olds, achieved national fame back in 2013 as a community-owned ISP offering gigabit internet, in competition with local incumbents (TELUS and Shaw). If you are interested in community/municipal broadband in Canada, you will at some point have come across a version of the Olds story or heard reference to the “Olds model“. In general, O-NET has been presented as a success story, but this is a “success” that needs to be qualified. It’s remarkable that a community network was able wire up a town of Olds’ size with fibre and establish itself as a full-service ISP. But the long-term financial picture has never been clear, given the expense of building the network. O-NET successfully achieved more than many community broadband projects in Canada, but like many such projects, its future depends on local political support. Instead, the last several months have seen political conflict and uncertainty.

I haven’t been to Olds since 2014 (when I helped with a “citizen planning circle” about broadband) or spoken with anyone affiliated with O-NET since 2017. Local media (Olds Albertan/Mountain View Today) have been my way of keeping in touch, and based on the news it appears that after years of fading into the background of local politics, O-NET has recently become a major object of contention. On May 22, 2020, Olds Town Council issued a statement calling in $14 million in loans related to the build, and pushing for increased oversight. Subsequent discussions took place in closed Council sessions as well as through public allegations. Most recently, Town Council has created a broadband investment committee to decide the project’s future. So far the committee has met behind closed doors, and there has been public disagreement over its makeup (specifically, the exclusion of Coun. Mitch Thomson, who has long played an important role in the Olds fibre project).

It wasn’t meant to be like this. In fact, a key reason why O-NET’s relationship with the Town of Olds is so difficult to explain is because of the way that the organizational/ownership structure was meant to provide some independence from Town politics. O-NET is not a municipal ISP in the straightforward sense — it (and its owners) have been a separate organization from the Town. However, the funding required to build the network depended on the Town’s support, which has cut annual funding and called in loans given to the Olds Institute for Community and Regional Development (OICRD) or Olds Institute (OI). This non-profit organization “oversees Olds Fibre Ltd., a for-profit business that owns O-NET“. You see, the for-profit O-NET is really a creation of the Olds Institute (specifically, its Technology Committee), a “non-profit community and economic development organization” that had the Town of Olds as one of its four founding members. The project champions and volunteers that worked on establishing the community network benefited from the support of municipal politicians, but until now the Town limited itself to a supporting role. This included supporting O-NET in 2010 and 2014 when it required an additional $14 million to complete the fibre build — the Town was able to borrow the money from the Alberta Capital Finance Authority and then loan it to OI. O-NET used the funding to get its network built, but thereby became deeply indebted to the Town, which has recently pulled on the purse strings. It’s had to know where all of this will lead, but there is certainly a political effort in Olds to do something different with how the fibre project has been managed. To the extent that the Town’s CAO Michael Merritt would comment on whether there might be a future divestment from O-NET, he blandly stated: “The town is working with Olds Fibre/O-NET to determine the next steps“.

Regardless of what the future holds, anyone bringing up fibre in Olds as community network “success story” should probably add a significant footnote (which is essentially what this blog post should be considered for a chapter in my forthcoming book).

End-to-end versus end-to-end

If you’ve spent any amount of time reading about what the internet is, you will have heard it described as an end-to-end network.  End-to-end architecture is said to be a founding principle of the internet. It has often been invoked in discussions of net neutrality, or network management. It is typically understood to mean that the internet is decentralized: to the extent possible, control is best exercised by the computers at the ‘ends’ of the network, and everything in between (the internet ‘pipes’ and the networking hardware owned by the telecom companies) should be kept as simple (or ‘dumb’) as possible — limited to passing along data packets.

Now, it is possible to argue that the internet has never been decentralized. There are points of control in its architecture, through state institutions, and the ISPs that manage our experience of the internet. But this is only surprising if we lose sight of what the end-to-end argument was meant to be — not a statement of what the internet is, but what it ought to be.  This normative argument was premised on the notion of an authentic, original internet, but it was a fragile notion in danger of being lost, or intermediated out of existence. While the end-to-end principle was not intended as a description of reality, it did come to shape reality through its influence, as many people responsible for designing technologies, running networks, and governing connectivity took it seriously. However, countervailing pressures to move control from the edges to the center of the network will ensure that this will remain a constant point of tension.

The above is an argument largely based on Tarleton Gillespie’s article, Engineering a Principle: ‘End-to-End’ in the Design of the Internet. This article provides some STS-informed discourse analysis of the end-to-end principle, tracing where the term came from and how it has shifted over time. But there was another kind of end-to-end principle that preceded the internet as we know it today, and was became an obstacle to creative uses of telecom networks – a path that led us to the internet.

Gillespie doesn’t dive into this earlier notion of end-to-end, although he does reference some (Bell) sources from the early 1980s that use it to discuss virtual circuits and the need to provide end-to-end digital networking over an infrastructure composed of both analog and digital parts (or over the old telephone system). What is missing from the account is the history of the term before the 1980s, which is the background that these Bell authors have in mind when they discuss end-to-end. This alternate meaning of end-to-end persists today in numerous contexts, to denote roughly: ‘from one end to the other, and everything in between’. In the history of telecom, this meaning has been intimately tied to the notion of system integrity in the monopoly era.

The old end-to-end principle argued that the network operator must have full control of the network, from one end to the other, or that the carrier had end-to-end responsibility for communications. This meant control throughout the network, but also of the endpoints, so that only approved devices could be connected to it. The principle was also sometimes cited to prevent interconnection with other networks. In the US, regulations prohibiting “foreign” attachments to the telephone system were challenged by the 1968 Carterphone decision, but in Canada it took until later in the 1970s to displace the notion of end-to-end control and systemic integrity in telecom regulation.

It is interesting how the internet’s end-to-end principle is not just different from this earlier notion, but in many ways its mirror image. In the mid-20th century the telephone was seen as a “natural monopoly”, because it was best managed by a single provider with centralized, end-to-end control, protecting the system’s “integrity” from any unauthorized elements. The internet in contrast is cobbled together from various network and bits of hardware, its protocols allowing diverse autonomous systems to exchange traffic, and users can plug whatever devices they want into the endpoints.

What happened in between the fall of natural monopoly arguments, and the rise of the packet-switched commercial internet was the liberalization of telecom, effectively opening up the monopoly telcos to competition. The loss of end-to-end control made it possible for the internet to emerge. The old monopolies became incumbents, and had to give up some control over what happened over their wires.

In recent years, with a flatter internet, online intermediaries maintaining more control by deploying their own transport infrastructure, and “middleware” in the network making more decisions, the internet’s end-to-end principle is arguably under threat. But these concerns have been around for most of the history of the commercial internet. Fundamentally, these are issues about who and what exercises control over communication, rather than what the nature of our communication infrastructure dictates. Because of this, tensions about the relative power of the edges versus the middle will continue, but this also means that the history of these arguments continues to be relevant.

Connected across Distance

I’ve recently seen television segments air on news channels that were clearly recorded before the pandemic, but broadcast after, with people happily mingling together and discussing things which no longer seem to matter. While I’m as tired as anyone of the fact that the news has become nearly-24/7 COVID-19, I do find myself feeling partly offended by these portals into a different world (how dare they cheerily show me this reality which no longer exists?) I then find myself wondering if these things will once again matter, one day. As I’m sure many of us are, I know that things will be forever changed by this pandemic, but it’s still unclear how far those changes will go. To what extent will the previous order be transformed, and what has merely been suspended?

I ask this question as I edit a book manuscript that was written before the pandemic, and which will hopefully be published at a time when we have some answers to the above. What of Canadian telecom policy? What of the Canadian telecom industry? How will all of this change what we think about our intermediaries, how they are governed, and how they conduct themselves?

The telecom industry is among the more fortunate ones, given how essential this infrastructure is to anything else that is still operating. Being essential has other consequences however; right now voluntary actions by telecom companies have forestalled additional regulation, but this may not be enough in the medium-term. Wireless companies have suspended data caps, TELUS is giving students in Alberta and BC a deep discount, and no ISP wants to be in the news for disconnecting someone who is cash-strapped and under quarantine, but will government need to step in to guarantee some sort of connectivity safety net? Then there is the ongoing discussion over the role of telecom data for pandemic surveillance, and whether this can serve as a foundation for what ‘normal’ might look like several months from now.

Could the changes be even more fundamental? All of this presumes that capitalism will survive, that the political economy of telecom will largely be preserved, that various foundations of social order will be maintained. What I’m hoping we realize in all of this is how vital infrastructures like utilities, health care, supply chains are, the importance of the people doing the truly essential work we rely upon, along with the value of long-term planning. But we are also being shaken out of our taken-for-grantedness, learning just how fragile social life can be. Many Canadians have had the privilege of assuming certain social structures are permanent, that certain institutions would be there for us. Residents of other countries (along with some Canadians and First Nations) have not shared this sense of security. Now we are all experiencing the same upheaval, manifesting differently as each locale follows its asynchronous path.

As the experience of watching media that was recorded pre-pandemic becomes more rare in the coming months, I wonder if watching it will feel even weirder. Will seeing strangers relax in close proximity fill us with nostalgia or dread? In the meantime, I have a lot to be thankful for, and there’s only so much guessing about the future that is helpful for the here and now.