“Community broadband” success stories: Kaslo’s KiN

What happens when a community decides to take connectivity into its own hands and builds itself a broadband network? This question has been raised in past months with the pursuit of ConnectTO in Toronto, which may be the topic of a future post. The “poster child” of successful community broadband in Canada has often been identified as Olds, AB, but as I discussed in a previous post, the future of that particular project is in considerable doubt. In the United States, recent years have indeed produced numerous community broadband success stories, so how about Canada?

First of all, we need to spend a bit of time defining what we are talking about. Unlike a municipal network, which is typically owned by a municipality, a “community network” can encompass this and many other possibilities, including co-operatives, not-for-profits, or other ‘community-based’ organizations. The word ‘community’ tends to be used for its positive connotations than its descriptive accuracy, and can suggest a “spurious unity” of similarly-thinking people (Calhoun, 1998, p. 34). Rather than representing the will of some larger community, these broadband projects are often the result of a small number of champions who successfully mobilize resources and obtain the support of key political actors — despite differences and disagreements among the local population.

For me, community broadband is defined by the fact that it is (1) locally-based — rooted in the needs of people in a particular place or region, and that it (2) serves some articulation of the public good or collective interest of these people. Typically what this means is that residents, businesses, or government actors in some location feel ill-served by incumbent providers, and rather than waiting for these incumbents to improve the situation, they take matters into their own hands. This situation often arises in rural regions that are less profitable, and hence less of a priority for incumbents (which is why ConnectTO is so unusual). Unfortunately, rural areas are also the sorts of places where there tends to be a shortage of people with the skills required to build and operate a broadband network.

This is why I was so intrigued by a presentation that Tim Ryan gave to the BC Broadband Association in 2019, describing the do-it-yourself approach of the Kaslo infoNet Society (KiN), that involved locally-trained staff installing fibre and wireless for far less than is typically budgeted for such projects. Later that summer, I visited Kaslo to talk further with Tim and see things for myself. As is typically the case in Canada, the construction season is weather dependent, and KiN was busy digging and laying fibre before the freeze hit.

They’ve continued this work ever since, laying more fibre underwater to connect shoreline communities along North Kootenay Lake, as well as within Kaslo itself.

Area connected by submarine fibre

This is now effectively a regional network, connecting residents on both sides of a narrow mountain lake, over roughly 50km. Over the years, the technology underpinning the network has shifted from largely wireless to largely fibre-based, enabled by some innovative underwater deployments in the lake. Laying fibre in water poses some challenges, but it eliminates many of the land-based headaches over rights-of-way and infrastructure access (roadways, poles etc.).

One of the lessons here is that the specific technologies used to provide internet access should depend on the local circumstances. Fibre can be installed in different ways, and there are still situations where other technologies make better sense. KiN’s approach gives you a sense of just how low fibre deployment costs can be pushed in Canada. The cost of the actual cable tends be relatively minor — it’s getting that cable into the ground (or up on poles) that creates the expense. Aerial fibre involves getting approval from whoever (telco or power utility) owns the utility poles, and paying for rent or upgrades to the poles. Burying fibre can create different headaches in terms of approvals and overlapping infrastructure, plus the major expense of digging up the ground. On land, Ryan has claimed KiN can get trenching costs down to $7 [CAD] / meter. In lakewater, this cost can be a lot less, but most regions do not have the benefit of a long lakeshore geography in which to deploy backhaul.

Beyond the fact that there is no one-size-fits-all approach to building a community network, it is indeed possible to use success in one particular case to generalize more broadly. Below are some key determinants of success for a community broadband project, building on some Tim Ryan’s personal views about what has contributed to the success of KiN.

1. Backhaul (backbone connectivity)

The other elements below are all important, but if they are missing they can be developed. Without a backbone however, the network is a non-starter. Any community network is going to need to be plugged into the global internet somewhere, and its bandwidth is going to depend on the bandwidth of that connection to a fibre backbone. In an ideal situation, the community network finds a way to reach an internet exchange where it can connect with content providers and world-spanning networks. However, for much of rural Canada this is simply not possible. Satellites will open some possibilities in future years, but for the foreseeable future the viability of a new network is going to depend on access to existing infrastructure that can carry traffic regionally and globally. This may be the local telco incumbent(s), who will likely be competing with any new network for customers. While incumbents are mandated to provide wholesale network access in Canada, the terms of this mandated access do not provide a lucrative opportunity for community networks. The most successful approaches for community networks involve alternative backbones, or leasing a fibre path directly to an exchange if possible.

In Kaslo, KiN had been relying on incumbent TELUS for connectivity until 2014, when CBBC (a subsidiary of Columbia Basin Trust) extended its backbone to the town. This shifted the economics of broadband in the region significantly, and KiN then proceeded with its ongoing fibre deployments.

2. People and skills

Having the right people, with practical skills, experience, and personal connections ends up being valuable in a project that is going to be difficult in a best-case scenario, and which will likely depend on a small number of bodies to keep things moving. There is a path forward for community-based groups who want to build a network but don’t know how to go about it, but this can involve hiring telecom consultants and a steep or expensive learning curve. Effective project champions and relevant technical skills among community members are enormous assets. Local governments that have in-house networking expertise and experience managing infrastructure can draw on those for broadband projects, while groups outside of government are fortunate if they involve members with an understanding of telecom networks.

In the early 2000s, Tim Ryan was trying to carry out business from Kaslo over a limited internet connection, and his background in IT helped him imagine how things could be better. He joined KiN in 2012, where other members also had technical skills, but as Ryan stated “There is a perception that that fibre connectivity is complex … (but) it’s 90% ditch-digging and 10% technology”. In terms of critical skills, in his view they are: “Critical skill number one: ditch digging. Critical skill number two: network management. That was contributed by the existing ISP [KiN]. Critical skill number three: an understanding of how optical networking technology is assembled, and we had two people in the community that knew how”. These skills gave the project’s participants the confidence to proceed — knowing that building a fibre network was possible: “Optical technology on the face of it, is as opaque as brain surgery and rocket science. In fact, it’s not that difficult, but most people don’t know that.” Having the knowledge to imagine what is possible helps to get a project started, but being able to train local staff in the required skills ends up being important to keep costs down (and circulate money locally) while maintaining local autonomy.

3. Organization

There are examples of local networks built by a committed individual who wants better connectivity and extends this to their neighbors, but beyond this scale, some sort of larger organization is required. An ISP could be organized along many different lines: for-profits, not-for-profits, co-operatives, or public utilities have all had some success meeting local needs. If possible, an advantage comes from being able to work through already-established organizations. In Kaslo, KiN had been incorporated as a not-for-profit since 1996, when it had offered dial-up service. While network technology had moved on, KiN retained a couple hundred subscribers and an organizational that could be built upon for a future network. In Olds, the OICRD became the vehicle for envisioning, developing, and eventually owning its fibre network. Organizations coordinate individual action into collective efforts, and as legal entities they can do things that individuals cannot — such as applying for funding. But organizations also require internal governance and maintenance (the non-technical kind), and can always benefit from a shared vision and collective enthusiasm.

4. Political support

To some extent (often a large one) an ISP is a political actor, and must by necessity maintain relationships with different levels of government: the CRTC and ISED, provincial ministries (land use, transportation), public utilities, local councils and municipal departments. ISPs can be enlisted towards public policy goals, such as connecting public facilities (a valuable “anchor tenant” for a community network), tackling the digital divide, or facilitating economic development. In Kaslo, KiN connects the Village Hall, Langham Cultural Centre, and Library, and has made the most of being a local actor in a small place. As Tim Ryan put it, KiN’s early expansions benefitted from being in “a small village with a village government that knows who you are, that you can have a face-to-face conversion at pretty much any point, and business can be done around a coffee table, first thing in the morning, and you can carry on and get on with it”.

Construction and maintenance depends on municipal right-of-ways. If the ISP is municipally-owned, its fortunes are tied to the decisions of municipal government. In Olds, the City didn’t own the network but helped secure the loans needed to build it, and was able to exercise control by calling in these loans when political attitudes towards the network changed. While KiN has not been dependent on its municipality for funding, it has benefited from access to right-of-ways by maintaining a working relationship with local government (a municipal access agreement is key) — so that when the Village is expanding its sewer system, it’s possible to extend fibre as part of the process.

Incumbent resistance can also play out at the political level, and while Canada has not seen the sorts of lawmaking that US incumbents have lobbied for to ban community networks, other kinds of political battles do play out. A 2014 article from the Valley Voice recounts how “Telus tried to negotiate a secret deal with [Kaslo] Village council. The corporation promised a $500,000 upgrade to existing copper lines… Former Mayor Greg Lay says he fought for KiN’s right to be the point-of-presence provider rather than having to open it up to bidding from contractors outside the community. ‘I had to say, we support local people, I don’t care what Telus is offering,’ says Lay”. As much as political involvement can be a liability for telecom networks, it can also be a shield for local actors against larger interests.

5. Material resources

Community networks of significant scale don’t come cheap — millions may be required. KiN has found ways of doing it for less, but even in Kaslo securing funding is a major part of the challenge. There are sound economic reasons why incumbents “underserve” certain populations — without government support for construction many rural connectivity projects would either never be profitable, or only profitable in the long term. Writing and managing grant applications can be a full-time job, and navigating the funding landscape is complex (in BC, there are resources available online and funding is also available through federal programs). It helps to have some funds available early, so that work can begin without waiting for big grants to come in. And because of the large amounts of money involved, it can make a huge difference if a project can find ways to source cheaper materials, effectively adapt to the terrain, and spend less on labor (this includes using local workers or having supporters willing to volunteer their time/equipment). I think all of these factors have helped achieve successes for KiN, and could make a difference elsewhere.

Additional links:

Kaslo infoNet Society

Metaviews/Future Fibre profile 

Bill C-10 Leaves Far Too Much to the Imagination

Internet policy in Canada is set for a major update should Bill C-10 pass – but it’s the sort of update that will take ages to install and may break more than it will fix. It is odd to have a piece of legislation spoken about in such starkly different terms. Some opponents have described it as an attack on ‘free speech’, while proponents see it as getting our due from the ‘web giants’. This contrast is made possible by the details missing from the legislation, or what it leaves for regulators to determine in the future, when they struggle with implementing cultural policy across internet platforms.

Supporters working in the Canadian media industries imagine that they will benefit from the eventual outcome, although this is less true of content creators that have historically been excluded from Canadian cultural policy, such as users uploading videos to streaming sites and smaller online services. Those who feel well-served by the aims of pre-internet broadcasting policy can expect to be well-served by C-10, which essentially extends the Broadcasting Act to regulate where it previously has not. C-10 is first and foremost about regulating “online undertakings” as broadcasters – treating today’s internet platforms like the television and radio stations that dominated media when the Broadcasting Act was written. What this will actually mean for Netflix or YouTube will be decided by the CRTC, but it’s entirely reasonable to expect some system to promote the “discoverability of Canadian programs” that commands these services to modify their menus or algorithms if they want to operate in Canada. This raises the question of what counts as a Canadian program, and what platforms the regulations will apply to. However, the gist of it is that the CRTC has largely regulated the internet through telecom policy (the Telecommunications Act) rather than cultural policy (the Broadcasting Act). Where telecom policy requires non-discrimination and net neutrality (telecom companies are not supposed to mess with the content of what they are delivering to your screen), cultural policy includes the promotion of content that serves our shared interest as Canadians.

This is where concerns about free speech and the open internet have come into play, but there are numerous other issues that have been raised in recent weeks by legal scholars and internet policy experts such as Michael Geist, Dwayne Winseck, Emily Laidlaw, and Fenwick McKelvey, as well as former CRTC Chair Konrad von Finckenstein. These experts broadly agree that it is indeed important to regulate the big platforms, just not in this way, and all of them have been concerned with the vague scope of the bill. C-10 so far has been a squandered opportunity to actually consider what we want from internet and cultural policy in Canada, and instead seeks to simply extend the obligations of 20th century broadcasters (to fund Canadian content, and to show a certain amount of it) to the internet. Unfortunately, doing so with this particular piece of legislation will be anything but simple, and in trusting the CRTC to sort out the details we are setting up years of political battles, regulatory proceedings, and court cases, with some very uncertain outcomes.

The key questions to ask are: who will be regulated by this law and in what manner? The government has done a terrible job of answering these questions, including explaining why it has eliminated the section (4.1) of the bill that explicitly stated it would not regulate user-generated content, or explaining what kinds of online services won’t be regulated by the bill. The collateral damage from the bill could be significant, and implementing it would require asking the CRTC to do a lot more as a regulatory body in new domains, where it often seems challenged by its current responsibilities. C-10 is deliberately unimaginative, but remains open to various interpretations, including imagined windfalls from Silicon Valley and dark visions of government control.

Last week the government took the rare step of cutting short committee review of the bill and pushing ahead towards a House of Commons vote in the near future. Then came the strange sight at the end of the week of the heritage committee wrapping up its work by plowing through votes on a long list of amendments without explaining what was in them. As a result we have C-10 heading back to the House, without a clear idea of what’s in it beyond what Michael Geist pieced together. Apparently we will find out what the amended text is in the next few days. C-10 has been a badly-written and poorly justified piece of legislation, which has left far too much to the imagination (including what is actually in the current text of the bill).

Blocking Bad Traffic

Canadian internet policy is currently dominated by discussions of blocking – what to block and how. Blocking copyright infringement has been debated in a number of forms in recent years, was implemented in a limited sense to deal with GoldTV, and is now the subject of a new Government consultation (see here). Cabinet is also promising new measures to deal with illegal pornography, which may include new measures to block access (or building on existing ones, such as Cleanfeed). And then there is the CRTC’s proposal to mandate blocking cyber security threats – specifically botnets – though as a number of interveners have argued (including RCMP, CSE, M3AAWG) the focus on botnets is too narrow and should be expanded or kept flexible enough to adapt to future threats. Internationally the focus has been on how social media companies moderate their platforms, including blocking mis/disinformation and taking down accounts.

The common thread here is the understanding that some categories of internet traffic generate significant harm, and that there are centralized points of control where this traffic can be filtered out. As documented in Telecom Tensions (forthcoming May 2021), many of these debates are more than twenty years old (copyright owners have been arguing that ISPs should be responsible for infringing traffic since the mid-1990s), but they regularly re-emerge to focus on new methods, intermediaries, and targets. ISPs are often at the center of these controversies and participate as interested parties. Because they can exercise centralized control over their networks (DPI, DNS, BGP etc.), this is where filtering is deemed most effective (mandating end-user controls is also generally off-limits), and this means they are most likely to be saddled with new responsibilities and liabilities. There are already many existing processes in Canada for filtering at the ISP level (particularly when it comes to cyber security) and the question is often how to better coordinate and standardize what ISPs are doing on their own. Incumbents and government agencies have pointed to the work of CSTAC and CTCP in regard to the botnet-blocking proposal, and the work of the CCAICE goes back to the mid-2000s.

Network neutrality or the “open internet” is often implicated in these discussions, as blocking traffic generally contradicts the principle that users should be able to connect to whatever they want (or that ISPs should not discriminate among traffic). Those in favor of blocking will typically focus on the illegality of specific kinds of traffic as a justification, even where they may have significant financial interests in engaging in blocking (intellectual property, gambling revenues, bandwidth recovery). Opposition to blocking is less of an issue the more universally the traffic is recognized to be ‘bad’, which is why the earliest blocking regimes focused on child abuse imagery and spam. Botnets are an example where the users of ‘infected’ devices may not be aware that they are sending and receiving such traffic, and this can take us to questions of personal autonomy/responsibility for our devices (the Australian iCode system involved ISPs notifying users who were infected, placing the responsibility on users to take action). Even when there is broad agreement that a category of traffic is undesirable and should be stopped, there are always questions over effectiveness and overblocking (some bad traffic will get through, and some legitimate traffic will be blocked inadvertently). Finally, debates over internet filtering are inherently also about internet surveillance – since to discriminate between categories of traffic, ISPs and other actors need to monitor flows and have some sense of their contents.

Underlying all of this are two different view of what an ISP is and should be: whether our networks should be “dumb pipes”, limited to being just a conduit for packets as imagined in the end-to-end internet, or whether the pipes should be “smart” and exercise agency over traffic. The smart pipe’s “intelligence is rooted in its ability to inspect and discriminate between traffic, to decide what should be permitted, prioritized, or cleaned”. Clearly, the current rounds of proposals envision even broader roles for intermediaries in governing traffic — what I describe as the process of intermediation. However, the specifics in each case matter, involving different business interests and public consequences (including unintended consequences). Any action needs to be informed by an understanding of the actors and processes already in place to deal with bad traffic, previous experience with filtering elsewhere, and the harms and risks of traffic discrimination. We need to be mindful of the fact that some actors will always try to mandate blocking in order to safeguard their narrow interests. Political actors will also recurrently feel the need to ‘do something’ about internet threats, and imposing additional surveillance and filtering responsibilities on intermediaries often seems like the easiest solution. Policies for copyright filtering, pornography, and child abuse imagery have a long history in internet governance, but cyber security remains in many ways a frontier of internet policy, and one worth watching closely given the expansive possibilities for what might be considered a cyber threat in the future.

Rogers + Shaw — What are the policy options?

So, the big news from last week is that Canada’s two biggest cablecos want to become Canada’s biggest cableco.

This shouldn’t be surprising – Canada’s two biggest telcos have thought about merging for years, but the political winds have apparently not been favorable. While you can generally (but not always) satisfy shareholders with a big enough bucket of cash, governments are another matter. Since telecom liberalization in the 1990s, public policy has tried to move the country away from monopoly and towards a competitive marketplace for telecom services. Incumbent telephone and cable companies were unmoored to compete with one another across the country, but the practical consequences have been a gradual consolidation of the industry through mergers & acquisition. Because it makes little sense to build infrastructure in a ‘territory’ that is home to a well-established incumbent, the big companies stuck to their domains, and the biggest grew their territories by gobbling up potential competitors in other regions. People have been saying that there isn’t much left to gobble – or not much room for further consolidation in the industry – since the 2000s. But deals that further reduced the number of incumbents have continued to be approved by regulators. Eastern cableco Rogers is still hungry, and it’s confident government will agree to their planned purchase of Western cableco Shaw. Perhaps they will get what they want, but this is not a done deal and will be debated for some time to come. So, what are the options?

Option 1: Let the capitalists have their way

Rogers acquiring Shaw wouldn’t change much of the status quo anyways. The two companies already don’t compete for wireline customers (although they have previously squabbled over whether there is actually a non-compete agreement). The big change is going to come in wireless, where Shaw does compete (as Freedom, formerly WIND) with Rogers and the rest of the Big Three (Bell & TELUS), and access to spectrum is a critical factor. Given how hard it has been to develop a fourth wireless competitor, we could just embrace the reality of the Big Three. But why stop there? Maybe two companies is all you need for competition? Much of the country already operates as a duopoly for wired internet access (where Canadians can choose between their local cableco and telco). Regulatory proceedings at the CRTC would be greatly simplified in a world with only two players, as long as the regulator can assure itself that this is what competition looks like and there’s no need to set rates. Canadians will pay more – shareholders will profit. However, this would in many ways give up on what liberalization was meant to accomplish. Sure, governments care about shareholder interests, but they also care about other things, and will have to draw the line somewhere. Apparently that line wasn’t reached when Bell acquired MTS (followed by former MTS customers paying more). Is Rogers-Shaw going to be the line?

Option 2: Release the Americans

If no facilities-based competitors are going to organically develop from within the Canadian telecom industry (by climbing the mythical “ladder of investment”), then one option is to attract competitive interest from outside the country, and U.S. companies offer the best potential given their infrastructure south of the border. Of course, it’s not as easy as simply relaxing foreign ownership rules. A new foreign-owned competitor would have to spend billions to build infrastructure to compete against the comfortably-situated incumbents. More realistically, foreign investors would buy an existing Canadian incumbent where they saw an opportunity. This does have the potential to shake things up, just as WIND mobile did with its foreign backers (or when Verizon seemed interested in acquiring WIND and Mobilicity). Ultimately, WIND was bought by Shaw, and now Rogers is buying Shaw. As a new entrant, WIND required favorable regulatory treatment to stand any chance against the incumbents. New entrants need access to existing infrastructure (towers, poles, rights of way) and wireless spectrum. Long-time telecom journalist Rita Trichur (who was a correspondent during the ‘telecom war’ of 2013, when it seemed that Verizon might move into Canada) has come out advocating government drop the foreign ownership rules as a source of getting fresh competition into the market. She imagines that Verizon or AT&T could buy up the infrastructure that Bell and/or TELUS have built. Perhaps customers would see lower costs, and she points out that we already coordinate with the Americans on spectrum and security. After all, Canada and the U.S. are both members of the Five Eyes security alliance. Those AT&T/NSA splitter rooms were just for foreign traffic right? And Canada’s not really foreign, because the Five Eyes have a “gentleman’s agreement” not to spy on each other. Right?

Option 3: Service-based competition/structural separation:

The “European” model – recurrently floated as a possibility in Canadian telecom, but never seriously considered as a broad approach given our commitment to facilities-based competition. If we can call facilities-based competition a failure, then how about we accept a leading alternative? Let service providers compete over some kind of shared infrastructure (with the infrastructure owner “separated”, so that they don’t compete with these service providers). Canada’s rejection of this approach in favor of facilities-based competition was once justified by the fact that the country had overlapping cableco and telco infrastructure in urban areas, but being stuck with just two competitors was not imagined as the final outcome (in wireless, there are the Big Three, but Bell and TELUS split the country between them and share facilities, rather than build competing networks in the same territories). If we don’t like the outcome of duopoly, let’s deal with the policy failure by pushing in a different “separated” direction. There’s a lot of different models such an approach could follow, however the path will be difficult, and many people will be very upset by a change in the status quo.

Option 3b: MVNOs

Given the implications of the deal for wireless competition specifically, this could be the time to extend mandated access to wireless facilities to MVNOs (Mobile Virtual Network Operators). MVNOs have been very controversial in Canada, as another way of mandating access to incumbent facilities for new competitors to provide (wireless) services over — a specific kind of service-based competition. Canada has historically expanded mandated-access regimes for other forms of connectivity to deal with the flaws of facilities-based competition, but incumbents have opposed the introduction of MVNOs as they opposed other forms of mandated access. This opposition will be harder to maintain in the wake of a Rogers-Shaw deal.

Option 4: Just say No

In some ways, this is the conservative approach – reject the deal as bad for consumers, bad for competition, and keep things the way they are. In other ways, this would be a departure from Canada’s post-liberalization approach to competition policy, because some arm of government would have to finally signal that enough is enough. Perhaps this would just kick the can down the road and maintain regulatory uncertainty, but rejecting the deal provides an opportunity to articulate a new policy approach that the industry can orient to.

Are there other options? Sure – the deal can be approved with various conditions (as Rogers apparently expects), but these are not likely to make a fundamental difference to the outcome. Even if Shaw hands over Freedom and its spectrum to a different player, this won’t translate to a national competitor. And of course, we could go all-in on monopoly consolidation (let the biggest fish eat the rest) — which takes us back to the utility model and erases what liberalization was supposed to accomplish.

I do not want to weigh in on what is likely to happen, but to highlight the fact that this presents an opportunity to do something differently in Canada. My worry is that regulators will focus too much on how they might somehow make the Rogers-Shaw deal seem acceptable, and not enough about how they might steer policy in a different direction. The Competition Bureau is already getting an earful about the deal, and you can let them know your own thoughts.

Trouble in Olds

Canada’s best-known community network has met its existential threat — municipal politics. O-NET, based in Olds, achieved national fame back in 2013 as a community-owned ISP offering gigabit internet, in competition with local incumbents (TELUS and Shaw). If you are interested in community/municipal broadband in Canada, you will at some point have come across a version of the Olds story or heard reference to the “Olds model“. In general, O-NET has been presented as a success story, but this is a “success” that needs to be qualified. It’s remarkable that a community network was able wire up a town of Olds’ size with fibre and establish itself as a full-service ISP. But the long-term financial picture has never been clear, given the expense of building the network. O-NET successfully achieved more than many community broadband projects in Canada, but like many such projects, its future depends on local political support. Instead, the last several months have seen political conflict and uncertainty.

I haven’t been to Olds since 2014 (when I helped with a “citizen planning circle” about broadband) or spoken with anyone affiliated with O-NET since 2017. Local media (Olds Albertan/Mountain View Today) have been my way of keeping in touch, and based on the news it appears that after years of fading into the background of local politics, O-NET has recently become a major object of contention. On May 22, 2020, Olds Town Council issued a statement calling in $14 million in loans related to the build, and pushing for increased oversight. Subsequent discussions took place in closed Council sessions as well as through public allegations. Most recently, Town Council has created a broadband investment committee to decide the project’s future. So far the committee has met behind closed doors, and there has been public disagreement over its makeup (specifically, the exclusion of Coun. Mitch Thomson, who has long played an important role in the Olds fibre project).

It wasn’t meant to be like this. In fact, a key reason why O-NET’s relationship with the Town of Olds is so difficult to explain is because of the way that the organizational/ownership structure was meant to provide some independence from Town politics. O-NET is not a municipal ISP in the straightforward sense — it (and its owners) have been a separate organization from the Town. However, the funding required to build the network depended on the Town’s support, which has cut annual funding and called in loans given to the Olds Institute for Community and Regional Development (OICRD) or Olds Institute (OI). This non-profit organization “oversees Olds Fibre Ltd., a for-profit business that owns O-NET“. You see, the for-profit O-NET is really a creation of the Olds Institute (specifically, its Technology Committee), a “non-profit community and economic development organization” that had the Town of Olds as one of its four founding members. The project champions and volunteers that worked on establishing the community network benefited from the support of municipal politicians, but until now the Town limited itself to a supporting role. This included supporting O-NET in 2010 and 2014 when it required an additional $14 million to complete the fibre build — the Town was able to borrow the money from the Alberta Capital Finance Authority and then loan it to OI. O-NET used the funding to get its network built, but thereby became deeply indebted to the Town, which has recently pulled on the purse strings. It’s had to know where all of this will lead, but there is certainly a political effort in Olds to do something different with how the fibre project has been managed. To the extent that the Town’s CAO Michael Merritt would comment on whether there might be a future divestment from O-NET, he blandly stated: “The town is working with Olds Fibre/O-NET to determine the next steps“.

Regardless of what the future holds, anyone bringing up fibre in Olds as community network “success story” should probably add a significant footnote (which is essentially what this blog post should be considered for a chapter in my forthcoming book).

End-to-end versus end-to-end

If you’ve spent any amount of time reading about what the internet is, you will have heard it described as an end-to-end network.  End-to-end architecture is said to be a founding principle of the internet. It has often been invoked in discussions of net neutrality, or network management. It is typically understood to mean that the internet is decentralized: to the extent possible, control is best exercised by the computers at the ‘ends’ of the network, and everything in between (the internet ‘pipes’ and the networking hardware owned by the telecom companies) should be kept as simple (or ‘dumb’) as possible — limited to passing along data packets.

Now, it is possible to argue that the internet has never been decentralized. There are points of control in its architecture, through state institutions, and the ISPs that manage our experience of the internet. But this is only surprising if we lose sight of what the end-to-end argument was meant to be — not a statement of what the internet is, but what it ought to be.  This normative argument was premised on the notion of an authentic, original internet, but it was a fragile notion in danger of being lost, or intermediated out of existence. While the end-to-end principle was not intended as a description of reality, it did come to shape reality through its influence, as many people responsible for designing technologies, running networks, and governing connectivity took it seriously. However, countervailing pressures to move control from the edges to the center of the network will ensure that this will remain a constant point of tension.

The above is an argument largely based on Tarleton Gillespie’s article, Engineering a Principle: ‘End-to-End’ in the Design of the Internet. This article provides some STS-informed discourse analysis of the end-to-end principle, tracing where the term came from and how it has shifted over time. But there was another kind of end-to-end principle that preceded the internet as we know it today, and was became an obstacle to creative uses of telecom networks – a path that led us to the internet.

Gillespie doesn’t dive into this earlier notion of end-to-end, although he does reference some (Bell) sources from the early 1980s that use it to discuss virtual circuits and the need to provide end-to-end digital networking over an infrastructure composed of both analog and digital parts (or over the old telephone system). What is missing from the account is the history of the term before the 1980s, which is the background that these Bell authors have in mind when they discuss end-to-end. This alternate meaning of end-to-end persists today in numerous contexts, to denote roughly: ‘from one end to the other, and everything in between’. In the history of telecom, this meaning has been intimately tied to the notion of system integrity in the monopoly era.

The old end-to-end principle argued that the network operator must have full control of the network, from one end to the other, or that the carrier had end-to-end responsibility for communications. This meant control throughout the network, but also of the endpoints, so that only approved devices could be connected to it. The principle was also sometimes cited to prevent interconnection with other networks. In the US, regulations prohibiting “foreign” attachments to the telephone system were challenged by the 1968 Carterphone decision, but in Canada it took until later in the 1970s to displace the notion of end-to-end control and systemic integrity in telecom regulation.

It is interesting how the internet’s end-to-end principle is not just different from this earlier notion, but in many ways its mirror image. In the mid-20th century the telephone was seen as a “natural monopoly”, because it was best managed by a single provider with centralized, end-to-end control, protecting the system’s “integrity” from any unauthorized elements. The internet in contrast is cobbled together from various network and bits of hardware, its protocols allowing diverse autonomous systems to exchange traffic, and users can plug whatever devices they want into the endpoints.

What happened in between the fall of natural monopoly arguments, and the rise of the packet-switched commercial internet was the liberalization of telecom, effectively opening up the monopoly telcos to competition. The loss of end-to-end control made it possible for the internet to emerge. The old monopolies became incumbents, and had to give up some control over what happened over their wires.

In recent years, with a flatter internet, online intermediaries maintaining more control by deploying their own transport infrastructure, and “middleware” in the network making more decisions, the internet’s end-to-end principle is arguably under threat. But these concerns have been around for most of the history of the commercial internet. Fundamentally, these are issues about who and what exercises control over communication, rather than what the nature of our communication infrastructure dictates. Because of this, tensions about the relative power of the edges versus the middle will continue, but this also means that the history of these arguments continues to be relevant.

Connected across Distance

I’ve recently seen television segments air on news channels that were clearly recorded before the pandemic, but broadcast after, with people happily mingling together and discussing things which no longer seem to matter. While I’m as tired as anyone of the fact that the news has become nearly-24/7 COVID-19, I do find myself feeling partly offended by these portals into a different world (how dare they cheerily show me this reality which no longer exists?) I then find myself wondering if these things will once again matter, one day. As I’m sure many of us are, I know that things will be forever changed by this pandemic, but it’s still unclear how far those changes will go. To what extent will the previous order be transformed, and what has merely been suspended?

I ask this question as I edit a book manuscript that was written before the pandemic, and which will hopefully be published at a time when we have some answers to the above. What of Canadian telecom policy? What of the Canadian telecom industry? How will all of this change what we think about our intermediaries, how they are governed, and how they conduct themselves?

The telecom industry is among the more fortunate ones, given how essential this infrastructure is to anything else that is still operating. Being essential has other consequences however; right now voluntary actions by telecom companies have forestalled additional regulation, but this may not be enough in the medium-term. Wireless companies have suspended data caps, TELUS is giving students in Alberta and BC a deep discount, and no ISP wants to be in the news for disconnecting someone who is cash-strapped and under quarantine, but will government need to step in to guarantee some sort of connectivity safety net? Then there is the ongoing discussion over the role of telecom data for pandemic surveillance, and whether this can serve as a foundation for what ‘normal’ might look like several months from now.

Could the changes be even more fundamental? All of this presumes that capitalism will survive, that the political economy of telecom will largely be preserved, that various foundations of social order will be maintained. What I’m hoping we realize in all of this is how vital infrastructures like utilities, health care, supply chains are, the importance of the people doing the truly essential work we rely upon, along with the value of long-term planning. But we are also being shaken out of our taken-for-grantedness, learning just how fragile social life can be. Many Canadians have had the privilege of assuming certain social structures are permanent, that certain institutions would be there for us. Residents of other countries (along with some Canadians and First Nations) have not shared this sense of security. Now we are all experiencing the same upheaval, manifesting differently as each locale follows its asynchronous path.

As the experience of watching media that was recorded pre-pandemic becomes more rare in the coming months, I wonder if watching it will feel even weirder. Will seeing strangers relax in close proximity fill us with nostalgia or dread? In the meantime, I have a lot to be thankful for, and there’s only so much guessing about the future that is helpful for the here and now.

 

 

Connected Coast

Where the subsea fibre is meant to meet the last mile

For the past few years, one of the most intriguing and exciting public connectivity projects has been underway along Canada’s Pacific Coast. Born of the fusion of two proposals  — one from Prince Rupert’s municipally-owned telco CityWest, and the other from the Strathcona Regional District (the SRD, administered from Campbell River, Vancouver Island) – the Connected Coast project will lay fibre from Haida Gwaii and Prince Rupert, down the Pacific Coast, and around Vancouver Island (map). This is an ambitious build for the parties involved to undertake, with the potential to make a big difference in a lot of places that would otherwise be waiting for broadband. Because of funding constraints, it is on a strict timeline for completion (2021). Assuming everything goes well with the subsea portion, the success will ultimately depend on what happens when the cable reaches the shore.

I visited Vancouver Island in June to attend the final three presentations and meetings the SRD was conducting within its territory: in Tahsis, Gold River, and Cortes Island. Of the three, Cortes is the only area that currently has cell coverage, and availability of wireline depends on where one lives. Right now, internet access is often provided across a backbone of microwave links, or over satellite.

How TELUS keeps Cortes Island connected

As explained at the community meetings, on the SRD’s side the Connected Coast project began from the expressed need for connectivity from local representatives across the District. It became an infrastructure project after the SRD was informed by a telecom consultant that improvements in the last mile could only help as far as there was a backbone available to connect to. Because the funding is limited to building this backbone, the SRD can only facilitate and instigate the last mile, leaving this to other local parties (TELUS was not a participant at the meetings).

Conuma Cable operates in Gold River and Tahsis, where TELUS also has a network.

Basically, the plan is for an open-access, middle-mile network. Open access is a condition of the funding (as it is for many publicly-funded backbones), so in theory, any last-mile ISP will be able to connect to the Connected Coast backbone. The extent to which this happens will depend a lot on local conditions, and given that these are ‘underserved’ communities, this isn’t a retail market with many competitors. If incumbents choose to pursue their own builds independently, independent ISPs (or community networks) will need to step up and find the resources to bridge the last mile.

The Uptown Cafe in Gold River. Great food, and connectivity with your meal!

In addition to talking specifics with local ISPs and letting residents know about the new cable arriving at their shores, this was also an opportunity for local residents to to ask questions, imagine the possibilities that greater access to broadband could bring, and to voice concerns. This was especially the case on Cortes, where the meeting included a number of people strongly supportive what Connected Coast could provide, as well as opposition from those who don’t see connectivity as an unbridled good (including over fears that cell towers and radiation might follow the wired infrastructure). All in all, it was great to have a reason to visit these communities, and development of the Connected Coast project remains one to watch.

Connected Coast Comes To Cortes

BCBA Conference and Artificial Intelligence in Telecom

I’m currently at the British Columbia Broadband Association (BCBA) Conference, reconnecting with perspectives from the industry and the provincial perspective in BC. Of the Canadian events I’ve attended, this is definitely one of my favorites – the local context keeps it at a manageable size, affordable, but still with quality speakers. BC stands out as a province in Canada because of how non-incumbent ISPs have historically organized through the BCBA (for other industry conferences one generally has to travel to Toronto). BC is also remarkable in the attention that connectivity has received from the provincial government. The province is involved primarily through Network BC, which has in recent years focused more on supporting local solutions and supporting communities directly, rather than just partnering with the incumbent. While their materials/resources are intended for BC audience, communities around Canada interested in improving connectivity can find some ideas and inspiration in what Network BC has recently been involved in.

One of the new developments since I was a regular at these industry conferences is that we are now firmly in the midst of the artificial intelligence (AI) wave, where it is basically impossible to attend any policy or industry/technology conference without hearing at least one presentation about the promise of AI. At the BCBA, this presentation was provided by Jim Radzicki from TELUS International, who spoke on AI and the customer experience, including a broad overview of approaches and technologies under the AI “umbrella”.

Here the focus was really on how to better provide “customer care” – addressing customer questions, troubleshooting, and so forth. The application of AI in handling customer calls (including helping human agents handle calls) and acting as chatbots is a natural extension of these technologies, especially if we define AI with reference to the Turing Test (as Radzicki did), which connects the high-tech hype of AI with incredibly dumb chatbots of an earlier era like ELIZA. But AI is far more than just ‘fooling’ someone that they are talking to an intelligent being. As Radzicki also explained, what distinguishes these new technologies is their ability to organize and work with (or ‘learn’ from) vast data sets. And this is where AI already impacts our lives in a variety of ways, whether we know it or not – powering our Google searches, recommendations, or handling various functions behind the scenes. Elsewhere in the telecom industry, AI is touted as a solution for cybersecurity, network management, and traffic management, with companies offering products in these domains (beware however, if it says “AI” on the box, you may just be being sold some clever marketing and little else).

It’s great to be at a broadband conference again, and I will continue working on these topics and writing publications (particularly about community networks and public connectivity projects). Internet governance remains a fascinating domain, but this year I am also beginning a new long-term research project into AI governance: both how AI is governed, and how AI is used to deliver public services, or in the service of public policy. Governments at all levels in Canada are working to implement these technologies (machine learning, automation) with agencies such as TBS trying to guide developments across departments. I look forward to attending more AI-themed talks over the summer, getting a broad perspective of this fascinating set of social transformations, and seeing what stock images of humanoid robots the presenters have selected as illustrations.

All Forms of Competition

Well that didn’t take long. The CRTC just launched a review of mobile services, indicating that it was now in favor of allowing MVNOs. This comes just two days after receiving a new policy direction from ISED Minister Bains, and a period of strain in the CRTC’s arm’s length relationship with the federal Cabinet, which had different ideas over competition policy. After sending two displeasing CRTC decisions back for review, ISED Minister Bains pulled the CRTC’s arm on Feb. 26 and gave the Commission a course correction.

Much to the dismay of former Industry Minister Bernier, the CRTC had come to effectively ignore the old 2006 policy direction, apart from the obligation to mention it in telecom decisions. It had applied further regulation to incumbents and accorded more legitimacy to leased access. The Commission extended mandated access to last-mile fibre networks in 2015, and claimed that forcing incumbents to share these facilities would promote facilities-based competition (somewhere else — the middle mile). The fact that the Commission was required to tie arguments into knots to explain itself, or could redefine the scope of facilities-based competition to justify shared facilities was not particularly bothersome. The only body that could hold it accountable generally approved of these moves. Once the CRTC (under the direction of its new Chairman Ian Scott) began to behave in ways that made Cabinet uncomfortable, revisiting the policy direction became a way to instruct the Commission on how it was expected to behave.

The 2019 policy direction is more open-ended than the small government directive of 2006, signaling that the CRTC can intervene to promote “all forms of competition”. In effect, Cabinet is telling the regulator that it has not been regulating enough, and that competition is still valuable even if it isn’t between competing facilities. Certainly, the CRTC could try to flout the policy direction or interpret it narrowly as it had previously done, but this would escalate the conflict with Cabinet. When sovereignty is contested between the two, Cabinet will eventually have its way. In 2019, the rationality of facilities-based competition has been officially demoted.

The move by Bains has the effect of transforming an inconvenient policy failure into a horizon of possibilities. The vague fantasy that competition would lead to overlapping wires running into every home can be abandoned. In its place, consumer interests and competition have been reaffirmed as core values, but competition can be defined differently than before. Facilities-based competition is still an option, and one evidently favored by the CRTC even as it course-corrected after receiving its policy direction. In the Notice of Consultation for the new review, the CRTC is open to a mix of facilities-based and service-based competitors in wireless, but mandated access is still presented as a temporary stepping stone until “market forces take hold”. But according to Cabinet, all forms of competition are on the table, and facilities-based is not the only officially recognized option.

There’s still no reason to believe this is the first step to the ‘nuclear option’ of structural separation. More likely, the goal of competition policy will end up being something closer to the status quo than the elusive dream of a competitive facilities-based market. This would mean that mandated access won’t end up being presented as a temporary detour, but as the final destination.

2021 UPDATE: The MVNO decision in CRTC 2021-130 did go some ways to establishing a wholesale regime for wireless, but it limits MVNOs to those providers that already operate wireless networks. This effectively limits this form of service-based competition to facilities-based companies — therefore likely having minimal impact on the status quo.