Connected Coast

Where the subsea fibre is meant to meet the last mile

For the past few years, one of the most intriguing and exciting public connectivity projects has been underway along Canada’s Pacific Coast. Born of the fusion of two proposals  — one from Prince Rupert’s municipally-owned telco CityWest, and the other from the Strathcona Regional District (the SRD, administered from Campbell River, Vancouver Island) – the Connected Coast project will lay fibre from Haida Gwaii and Prince Rupert, down the Pacific Coast, and around Vancouver Island (map). This is an ambitious build for the parties involved to undertake, with the potential to make a big difference in a lot of places that would otherwise be waiting for broadband. Because of funding constraints, it is on a strict timeline for completion (2021). Assuming everything goes well with the subsea portion, the success will ultimately depend on what happens when the cable reaches the shore.

I visited Vancouver Island in June to attend the final three presentations and meetings the SRD was conducting within its territory: in Tahsis, Gold River, and Cortes Island. Of the three, Cortes is the only area that currently has cell coverage, and availability of wireline depends on where one lives. Right now, internet access is often provided across a backbone of microwave links, or over satellite.

How TELUS keeps Cortes Island connected

As explained at the community meetings, on the SRD’s side the Connected Coast project began from the expressed need for connectivity from local representatives across the District. It became an infrastructure project after the SRD was informed by a telecom consultant that improvements in the last mile could only help as far as there was a backbone available to connect to. Because the funding is limited to building this backbone, the SRD can only facilitate and instigate the last mile, leaving this to other local parties (TELUS was not a participant at the meetings).

Conuma Cable operates in Gold River and Tahsis, where TELUS also has a network.

Basically, the plan is for an open-access, middle-mile network. Open access is a condition of the funding (as it is for many publicly-funded backbones), so in theory, any last-mile ISP will be able to connect to the Connected Coast backbone. The extent to which this happens will depend a lot on local conditions, and given that these are ‘underserved’ communities, this isn’t a retail market with many competitors. If incumbents choose to pursue their own builds independently, independent ISPs (or community networks) will need to step up and find the resources to bridge the last mile.

The Uptown Cafe in Gold River. Great food, and connectivity with your meal!

In addition to talking specifics with local ISPs and letting residents know about the new cable arriving at their shores, this was also an opportunity for local residents to to ask questions, imagine the possibilities that greater access to broadband could bring, and to voice concerns. This was especially the case on Cortes, where the meeting included a number of people strongly supportive what Connected Coast could provide, as well as opposition from those who don’t see connectivity as an unbridled good (including over fears that cell towers and radiation might follow the wired infrastructure). All in all, it was great to have a reason to visit these communities, and development of the Connected Coast project remains one to watch.

Connected Coast Comes To Cortes

BCBA Conference and Artificial Intelligence in Telecom

I’m currently at the British Columbia Broadband Association (BCBA) Conference, reconnecting with perspectives from the industry and the provincial perspective in BC. Of the Canadian events I’ve attended, this is definitely one of my favorites – the local context keeps it at a manageable size, affordable, but still with quality speakers. BC stands out as a province in Canada because of how non-incumbent ISPs have historically organized through the BCBA (for other industry conferences one generally has to travel to Toronto). BC is also remarkable in the attention that connectivity has received from the provincial government. The province is involved primarily through Network BC, which has in recent years focused more on supporting local solutions and supporting communities directly, rather than just partnering with the incumbent. While their materials/resources are intended for BC audience, communities around Canada interested in improving connectivity can find some ideas and inspiration in what Network BC has recently been involved in.

One of the new developments since I was a regular at these industry conferences is that we are now firmly in the midst of the artificial intelligence (AI) wave, where it is basically impossible to attend any policy or industry/technology conference without hearing at least one presentation about the promise of AI. At the BCBA, this presentation was provided by Jim Radzicki from TELUS International, who spoke on AI and the customer experience, including a broad overview of approaches and technologies under the AI “umbrella”.

Here the focus was really on how to better provide “customer care” – addressing customer questions, troubleshooting, and so forth. The application of AI in handling customer calls (including helping human agents handle calls) and acting as chatbots is a natural extension of these technologies, especially if we define AI with reference to the Turing Test (as Radzicki did), which connects the high-tech hype of AI with incredibly dumb chatbots of an earlier era like ELIZA. But AI is far more than just ‘fooling’ someone that they are talking to an intelligent being. As Radzicki also explained, what distinguishes these new technologies is their ability to organize and work with (or ‘learn’ from) vast data sets. And this is where AI already impacts our lives in a variety of ways, whether we know it or not – powering our Google searches, recommendations, or handling various functions behind the scenes. Elsewhere in the telecom industry, AI is touted as a solution for cybersecurity, network management, and traffic management, with companies offering products in these domains (beware however, if it says “AI” on the box, you may just be being sold some clever marketing and little else).

It’s great to be at a broadband conference again, and I will continue working on these topics and writing publications (particularly about community networks and public connectivity projects). Internet governance remains a fascinating domain, but this year I am also beginning a new long-term research project into AI governance: both how AI is governed, and how AI is used to deliver public services, or in the service of public policy. Governments at all levels in Canada are working to implement these technologies (machine learning, automation) with agencies such as TBS trying to guide developments across departments. I look forward to attending more AI-themed talks over the summer, getting a broad perspective of this fascinating set of social transformations, and seeing what stock images of humanoid robots the presenters have selected as illustrations.

All Forms of Competition

Well that didn’t take long. The CRTC just launched a review of mobile services, indicating that it was now in favor of allowing MVNOs. This comes just two days after receiving a new policy direction from ISED Minister Bains, and a period of strain in the CRTC’s arm’s length relationship with the federal Cabinet, which had different ideas over competition policy. After sending two displeasing CRTC decisions back for review, ISED Minister Bains pulled the CRTC’s arm on Feb. 26 and gave the Commission a course correction.

Much to the dismay of former Industry Minister Bernier, the CRTC had come to effectively ignore the old 2006 policy direction, apart from the obligation to mention it in telecom decisions. It had applied further regulation to incumbents and accorded more legitimacy to leased access. The Commission extended mandated access to last-mile fibre networks in 2015, and claimed that forcing incumbents to share these facilities would promote facilities-based competition (somewhere else — the middle mile). The fact that the Commission was required to tie arguments into knots to explain itself, or could redefine the scope of facilities-based competition to justify shared facilities was not particularly bothersome. The only body that could hold it accountable generally approved of these moves. Once the CRTC (under the direction of its new Chairman Ian Scott) began to behave in ways that made Cabinet uncomfortable, revisiting the policy direction became a way to instruct the Commission on how it was expected to behave.

The 2019 policy direction is more open-ended than the small government directive of 2006, signaling that the CRTC can intervene to promote “all forms of competition”. In effect, Cabinet is telling the regulator that it has not been regulating enough, and that competition is still valuable even if it isn’t between competing facilities. Certainly, the CRTC could try to flout the policy direction or interpret it narrowly as it had previously done, but this would escalate the conflict with Cabinet. When sovereignty is contested between the two, Cabinet will eventually have its way. In 2019, the rationality of facilities-based competition has been officially demoted.

The move by Bains has the effect of transforming an inconvenient policy failure into a horizon of possibilities. The vague fantasy that competition would lead to overlapping wires running into every home can be abandoned. In its place, consumer interests and competition have been reaffirmed as core values, but competition can be defined differently than before. Facilities-based competition is still an option, and one evidently favored by the CRTC even as it course-corrected after receiving its policy direction. In the Notice of Consultation for the new review, the CRTC is open to a mix of facilities-based and service-based competitors in wireless, but mandated access is still presented as a temporary stepping stone until “market forces take hold”. But according to Cabinet, all forms of competition are on the table, and facilities-based is not the only officially recognized option.

There’s still no reason to believe this is the first step to the ‘nuclear option’ of structural separation. More likely, the goal of competition policy will end up being something closer to the status quo than the elusive dream of a competitive facilities-based market. This would mean that mandated access won’t end up being presented as a temporary detour, but as the final destination.

2021 UPDATE: The MVNO decision in CRTC 2021-130 did go some ways to establishing a wholesale regime for wireless, but it limits MVNOs to those providers that already operate wireless networks. This effectively limits this form of service-based competition to facilities-based companies — therefore likely having minimal impact on the status quo.

ISPs as Privacy Custodians

Just published in the Canadian Journal of Law and Society (CJLS) is my article on Internet Service Providers as Privacy Custodians (a pre-print version is available here). The content is adapted (and updated) from a chapter of my PhD dissertation, wherein different chapters dealt with different social responsibilities of ISPs in Canada. The focus of this piece is on privacy responsibilities, but these are interrelated with ISPs’ other responsibilities and social roles, such as surveillance. For example, Canada’s Privacy Act was referred to as the “wiretap bill” while it was being debated in 1973, because while it criminalized the invasion of privacy, it also provided a formal legal route through which the police could obtain wiretaps (I particularly enjoyed studying the murky history of wiretapping in Canada for this piece, which I could only include in summary form).

The responsibilities of ISPs to protect privacy directly shape how police can carry out investigations involving subscriber information, how copyright enforcement operates, and the sorts of commercial relationships ISPs can enter into when monetizing subscriber information. I settled on the term “privacy custodians” to describe the role of ISPs in governing privacy for lack of a better one (the term is used in health care, and here I conceive of privacy governance as being broader than managing the personal information of users, encompassing a broader relationship to the public including policy advocacy, public accountability, and privacy education). I’ve been interested in how different ISPs approach the role of privacy custodian, at times through differing interpretations of legal obligations, but also through different kinds of voluntary efforts to go beyond legal obligations. I discuss these by distinguishing positive responsibilities (the responsibility to do something) and negative responsibilities (the responsibility to not do something). I argue that we should pay attention to the ways that ISPs are distinguishing themselves by carving out and asserting new positive responsibilities, but being mindful of the discretion with which they do so, and the pressures to compromise privacy given the growing value of the data that these intermediaries can collect.

The abstract reads:

This article examines the role of internet service providers (ISPs) as guardians of personal information and protectors of privacy, with a particular focus on how telecom companies in Canada have historically negotiated these responsibilities. Communications intermediaries have long been expected to act as privacy custodians by their users, while simultaneously being subject to pressures to collect, utilize, and disclose personal information. As service providers gain custody over increasing volumes of highly-sensitive information, their importance as privacy custodians has been brought into starker relief and explicitly recognized as a core responsibility.

Some ISPs have adopted a more positive orientation to this responsibility, actively taking steps to advance it, rather that treating privacy protection as a set of limitations on conduct. However, commitments to privacy stewardship are often neutralized through contradictory legal obligations (such as mandated surveillance access) and are recurrently threatened by commercial pressures to monetize personal information.

While tensions over privacy and state surveillance have been long-lasting and persistent, in recent years the most interesting developments have been related to the monetization of personal information. Recent news have included the re-launch of Bell’s targeted ads program, and another U.S. privacy scandal involving the resale of location information. Canadian incumbents collaborate on location and identification services through EnStream, which has so far remained relatively quiet and scandal-free, but also introduced a new role into the subscriber-provider relationship. We pay service providers to give us connectivity and some extent of privacy, but these companies are also serving the needs of customers who want information about us.

In short, the internet’s gatekeepers are also the gatekeepers of our identities and activities.

SuperNet 1.0 Post-Mortem

Earlier this month the Auditor General of Alberta released a major report, with a section on contract management at Service Alberta devoted to SuperNet. Several news outlets covered the release, and decided that the section dealing with SuperNet was the most newsworthy, summarizing it as mismanagement of a $1 billion contract.

The report is a bit of a strange read, since it is on the topic of the original SuperNet, discusses it in present-tense, but the relationships and contracts that defined SuperNet 1.0 belong to the past. The Auditor General’s office effectively studied the period between 2005 and 2017, carrying out interviews and analyzing documents in 2017, and completing the audit in January 2018 — long before the last-minute hand-off of the network last summer. One of the report’s findings was that the Government of Alberta had identified risks in the 2018 transition and “incorporated mitigation strategies” into its planning, but the audit did not look at the procurement process or how the transition (to Bell) actually took place, or what is in the new contract.

Service Alberta Minister Brian Malkinson took the opportunity to reassure that GoA has already learned the report’s lessons, and will post the new contract online once Axia’s assets are transferred to Bell. Since Bell reported that it had completed acquisition of Axia back in September, I’m not sure what the reason for the hold-up is. This is the sort of public accountability that was badly missing from the last SuperNet deal. This time, the public is being asked to accept the Minister’s reassurance that it’s all good, while we wait to learn what sort of deal was struck with Bell back in June.

So what did the Auditor General learn about SuperNet 1.0? A lot of the report puts on official stamp on what we already knew — lax oversight, a badly-written contract, Axia doing things GoA thought was inappropriate, but GoA feeling largely powerless to stop them:

“In 2011 the department sought legal advice on potential non-compliance with operator independence requirements. The department then sought additional information from the operator on services provided to third parties… The operator has asserted it is compliant with contract terms and obligations. As a result, again, the parties to the contract did not consistently interpret the terms and conditions”


“The department attempted to exercise these audit rights in 2015 as a result of a number of contract disputes, including those identified above. The department has not been successful in exercising that right… again because of differing interpretations of contract terms”

The report does offer a kind of explanation for something I’ve wondered since reading the SuperNet 1.0 contracts — what about all the regular reporting that Axia was supposed to provide GoA about how the SuperNet was running?

“We found that, since 2006, the department has not always received this reporting from all contract parties. We also found no evidence of the department routinely requesting this information from the parties. We asked department management why they have not obtained this information as required under the contract. Management stated that they considered the reporting to be more relevant to the initial construction of the network rather than ongoing operations.

Right. I’m guessing that the various ministries responsible for SuperNet over much of this period really weren’t interested or capable in monitoring what was happening with the network. What about when GoA started to have more serious concerns about the contract after 2011? Maybe no one at Service Alberta saw this information as valuable enough to ask for. Maybe some people weren’t aware this was in the contract. Whatever the reasons, I’ll be interested in what kind of accountability is written into SuperNet 2.0, and whether accountability on paper translates to accountability in practice.

So, we now have some more specifics about fundamental issues that  Service Alberta was quite up-front about last year before the contract expired. The SuperNet seems to be chugging along in continuity mode for the time being, and the communities Axia fibred up are still getting the internet they agreed to for the same price. But last summer did not exactly inspire confidence about the SuperNet 2.0 transfer. What was all that last-minute contract negotiation about? What exactly is Alberta paying for right now? Here, the Auditor General can tell us nothing — these sorts of audits can take many months to pull together.

Maybe next year?

SuperNet 2.0 (update)

With the transition to Bell’s operation of the SuperNet a smooth one so far, it’s worth revisiting the topic and my previous blog post. Any anticipated squabbles between Bell, Axia, and GoA have been rendered moot by Axia ceasing to exist as an independent entity in Alberta, now that it is being absorbed into the big body of Bell. This means Bell now owns anything that Axia might have wanted to claim as its property, and it inherits all of the corporate infrastructure responsible for keeping the SuperNet running. I wonder if this was the plan all along ever since GoA started favoring Bell for the contract, or if it’s what led to the last-minute nature of the announcement, as Bell and GoA belatedly realized they would face some serious disruption if they tried to go around Axia.

The remaining question is what happens to the communities that Axia (Connect) has extended fibre networks in during the past few years. The SuperNet 2.0 contract should really have done away with the possibility for such an arrangement, which violated the “level playing field” intent of the SuperNet (with the wholesaler also competing in the retail market). But so far we have indeed witnessed continuity for places like Nanton and Stavely, which makes me wonder, is this continuity just a temporary arrangement? Is there a plan to sell these networks off to another party, or does everyone previously served by Axia Connect just become a Bell customer? If the latter, this legitimates what Axia had done in recent years and would be a big win for Bell, which would not be confined to the role of middle-mile intermediary. It could open up the possibility for Bell to use the SuperNet to great competitive advantage against rural Alberta ISPs, whether they rely on SuperNet or not. It would also be another example of the GoA choosing to maintain the SuperNet status quo, rather than making difficult decisions and much-needed changes to the contract. I hope this is not the case, but it’s certainly a possibility given previous history.

2019 UPDATE: In keeping with its practice to only offer retail internet service in Eastern Canada, Bell has sold the Axia retail operation to TELUS.

SuperNet 2.0 Deadline Hits

The clock almost ran out.

In a decision that should have been announced months ago, the Government of Alberta (GoA) has just declared it was handing the SuperNet’s management over to Bell – the company that had pulled together the original consortium which had won the contract back in 2000 and funded a great deal of the network, but which had played a more marginal role as its once-partner Axia assumed operational control. The next stage for Alberta’s province-spanning fibre-optic network is about to begin, and if the transition continues to generate bad press, more Albertans might actually become aware that this urban-rural infrastructure exists.

Whatever is happening this long weekend at the offices of Bell, Axia, and the GoA, it wasn’t supposed to be this way. The province was working on a long-term fix for the situation last Spring, when Service Alberta’s Stephen Bull assured that the file had the government’s attention at the highest level. Then something got bogged down, someone lost the script, or there were too many cooks in the kitchen. All that’s known publicly is that the Minister responsible for Service Alberta was shuffled a couple of weeks ago, and the GoA seemed ill-prepared to comment on what was happening, leaving Axia to mount a last-minute PR campaign to defend their interests and encourage Albertans to write their MLAs to keep the SuperNet “managed from Alberta, by Albertans.” It all seemed reminiscent of how the province has neglected the SuperNet since it was built, leaving the day-to-day to Axia, the network’s former operator, with the GoA’s responsibilities shuffled between the province’s shifting ministries and only occasionally receiving higher-level attention.

As we approach the July 1st hand-off, it’s worth reviewing what’s at stake:

Axia was small, new, and unknown in 2000, with no experience building and running a network like SuperNet. Today, the company has experience around the world, but its core operations in the province do seem to be in jeopardy. Presumably, there are contingency plans, but GoA didn’t give Axia much time to work things out, and everything the company does in Alberta seems tied to the SuperNet contract.

Bell invested significantly in the SuperNet during the early 2000s, which was meant to give the company more of a Western presence, but the project ended up being more of a money-pit than anticipated and I can imagine the subsequent legal disputes left the company with some regrets. Now the company gets to gain control over infrastructure it has actually owned since 2005.

We don’t know what the contract looks like, and it may be some time before we do. The original SuperNet contract was treated as confidential business information rather than public policy and had to be eventually obtained through a FOIP (freedom-of-information) request. Last week, Axia was promoting the idea that its much-touted open-access model would be under threat if Bell got the contract. But the GoA has promised “continuity”, including for the ISPs that get wholesale access to SuperNet, so it’s unlikely there will be a fundamental change in that regard. Also, Axia abandoned a key principle of the open-access model years ago, when it effectively dropped structural separation by becoming a retail ISP through SuperNet. Sure, they did so without blocking other ISPs from accessing the network, and they were arguably meeting a market need that was ill-served, but in an open-access middle-mile network the operator should not also be competing with last-mile ISPs.

GoA tacitly approved all of this by staying out of the way as Axia offered fibre networks to municipalities across the province, through an entity legally separate from its SuperNet operations, but which advertised connectivity through the SuperNet as its competitive advantage. This is something that always gave me (and others) trouble, since Axia SuperNet and Axia Connect were supposed to be separate entities, but the dependencies between the two were quite explicit (with Axia Connect communities used to promote SuperNet, and SuperNet used to promote Axia Connect). Did municipalities understand the risk they were entering into when they inked their contracts with Axia Connect? I hope so, since Axia Connect’s contingency on the SuperNet contract was not a secret, but some are understandably confused that the next-generation network they had secured for their town could now be a stranded asset in need of a long extension cable. I’m guessing that Axia will lease that cable if they have to and pass the cost on to consumers, or if things get really bad they could sell these fibre networks. Still, whatever happens this will only affect a relatively small number of Alberta towns with Axia Connect (Nanton, Vulcan, Nobleford, Stirling, Barnwell, Fairview, Pincher Creek, Fort Macleod, Raymond, Magrath, and Hanna). The rest of rural Alberta relies on the traditional SuperNet arrangement that connects public facilities and allows access to private ISPs.

Something else I anticipate will become an issue is the question of who-own-what. This has always been one of SuperNet’s most confusing aspects, since at the end of the 1990s the GoA really did not want to become owner of a telecom network. Government ownership is what the SuperNet contract stipulates could happen in 2035 – at least for the Extended Area Network that Axia was responsible for, but by then it might not be worth much. The Base Area Network, which connects Alberta’s cities, is owned by Bell. Ownership of the Extended Area Network is also technically Bell’s, but the network has not been static since it was built, with Axia building upgrades and integrating the network with its own business endeavors. I can imagine it may be complicated to sort out which bits of equipment belong to Axia, and which belong to the Extended Area Network being transferred over to Bell’s control. Both companies are contractually obligated to ensure a smooth and orderly transition, and in a perfect world the priority would be to maintain connectivity to all SuperNet clients while these things get sorted out, but everything related to SuperNet so far has been from perfect.

What happens on Canada Day remains uncertain — more to follow.

Security Versus Surveillance After Snowden

Just published in the latest issue of the open-access Surveillance & Society journal is a piece I originally wrote while attending the Surveillance Studies Network conference in Barcelona in 2014. By that point, nearly a year after the first Snowden disclosures, the most significant revelations had come out and it was possible to take stock of their impact. I was studying the Canadian telecom policy at the time, attending industry conferences and international events like NANOG and the IETF. At both kinds of meetings, discussions of privacy, surveillance and Snowden were unavoidable that year. We had entered the post-Snowden era, and this was evident beyond conferences’ discussion topics.

When the first Snowden disclosures happened in June 2013, conflicts between the NSA and private industry had cooled (and relations warmed), following mid-1990s fights over the clipper chip. Many information security practitioners in 2013 had not been involved in these political battles from twenty years ago. Some infosec professionals had started out as troublesome hackers, but the NSA now saw domestic hackers as less of a threat and more of a recruitment opportunity, with the head of NSA (Gen. Hayden) giving a keynote at Def Con in 2012. Individuals from the NSA had also participated at the IETF, and many in the private sphere had come to see themselves as essentially fighting on the same side as government. The biggest enemies were foreign state-backed hackers (“advanced persistent threats”), concern over which had reached an all-time high in 2013, particularly through the threat emanating from China. Snowden changed all that; Chinese hackers dropped from the headlines, the IETF took a public stand, and the NSA took a “time out” from hacker conferences. It wasn’t just that the Five Eyes were carrying out mass surveillance — they were doing so by compromising the security of technologies, institutions, and people they claimed to protect.

As many (including Snowden) argued, secret government surveillance in a democracy is a political issue, and the disclosures brought secret programs to public attention to make an informed policy debate possible. But other than the USA Freedom Act, meaningful political action did not materialize, and in the United States the debate largely centered on the question of whether Americans were illegitimately spied upon by their own government (as opposed to larger questions of international mass surveillance and governments compromising technologies used by their own citizens). But some institutional relationships and technologies were immediately altered because of Snowden, and the practical consequences of changes undertaken in the private sector and civil society have been more significant than political reforms.

Post-Snowden security responses include Google securing its own international links, a wider shift toward encrypting web traffic (through HTTPS), or Apple’s post-Snowden security upgrade, which set off a massive legal fight with the FBI over an iPhone in 2016. It’s not that mass surveillance has become more difficult across the board — Apple now faces new concerns about iPhone security and the privacy compromises it has made to enter into the Chinese market, but the company’s pre-Snowden cooperation with U.S. authorities is over.

More broadly I hope this piece will be useful in distinguishing between different kinds of security: cyber, national, and information technology (IT), and how these relate to privacy and surveillance. Before Snowden, many in Five Eyes nations saw national, cyber, and IT security as working together. After Snowden, IT security has become a form of resistance against surveillance tied to national security and cyber security projects.

All good things…

Since 2012, I’ve been working on my PhD dissertation research into Canadian internet policy at the University of Alberta’s Department of Sociology. This month I successfully defended the dissertation (pdf), which addresses the theme of this blog — intermediation. This includes an analysis of the political economy of Canadian telecom, competition regulation, public connectivity, privacy, security and lawful access, copyright, net neutrality, and alternative or public approaches to connectivity.

An enormous thanks to all those who have helped me get to this point by sharing what you know about these topics. Many people have told me things that do not appear in the final thesis, but rest assured every interaction I’ve had over the years has helped to inform my understanding to get to this point. It’s been really great hearing from internet pioneers, Canadian telecom professionals, public servants, policy experts, and all those who help make this connected world what it is.

So what’s next? I plan to continue pursuing all the topics that have animated this research. We’re still talking through many of the same telecom and internet policy debates as when I started, and ISPs are still crucial gatekeepers and mediators of connectivity. The blog will keep its focus, though there may be some changes in frequency as I move on to new professional responsibilities at UBC. However, I imagine in the future I will be thinking more about Silicon Valley companies and the business model we might call platform capitalism, so the nature of the intermediaries I focus on may change. I will also be keenly looking for approaches to connectivity that are more locally-oriented, and alternatives to the giant firms that currently dominate connectivity and our online experiences.

ISPs as Providers of Equitable Connectivity

Recently in the news — Canadians love connectivity and they want it cheaper. We can see this either as an indicator of increasing competition in the sector (thanks to Freedom Mobile), or a sign of how high rates and data caps make Canadians scramble for a deal when it’s offered.

The focus now is on mobile plans, but we’re not having the discussion about an affordable option for residential broadband. As announced in last year’s federal budget, affordable government-approved broadband for low-income Canadians may eventually become available. While there are strong parallels between this approach and 20th century efforts to achieve universal service through cross-subsidization, this will likely not be a universal program. Rather than imposing some sort of “skinny basic” for the internet, the federal Cabinet has made affordable internet a priority, allocated money, and left us waiting on the details.

In a previous post, I wrote about the CRTC’s universal service objective, and how the Commission likes to stay out of setting retail prices for broadband (unless we’re talking about an IPTV service). The CRTC does regulate wholesale internet rates to promote competition, and this is supposed to control prices, but part of the rationale for not intervening directly on retail pricing was to avoid doing something that would “inadvertently hinder the development of further private and public sector initiatives” on affordability. Well, the federal government’s $2.6 million annual program announced last March, can be seen as a public program to nudge private sector initiatives along. The money is meant to help support ISPs that offer low-priced connectivity to low-income families, who will also receive refurbished computers.

This is similar to what Rogers and TELUS have been doing already in select markets, and these companies may end up being able to roll their existing programs into whatever is finalized as the government’s plan with little effort. But if other providers do join (or are compelled to participate in a mandatory program), then this becomes more of an industry norm than a distinguishing virtue. Rogers and TELUS have been trying to behave and stand out as good corporate citizens (Bell’s distinctive efforts in this regard have been championing the issue of mental health).

The discussion is understandably focused on the incumbents here, but let’s not forget there are a host of organizations and ISPs that have long been devoted to a more equitable distribution of connectivity in society: FreeNets & community networks (NCF, VCN, ViFA, Chebucto), publicly-funded rural broadband (like SuperNet, or one-time funding programs like Connecting Canadians and Connect to Innovate), First Nations initiatives, as well as public internet access sites. The federal government’s affordable access program for low-income households was criticized for being developed independent of groups that have been advocating for affordable connectivity in recent years, and following this criticism the proposal was sent back to the design stage to gestate further.

Personally, I love to see programs targeted for low-income Canadians that need them most, but the shelved affordable access proposal was a feather-light welfare policy. This was not the state using the market to achieve a public good — this was the state trying to achieve a public good without imposing any undue burdens on the market, with the private sector invited to participate. It would have encouraged a form of cross-subsidization, where ISPs use wealthier subscribers to subsidize poorer ones. In the monopoly era, cross-subsidization is how universal service (a phone in every home) was achieved. The telco companies had their regional monopolies, and one justification for this monopoly power was that you could take profits from urban areas to subsidize connectivity for more expensive (or less profitable) rural areas. After the monopoly era ended, we shifted to the cultivation of competition and deference to market forces. The societal benefits of internet access for everyone are clear, but the distribution of connectivity is still treated as a corporate responsibility.

This Liberal government is taking its time on this issue — perhaps they see flaws in the previous approach but are reluctant to push a more robust policy.  In the meantime, telecom companies may be less willing to develop their own affordable access programs knowing they may have to adjust to whatever shape government policy takes.