Speakers
-
Dr. Emily Laidlaw
University of Calgary
-
Michael Lee-Murphy
The Wire Report (Moderator)
Key Issues
- What should a regulatory framework for online safety look like?
- How does a crisis of trust in institutions and media affect efforts to legislate the internet?
- Which intermediaries should be regulated and how?
- How can a new digital regulator be set up for success?
Discussion Overview
The Online Safety Expert Advisory Group, commissioned by the Department of Canadian Heritage and co-chaired by Dr. Laidlaw concluded its work in June 2022. The Advisory Group met throughout the Spring to discuss the many difficult dimensions of online safety regulation. While the mandate of the group was not to achieve consensus, they did agree on the following: (1) there should be a duty to act responsibly for the companies, and (2) there should be a regulator. The former was largely decided on the basis of how these platforms are at the intersection of speech and human rights, the notion of product safety, and the approaches of both the United Kingdom (UK) and European Union (EU)
Elon Musk’s takeover of Twitter was largely inspired by a backlash against the notion of content moderation. It serves as an excellent case study of the current transnational phenomena of a crisis of trust in government and media institutions, and the risks that come without having legal oversight and a legal mandate to regulate these companies. It also demonstrates how a lack of trust can complicate attempts to regulate online intermediaries. As well, it represents how the global internet governance regime is not settled, and needs a re-think. However, there are several mechanisms at work that have regulated Musk’s Twitter to a varying degree. The marketplace and user choice has been powerful in this scenario, particularly as advertisers and users abandon the platform. There are also complex regulatory regimes at work, such as the pushback Musk has faced from the EU. Moreover, intermediary service providers, like app stores and content delivery networks, are able to provide lots of pressure, as they can remove companies from their services if they wish.
The pressure from infrastructure players opens up the question of intermediary liability and their role in an online safety regulatory framework. Essentially, how far down the internet stack should a regime go? Baking in the principles of transparency, necessity, proportionality, and responsibility helps answer this question. While all companies should act responsibly, the obligations should be proportionate to the risk that companies pose to the public - as in the EU Digital Services Act. Platform companies pose the most risk to Canadians online, and are the players who can have the most targeted, thoughtful, and minimally invasive work to make Canadians safer online. The further down the stack you venture, the more opaque and blunt the options for content moderation are. Thus, the most burdensome obligations - such as auditing, content requirements - should be on platforms, while those lower down the stack should only have transparency obligations, for example. Players lower down in the stack have only been turned to thus far due to a failure of law, policy, oversight, and of the players closer to content. For example, injunctions for site-blocking have come out of the Federal Court in part due to the ineffective exercise of remedies by content and hosting intermediaries. Intermediaries who do not host content should never be required by a regulator to action content, as that could be an infringement on human rights, as it would be government interference in speech without the proper legal tests by the judiciary.
The Canadian Radio-television and Telecommunications Commission should not be responsible for the implementation, oversight and enforcement of the online safety regulatory framework, as is the case in the United Kingdom. Following Australia’s lead, a new digital regulator should be created. It should be responsible for enforcement, on the one hand, and education and collaboration on the other. While it is important to include small platforms in the online safety regime, these companies should not be burdened to the degree of inexistence. A regulator should help develop industry codes of practice and educate companies so they can succeed in the landscape, as opposed to fear fines and view regulation as an irritant. Moreover, a unique digital regulator would be able to collaborate with similar international bodies, such as the eSafety Commissioner, which would help ensure that global big tech companies comply with Canadian regulations, instead of skirting compliance due to the small size of the Canadian market.
Key Insights
- An online safety framework in Canada should be based on the premise that all companies should act responsibly because they operate at the intersection of speech and human rights.
- The events transpiring under Elon Musk’s leadership of Twitter serves as a case study for why legal oversight and a legal mandate is needed for platforms. The current controversy shows how the various mechanisms - the marketplace, complex global regulatory regimes, infrastructure providers - are often at odds with each other.
- While all companies should act responsibly, the obligations of platform providers versus intermediaries should be different. Infrastructure players that do not host content should not be forced to action content.
- There should be a new digital regulator that is responsible for enforcement, as well as education and the development of codes of practice for small and medium-sized enterprises. The regulation of online harms should not be left with the CRTC, as the CRTC does not have experience in internet governance or speech.