Canada: Canadian Commission on Democratic Expression calls for new federal body to regulate social media

In depth

The Public Policy Forum (PPF) is a prominent independent, non-profit Canadian think tank for public-private dialogue. In April 2020, the PPF established the Canadian Commission on Democratic Expression ("Commission") to study and provide informed advice on how to reduce harmful speech on the internet. The Commission issued a report recommending six practical steps that place the responsibility of hateful and harmful content on the shoulders of technology platforms and its creators. The recommendations are summarized below.

Recommendation #1: A new legislated duty on platforms to act responsibly

The Commission believes that platform companies must accede to a greater public interest by assuming responsibility for harmful content that appears within their domains. Creating a new legal standard to act responsibly would impose an affirmative requirement on platforms including social media companies, large messaging groups, search engines and other internet operators involved in the dissemination of user-generated and third-party content.

Recommendation #2: A new regulator to oversee and enforce the Duty to Act Responsibly

To oversee and enforce the new Duty to Act Responsibly on platforms, the Commission calls for the creation of a new regulatory body ("the Regulator") that would move content moderation and platform governance beyond the control of private sector companies. Regulatory decisions would be judicially made, based on the rule of law and subject to a process of review. The Regulator would also be responsible for publication and enforcement of a Code of Conduct for regulated parties, underpinning the Duty to Act Responsibly.

Recommendation #3: A Social Media Council to serve as an accessible forum in reducing harms and improving democratic expression on the internet

Creating an independent, stakeholder-based social media council would provide an institutional forum for platforms, civil society, citizens and other interested parties to have an inclusive dialogue about ongoing platform governance policies and practices. Importantly, it would perform a consultative role for the Regulator in providing broad-based input into the Code of Conduct and how changing technology, business models and user experience affects policy.

Recommendation #4: A world-leading transparency regime to provide the flow of necessary information to the Regulator and Social Media Council

One of the central challenges faced by researchers, journalists, policy communities, social media users and, soon, regulators, is that the platform ecosystem is considered as non-transparent. Embedding significant transparency mechanisms at the core of the mandate for the Regulator and Social Media Council would provide greater access to information and create a more publicly accountable system.

Recommendation #5: Avenues to enable individuals and groups to deal with complaints of harmful content in an expeditious manner. An e-tribunal to facilitate and expedite dispute resolution and a process for addressing complaints swiftly and lightly before they become disputes.

The Commission believes that creating a new e-tribunal for online content disputes could rebalance the possible asymmetry in the digital sphere from private sector processes within the platform companies, to a public institution dedicated to due process and transparency. An e-tribunal would provide rapid and accessible recourse for content-based dispute settlement.

Recommendation #6: A mechanism to quickly remove content that presents an imminent threat to a person

Given the instantaneous nature of the internet, the Commission recommends that the Regulator be empowered to issue cease and desist orders for takedown within 24 hours in cases judged to contain a "credible and imminent threat to safety". These orders would be challengeable in court and an exception to the Commission's general rule that the Regulator refrain from individual content decisions and address systemic issues.

Other Considerations

The Commission considered imposing reactive takedown requirements on platforms, which would require companies to remove "offending categories" of content in as little as 24 hours or face heavy fines. Despite the existence of these mechanisms in other jurisdictions, these kinds of changes were rejected for fear of over-censorship.

Overall, the Commission believes that to be effective, the Regulator must have the power to impose penalties, such as significant fines and possible jail time for executives. The Commission intends for the affirmative requirements on the platforms to be developed under legislation and regulation.


Further details are available by consulting the Canadian Commission on Democratic Expression's final report and the report of the Citizens' Assembly on Democratic Expression.


Contact Information
Theo Ling
Partner at BakerMcKenzie

© 2021 Baker & McKenzie. Ownership: This site (Site) is a proprietary resource owned exclusively by Baker McKenzie (meaning Baker & McKenzie International and its member firms, including Baker & McKenzie LLP). Use of this site does not of itself create a contractual relationship, nor any attorney/client relationship, between Baker McKenzie and any person. Non-reliance and exclusion: All information on this Site is of general comment and for informational purposes only and may not reflect the most current legal and regulatory developments. All summaries of the laws, regulation and practice are subject to change. The information on this Site is not offered as legal or any other advice on any particular matter, whether it be legal, procedural or otherwise. It is not intended to be a substitute for reference to (and compliance with) the detailed provisions of applicable laws, rules, regulations or forms. Legal advice should always be sought before taking any action or refraining from taking any action based on any information provided in this Site. Baker McKenzie, the editors and the contributing authors do not guarantee the accuracy of the contents and expressly disclaim any and all liability to any person in respect of the consequences of anything done or permitted to be done or omitted to be done wholly or partly in reliance upon the whole or any part of the contents of this Site. Attorney Advertising: This Site may qualify as “Attorney Advertising” requiring notice in some jurisdictions. To the extent that this Site may qualify as Attorney Advertising, PRIOR RESULTS DO NOT GUARANTEE A SIMILAR OUTCOME. All rights reserved. The content of the this Site is protected under international copyright conventions. Reproduction of the content of this Site without express written authorization is strictly prohibited.