Public Policy to the “Digital” Rescue!

March 2023 -Anyone who uses social media is familiar with recent discussions on misinformation and data privacy. Even those who are not active on these platforms have heard about it in the news or from friends.

To combat these issues, some developed countries have introduced legislation that regulates how these online platforms manage their users’ information and content. Regulators in Austria, Germany and the United Kingdom have already taken bold measures to tackle the lack of social media regulation, while the Biden Administration is considering a similar slew of new regulations. Canada, however, lags behind. The country has considered, but not approved, legislation in this regard, leaving the management of misinformation and the protection of its citizens’ data up to companies’ standards.

This could be changed if the House of Commons passes the Digital Charter Implementation Act 2022. The Charter, proposed as Bill C-27, could transfer the burden of data privacy and the management of misleading content from consumers to organizations, increasing transparency and introducing a system of enforcement through financial penalties.

eng-banner-policy-byte-graphic

 

To those skeptical about whether the Charter would be efficient, regulation of content in social media has already been shown to be very effective. Faced with mounting pressure from the population and lawmakers, social networks have started acting in the fight against misinformation or misleading content. Twitter and Facebook banning former US President Donald Trump from their platforms for allegedly sharing misleading information and inflaming millions of his followers is a clear indication they can act when pressed.

In Canada, Twitter’s decision to block 30 accounts to prevent the archiving of Canadian politicians’ deleted tweets is yet another signal that social media giants can intervene when required. As well, a survey conducted in Canada by the Canadian Race Relations Foundation and Abacus Data found that at least 60 per cent of respondents believed “the federal government has an obligation to put forward regulation to prevent the spread of hateful and racist rhetoric and behaviour online.”


But what is the government’s role in regulating social media?

The US government has already set sights on Section 230 of the Communications Decency Act, which protects internet companies from liability for user-generated content disseminated on their platforms. Legislations that protect the general public from harmful content have been mooted.

Some of the legislation under consideration in the US could require Facebook and Twitter to meet certain standards in transparency, and data protection. Tighter regulations could potentially help clamp down on misinformation or misleading content on social networks, holding social media companies responsible for any misinformation or content shared on the platforms with the intent of causing damage.

Meanwhile, regulations in Canada have lagged behind. Natasha Tusikov, an assistant professor at York University and author of Chokepoints: Global Private Regulation on the Internet, says the Canadian government has lagged behind other developed countries in tightening regulations on social media platforms. While Austria, Germany, and the United Kingdom have recently introduced new measures to tackle hate speech and disinformation online, Canada is still considering, but not actually implementing, new rules.

The current federal Government has brought forward two new bills in this arena. Bill C-10 was structured to modernize the Broadcasting Act by accounting for new ways of consuming content (streaming) and better reflecting marginalized segments of society (2SLGBTQ+ community, racialized communities, persons with disabilities, and Indigenous peoples). Meanwhile, Bill C-36 was designed to provide recourse for users who faced online bias, hate, and prejudice. These bills were created after years of committee investigations and studies and received royal assent in 2022.

While the government promises to revive them, Tusikov has said that even these measures don’t address the root of the problem: the business model of social media.

Social platforms thrive on user engagement, which drives retention, viewership, and conversions for the platforms’ true customers: advertisers. While mainstream advertisers have pulled back from harmful online content, in general, the engagement rate is the key factor considered for most campaigns. Put simply, social media’s revenue relies on retaining attention, which indirectly encourages extremist views and sensational content on the platforms.  Fact-checking and moderation are disincentivized by this model.     

Tusikov’s sentiment that the fundamental business model of social media is broken has been echoed by numerous lawmakers such as Representatives Anna Eshoo and Jan Schakowsky as well as MIT Sloan professor Sinan Aral.    

Unless governments can seriously regulate social platforms and put guardrails on their business model, the creation and dissemination of harmful content are not likely to decline.


So, what is this Digital Charter again?

On June 16, 2022, the federal government introduced Bill C-27, Digital Charter Implementation Act, 2022. If passed, the potential Digital Charter could greatly enhance the security of internet users across the country while streamlining the flow of information across our digital platforms.

Besides regulating the way personal information is handled by corporations, and how enterprises can use artificial intelligence, the Charter includes the following proposals:

  • Greater transparency about how personal information is stored and managed by corporations. Canadians can expect significantly more control over their personal data and will be free to share or transfer it without an undue burden;
  • Freedom to move personal data from one corporation to another with ease;
  • The right to withdraw consent or request the destruction of personal data when it is no longer necessary for use by the corporation;
  • Fines of up to 5 per cent of revenue or $25 million, whichever is greater, when an organization is found liable for the most serious offenses under the new Bill - the most stringent fines in the G7, and
  • Standards for the development and management of AI-enabled systems.

If implemented, these standards and protocols could vastly improve the digital landscape in Canada. No longer would tech giants from across the border be in charge of Canadian lives and their access to vital information.

In the US, for example, 24 per cent of privacy officers at large companies (with more than 1,000 employees) deal with 10 to 50 privacy requests a week under the California Consumer Privacy Act, according to a  . 9 per cent of these officers said they had as many as 500 requests every week.

The negative impacts of social platforms could be mitigated and emerging threats from the misuse of AI and data-driven business models reduced.

To enforce these standards, Bill C-27 greatly enhances the powers of the Office of the Privacy Commissioner (OPC). If passed in its current form, the bill could allow the OPC to order organizations to change their practices and make these changes public, offering greater transparency. It could also give the OPC the power to approve or not an organization’s Codes of Practice or Certification Program to meet data compliance requirements.

To process penalties, Bill C-27 recommends the creation of a new Data Protection Tribunal. This tribunal could have the power to act on recommendations from the OPC and implement penalties of a maximum of $10 million or 3 per cent of gross global revenue for non-compliant organizations.

In summary, the federal government’s proposed bill would enhance data protection by making consumer consent the primary driver of data collection. However, to reduce the burden on individual consumers, the bill makes organizations responsible for transparency and compliance while instituting a new tribunal to enforce penalties.

It remains to be seen if this bill passes into law. But if implemented, it could have a substantial impact on the way Canadians use digital platforms and the business model of providing these platforms to Canadians. Until such bills get passed into actual law, Canadian internet users are at a disadvantage to their global peers.

 

About David Bruno: As founder and former CEO of a global cyber security firm, David specialises in anti-fraud and anti-corporate espionage systems worldwide. He provides financial sector solutions for the digital and interactive FINTECH sectors, and for over 20 years he has worked to provide security protections to the masses and has invested his own money in a free E2EE encrypted email server for the public. He is a contributor and member of Electronic Frontier Foundation (EFF) advocating for defending digital civil liberties. He is a Tech Policy Analyst for Washington DC based Global Foundation For Cyber Studies & Research (gfcyber.org) and a contributor to the Northern Policy Institute dedicated to educating the public on the surveillance of email in general and the importance of encryption, especially for vulnerable populations like refugees. He was also a policy contributor to Canada’s new Digital Charter, re-introduced in 2022 by Minister François-Philippe Champagne

Write for us

The content of Northern Policy Institute’s blog is for general information and use. The views expressed in this blog are those of the author and do not necessarily reflect the opinions of Northern Policy Institute, its Board of Directors or its supporters. The authors take full responsibility for the accuracy and completeness of their respective blog posts. Northern Policy Institute will not be liable for any errors or omissions in this information, nor will Northern Policy Institute be liable for any detriment caused from the display or use of this information.  Any links to other websites do not imply endorsement, nor is Northern Policy Institute responsible for the content of the linked websites.

Northern Policy Institute welcomes your feedback and comments. Please keep comments to under 500 words. Any submission that uses profane, derogatory, hateful, or threatening language will not be posted. Please keep your comments on topic and relevant to the subject matter presented in the blog. If you are presenting a rebuttal or counter-argument, please provide your evidence and sources. Northern Policy Institute reserves the right to deny any comments or feedback submitted to www.northernpolicy.ca that do not adhere to these guidelines.  

0 Reader Comments

All fields are required.