This site will look much better in a browser that supports web standards, but it is accessible to any browser or Internet device.

Skip to Content

Question - Ministry of Public Safety and Emergency Preparedness

The Honourable Ralph Goodale, P.C., M.P., Minister of Public Safety and Emergency Preparedness

Social Media

Hon. Serge Joyal: I’ll try to follow the admonition from Your Honour.

Mr. Minister, I would like to come back to an issue before your department, which is role of social media companies in the control of hate speech, violent content and extremism. This is part of your responsibility. It’s quite obvious from the past months and year that social media companies — and I’m thinking of Facebook among them — are inefficient in their will and capacity to control the spread of hate speech and violent content.

Why is the government waiting and stalling on the initiative of not tabling a bill to show the will of the government to use its responsibility and powers to stop that? That’s what Canadians are expecting. It seems this government, in relation to controlling and regulating social media, is very reluctant to do anything and is just waiting on some other partners of Canada to do something and then follow suit.

Why are you not exercising leadership in that domain?

Hon. Ralph Goodale, P.C., M.P., Minister of Public Safety and Emergency Preparedness: Thank you for the question. It’s a very important one. This topic has been under discussion, particularly for the last two to three years, at every meeting that I’ve attended of the security ministers of the G7 and the security ministers of the Five Eyes. While interest in the topic has ebbed and flowed a bit over time, depending on the particular circumstances at any given moment, it’s fair to say that there has been a constant rising of interest and concern among all of our closest allies about the use of social media in ways that disseminate harm.

The first manifestation of that is undoubtedly terrorist activity that is harboured on some platforms on the Internet, but there are also the very serious issues of child sexual exploitation, human trafficking and, in the last year and a half, the concern about foreign interference in democracies.

The discussions with the social media companies have intensified both bilaterally in Canada’s discussions with each one of them individually and collectively through an organization called GIFCT, the Global Internet Forum to Counter Terrorism.

In the last conversation, which was about a month or so ago at a meeting of the G7 in Paris, we made it very clear that the expectations of these countries — the G7 in that case, but it would also include all of the Five Eyes — the concern is rising, the patience is running out and we expect to see firmer, better, more effective action by the social media companies. While it wasn’t unanimous among these countries, certainly the overwhelming majority were of the view that if the response levels from the social media companies were not adequate to protect society from these social harms, then the companies could expect regulation. Around the time we were having that meeting, or a little bit before it, Mr. Zuckerberg made a comment internationally to suggest that his company might, in fact, welcome regulation.

So the attention around the world is now turning to exactly what form that should take. Some companies are imposing penalties.

(1550)

Others have taken an interesting approach, and I’d be interested in the Senate’s view on this: Do you create, in law, a new tort that would effectively say that these companies assume the financial responsibility for the damage they do if their platforms are misused for purposes such as terrorism, child sexual exploitation, human trafficking or interference in democracy?

I would be interested in your thoughts on which of the various techniques available would, in your view, be the most effective.