JUST Committee Meeting
Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.
For an advanced search, use Publication Search tool.
If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.
Standing Committee on Justice and Human Rights
|
l |
|
l |
|
EVIDENCE
Tuesday, May 28, 2019
[Recorded by Electronic Apparatus]
[English]
I call the meeting to order.
Good morning, everyone. Welcome to this meeting of the Standing Committee on Justice and Human Rights, as we continue with our study of online hate.
[Translation]
It's a great pleasure to welcome Mr. Rioux to this committee for the first time.
Welcome, Mr. Rioux.
It's also a great pleasure to welcome all the witnesses here today.
[English]
We have a very distinguished panel, and quite a lot of witnesses in this first panel.
I'm going to start with those here by video conference. We have both the Windsor Islamic Council and the Windsor Islamic Association, represented by Ms. Lina Chaker and Mr. Sinan Yasarlar.
Are you both hearing me?
Perfect.
You're going to testify first, after I introduce all the other witnesses, because we do not want to lose the video conference connection.
In the room with us, as an individual, we have Ms. Elizabeth Moore, educator and advisory board member of the Canadian Anti-Hate Network and Parents for Peace. Welcome.
From the Alberta Muslim Public Affairs Council, we have Mr. Faisal Khan Suri, president, and Mr. Mohammed Hussain, vice-president of outreach. Welcome.
From the Friends of Simon Wiesenthal Center for Holocaust Studies, we have Mr. Avi Benlolo, president and CEO. Welcome.
The rules are eight minutes per group.
We're going to start with the Windsor Islamic Council and the Windsor Islamic Association. I understand they are splitting their time.
Please go ahead. The floor is yours.
Good morning, honourable MPs. We would like to thank the members of Parliament for allowing us to give our perspectives on online hate on behalf of the Windsor Islamic Council and the Windsor Islamic Association, of which I am the public relations director. Lina Chaker is from the Windsor Islamic Council.
Good morning to everyone.
The problem is victims of online hate. Internet use is growing year by year and will continue to do so in the generations to come. Just as we have regulated other technologies, including television, radio, movies, magazines, and other communication platforms, we cannot ignore the Internet. The harm of online conversations transcends the digital world. We don't need to cite violent events or even the most recent attack in New Zealand to prove that online hate has real-world consequences.
Our community centres are filled with troubled youth facing negative peer pressure, social anxiety, and mental health issues. The overall international Muslim community has been shaken twice over the past couple of years by terrorism, just as other communities have been. These terrorists clearly built their Islamic knowledge from misinformed online sources that spew hate.
We have our own Canadian example from January 29, 2017, in Quebec, with evidence that motivation was driven by online hate sites.
To prevent and respond to online hate, we believe there are three important actions the Government of Canada can take.
Number one is to set strict standards and guidelines for social media companies to self-regulate their content. Number two is to more readily enforce legislation that criminalizes both online and off-line hate speech. Number three is to increase awareness about public reporting and responding to this type of behaviour online.
The first action is to impose strict self-regulation standards and penalties for social media companies. Other countries have developed strategies to impose regulations and protocols for social media companies to self-regulate the content of hate speech on their sites. For example, Australia and Germany now penalize social media sites that fail to remove hateful content with financial charges or even imprisonment.
Alternatively, some countries such as Sri Lanka...[Technical difficulty—Editor] ...social media to stop the spread of misinformation and hate. Canada should consider policies of the kind that have been adopted in Australia, Germany and even Sri Lanka to enforce the removal of hateful content and combat terrorism.
We recognize that there may be difficulties in regulating online content. However, our country currently regulates other forms of online content such as child pornography, and anti-spam legislation does exist.
Similar to this, there has to be an effort to combat online hate. For the individuals who try to bypass such regulations, we should combat that by not allowing companies to provide individuals with VPNs or other IP-blocking programs.
Nuimber two is to introduce effective legislation to penalize those who incite hatred. In addition to penalizing social media companies for not taking down hateful content, we must penalize Canadians who spread hateful messages, whether online or off-line. Although we currently have tools to do so, such as section 319 of the Criminal Code, our community feels that they are not adequately utilized and thus cannot encompass online hate crimes.
In fact, we had an unfortunate local example here in Windsor, Ontario. An individual was spraying graffiti all over the city, on the posts and bus stop signs, inciting hatred and harm to Muslims specifically.
These acts weren't recognized as hate crimes under section 319, which makes our community pessimistic about the prospects of encompassing online hate speech. This individual had a misdemeanour and no other charges were pressed against him.
Recognizing this, we believe that section 13 of the Human Rights Act was a vital piece of legislation that was dedicated to online speech. However, it can be amended or restructured to be more effective. We recognize that section 13 was not heavily utilized before it was repealed. However, we do not find this to be a convincing reason not to reintroduce it.
Online hate can be responsible for other types of actions in our society, including verbal attacks against women with hijabs, trying to do harm to people of a visible minority and inciting the physical confrontations that have happened in several supermarkets, shopping areas and malls in our country.
Thus, we are not limiting the discussion of section 13, but hope that any legislation introduced to combat hate will readily be enforced for the betterment of our multicultural Canadian society. The frequency with which a piece of legislation is used should not be the basis on which we decide whether it exists or not. Rather, it should highlight to us that most people still do not know what to do when faced with online hate.
We recommend that there be more education on the consequences of promoting hate. While recognizing that education tends to be a provincial mandate, it is our believe that the Government of Canada can play a vital role. This leads us into our third and final point: educating the public on how to report incidents of hate.
My colleague went over the first two action points that we believe the Government of Canada can take by introducing regulations for social media companies and legislation to regulate those who are spreading online hate. I will cover the third point, which is that we believe that victims of online hate need to be more educated so that they know what to do when they are faced with it.
We grew up with teachers telling us how to respond to bullying on the playground. That's not really effective for the online world. They taught us that sticks and stones can break your bones, but words don't really hurt you. Unfortunately, in today's world, we learn that words can not only hurt you psychologically but can also lead to criminal activity and even terrorism.
I want you to think about the last time you tried to report an online hateful comment. Assuming that the process for reporting the post was user-friendly and noticeable—that is, you actually saw the button that says “report”—where did it lead? Did you have to personally follow up and check to see if it was taken down? How many times? Did you have to forward it to your friends and convince them to also try to report it? How many of us continue to experience and see online hate, despite the continued reports?
We have a couple of recommendations for the government to enforce so that social media companies will better create mechanisms for us to be able to help them regulate the content.
The first is to make it easier to report hateful content. Currently, for example, Facebook doesn't have a “report” button; it has a “give feedback” button. It's not as visible.
Second, hasten the time between the reporting of a post and its examination. As we know, time moves much faster in the virtual space than it does off-line. These processes should be receptive to that.
Third, social media companies should provide the person who reported the harm with an update and provide them with information about other resources, including law enforcement, and such resources as the human rights commission.
Fourth, social media companies should examine software and other algorithms that direct users to violent content and share that with government authorities so that the government can also help find and eliminate violent extremist material.
Finally, social media companies should produce tools that help us, and help users, differentiate between credible information and fake news.
As we have been talking about, there are two kinds of content online that can lead to a lot of violence. One is actual hate and the other is misinformation. We believe the Government of Canada can support and fund community initiatives of digital media literacy to help youth and adults alike be able to differentiate between misinformation and credible information as a method of responding to hate. There is a variety of programming that successfully teaches both generations how to differentiate between real and fake news, making them less susceptible to being influenced by hateful messages. This is essential, given the industry of hate and fake news. Moreover, teaching media literacy skills empowers youth to control their own narrative of their identity and to respond to the negative messages with positive ones.
In conclusion, as Prime Minister Jacinda Ardern said, freedom of speech is not advocating murder, and it's also not spreading false or hateful content. We thank the Government of Canada for considering the important consequences of online hate and applaud the right honourable Prime Minister Justin Trudeau for signing the Christchurch call in Paris recently, where he took the effort to tackle this issue of violent online content. However, there is more to be done.
To summarize, we urge the government to combat online hate in three ways: first, by setting strict standards and guidelines for social media companies to regulate their content; second, by more readily enforcing hate speech legislation, be it online or off-line; and last, by increasing the public's awareness about how to report and respond to online hate.
Thank you for your time and consideration.
Thank you very much.
A voice: May I just—
The Chair: You guys have gone a little bit beyond your time, so I will go to the next speaker.
We'll now follow the order that's on the agenda.
Ms. Moore, the floor is yours.
I want to start by thanking the committee for the opportunity to speak today. It is certainly a privilege that I never thought would be afforded to me as a former extremist. I really appreciate the opportunity to be here.
I would like to provide a bit more context about who I am and how my views about online hate have been informed.
In the early nineties, I was a member of and spokesperson for the extremist group the Heritage Front, which at the time was the largest hate group in Canada. They acted as an umbrella organization for the racist right at the time. They brought in the Church of the Creator and the KKK, among other organizations. Most troubling, they were trying to do what the so-called alt-right is trying to do today, which is to make inroads into the mainstream and to try to have a veneer of legitimacy on top of the hatred.
I should add that Wolfgang Droege, who was the leader of the Heritage Front, was convicted of many offences prior to starting the organization, including air piracy, the attempt to overthrow a friendly nation and drug offences, which I believe included possession. He managed to influence people, despite this veneer of wanting to be more mainstream and trying to make connections with the Reform Party. His followers committed a wide array of offences of their own, which included hate crimes offences, assault, and targeted and unrelenting harassment of anti-racists.
I feel very fortunate that I was able to leave that terrible world of hatred behind. Since 1995, I've been working with non-profits, educators and law enforcement to raise awareness about the dangers of hate groups. I'm currently on the advisory boards for the Canadian Anti-Hate Network and Parents for Peace, which is an American organization that provides support for families of radicalized individuals.
Back in the nineties, when I was an extremist myself, I quite literally communicated hate by telephone. Also, prior to leaving, I helped prepare materials for the Internet. They were back-issue articles from the Heritage Front's magazine and they ended up posted on what would become Freedom-Site, which was one of Canada's first white supremacist websites. That website, which was run by Mark Lemire, in 2006 was found to contain material that violated section 13 of the Human Rights Act.
I feel fortunate that I never personally got in trouble with the law, but I do realize that it was a very real possibility. I understand that a sample size of one has limited value, but I should say that section 13 did moderate my behaviour. When I was working on the hotline, I was very aware of the fact that friends who were working on hotlines very similar to the Heritage Front hotlines were facing charges under section 13, and it made me more careful. I did not engage in or indulge the unrestrained hatred that I certainly felt inside. I do understand, with the benefit of hindsight, that what I was communicating was still hateful, but it was definitely not as hateful as it would have been in the absence of such legislation.
The methods that are used today to communicate hatred are definitely more sophisticated and exceptionally more accessible than what we had available to us in the nineties. As an analog kid, I have to say that it frightens me that young people today could have their life trajectories altered by watching one YouTube video or interacting with one Twitter account or one Reddit account.
Racist extremists have always networked with like-minded individuals across borders in other countries, but we now have an environment where the transmission of hate knows no borders or language barriers—or even time differences, frankly.
To fully understand what is at stake, I think it's imperative to consider not just the words and images that are put in front of you but the emotions that created those words. Hatred is intoxicating, it's all-consuming and, in my opinion, it's a contagion that when embraced crowds out not only other moderating emotions but also any sense of reason and connection to one's fellow human beings.
I want to read a quote from R. v. Keegstra from the Supreme Court in 1990. Hatred is:
emotion of an intense and extreme nature that is clearly associated with vilification and detestation...
Hatred...against identifiable groups...thrives on insensitivity, bigotry and destruction of both the target group and of the values of our society. Hatred...is...an emotion that, if exercised against members of an identifiable group, implies that those individuals are to be despised, scorned, denied respect and made subject to ill-treatment on the basis of group affiliation.
...hate propaganda legislation and trials are a means by which the values beneficial to a free and democratic society can be publicized.
With more people being exposed to hateful ideas and emotions than ever before through social media and online content, and with the very troubling rise of hate-motivated crime in Canada, I'm quite heartened that the government is revisiting the inclusion of incitement of hatred in either the Canadian Human Rights Act or the Criminal Code.
The introduction of Canada's digital charter shows promise in developing a thoughtful and measured template for how Canadians can expect to be treated as digital innovation continues to expand. However, I wish to challenge the committee to consider that the government's responsibility to Canadians should not end with the adoption of these measures. Unless effective and ongoing training is provided to everyone responsible for implementing these laws, including judges, Crown prosecutors and police, victims will continue to feel that they are not heard and that justice remains elusive.
As an example, just last week I heard from a member of my local community who wanted to report anti-Semitic graffiti that they found. The responding officer was not at all sympathetic, and because the swastika that was found was misshapen, he wrote it off as a collection of L's. That is not a responsible response to the community.
Speaking as a former extremist and as a woman and a mother who is raising a child in an interfaith Jewish-Christian family, I think Canadians urgently need you to respond boldly and to lead us into an era in which we can expect that our children will be treated with respect and dignity, both online and in the real world. I think we also have a responsibility to the international community to do what we can to limit hatred that may impact identifiable groups in other nations because, as I said, borders mean nothing in the digital world. It is unfortunately no accident that the Christchurch shooter had Alexandre Bissonnette's name on one of his weapons.
The endgame of hatred is always violence and death, and that game starts with incitement, words and images that we find on the Internet.
The introduction of legislation to address the early stages in the progression of hate is both right and necessary. Canada's values of peace, diversity and inclusion are being eroded by the unrelenting promotion and communication of hate online. It is time, if not past time, to send a strong message to racist extremists that their hatred and targeting of identifiable groups is not just unacceptable but unlawful.
As I stated earlier, I have experienced first-hand the moderating effects of such laws and regulations. I think it's time that we do the right thing to rein extremists in before anyone gets hurt or loses their life.
I would add, if I have a moment, very briefly in response to what the earlier speakers had mentioned, that when it comes to reporting online hate, I think platforms need to have more transparency when they respond to people. I have experienced myself being targeted as a former extremist online and receiving hatred, and when I report it, if I get any response back at all, it is, “We have found that they did not violate terms of service” or “We have found that they have violated terms of service”, but there's no additional information to say in what ways they've precisely violated terms of service. There is no mention of what measures have been taken, whether the account has been suspended or whether that suspension is temporary or permanent. I think online platforms owe it to the people who are victims to have more transparency in what they are doing and in saying whether this account is going to be monitored, going forward, for any additional infractions.
Thank you very much for your time today.
Thank you, Mr. Chair, for having us.
My name is Faisal Khan Suri. I'm the president of the Alberta Muslim Public Affairs Council, or AMPAC. I'm joined here by my colleague Mohammed Hussain, who is VP of outreach.
Today's topic of discussion is not only an important one but an absolutely necessary one. With all the events we are seeing in Canada, throughout the world, and especially within Alberta, it definitely warrants our being here, collaborating on this effort and sharing our thoughts. Thank you again to this committee for inviting us and allowing us to share our thoughts.
I'll just give you a snapshot of AMPAC.
We're dedicated to championing civic engagement and anti-racism efforts within the province of Alberta. We focus on advocacy work, implementing strategies around media relations, community bridge-building, education, policy development and cultural sensitivity training.
AMPAC envisions a province where deep equality exists for all Albertans, including Muslims, in a political and social landscape that is respectful and harmonious for people of all faiths and backgrounds.
To get to the gist of things, to state it quite mildly, online hate influences real-life hate. I could be quite blunt about this. Online hate is an enabler, a precursor and a deep contributor to not just real-life hate but also to murder.
We've seen a lot of recent tragedies happen across the world. In January 2017, the Quebec City mosque killer, Alexandre Bissonnette, gunned down six Muslim men in execution style when he came into the mosque with two guns and fired more than 800 rounds. The evidence from Bissonnette's computer showed he repetitively sought content about anti-immigrant, alt-right and conservative commentators; mass murderers; U.S. President Donald Trump; and the arrival of Muslim immigrants in Quebec.
In October 2018, white nationalist Robert Bowers murdered 11 people and injured seven more at the shooting inside the Tree of Life synagogue in Pittsburgh. This was an attack that appeared to have been motivated by anti-Semitism and inspired by his extensive involvement in white supremacy and alt-right online networks.
In March 2019, a lone gunman armed with semi-automatic weapons burst into the mosque in Christchurch, New Zealand. This white nationalist, in what was a gruesome terrorist attack, was broadcasting live on Facebook and Twitter, and 51 worshippers were killed.
There are so many more examples we could provide that show the accessibility of online hate and how it's affecting the real-life hate we are witnessing today.
I think it's absolutely critical, if not fundamental, to embark on such studies as this and to look a lot further into this issue with a deep thought process in place.
Online hate is a key factor in enforcing hate in all forms—Islamophobia, anti-Semitism, radicalization, violence, extremism and potentially death. This is why we must take immediate action to work on prevention, monitoring and enforcement.
In order to combat online hate, AMPAC has come up with three recommendations. Number one is to employ artificial intelligence on online materials to identify any form of hate speech. Number two is to reopen the Canadian Human Rights Act for a comprehensive review. Number three is to have transparency and accountability for social media platforms.
Allow me to delve a little further into the first recommendation, employing artificial intelligence on online materials to identify any form of hate speech.
Right-wing extremist groups are using social media platforms such as Facebook and Twitter to create and promote their groups, share messages, assemble people and more. The question is, how can we remove their access, block IP addresses or even discover these types of groups? There are some tools being used today, such as text analysis, to combat online hate, but these groups are becoming much smarter, and they're using images such as JPEGs to help deter that monitoring.
While we are happy to see that the new digital charter incorporates elements of an approach that involves industry, we believe that the government must itself fund innovative technological solutions to track online hate and aid in developing artificial intelligence that can combat it.
The AI technology needs to be comprehensive so as to encompass text analysis and languages, and so as cover all forms of social media that are used to facilitate online hate. We believe that there is space in Canada, especially within Alberta, to build that capacity.
Our second recommendation, to reopen the Canadian Human Rights Act for a comprehensive review, is quite near and dear to our hearts.
The moment freedom of speech or freedom of expression puts another group, organization or individual in any form of danger, it can no longer be justified as freedom of speech or expression. This is now freedom of hate, which has no place in the Canadian Charter of Rights and Freedoms, nor in any pluralistic society that we live in. It has been far too long since the Canadian Human Rights Act has been revisited.
Keep in mind the following: For the last few years, hate has led to the murder of innocent civilians. Also keep in mind the importance of reviewing how online access and other media have been used to propel such hate and extremist perceptions.
AMPAC recommends not simply revisiting and reviving section 13, but reviewing the Canadian Human Rights Act in its entirety. The review itself needs to consider facts on the rise of Islamophobia, anti-Semitism, xenophobia and all other forms of hate. Questions need to be asked in terms of what determines hate and how we can bring enforcement into the picture with respect to the Charter of Rights and Freedoms.
Part of our third recommendation that we talked about is transparency and accountability for social media platforms. While we're pleased with the signing of the digital charter, we think that there is a lot more to be done in terms of regulating social media companies. We recognize that social media platforms have been trying to curtail hate speech through reporting options, but there is a lack of accountability in what follows that reporting, which in turn minimizes any sort of enforcement. Social media platforms such as Facebook, Twitter and YouTube must be held accountable by government authorities for reporting the data and for any follow-up measures.
We're quite aware of the challenges that such regulations can bring to freedom of expression related to this recommendation, but we believe in a statement that New Zealand's Prime Minister Jacinda Ardern gave. Her persistence to control the amplification of online hate is not about curbing freedom of expression. I will quote some of her words. She says, “...that right does not include the freedom to broadcast mass murder.” She also says, “This is not about undermining or limiting freedom of speech. It is about these companies and how they operate.”
Working alongside social media companies, holding them accountable, and imposing some form of financial repercussions or other necessary measures are part of this recommendation. We hope to see a requirement for online platforms to be transparent in their reporting come to light with this initiative.
To end, I'll go back to the key factors that are priorities for us: to look at prevention, monitoring and enforcement. Today the recommendations that we've talked about—implementing a comprehensive artificial intelligence tool that spans major social media platforms, implementing language-text-image analysis, reopening the Canadian Human Rights Act for an extensive review, reviving section 13 and holding social media platforms accountable for sharing data—are just the initial steps that we believe can help to curb online hate.
With a 600% increase in the amount of intolerant hate speech in social media posts from November 2015 to November 2016, I can only try to fathom or understand where those statistics are today.
Additionally, with the clear evidence of online hate, including the horrific killing of innocent people, there is absolutely no greater time than the present to action immediate government-legislated change. We cannot allow hate to inflate any further. We most certainly cannot allow any more lives to be taken.
I'd like to end this by echoing the statement of Prime Minister Justin Trudeau: “Canadians expect us to keep them safe, whether it’s in real life or online....”
Thank you so much.
Thank you very much.
Now we'll go to the Friends of Simon Wiesenthal Center for Holocaust Studies.
Go ahead, Mr. Benlolo.
Good morning, everyone. Thank you very much for having us here today and for actually doing this. This is very important work that you're all doing.
I'd like to begin my statement by first telling you a little bit about our institution. We're an international human rights organization. We have a network of offices worldwide, monitoring and responding to anti-Semitism, fighting hate and discrimination and promoting human rights. The organization has status with the United Nations, UNESCO, the OSCE and many other notable global organizations. Additionally, the Simon Wiesenthal Center has won Academy Awards and developed museums. We are currently building a human rights museum in Jerusalem.
ln Canada, we have won the Canadian Race Relations Foundation's award for our tolerance training workshops on the Tour for Humanity and in the classroom. We educate about 50,000 students each year, including those in law enforcement, faith leaders and teachers.
The organization has been tracking online hate for more than two decades. Twenty years ago, online hate was primarily found on websites. They were fairly easy to track, document and, in some cases, bring down through the help of Internet service providers. In fact, we used to produce an annual report called “Digital Hate” in the early days.
Section 13 of the Canadian Human Rights Act allowed us to bring down several online hate sites simply by bringing them to the attention of the ISP. Our ability to sanction hate sites became limited when section 13 was repealed in 2013. We lost an invaluable tool that provided a red line for the public. If that tool was in existence today, it's unlikely that anti-Semitic websites based in Canada, like the Canadian Association for Free Expression or Your Ward News and others, would so easily find a home on Canadian servers.
The advent of social networking sites like Facebook, Instagram, Twitter and the like introduced a tsunami of hate into the social sphere. According to one study, roughly 4.2 million anti-Semitic tweets were posted and reposted on Twitter between January 2017 and January 2018. Conversely, according to Statistics Canada's 2017 hate crime report, there were 364 police-reported cyber-hate crimes in Canada between 2010 and 2017. Of those, 14% were aimed at the Jewish community.
I'm telling you this because this number is actually really low. You'd be surprised hearing this number, but it's low. I think it's low, given this recent Leger Marketing poll that showed that 60% of Canadians report seeing hate speech on social media. That would mean something like 20 million Canadians have witnessed hate online.
Moreover, through our own polling, the Friends of Simon Wiesenthal Center found that on average across the country, 15% of Canadians hold anti-Semitic attitudes. That represents about five million Canadians. That's kind of the low end of that threshold; in Quebec, that number surges to an incomprehensible 27%.
Social networking platforms must be held to account for allowing online hate to proliferate. We note that these platforms have begun banning white supremacist and extreme terror groups. This is certainly one step forward. However, since they are operating in Canada, we must demand that platforms conform to our Criminal Code, specifically section 318 on advocating genocide, subsection 319(1) on publicly inciting hatred, and subsection 319(2) on wilfully promoting hatred.
lt's possible that Canada requires a CRTC-like office with a mandate to regulate online content and specifically ensure that online hate is curtailed. Indeed, one CRTC mandate is to “protect” Canadians. The CRTC says, “We engage in activities that enhance the safety and interests of Canadians by promoting compliance with and enforcement of its regulations, including those relating to unsolicited communications.” It's in their mandate.
That appears to be consistent with our interest here to limit the proliferation of hate online in accordance with Canadian law.
The Christchurch Call to Action to eliminate terrorists' and violent extremists' content online is a positive step forward. However, it must be implemented by Canada with concrete tools. Friends of Simon Wiesenthal Center recommends the following actions that could help stem the promulgation of hateful acts against all communities through online platforms.
One, reinstitute section 13 of the Canadian Human Rights Act to make it illegal to utilize communications platforms to discriminate against a person and/or an identifiable group.
Two, the section should as well make platforms and service providers liable for ensuring they are not hosting hate websites and moderating their online social networking feeds. Fines should be imposed and criminal sanctions should be placed on violators.
Three, expand Statistics Canada's mandate to collect and share hate crime statistics from across the country. At the moment, Canadian policy-makers and organizations are mostly guessing. This is where I get back to those police numbers. We really are guessing at the extent of hate online and beyond. We need better information collected across the country to make better policy.
On that point, I held a hate crimes conference last fall and I invited Statistics Canada. It was the first time they attended a hate crimes conference with police units from across the country. I was shocked that this hadn't happened before.
Fourth is to improve police capacity and ability to track and respond to hate crime. Through our research, we discovered an inconsistency of hate crime units across the country. Some cities lack the resources to implement and deploy hate crime investigators, as you just heard. Last fall, we initiated the hate crimes conference. I'm repreating myself.
This country is lacking a best-practices model for policing hate crimes and understanding hate crimes and understanding the law around hate crimes and collecting and delivering that information to Stats Canada, which will in turn deliver that information to the policy-makers.
Number five is to improve communication between the provincial attorneys general as well as police when it comes to investigating and prosecuting hate crime and hate speech offenders. This will require additional training for prosectors and police officers so that victims of hate speech crime feel their needs are addressed.
We have specific examples that I can get into later about the mishandling in how the prosectors are working with the police and the disjointed communication between them in finding hate crime criminals and prosecuting them.
Number six is education. This is, for us institutionally, one of the most important elements. Education on responsible usage of social networking sites and websites is required now more than ever. We dedicate literally millions of dollars a year to deploying our educational programs to bring that to students. We have, for example, cyber-hate and cyber-bullying workshops, where we aim to educate students.
Even going to a website about the Holocaust is one example. How do you know which website is legitimate? How do you know which one is fake? Further education needs to happen in schools across the country so the students, the young people, the next generation will understand what hate speech and hate crime really are and be able to differentiate.
Finding a balance between protecting free speech and protecting victims of hate is essential. Our freedom and democracy must be protected. At the same time, we must recognize that there are victimized groups that need protection too, and leaving the issue to the marketplace will bring about unpredictable consequences.
Even The Globe and Mail admitted in an editorial last week that times have changed since the Supreme Court of Canada struck down a law in 1992 that made it a crime to “spread false news”. The Globe says, “Much has changed since then. Mr. Zundel printed and handed out crude pamphlets”, whereas today the same hateful message can be viewed by millions of people at once and inspire violent action.
We know this. The recent terror attacks in New Zealand, Sri Lanka, San Diego, Pittsburgh, etc., must motivate government and civil society to take immediate action. Terrorism can be prevented with the right placement of instruments, instruments that include a combination of enhanced legal measures, advanced monitoring and prevention, increased resources for law enforcement and hate crime units, and broader educational programs that promote tolerance, compassion and good citizenship.
We hope the committee makes recommendations for immediate amendments to the Canadian Human Rights Act to end incitement of hatred on online platforms.
Thank you.
Thank you, Mr. Chair.
First of all, Mr. Suri, I take great umbrage with your defamatory comments to try to link conservatism with violent and extremist attacks. They have no foundation, they're defamatory and they diminish your credibility as a witness.
Let me, Mr. Chair, read into the record the statement of [Pursuant to a motion passed on June 13, 2019, a portion of this testimony has been deleted. See Minutes of Proceedings of June 13, 2019], who is responsible for the Christchurch massacre. He left a 74-page manifesto in which he stated, [Pursuant to a motion passed on June 13, 2019, a portion of this testimony has been deleted. See Minutes of Proceedings of June 13, 2019]. I certainly wouldn't attempt to link Bernie Sanders to the individual who shot up Republican members of Congress and nearly fatally killed Congressman Scalise, so you should be ashamed.
Now, with respect—
On a point of order, Mr. Chair, that is unacceptable behaviour of a member of Parliament to witnesses at our committee. I have the speech in front of me, Mr. Chair, and there is nothing linking conservatism to that movement. If alt-right is limited to conservatism, that's conservatism's issue—
Mr. Chair, you cannot have a member of this committee calling for witnesses to be ashamed. That's unacceptable.
Guys, what I will do is I'm going to allow Mr. Suri to respond. I certainly don't agree with the comments made, but at the same point in time, this is Mr. Cooper's time.
There's no ruling here right now. There's nothing to challenge. I haven't made a ruling.
I didn't agree with the comments made by Mr. Cooper. I'm sure Mr. Suri doesn't agree with the comments and I see that all the other members don't agree with the comments. I think Mr. Suri has a right to respond, if he wishes to respond.
—him in a position of vulnerability to have to respond to these attacks. Quite frankly, that's unacceptable. He didn't come here today to defend himself. He came here to present on behalf of his organization.
It is unfair, Mr. Chair, if you attempt to have him respond and legitimize what has been said. I will call for us to go in camera immediately. I think I'll have support from my colleagues.
I don't think there's a question of legitimizing it at all.
I think, again, members have their time to make their comments. I don't agree with Mr. Cooper's comments, but this is part of his time.
I am going to ask, and I will give time for Mr. Suri to respond—
The explanation you have provided really provides little comfort. You are saying that a member can say anything they'd like to say. That is completely unacceptable.
I put a motion on the floor that we go in camera, please, to have this conversation. I'm not comfortable putting our witnesses in this very difficult position of listening to this disagreement that we're having.
That's fair.
There's a motion to go in camera. All those in favour?
Some hon. members: Agreed.
The Chair: We're going to move in camera, with apologies to everyone.
We're going to ask everyone who is not part of the committee to leave the room for a very brief period of time. We will have you come back in as soon as possible.
[Proceedings continue in camera]
We will resume.
Our meeting was suspended. We apologize for taking away your time. It was important for the committee to discuss and deal with the issue.
I'm going to give the floor back to Mr. Cooper.
Thank you, Mr. Chair.
While I certainly find the comments made by Mr. Suri to be deeply offensive and objectionable and vehemently disagree with them, I will withdraw saying that he should be ashamed. That was not unparliamentary, but I understand it made some members of the committee uncomfortable, so in the spirit of moving forward, I withdraw those specific comments, but certainly not the rest of what I said.
Mr. Benlolo, you cited sections 318 and 319. What about subsection 320.1.?
I raise that because you did cite section 13, but section 320.1 is an interesting section of the Criminal Code, one that, for whatever reason, has been completely underutilized. It provides a judge with the authority, on reasonable grounds, to order the removal of something on a computer system that constitutes hate propaganda.
Intent isn't required. All that is required is that, on reasonable grounds, a judge is satisfied that it constitutes hate propaganda. Would that not be a tool that's already there that has been overlooked or underutilized?
Fair enough, and that's why I cited these provisions in the Criminal Code in general. It's because there really are tools in the Criminal Code. The problem is they are not being effectively utilized by the legal system and police services.
When section 13 was there, we were able to use that as a tool to essentially call up Internet service providers. It was really that simple to say, “Look, this is actually illegal according to section 13. Please remove it.” They did. They often complied. We didn't have to go through legal channels.
Those kinds of provisions help. In my experience, it's a fact that section 13 worked. That's why I would like something like that, perhaps a little updated, corresponding as well to the Criminal Code, to make our system a little bit more robust in addressing these things.
Thank you, Chair, and thank you to the witnesses for coming today.
I want to talk a little bit about section 13. We've heard before, in this committee and elsewhere, very opposing views as to whether we should have a section 13, whether we should have an amended form of it, or whether we should find other mechanisms of enforcing more safety online when it comes to hate speech.
I've heard from witnesses here today. Mr. Suri said that we should amend or open up the Canadian Human Rights Act. Ms. Moore mentioned bringing back section 13. Mr. Benlolo mentioned bringing back section 13 as a tool.
If section 13 were to be brought back into the Canadian Human Rights Act, what kind of amendments would you like to see to its old form?
Mr. Benlolo, would you comment?
The framework of section 13 essentially addressed old digital formats of communication. To my recollection, it did not address modern-day issues that we're dealing with, so that needs to be updated and essentially updated to websites and social networking sites, and anything else we can project potentially into the future, because everything is evolving. That's really where I believe the update is required. It's not, as was previously mentioned, the telephone as much anymore, as we all know. That's where the update is needed.
I will definitely second what Avi is saying. It goes further back, I think. The CHRA has not been reviewed since 1977, I believe. Back in 2001, a panel was created for its review, but nothing ever came out of that.
Hate has evolved. It has been more modernized. We see the media that are being utilized. Now it's ever more important to make sure we consider the use of technology we have today that was not present back in 2001. That would merit a review. In looking at some of the language used to eliminate section 13, I believe it should be brought back in.
I certainly second what Faisal and Avi have mentioned. I would like to see the addition of something very specific to deal with false news within section 13. If you have something that is inciting violence or inciting hatred and if a person is, let's say, using altered or doctored videos or photographs of a Jewish politician, for example, that would be given specific consideration. You're not just promoting hatred; you're also, literally, putting words in someone else's mouth, potentially. That needs to be addressed as well.
On top of what Avi mentioned.....
Avi, you talked about Zundel and how he essentially took something from print media and then was able to take it online and expand its reach very much.
If you were looking at section 13, a quick addition would be applying it to telecommunications as well, anything to do with the Internet. Let's face it: 41% of hate is on YouTube.
Thank you.
Thank you very much to all of our witnesses today.
We're looking at the digital charter, and all of you have mentioned strong enforcement and real accountability when it comes to social media platforms. Today on the Hill we have an international grand committee that's looking into citizen rights and big data. We have a real challenge, because we have Mark Zuckerberg refusing to even come to the committee. He's in contempt of Parliament, essentially, because he refuses to come before this international committee, and therein lies the biggest part of our challenge.
If the big digital players don't respect what we're trying to do in our respective legislatures around the world, how can this end up being meaningful? It's a very significant challenge. It's really going to require, I think, all of our countries to call them on the carpet and tell them that they are responsible.
When you hear about the numbers online, the percentages that you've all raised here, it's just mind-blowing. That, in and of itself, is a very serious challenge when we can't even hold them accountable to what we're trying to put forward. We can put forward what we think will be important legislation, but if they don't adhere to it, where are we?
I want to go back to something.
Lina and Sinan, thank you for being here from my local community of Windsor-Essex. I appreciate your being here by video conference today.
I want to go to something that Lina said when talking about that real-life experience. I wonder what this looks like on the ground when you're trying to combat online hate or you see something and you think, “Is this hate? What is this?” You start to have those conversations among others to try to get them to stop it as well.
I also wonder if you can speak to the impact of having that burden on you and your community and in particular on young people. I know you do a lot of work with youth. What is the impact of this responsibility that's now on their shoulders to battle this every day when they're seeing things online?
Thank you for the opportunity.
Honestly, I referenced it as a popularity contest. I feel that's the best way to picture it.
When somebody feels attacked online, the way they get that recognition and get a post taken down or content removed is really by just texting all their friends and telling them to report it so that Facebook or whatever platform takes it seriously. I think that's part of the....
As you were saying, it's not only a burden, but it's not uniform in the way that people who face those kinds of online hate messages.... Not everybody is uniformly receptive to helping to take those messages away. It's really how many friends you know, and it's almost telling people that you have to share and you have to pity yourself more and more. It turns into this cycle of everybody trying to gain sympathy for what's happened to them in order to feel that something is going to happen. I feel that's not good in and of itself. That's more a symptom of the problem of it not being recognized in the first place.
Thank you so much.
I'll ask this more broadly. How can we hold social media companies accountable? We are all, essentially, their customers, right? We're in that space. It's a very big question, but I have to ask. I feel that you, obviously, are looking at ways to combat online hate. The biggest partners in that are the social media platforms. What are your suggestions for holding them accountable for what's happening on those platforms?
I'll go down the line.
Look, the question you raise is obviously the biggest question, because they're becoming bigger and the landscape is changing. They're essentially running countries now.
There isn't a simple answer. Obviously, I've suggested penalizing them and imposing fines and sanctions, etc. CRTC perhaps has an answer in terms of how it deals with broadcasting in general.
A lot of these social network providers also have offices here in Canada. They're running their business here in Canada, and as well there are heads, CEOs, here in Canada, so that may be the angle to enforce our laws.
I'm sorry, but I have to get to the next questioner. Sorry about that, everyone, but we're running short on time.
Mr. Boissonnault and Mr. Virani are splitting these four minutes.
Go ahead, Mr. Boissonnault.
Thanks very much.
I'm going to go to Mr. Suri and Mr. Hussain right away, and then to Ms. Moore. I have a minute each for you, Mohammed and Faisal.
We heard about what you want changed in the Criminal Code and the CHRA.
Mohammed, what's the effect of online hate on you, your family, and the people you represent, the Albertans you represent?
I actually think the consequences of that hate.... Let's face it: I'm someone who goes to the mosque regularly. I'm there in the evenings. I'm there on Fridays as well. After January 29 happened, after Christchurch happened, and the youngest victim of Christchurch was actually a three-year-old boy,... I take my kids to the mosque with me. Did I think about it? Was I worried about it before I went in? Yes, I thought about it. Did I quiver? Totally.
This is me I'm talking about. I constantly live in fear of this because I have a young daughter who plays hockey, who may or may not decide to wear a hijab and so on, but she's a visible minority. She has a very visible last name as well, so how is she going to be treated? What adversity is she going to face?
Let's face it: men don't see the brunt of it. A woman wearing a hijab does. A Sikh person with a turban does see consequences of this online hate. It's very real.
I definitely second what Mohammed was saying. It is so prevalent now in terms of the hate that we see now and how social media are being used to propel that hate to get it into the hands of people. We have WhatsApp group messages coming through looking at not only the Quebec shooting incident but also the Christchurch incident and the synagogue incident that happened in Pittsburgh. Before it even hits Facebook or Twitter, automatically word of mouth goes out and it's viral because you get a message on WhatsApp. People start living in this fear: Should I go to the mosque? Should I go to the playground? Am I going to be pulled over to the side by so-and-so?”
There was an incident that came to our attention in which a mother was dropping her kids off at school in one of the areas in Edmonton. Somebody had just driven by in a red pickup truck, and actually the intent of the truck driver was to ask for directions to get to someplace. However, the mother, along with the two kids, were just sort of caught up in this gamut of emotions from what they saw at Christchurch, and they felt that seeing a truck pulling them over, they didn't know what to do. It had absolutely nothing to do with this.
That's where we come in, to educate people to the fact that they can't live like that. That is not the intent. It is not the intent of Alberta or Canada at large. We can't let fear take over our own personal sanctuary that we live in.
I need to pause you there, Faisal. Thank you both for the leadership you demonstrate in Alberta and across the country.
Ms. Moore, what helped you go from overt hate to realizing that what you were doing was wrong? What helped you make that conversion?
It was a slow process for me to come to the realization that what I was doing was wrong. I wish I could give a cookie-cutter thing and say, “This will work for everybody”, but I think everybody comes into a hate group as an individual and they leave as an individual with their own unique experiences and terms.
In my case, I ended up connecting with some filmmakers who were making a documentary about the racist right, and I was part of that film. When they'd interview me, they were starting to ask me questions like “Well, how did you feel about this? What did you think about this? Do you agree with what this other person said?” The questions were about this other scene that we'd filmed. Having a camera in my face and having these people who were not part of the racist movement actually being nice to me, for lack of a better term, and listening to what I had to say made me feel some accountability finally. They made me actually stop and think. Because I was so busy doing things, just having a moment of pause, having a moment to stop and think, really made a difference for me.
No, we're doing four-minute rounds. We are late. The good news is that Mr. Virani has time in the second panel.
Thank you very much, everyone. I appreciate your leadership in your communities. You all make a difference for the people you speak for and you were very helpful to the committee.
I'm sorry to have interrupted the panel, but I really appreciate your testimony.
I'm going to ask the members of the next panel to come up quickly, because we're running very late.
Thank you. The meeting is suspended.
We will reconvene this meeting of the Standing Committee on Justice and Human Rights as we deal with our next panel on the topic of online hate.
We welcome, from the Federation of Black Canadians, Ms. Dahabo Ahmed Omer, who is a board member in charge of stakeholder relations.
From the Organization for the Prevention of Violence, we have Mr. Bradley Galloway, research and intervention specialist. Welcome.
As well, returning from the South Asian Legal Clinic of Ontario, we have Ms. Shalini Konanur, who is the executive director and attorney, and Ms. Sukhpreet Sangha, who is a staff lawyer. Welcome.
We're going to go in the order of the agenda, so we'll start with the Federation of Black Canadians.
Ms. Omer, the floor is yours. You have eight minutes.
Thank you. Good morning.
Please allow me to acknowledge that we are gathered here this morning on land held by the Algonquin people who are the original stewards of this territory, which they never ceded. As representatives of over one million Canadians of African descent, many of whom were displaced by the transatlantic slave trade and colonialism, the Federation of Black Canadians is of the belief that Canadians must continuously do such land acknowledgment as part of the national reconciliation with indigenous sisters and brothers.
Allow me to begin by thanking the Standing Committee on Justice and Human Rights for inviting me to address you this morning. My name is Dahabo Ahmed Omer, and I'm the stakeholders' lead on the board of directors of the Federation of Black Canadians. I'm here to speak to you in favour of amending the Canadian Human Rights Act. This is to provide legislators, law enforcement and marginalized communities with more effective instruments and mechanisms to stem the explosion of hate crimes and terrorism.
As you're probably aware, there has been a horrendous spike of hate crimes in Canada. Stats Canada just recently released the latest report on police-reported hate crimes in Canada, which shows a 47% increase in reported hate crimes. Black Canadians not only constitute the group most targeted by hate crimes by race and ethnicity, but the recent increase in hate crimes has been largely, although not exclusively, a consequence of more hate crimes targeting people of African descent.
If you're a black Canadian and you happen to be a Muslim and a woman and a member of the LGBTQ+ community, there is an even greater risk of being targeted by hate crimes. This intersectionality of hate is poorly understood and is also a very important part of the equation. Based on the federal government's 2014 “General Social Survey: Canadians' Safety”, we now know that over two-thirds of people targeted by hate crimes do not report them to the police. The most often reported explanation for this is that they get a sense that if they do report the crime to the police, the report will either not be taken seriously or the accused will not be punished.
From a black Canadian perspective of communities suffering from over-policing, carding and other forms of racial profiling, that fear becomes even more heightened. Even right here in the nation's capital, there was recently confusion with the Ottawa Police Service over whether or not the municipality has an actual hate crime unit. This feeds into the perception of law enforcement's indifference.
It is important for the federation to stress that this explosion of hatred that has been described so far actually mirrors the proliferation online. CBC's Marketplace recently revealed a whopping 600% increase in hate speech by Canadians online. We also know that over 300 white supremacist groups are operating in Canada, using the web not only to promote hate and concoct deadly attacks but also to infiltrate our trusted public institutions.
It should therefore come as no surprise that a 2018 Angus Reid poll showed that 40% of Canadians feel that white supremacist attitudes are a cause of great concern.
Hate is currently undermining public safety for marginalized communities such as mine while also threatening national security. This is made clear by a recent report by the military police criminal intelligence section that reveals white supremacist infiltration of the Canadian Forces by paramilitary groups that use the web to recruit and spread hate.
With terrorist attacks on the Centre Culturel Islamique de Québec; recent vandalization of black, Muslim, Jewish and Sikh places of worship; and the global context of coordination among white supremacist groups worldwide, more and more Canadians of all backgrounds believe that the time is now for Parliament to act more forcefully and deliberately against hate, which undermines public safety and transnational security.
[Translation]
Canadians expect their Parliament to take stronger action to prevent hate crimes that threaten public safety across the country.
[English]
The Federation of Black Canadians is aware that there is a tension between respecting freedom of expression, as protected under section 2(b) of the charter, and regulating hate speech online, as well as the prospect of technical solutions to reporting and monitoring hate speech or designating legitimate source and news sources, yet based on the lived experience of so many people across Canada who look like me, the federation believes that the lack of civil restrictions on dissemination of hate communicated over the Internet, the most prevalent and easily accessible mechanism of public communication, is a matter of grave concern.
The Canadian Human Rights Act stripped of section 13 is not a tool for the 21st century. When one considers that almost all Canadians under the age of 44 communicate online, that's why it's imperative that all political parties and independents come together in the spirit of consensus to restore section 13 of the act, which constitutes the only civil hate speech provision in Canada explicitly protecting Canadians from broadcast hate speech on the Internet.
The burden of proof required by section 319 of the Criminal Code is so high that, in and of itself, it leaves the most vulnerable populations, including black Canadians, subject to the proven harms associated with hate speech without providing a viable mechanism for recourse.
This becomes yet another systemic barrier to the inclusion, well-being and safety of black Canadians, among so many other groups targeted by hate. While the right to freely express oneself is fundamentally essential to a functional democracy—and trust me when I say this, because my country of origin is Somalia—the protection of the minority communities from the real harms associated with hate speech and online hate is demonstrably justified in a free and democratic society. It is only when Canadians feel safe, protected and respected within our society that Canada can flourish and advance as a democracy.
Thank you.
[Translation]
Thank you, Mr. Chair.
Committee members, the Mosaic Institute is grateful for the opportunity to participate in your deliberations on online hate. We recognize that your time is limited, and that you must be selective about the organizations you invite to appear. Thank you for including us.
[English]
Mosaic is a Canadian charitable institute that advances pluralism in societies and peace among nations. It operates through track two diplomacy and brings together people, communities and states to foster mutual understanding and to resolve conflict.
Over the years, we have convened Chinese and Tibetan youth leaders on peaceful co-existence on the Tibetan Plateau, we have assembled Sinhalese and Tamil representatives on reconciliation after the Sri Lankan civil war, and we have called together survivors of genocides to combat future global atrocities.
Fundamentally, our mission is to break cycles of hatred and violence by building empathy and common ground between peoples at strife. We have therefore seen first-hand how the speed and reach of social media have made it both a means of bringing us all together and a weapon to set us all at one another's throats.
The stakes are unutterably high. In our work with the Rohingya people, it has become clear to us that social media played a determinative role in spreading disinformation, fomenting hatred and coordinating mass slaughter, ending with the deaths of at least 10,000 innocent people and the ethnic cleansing of at least a million more. Canada is not Myanmar. Nevertheless, the ability of Parliament to contain and combat online hatred and incitement will quite literally decide whether people live or die.
It should go without saying that in a just and democratic society, there is no higher ideal, no greater ethic, no more sacrosanct imperative than freedom of expression. Peace, order and good government; liberté, égalité, fraternité; life, liberty and the pursuit of happiness—all are impossible without free public discourse. Freedom of expression becomes meaningless if it does not include freedom to offend, freedom to outrage and quite frankly, freedom to make an ass of oneself, although I'm sure that never happens in Parliament.
Voices: Oh, oh!
M. Akaash Maharaj: Any abridgement of freedom of expression must, therefore, be only the barest minimum necessary to preserve the dignity and security of citizens.
We believe that Canadian laws defining illicit hate speech are sufficient for that purpose, and the scope of proscribed speech need not and should not be expanded further. Legal, regulatory and social media frameworks fall short, not in defining hate but in identifying it and quarantining it before the virus spreads and wreaks its damage.
We do not underestimate the scale of the challenge that legislators and social media firms face. During the two and half hours set aside for this hearing, there will be 54 million new tweets and 4.7 billion new Facebook posts, comments and messages.
For your consideration, here are our recommendations.
First, social media firms must, either voluntarily or under legal compulsion, adhere to a set of industry standards on the speed with which they review reports that posts violate Canadian anti-hate laws or their platforms' own terms of service. For example, the European Union standards require firms to review a majority of reports within one day.
Second, social media firms should be required to have specific conduits to prioritize complaints from trusted institutions about offending content. A complaint from a children's aid society, for one, should be treated with immediate concern.
Third, there must be financial consequences for firms that fail to remove illegal content within a set period—penalties severe enough to make the costs of inaction greater than the costs of action. Germany's network enforcement act sets fines as high as 50 million euros when illegal posts stay up for more than 24 hours.
Fourth, social media firms should be required to publish regular transparency reports providing anonymized information on, among other issues, the performance of their machine learning systems at automatically intercepting proscribed posts; the speed with which firms respond to complaints from victims, trusted institutions and the public at large; and the accuracy of their responses to complaints as measured by a system of third party random sampling of posts that have been removed and posts that have been allowed to stand.
Fifth, social media firms must be more forthcoming in revealing the factors and weightings they use to decide what posts are prioritized to their users. They must give users greater and easier control to adjust those settings. Too often social media platforms privilege content that engages users by stoking fear and hatred. A business model based on dividing our communities should be no more acceptable than one based on burning down our cities.
Sixth, Parliament should enact the necessary appropriations and regulations to ensure that CSIS and the Communications Security Establishment have both the mandate and the means to identify and disrupt organized efforts by hostile state and transnational actors who exploit social media to sow hatred and polarization amongst Canadians in an effort to destabilize our nation.
Seventh, Parliament should consider legislative instruments to ensure that individuals and organizations that engage in incitement to hatred bear vicarious civil liability for any violent and harassing acts committed by third parties influenced by their posts.
Eighth, the federal government should fund school programs to build young Canadians' abilities to resist polarization and hatred, and to cultivate critical thinking and empathy. The best defence against hatred is a population determined not to hate.
Finally, especially in this election year, I would put it to you that parliamentarians must lead by example. Everyone in this room knows that the guardians of our democracy are not ministers but legislators. We look to you to stand between our leaders and the levers of power to ensure that public office and public resources are used only in the public interest. More than that, we look to you to be the mirror of our better selves and to broker the mutual understanding that makes it possible for a vast and pluralistic society to thrive together as one people and one country.
During the upcoming campaign, you and your parties will face your own choices on social media: whether to campaign by degrading your opponents, whether to animate your supporters through appeals to anger or whether to summon the better angels of our natures. Your choices will set the tone of social media this summer more decisively than any piece of legislation or any regulation you might enact. I hope you will rise to the occasion.
Thank you.
Thank you very much.
We will now move to the Organization for the Prevention of Violence and Mr. Galloway.
Good morning. Thank you very much for having me here today. I appreciate the invitation.
My name is Brad Galloway. I'm working as a research and intervention specialist with the Organization for the Prevention of Violence, which is located in Edmonton, Alberta. My main goals there are to take part in up-and-coming research, specifically on the far-right extremist movement in Canada, and more specifically, as of recent times, looking at the online dynamics of far-right extremism.
I often weave in my own personal lived experiences with the far right in Canada, as I spent 13 years within that movement in Canada, mostly at the beginning, in the offline context. However, I spent about 10 years operating also in the online context, so I know a lot about this online activity from an insider's perspective. I've used a lot of my experiences in taking part in some academic research as of recent times.
I'm also working with Life after Hate, which is another group that is similar to the Organization for the Prevention of Violence. We're looking at doing interventions and helping other people leave extremist movements. Some of those initiatives will definitely include looking at ways to build on online intervention strategies to intervene with people, and also providing resources for people who want to leave these types of movements.
It is my belief that communities are formed on shared ideas, experiences and cultures. ln order to distinguish and define themselves, groups compare themselves to others in positive and negative ways. It is in the latter that problems might arise.
A healthy, culturally diverse society is one that respects, accords dignity to and even celebrates the differences between cultures and communities. However, when groups start to distinguish and compare themselves in a negative manner to other groups on grounds such as race, religion, culture, ethnicity and so on, there is a potential for destructive and abiding conflicts. This leads to an us-versus-them mentality.
lt is in this sense that hate and extremism are interrelated phenomena that exist along a continuum of behaviours and beliefs that are grounded in this us-versus-them mindset. The perpetuation of associated rhetoric can create an environment where discrimination, harassment and violence are viewed by individuals as not only a reasonable response or reaction but also as a necessary one. When this is left unchecked, deepening, sometimes violent divides within society can undermine who we all are as Canadians and fray the social fabric of the country.
For the last 30 years, technology—first telephones and later the Internet—has played a crucial role in the growth of the white supremacist movement throughout Canada. Early versions of hate speech online in the 1990s and 2000s were being distributed through automated calls and websites. For example, the Heritage Front, a white supremacist group, had automated computerized calls spouting racist information. Other examples included the Freedom-Site network and the white civil rights hotline.
Beginning in 1996, we then saw the emergence of online discussion forums such as Stormfront, which notably was one of the first white supremacy websites and is still very active today. Stormfront was the first of this series of online far-right platforms and was used to communicate and organize.
Today we see more activity on social media sites, such as Facebook, Twitter and Gab, though most of these conventional forums still exist and are often used in conjunction with the new platforms, inclusive of apps. Often content removal or regulation are suggested to mitigate such sites and platforms. I would say that they both have their upsides, but they are very much faced with many challenges, both legal and ethical.
More with regard to the present, extremist groups and individual influencers promote social polarization and hate through available technology and are highly adaptive to pressing demands by law enforcement, governments and private social media companies.
Further, online hate speech is highly mobile. I would argue that these hate groups, and organized hate groups specifically, are using this mobility to further their transnational hate movements. Even when this content is removed, it finds expression elsewhere. Individual influencers are adaptive at finding new spaces.
If content is removed, it often re-emerges on another platform or under the name of a different user. Often the rhetoric and the networks move from established networks, where counter-speech can occur and where journalists and law enforcement are able to easily track their activity, onto platforms where detection is more challenging and where what are often termed “counter-narratives” are harder to deploy.
There are a multitude of examples, both domestically and internationally, of individuals who are promoting hate being kicked off one major platform—for instance, Facebook—only to move to either another major platform such as Twitter, or any host of smaller platforms, such as Gab or Telegram. Today’s online space is a more dynamic, immersive and interactive multiplatform online space than has ever previously existed, when there were only a few forums or a few telephone lines.
Influencers and propagators of hate distribute through multiple interlinked platforms. This new dynamic has demonstrably had an ability to mobilize hate-based activism and extremism, especially for lone-actor, violent extremists such as those who perpetrated the Tree of Life synagogue and Quebec City mosque attacks. The individuals who carried out these attacks did not necessarily engage directly with ideological influencers or a networked group, but they were mobilized based on the hate they felt and the sense of crisis they saw stemming from an opposing group.
What is the solution? I don't think there's any golden ticket solution. However, we believe that ultimately the first step in prevention and countering the propagation of hate speech and extremism is awareness, beginning with a better understanding of the nature of hate crimes and hate incidents online and off-line. We need better data on who is most targeted by hate and what the intersectional dimensions of targeting are—as in black, woman, Muslim who wears the hijab—and where these things take place. We need data on whether certain public spaces, like public transit, or certain public platforms, such as Facebook or Twitter, are more conducive to hate speech and harassment.
In order to do this, there needs to be more incentive for victims of hate crimes to come forward. Often there is stigmatization, fear and skepticism around reporting a hate incident to the police. These issues need to be constructively challenged and mitigated through a multisectoral approach.
A recent example that I found is the proposed bill SB 577 in the state of Oregon, where they are also dealing with a rapid increase in hate crimes. This new bill requires law enforcement agencies to refer alleged hate crime victims to a new state hotline that is staffed by the Oregon Department of Justice, which connects callers to local mental health services, advocacy groups and other useful resources for crime victims. This allows victims to be in a safe, understanding environment while moving forward with a multitude of resources to address their hate experiences. It provides victims with some more resources and could increase reporting.
Online parallels are easy to imagine. Already some American non-profits are creating online resource hubs for people who have been doxed and had their personal information exposed. These resources could be repurposed and redeployed to address the issue we’re talking about today.
Many witnesses have likely discussed the legal challenges associated with changes to legislation. With the time I have left, I would instead like to touch upon some efforts that could occur further upstream of hate speech that don’t require legislative change.
Mr. Galloway, I'll just let you know that you already exceeded the eight minutes. I'll ask you to wrap up relatively quickly. Thank you.
Sure. I'll do that right away.
I think that implementing coherent, standardized, broad-based digital literacy campaigns—addressing an array of issues—in the education sector will help students and broader society manage the obviously very complicated issues that we're seeing today.
Again, thank you very much for having me here today. I look forward to any questions you may have.
Thank you to the committee for having us here. I'm here with my colleague Sukhpreet Sangha, and we'll be speaking interchangeably today.
Very quickly, the South Asian Legal Clinic of Ontario is a not-for-profit legal aid clinic that works with low-income South Asians across the province. We do poverty law, which includes a large volume of immigration, human rights, employment law, housing law and income security law.
From that casework we also look at the trends in what we hear from our clients and pick up on larger advocacy pieces around Ontario and around the country that are impacting the work we see on the ground, so our comments are directly related to our front-line work.
Approximately 30% of SALCO's legal casework raises issues of systemic racism and discrimination. We've worked on cases on access to service, housing, employment, policing, immigration, and we've worked within the larger justice system framework on these issues.
Our law reform work has addressed the growing inequities faced by racialized communities, inequities that intersect with multiple identities such as gender, faith, socio-economic status. Our work has addressed how those things intertwine in that world of online hate speech.
We've had a chance to speak on these issues at the United Nations. We are part of Ontario's anti-racism directorate consultation committee and worked on the legislation to embed that in Ontario law. We sit on the Toronto Police Service's anti-racism advisory panel. We've worked with the federal government on a national anti-racism strategy. We've debuted in Quebec and on test cases at the Supreme Court on the ability to wear the niqab, and we are currently sitting on a coalition of community leaders in Ontario that's looking specifically at dealing with hate crime and the rise of white supremacy.
No doubt you've heard from everybody today that obviously in recent years, we've seen a definite rise in hate speech in the public discourse. There's no doubt that social media platforms and the Internet have played a significant role in spreading that hate speech.
Globally and domestically, we know that online hate has been a catalyst for violence against Muslim, Jewish, black and indigenous communities.
I want to say quickly that a lot of the discourse, a lot of the discussion, now is around Islamophobia and anti-Semitism, but the data shows that anti-black hatred is prolific in Canada, as is anti-indigenous hatred. I believe that those two communities go largely unreported, so it's important, I think, for this committee to take notice of those particular historical communities and the hatred they continue to face to this day.
Last week I met with a Muslim client who came to our office begging us to help get some of their remaining family members out of Sri Lanka. Why? It was because the others had been killed in an attack on the Muslim community in Sri Lanka, which was incited by online hate. The connection there is real. We have people here who are connected to people globally and we see the impact of online hate.
To be frank, I want to tell this committee that I have personally received a significant amount of hate email, social media hatred threatening me, and in one case threatening my family and my children and threatening our organization.
Yesterday I spoke to a Sikh colleague who had received an open message on the website of his organization calling him a “towel head” and telling him that his community should be deported and that he deserves to die.
I don't bring these things to the committee to be shocking, but to tell you that this is what we feel and see. The audacity and frequency with which people now spew hate online shows us that we have failed to control online hate. There is truly little in the way of real and accessible mechanisms in Canada to hold people accountable.
I'm going to turn it over to Sukhpreet.
I'll be speaking about combatting online hate, especially through the lens of the criminal law.
Of course, we all understand that we must be expedient, bold and effective in the way in which Canada responds to the growth of online hate.
To that end, regarding the criminal law, the committee is aware and it has been mentioned earlier that the Criminal Code currently contains two main provisions that can be used to charge persons accused of committing online hate crimes: sections 318 and 319. While section 318 prohibits advocating genocide, section 319 more broadly prohibits public incitement of hatred and the wilful promotion of hatred and is thus the likelier charging section in cases of online hate crime. The use of section 318 is further limited by the fact that proceedings under it may not be instituted without the consent of the Attorney General.
Subsection 319(2) creates a hybrid offence that criminalizes:
Every one who, by communicating statements, other than in private conversation, wilfully promotes hatred against any identifiable group
The maximum punishment available, if proceeding by indictment, is two years of imprisonment.
However, it is troubling that the Attorney General's consent must also be obtained to institute proceedings under that section. As the popularity of using the Internet as a forum for spreading hate only continues to increase, Parliament should reconsider whether this requirement for the Attorney General's consent places undue limits on the prosecution of online hate crimes. Requiring this consent to proceed with online hate crime prosecutions creates an unnecessary additional barrier to these charges being pursued by legal authorities.
Alternatively, individuals committing certain types of online hate crimes can also be charged with more generic Criminal Code offences, as is sometimes done, under sections regarding uttering threats and criminal harassment, and the hate motivation of the crime can be considered an aggravating circumstance on sentencing under subparagraph 718.2(a)(i).
Another concern with the current usage of the code to address online hate crime is the often fraught relations between some members of racialized communities and the police. lt is more broadly acknowledged now that systemic racism is a significant problem within our criminal justice system, which creates an access-to-justice barrier for members of those same communities when they are subjected to hate crimes, when their main avenue for dealing with that is the police. lt can be reasonably difficult for racialized persons who have experienced being targeted by police through programs such as carding to then have to seek assistance regarding hate crimes from members of that same police force.
Further, the historical and ongoing overreliance of the criminal justice system on punishment through penalties such as imprisonment and fines also produces a deficiency when dealing with hate crimes, both online and off. Punitive sanctions, such as those traditionally meted out by the criminal justice system, do little to confront or change the attitudes and beliefs that motivate hate crimes. As such, a more meaningful remedy could lie in community-based programs that seek to address the motivators and the thinking that underlie hate crimes, in a genuine attempt at anti-racism and anti-oppression education.
Further, problems with prosecuting hate crimes through the criminal justice system are revealed by the fact that police solved just 28% of hate crime incidents in 2017, as shown by new Statistics Canada analysis, which I'm sure most members have seen. By comparison, among all Criminal Code violations, excluding those in traffic, 40% were solved by police in that same year, 2017. Hence, even when the hurdle of reporting to police is cleared by victims of hate crime, the chances of success are 12% lower than with other types of offences.
We'll make two recommendations regarding the criminal law.
First, as the popularity of using the Internet as a forum for spreading hate only continues to increase, Parliament should reconsider whether codifying requirements for the Attorney General's consent places undue limits on the prosecution of online hate crimes. Requiring this consent to proceed with online hate crime prosecutions creates an unnecessary additional barrier to these charges being pursued by legal authorities. I will note that this requirement for consent is also in section 320.1, which was raised earlier at this committee today.
The second recommendation is that the government should look at creating civil and community-based mechanisms to address online hate that do not engage the criminal justice system. In our position, the criminal law is not the most effective mechanism to address online hate. A non-criminal administrative mechanism could provide a more accessible alternative. Systemic racism within the criminal justice system makes it disproportionately ineffective for racialized communities.
I'll pass it back.
Okay. I'm going to go really quickly.
I won't repeat what everyone has said on section 13 of the Canadian Human Rights Act. What I will say is that I worked using that section, and it was effective to combat online hate. The section, as it's written, does include a recognition that hate includes computer and online communication.
We call for a re-enactment of that section. We can, in this committee, study what the issues were, procedurally, and why it was ineffective. There's a lot of nitty-gritty. We can take best practices from what we learned about things that didn't work and put in what did. I had, I think, at least five successful section 13 cases; none went to hearing. Just the use of section 13 resolved those issues.
Lastly, what we want to speak about most is that we cannot do any of this work piecemeal, such as by changing a section in the Criminal Code or adding a section. What we really need is some sort of national anti-hate strategy. We need a strategy that talks about online hate, social media platforms, the Internet and how they collect data, and makes it mandatory.
A lot of data has been quoted here, but the reality is that the data we have does not even touch on the reality of online hate. Most people don't report it; most people don't go to the police, as my colleague just said, so we don't know the real picture in Canada. We continue to fail repeatedly on how we collect data on this issue. We continue not to push social media outlets to collect data.
We also need a strategy—and someone said it before—around education. A national strategy, directorate, secretariat or whatever you want to call it ensures a commitment and a continuity of this work, regardless of what happens, and that we're looking at this within the larger picture. You see the work on online hate and education; you see the bigger picture. We would call on this committee to look at doing that.
The other thing I want to say quickly is that I would urge this committee not to come back with a recommendation that we study this more. As was the case with forced marriage, we had people from the U.K. come in to say, “You don't need to study it. You have enough anecdotal information in front of you that you should act now.” I feel very much the same way on this issue.
Thank you.
Thank you very much. That is all very helpful.
Folks, as you know, we're running a little behind. May I suggest we do four-minute rounds? Is that okay with everyone?
We'll start with Mr. Barrett.
Thanks, Mr. Chair.
Thank you very much to all of the witnesses for coming today and sharing your perspectives, your experiences and your advice to the committee.
I'm very interested, Ms. Sangha, in the idea of education over enforcement and having conversations in the public square.
You referenced the clearance rate for police forces and online hate crimes. My colleague, Mr. MacKenzie, served as a police chief, and reminded me that they clear 100% of the drunk-driving cases that get put in front of them. We could run a R.I.D.E. program every day, and, unfortunately, we're still going to be catching impaired drivers.
I very much like the idea of education. When we had witnesses here a few weeks ago, I asked a few of them questions about having the conversation in the public square, and bringing groups together, so we're not dealing with this in silos within communities. We all share a common purpose in this, and that is to end hate. When hate migrates online, it seems to proliferate more quickly, but if we address it right at the root cause....
Do you have any examples of where this has been done—where different faith or ethnic groups have been brought together on a large scale, with some result, or perhaps your colleague does?
I think Mr. Galloway referenced one: Life After Hate. I would point to other similar groups that seem to have been very effective, in terms of having people on the ground who were in hate groups and were themselves converted away from that impulse. That's the kind of community-based education I'm speaking to, because it involves people with lived experience. Those people have a way of relating to people who might be pursuing hate, outwardly on the Internet or elsewhere, and advocating for hate to be acted upon.
Those groups are the best examples of tackling this issue. I want to see more of those.
Mr. Galloway, picking up on that, who are the stakeholders who are at the table or in the room when these groups meet?
What we're trying to build, at least with the OPV in Alberta, is different partnerships with different groups that are doing work in this area. In British Columbia and Alberta, I've definitely come across people who are trying to suggest K-to-12 digital literacy programming—federally speaking—across Canada. It's about our willingness to teach young people about the content they're viewing at the initial stages, before it gets to much later when we're adults and we're viewing all of this content, right?
I think we lack digital literacy in Canada. I think that's one of the biggest problems. We've been trying to work as much as we can to do community engagement projects and to go into school districts throughout....
I've been doing it in B.C. for the better part of three years now and trying to work with Life After Hate to go out and speak with community organizations and others who have a stake in this discussion.
Something that could be very important is to introduce earlier in the public education system, and in other education systems, courses around world religions, not as electives but more as a mandatory inclusion. I can speak specifically to the fact that Sikhism is not really addressed in public education. People don't know that members of the Sikh faith are a whole different group. That's not to say that they therefore should redirect towards who they're actually meaning to target, of course—they shouldn't be targeting anyone—but ignorance promotes further hate and is a real problem.
Very quickly, I would just say that I think you've made my point very clearly. I think that involving targeted groups and bringing them all to the table, whether they are representatives from all religions or from any of the targeted groups such as LGBTQ2+ organizations, women's groups or any racialized groups, would very much serve us well in the form of education over enforcement.
Thank you very much for your answers.
Thank you, Mr. Chair.
I'll start with you, Ms. Konanur. Thank you so much for the amazing work that SALCO does in my riding of Willowdale.
You've mentioned that we've failed to control online hate. You also mentioned the unfortunate incident when you received hate emails. Could you explain to us how you dealt with it and whether that was an effective means? I know that you're there helping other people, so I can vouch as to how resourceful you are. How effective was reporting it?
It's a great question, because it's a question about accessibility. I dealt with it as a lawyer who is the ED of the South Asian Legal Clinic and has a whole heap of privilege behind me, so I can blast back, right? I can call the police and they'll take my call. I can do a number of things that I know my clients could never do.
I was able to counter it with my own commentary. With some of the comments, though, you just leave. We've become emotionally fatigued by it, to the point that you have to put your hands up and say, “I'm not going to engage”, because the engagement just leads to a snowball effect. The truth is that the emotional psyche of people who do this work with people from different communities—all of us—is harmed. I see it with my kids in particular.
The reality is that what we wanted to talk to you about today is that we don't have accessible solutions. That's why, when people talk over and over about civil remedies, administrative remedies, services that people can access for free to combat this and moving things out of criminalization, we are talking really about accessibility and access to justice. We want to look at those back-end mechanisms for people to combat that individual hate, but we also want you to think about the front-end piece, right? Why is it okay? Just as quickly as we went in this direction of online hate being okay, we can go in the direction of it not being okay. That takes our will. It takes our will to do that, and it takes our will to stand up, but it is really difficult.
One thing that I didn't get to say was that in Ontario we had an incredible ruling two weeks ago against two anti-Muslim advocates who went after the founder of Paramount Fine Foods. He got an award of $2.5 million in civil court. When I think about that case, I think, “If the clients I'd seen had the resources to access civil remedies, imagine the message we could send around civil liability for these cases.” I think about how we do have test case funding now, and we do have the court challenges program. Is there an opportunity at the federal level to expand those programs to have this kind of work done?
Thank you very much for that.
Mr. Maharaj, you mentioned transparency reports and requiring social media organizations to file such reports. I've never heard of such things. Could you elaborate on that and tell us what such transparency reports would contain?
Different jurisdictions have different requirements, but probably the best known is the European Union's Code of Conduct on countering illegal hate speech online. It's modest in its scope because it is transnational, but it requires social media organizations to report how many requests they have received, how many reports they have received of alleged hate speech, how many of them were analyzed and addressed within the first 24 hours, how many were addressed within 48 hours, and what percentage of those posts were ultimately taken down.
I think that is an excellent initiative. One of the metrics of its success is that over time, the proportion of reports that have been investigated by, for example, Facebook, within the first 24 hours has doubled over just four or five years. It shows that public exposure to their record creates a tremendous incentive for social media organizations to increase their compliance with their own rules, as well as for national anti-hate legislation.
I would say, though, that if this is a path Parliament wishes to go down in Canada, more would be better. That means more transparency, more detailed information about the reports that social media firms receive and how they deal with them. The greatest weakness of the EU's approach is that it only requires social media to turn its mind to these reports quickly. It doesn't require them to turn their minds to these reports effectively. In other words, if they address 50% of the reports in the first 24 hours but they get all of their analyses wrong, they've still exceeded the European benchmark. An additional step of random sampling of those reports by an independent third party to assess not just how quickly they're dealing with them but how effectively, I think, would be an excellent idea.
There are privacy risks—
Thank you so much. There are so many great topics. I think we could dive into the education piece—it's critical—and the work, Mr. Galloway, that you've been doing.
You've been talking about deradicalization and the relationship of online hate to offline and how it becomes a real-world hate that impacts people's safety.
I want to talk about something that Ms. Sangha brought up. First of all, I thank you for your call to action. I absolutely echo that. We need to have something strong coming out of this committee that's not just aspirational but can actually make an impact on the ground.
You're talking about the fact that many people in our country have a relationship with the police that is not good. They don't go to the police because they mistrust them and they mistrust the way they're being treated. This reporting online, I think, is really critical in the sense that people don't feel they have to go and be challenged or experience that racism or other things they currently experience with the police. It's very challenging.
I want to ask Ms. Omer first, then Ms. Konanur, and finally Ms. Sangha: How should this reporting process be structured so that everyone can feel comfortable coming forward? Who should be handling it? What are your thoughts on that?
I am a part of this coalition called the Justice for Abdirahman Coalition. We've been doing work for almost three years now, and a lot of that work has been towards creating more transparency and accountability with our police services.
One of the things we found with our communities is that you can't have someone who is within the police service as the person that marginalized and vulnerable communities are dealing with.
A lot of times one of the recommendations that we've made as a group is to create a civilian group whose composition would be members of the community, members who have been impacted by police interactions negatively, individuals who have done research in this field, individuals who are young black men who have faced discrimination and racial profiling with the police service. This council or this entity would really have a neutral and objective role and responsibility. It would allow the communities that are affected by these hate crimes to feel more comfortable in coming up to them. These would be advocates, such as myself, individuals you can relate to, you can connect with, who you know will not use the information you've given them against you.
A lot of times members of our community feel as though there will be a reprisal. They feel as though the police service has access to their information. They know their address and their licence plate numbers and things like this, so there is a fear in going to police services. A council or a committee made up of members of the community—grassroots organizations, really—would be the stakeholders and would be the ones reaching out to communities to say, “I can be your advocate. I can be your liaison, and you can trust me.” That neutral party would be the hand that would hold the community and the police and relay that information.
Inadvertently what it does is create that trust between the community and the police if they see that this body, as a neutral body, is able to play that advocacy role and balance role. I think that would start to create some of that trust between the police and the community that's not currently there.
I would echo that. My last call was really for creating some sort of federal entity that can collect that kind of data. I think about the work of Stats Canada. We recently had a call with Statistics Canada around collecting information on hidden homelessness. While that seems like an extremely difficult area to collect data on, they had some really innovative ideas on how to do it across the country. The truth is if they can come up with ways to collect data on hidden homelessness, certainly the brilliant minds at Stats Canada, who are doing incredible things, can create mechanisms for collecting this data.
It has to be outside of just reporting on those things reported to the police, because that is nowhere near a picture of what's actually happening. I think some sort of national strategy and some sort of entity that has a subset that talks about data and reporting on that data is a really critical piece.
I'm going to go very quickly.
You're all doing amazing work. Thank you.
Thank you particularly, Mr. Galloway, for bringing a voice that we haven't heard very often.
Shalini, we're very proud to always have SALCO at the committee.
Mr. Maharaj, I will pick up on where you left off and just say that it is critical, not just this year, but any year, that parliamentarians exercise discipline and appropriate behaviour. I will say that I am troubled when we have senators of this Parliament question white supremacy and its presence. I'm also troubled by reports today that we have elected officials potentially making announcements about immigration policy in front of hotels that were the site of arson attacks in Toronto. I'll leave it at that.
I have a question for all four of you that relates to section 13 of the CHRA. It's a bit specific because I'm a bit of a specific lawyer and we like to get into the weeds a bit.
The specific aspects are that the old version of section 13 had an exemption for the telecommunication provider. Do you think that should remain, or do you want more accountability for the telecommunication provider and the social media platform?
Second, can we quell the free speech antipathy by simply having a rider in there, which may be superfluous, saying that nothing in this clause is meant to derogate from the constitutionally protected right to freedom of expression?
Third, do we need a definition of “hatred” incorporated into it? This was the suggestion by Irwin Cotler, a previous attorney general, in a private member's bill.
Fourth, should we have some sort of threshold for what constitutes the type of hatred that would trigger section 13 so that we don't get single instances but more of a mass-orchestrated attack?
If all four of you could opine on all or any parts of those, that would be terrific. Thank you.
You know, I already went first. You guys are going to make me do it again. Okay, no worries.
You asked a lot of different questions, and I wrote them all down.
In terms of the exemption, I don't think there should be one. I think social media platforms and telecommunication companies should be just as responsible as individuals. We're putting so much responsibility and accountability on individuals who put messaging online, but it should also be on those who should be monitoring that and reporting it and who should also be doing that data collection, because according to a lot of the information we heard today, we don't even know sometimes what constitutes hate. I think the telecommunication companies should be doing a lot of that monitoring and should not be provided an exemption.
In terms of the definition of “hatred”, I have a definition. I don't think everyone would have that same definition. The human part of me would say that if someone looks at me and says that because I'm a Muslim woman and I'm black, I'm inferior to them, that constitutes hate for me, from just being human and what I feel, but if we're going to put it into terms that everyone understands, I would say—and I wrote this down—hatred is predicated on destruction. Hatred against identified groups, therefore, thrives on insensitivity, bigotry and destruction of both targeted groups and the values of our society. Hatred in this sense is the most dangerous emotion and contradicts reason, an emotion that if exercised against members of these identified groups, implies that those individuals are despised, scorned, denied respect and made subjects of ill treatment.
That would be my definition. I think it captures the human side of it, but also, I want to say, the legislation piece, because words matter, and when we talk about hatred and about hate crimes, they always start with words.
I think defining hatred is key, and I thank you for asking that question.
We have only about 15 seconds left on Mr. Virani's time, so if somebody wants to add a brief note, that would be great.
I will very briefly address one question: If section 13 is brought back, should there be a threshold? There absolutely has to be. I'll rephrase the numbers I gave you. Every second, there are 6,000 tweets. Every second, there are 521,000 Facebook posts. There is no court, no quasi-judicial body, no hearing that could possibly deal with the avalanche of complaints that such amount of activity generates.
If section 13 is brought back, it should be for precedent-setting or for cases that rise to a level that cannot and should not reasonably be dealt with by the social media platforms themselves.
On your point around putting a catch-all at the end to say that nothing in this section limits freedom of expression, I don't agree that we need that. Our Supreme Court has been clear that none of our freedoms are absolute; they always can be limited by reasonableness.
I don't think we should water down section 13 by saying that.
I want to thank all the witnesses. You've been incredibly helpful again. You represent a diversity of groups, a diversity of opinion, and your life experiences will really help shape our committee report. Thank you so much.
We are going to move to an in camera meeting. I would ask that everyone clear the room in the next couple of minutes. I'm going to briefly suspend.
[Proceedings continue in camera]
Publication Explorer
Publication Explorer