Skip to main content

JUST Committee Meeting

Notices of Meeting include information about the subject matter to be examined by the committee and date, time and place of the meeting, as well as a list of any witnesses scheduled to appear. The Evidence is the edited and revised transcript of what is said before a committee. The Minutes of Proceedings are the official record of the business conducted by the committee at a sitting.

For an advanced search, use Publication Search tool.

If you have any questions or comments regarding the accessibility of this publication, please contact us at accessible@parl.gc.ca.

Previous day publication Next day publication
Skip to Document Navigation Skip to Document Content






House of Commons Emblem

Standing Committee on Justice and Human Rights


NUMBER 150 
l
1st SESSION 
l
42nd PARLIAMENT 

EVIDENCE

Thursday, May 16, 2019

[Recorded by Electronic Apparatus]

(0845)

[English]

     Good morning, everyone. Welcome to this meeting of the Standing Committee on Justice and Human Rights, as we resume our study on online hate.
    Today, we have two panels. In our first, we are joined by Egale Canada Human Rights Trust, represented by Ms. Jennifer Klinck, chair of the legal issues committee. Welcome.
    We are also joined by Equal Voice, represented by Ms. Eleanor Fast, executive director. Welcome. Also with us are Ms. Nancy Peckford, senior adviser to the Morgane Oger Foundation, and none other than Morgane Oger, the founder. Welcome. Finally, we have Ms. Ricki Justice, acting chair of the Pride Centre of Edmonton. Welcome.
    We'll go in the order set out in the agenda. As Ms. Klinck was here just two days ago, she knows the exact timeline she has, so she'll set an example for everyone.
    Ms. Klinck, the floor is yours.
    Thank you. On behalf of Egale Canada, I would like to thank the committee for the opportunity to speak today on this critical question of online governance.
    Ensuring that there are meaningful protections against online hate and harassment, while also maintaining our commitment to the fundamental Canadian value of freedom of expression, is both difficult and of utmost importance. As part of its mission, Egale works to improve the lives of LGBTQ2SI people in Canada, by promoting human rights and inclusion through research, education, community engagement and public policy contributions.
    I am the chair of Egale's legal issues committee, which is a made up of LGBTQ2SI lawyers from across Canada. I am also a partner at Power Law, with a practice focused on constitutional law. I am grateful for the assistance of other members of the legal issues committee in preparing these remarks, particularly Professor Samuel Singer, Daniel Girlando and Melissa McKay.
    Online hate poses a significant threat and is therefore an issue of particular concern to the LGBTQ2SI community. According to a Statistics Canada report on police-reported hate crime in Canada for 2017, hate crimes in general and hate crimes targeting members of the LGBTQ2SI community in particular are on the rise.
    Police-reported hate crimes targeting sexual orientation rose 16% in 2017, compared with 2016. Crimes motivated by hatred of sexual orientation accounted for 10% of hate crimes. Police-reported data on trans-targeted hate crimes is suspect, as nearly half of reported incidents—15—occurred in 2017 alone, likely corresponding to the 2017 addition of gender identity and expression to the Criminal Code. We do know, however, from Trans Pulse, that 20% of trans people in Ontario have been physically or sexually assaulted for being trans. We also know that many survey respondents did not report these assaults to police. In fact, 24% reported having been assaulted by police.
    Further, a significant proportion—15%—of hate crimes that are also cybercrimes target members of LGBTQ2SI community. Of particular concern is that hate crimes targeting members of the LGBTA2SI community are marked by violence. Hate crimes targeting sexual orientation were more likely to be violent than non-violent. Victims of violent hate crimes targeting sexual orientation and aboriginal peoples were also most likely to have sustained injury. Similarly, hate crimes targeting trans or asexual people were very often violent, with 74% of incidents involving violence.
    In short, online hate is of significant concern to the LGBTQ2SI community, because people are committing ever more acts of hate against us, and, all too often, those who hate us want to hurt and kill us.
    The Supreme Court of Canada's unanimous decision in Whatcott, a case that specifically dealt with hate speech targeting homosexuals, and in which Egale intervened, succinctly summarized the real harms caused by hate speech. First, hate speech subjects individual members of the targeted group to humiliation and degradation, resulting in grave psychological and social consequences. Second, hate speech harms society at large, by increasing discord, and, even if only subtly and unconsciously, by convincing listeners of the inferiority of the targeted group.
    The regulatory response to online hate should also take into account how certain types of speech are fundamentally at odds with the values that underlie freedom of expression, including the search for truth, and democratic participation in the marketplace of ideas.
    As the Supreme Court of Canada explained in Whatcott:
a particularly insidious aspect of hate speech is that it acts to cut off any path of reply by the group under attack. It does this not only by attempting to marginalize the group so that their reply will be ignored: it also forces the group to argue for their basic humanity or social standing, as a precondition to participating in the deliberative aspects of our democracy.
    This insight has considerable resonance for members of the LGBTQ2SI community, who have often been portrayed as morally depraved child abusers, as was the case with some of the flyers in Whatcott, or in debates concerning access by trans people to bathrooms corresponding to their lived gender.
    Beyond online hate speech, other forms of targeted online harassment are also of vital concern for the LGBTQ2SI community. Today, I will focus on two examples that cause serious harm.
    First, cyber-bullying poses a particular threat to LGBTQ2SI youth. According to a 2016 Statistics Canada report on cyber-bullying and cyberstalking among Internet users aged 15 to 29 in Canada, more than one-third of the young homosexual and bisexual population were cyber-bullied or cyberstalked, compared with 15% of the heterosexual population. Cyber-bullying and cyberstalking were also correlated with substantially higher rates of discrimination, as well as physical and sexual assault.
    According to a 2015 a Canada-wide survey by UBC's Stigma and Resilience Among Vulnerable Youth Centre, 50% of older trans youth experienced cyber-bullying.
(0850)
    The effects of cyber-bullying on LGBTQ2SI youth are serious. A 2018 systematic literature review by Abreu and Kenny found that these included suicidal ideation and attempt, depression, lower self-esteem, physical aggression, body image issues, isolation and reduced academic performance.
    Second, aggressive trolling of members of the trans community has become a serious problem. Media reports indicate a growing trend, with members of the trans community who engage in public discourse online being targeted by an overwhelming volume of transphobic messages on online platforms. This form of harassment is marked by both the volume and the vitriol of the material, which has included alt-right memes and Nazi propaganda.
    Further, the practice of doxing, collecting personal information on a person’s legal identity or Internet activities and publishing it to hostile publics, exposes members of the trans community to specific harms, such as revealing their deadnames, and to broader discrimination.
    Such practices chill free expression, as trans people avoid participating in public discourse out of fear of reprisal.
    A Norwegian study released in March “found that those who participate in online debates and comment sections, are more likely to receive hate speech than those who don’t participate online to the same extent.”It also found that members of the LGBTQ community are more likely than others to withdraw from political debate as a result.
    While online hate and harassment are issues of particular concern to the LGBTQ2SI community, restrictions on online speech can also disproportionately affect that community. We know from the Little Sisters saga, when Canadian border officials equated representations of homosexuality with prohibited obscenity, that the policing of restrictions on speech can wrongly discriminate against unpopular viewpoints and groups. We also know that the Internet has become an important part of helping LGBTQ2SI individuals find or construct their identities.
    In short, the issues are complex, and the stakes are high. A federal government response is needed. That response should be informed by careful study and will almost certainly require action on many fronts.
    At this stage, it is evident that better regulation of online platforms is needed, but we cannot simply transpose old ideas onto this new forum. Requiring content monitoring by online platforms may be appropriate. However, there is a need to balance making platforms responsible for content from which they profit and the risk of incentivizing sweeping censorship. Creative solutions should also be explored to prevent online platforms from using algorithms that magnify and direct users towards ever more hateful and extreme content.
    Additionally, more can be done through public education and information campaigns to strengthen online media literacy; to ensure a better understanding of what amounts to hate and harassment, since inflammatory and wrong understandings fuel distrust of initiatives to promote tolerance and inclusion; and to ensure broad public knowledge of the historically devastating effects of hate.
    Finally, in any government response, hateful speech directed towards members of the LGBTQ2SI community must not be treated less seriously than speech directed towards other groups.
    Egale Canada therefore calls upon the federal government to take a broad approach to developing a robust toolkit to combat online hate and harassment.
    Thank you.
(0855)
    Thank you very much.
    We will next go to Equal Voice.
    Thank you, Mr. Chair, and members of the committee for inviting Equal Voice to participate in your study on online hate. My name is Eleanor Fast, I am the Executive Director of Equal Voice, and I am joined by Her Worship Nancy Peckford, the recently elected Mayor of North Grenville.
    Hear, hear!
     She is the first ever female mayor of North Grenville and is here today as an adviser.
    Founded in 2001, Equal Voice is a national, bilingual, multi-partisan, not-for-profit organization dedicated to electing more women at all levels of government in Canada.
    We are very concerned about how online hate is negatively affecting women's participation in politics.
    I would like to begin by bringing to your attention a study commissioned by Equal Voice in November 2018 called “Votes to Victory”. The study, conducted by Abacus Data, examined barriers to women's participation in politics. This study was wide-ranging and, while not focused directly on online hate, had some relevant findings. For instance, the study found that 76% of men and 79% of women think that women politicians are treated differently then men, and 84% of women felt that politics is not friendly, which tied for the top reason women gave for not wanting to be involved in politics, along with time away from family. I believe that this perception of unfriendliness is in large part due to the online interactions involving women politicians that people observe far too often.
    This leads me to my second point—to highlight the online hate experienced by women elected officials every day. Many brave women from all parties have spoken about this openly, or posted about it, including MP Rempel, MP Cesar-Chavennes, MP Ashton, and the Honourable Catherine McKenna, to name just a few. Unfortunately, the list gets longer every single day. The gender-based online hate they have experienced simply for doing their jobs in unacceptable. If we want more women in Parliament and in legislatures across Canada, which is what Equal Voice is working towards, then we need to strengthen protections for women politicians and for women candidates.
    The issue of women in power, or those running for office, being attacked online is not a new one. In politics, it is important to have online fora where people can have heated political debates, and places where people can disagree with one another.
    However, as social media evolves, so do the hateful attacks, bringing forth challenging times and a need for our laws and policies to evolve with them. There is no doubt that Canada needs to enact and enforce stronger consequences for initiating or participating in online hate.
    Mr. Chair, I would now like to discuss a few of the ways that Equal Voice is working to combat the issue of online hate directed at women politicians and those aspiring to be politicians.
    In 2014, Equal Voice launched its #respecther campaign, to expose the everyday sexism experienced by women politicians across Canada. Events were held around the campaign to equip women on how to address these attacks, and to discuss what can be done to eliminate them.
    Recently, in April 2019, Equal Voice launched a modern safety guide developed in partnership with Facebook Canada, available to everyone on our website. It is particularly relevant for all current and aspiring politicians. The guide provides practical advice on how to stay safer online by using existing tools that many of us are unaware of. We hope this guide will be particularly useful in the upcoming federal election.
    Earlier this year, we partnered with the Public Policy Forum on an event discussing online hate. Conclusions from that discussion were clear. We must work with governments and the social media industry to find better ways to reduce online hate.
    Finally, through our Systemic Change initiative, Equal Voice is working to change the culture within legislatures themselves. This project is focused on working with provincial legislatures across Canada to reduce barriers to women's participation. Many of the tools developed for this project, such as sample anti-harassment policies, are also relevant at other levels of government.
    We are proud of the steps that we have taken at Equal Voice, but the actions of small not-for-profit organizations like Equal Voice will never be enough. We need the government to act to combat online hate.
    Equal Voice thanks the Standing Committee on Justice and Human Rights for taking on this important study. We look forward to your report and to assisting you in whatever way we can.
(0900)
     Thank you, Mr. Chair, for this opportunity to give these opening remarks. I look forward to the committee's questions.
    Thank you very much, Ms. Fast.
    Congratulations, Ms. Peckford, on your election.

[Translation]

    We'll now move on to Ms. Oger.

[English]

     Thank you for inviting me to address this committee today.
    My name is Morgane Oger. My pronouns are she, her, and hers. I am the founder of the Morgane Oger Foundation. We work to reduce the gap between Canada’s human rights laws and the experience lived on the ground of persons facing systemic discrimination, through advocacy, education, and legal means.

[Translation]

    Hatred devastates.
    Although this presentation specifically addresses anti-transgender hate, we believe that the basis of our argument applies equally to all types of online hate, regardless of the motive.
    Hateful acts are devastating for the victim, who feels the rejection that she has difficulty getting rid of, and who often suffers a lasting psychological impact as a result of the trauma.
    Neither insults nor the expression of divergent points of view constitute online hatred. It's the harassment. It's the incitement to discriminate. It's the deliberate publication of misinformation in order to deceive the public by giving people a sense of misplaced indignation. Hatred is meant to “pathologize” or demonize members of a community because they are who they are.
    Hate propaganda acts by creating anger or disgust towards a person or group because of their identity. Hate speech incites discrimination or violence by any means available.

[English]

    Canadian websites, such as The Post Millennial, Feminist Current, Woman Means Something, Canadian Christian Lobby, Culture Guard and Transanity, publish incitements to discriminate through misinformation in articles aimed at turning public opinion against the transgender community. Twitter and Facebook are awash with anti-transgender misinformation intended to justify anti-transgender discrimination.
    During the 2017 B.C. general election, social conservative activist Bill Whatcott travelled to Vancouver with 1,500 flyers in hand, which urged people not to vote for me because I was transgender and for no other reason. He distributed them in the riding where I was contesting. The flyers had a photo of me, describing me as a biological male, and claimed that I was promoting homosexuality and transvestism. They stated that transsexuals were prone to sexually transmitted diseases and at risk of domestic violence, alcohol abuse and suicide.
     After the election, I complained to the BC Human Rights Tribunal, which ruled in my favour in its March 2019 decision. Since 2017, Bill Whatcott has continued to engage in transphobic and derogatory harassment campaigns against me and others, focusing on a claim that he is being prevented from telling the truth that a man cannot be a woman. Whatcott’s campaign includes blog posts, trips to Vancouver to distribute more flyers, audio and video interviews, a series of social media posts and a number of articles.
    Eventually the story was picked up on social and traditional media and took a life of its own, combining with other ongoing issues. Derivative articles stray further and further from the truth, and accusations proliferate.
(0905)

[Translation]

    The effects of Bill Whatcott's campaign against me continue. Two days after the ruling, Bill Whatcott came to a church where I was talking. His harassment is now mostly online and on the radio, but it doesn't end. It's never going to end. The truth is that what Mr. Whatcott did will never go away because it was widely rebroadcast online.

[English]

    Because of Whatcott’s campaign, I had to teach my children to be wary of people. I had to ask them to keep an eye out for strangers. I had to explain to them why I had to do that. No mom wants to have to sit her children down and say to them that someone might want to hurt her or them because of who she is.
    Shortly after the first Whatcott flyers and resulting wave of social media interest, I was attacked by a man who lunged at me at a political event. He tried to crash through a stroller with a child in it to get at me. Luckily, an undercover officer handled him without injuries, because by then, I was already under police protection.
    Later in 2017, I was stopped in my back lane because of online commentary. A man, whom I didn't know, wanted to ask me about Whatcott. I was 20 metres from my home at the time, and the individual shared his displeasure. He expressed that what I was doing was wrong, that I should leave Whatcott alone, and that he and his church didn't like what I was doing.
    In 2018, Whatcott announced in a Facebook video shot while hunting that he was coming back to Vancouver to distribute more flyers. He boasted about his shooting skills in the video. Vancouver police warned me and my children, and we had to upgrade our security precautions. He was in Vancouver for two weeks.
    Due to the proliferation of claims made about me online, I now receive regular threats on the phone and countless threats online, some of them explicitly violent.
    Because our provincial courts consider online publications to be a federal matter, and because section 13 of the Canadian Human Rights Act was revoked in 2013, there are no human rights measures in Canada today governing hatred online. If Whatcott had restrained himself to only share his flyers online through Facebook, Twitter, or his website, my complaint against him at the BC Human Rights Tribunal would have been impossible.
    However, in 2013 the Supreme Court of Canada unanimously affirmed the legitimacy of human rights legislation that restricts hate speech in its Saskatchewan v. Whatcott decision. Furthermore, the Federal Court of Appeal found section 13 to be constitutionally sound in 2015, after it was repealed in Canada v. Lemire.
    The current gap in Canadian human rights law at the federal level enables the publishing of material on websites and social media that is prohibited from being publishing in physical form. For online hatred, the only remedy is a criminal complaint, which has a very high bar for conviction and can require special approval from a province's attorney general. Canadians need a civil recourse that effectively deals with hate publications that can reach wide audiences like they can online.
    Bill Whatcott is quoted, in Oger v. Whatcott, as estimating that the online version of his flyer reached approximately 10,000 people. His future posts were widely distributed and cited in socially conservative circles in Canada and the U.S.
    Another anti-trans activist, Meghan Murphy, has had over 100,000 views on her anti-transgender videos filmed in Vancouver, a city where, if they had been put to paper, it would have broken the law.
    Dozens of articles on the website Feminist Current get 1,000 shares each as they eviscerate transgender women, specifically using disinformation to advocate against our existing rights.
    Canada's gap in online hate legislation also has an impact outside of Canada. We have Canadian websites inciting anti-transgender hatred in other countries where legislation is being considered, for example, in the United Kingdom and Scotland right now, and this is originating from Vancouver. It is unbelievable that we are participating in preventing other people from accessing equality. Because of our legislative gaps in regard to online publications, Canada is exporting this anti-transgender hatred. We're inciting prohibited discrimination to other countries.
    The Morgane Oger Foundation has some recommendations. First, we recommend that the Canadian Human Rights Act be updated to address online hatred and incitement to discriminate on prohibited grounds; second, that any online material that can be produced and then retrieved on demand for display in a browser or device should be considered in the same way as if published on paper. As we move away from paper, our laws need to adapt. Therefore, third, all social media platforms doing commerce in Canada should be required to meet or exceed Canada's human rights laws as they pertain to publications. Fourth, because display screens are the modern equivalent of paper, when they are fetching information stored on a media for the purpose of displaying it, they should be treated as publications. Fifth, publications based on the storage of material on a media for the purpose of displaying it on demand should be handled within the same jurisdiction to keep the cost of enforcement low. Finally, when an individual or organization publishes material or allows it to be published, or when the consumer is in Canada, Canadian hate laws should apply.
    Thank you very much for your consideration today.
(0910)
    Thank you very much. I'm so sorry for what you and your family have gone through.
    Ms. Justice.
    My name is Ricki Justice. I am the Acting Chair of the Pride Centre of Edmonton.
    Our mission at the Pride Centre of Edmonton is to provide supports that respond to the needs of people with diverse sexual orientation, gender identities and gender expression and of the people in their lives. We really work with the most marginalized people in our community, especially with youth.
    What we are seeing right now is that youth are taking their own lives and that online bullying and hate have a significant role in suicide in youth in our community. So many of our youth spend a lot of time in the online world that it becomes central to their social lives, so feeling hatred and anger against them through an online venue has a significant impact on their mental health.
    Many Albertans who live in rural or remote locations may not have structured LGBT communities or local support, so they rely heavily on online support groups that are affected by continued online hate.
    For one of our service users, the negativity that gets directed toward them through their online time, whether that's through video game chats, Facebook or other social media, has played a role in multiple suicide attempts, which they have thankfully survived. This is our daily reality.
    Mainstream media has a role in reinforcing negative messages about certain groups. In my day job I work at the Edmonton Mennonite Centre for Newcomers, where we have also seen online hate towards immigrants. Mainstream media plays a major role in reinforcing negative images of refugees, for example, and the LGBTQ2S+ community.
    An example of this was during the recent cancellation of the pride festival in Edmonton when a group called Shades of Colour was blamed for the cancellation because they were protesting and asking for pride festival to refocus on queer, trans, black, indigenous and people of colour who are still fighting for equity in our community.
    This group received.... Well, it was quite a horrible online hate campaign, including death threats, that resulted in their basically locking themselves up in their homes and feeling unsafe in their own community, which tells me that online hate really is real-world hate and that the two go hand in hand.
    We also realized through this example that there is racism within the LGBTQ2S+ community and that there is a general lack of understanding of intersectionality and diversity in our community. Also within our community we find that people are hesitant to report online hate because of a fear of police and their systemic mistreatment historically, so they don't come forward.
    Basically, I am advocating that we address root causes of online hate in the real world, such as social isolation, poverty and lack of education, but the Canadian government also needs to set clear expectations for social media platforms to provide information to the public regarding harmful speech on their platforms and their policies to address it.
    I was very happy to hear Prime Minister Trudeau announce that there will be a digital charter coming out at the end of the month, and I look forward to seeing what actions will be taken as part of that.
    I would recommend that illegal content on these platforms be removed as quickly as possible, within 24 hours. I know that other countries have such regulations and that platforms take measures to dissuade users from repeatedly uploading illegal content, so it's not just taking the content down; it's making sure that the content isn't put back up again.
    In Canada there is also a lack of civil society research on harmful online speech, and I think we need more of that so that we can have good evidence-based policy.
    We also need public education about how to report online hate. The LGBTQ2S+ community needs to know they will be treated equitably if they report online hate, and police need to know how to handle these reports consistently.
    Digital literacy for youth is very important to help them develop understanding about the sources of news but also to help them recognize and reject racist, sexist, homophobic and religion-based hate content. We also need to foster inclusivity in schools.
    Last, we need to address the mental health impact created by harmful speech online through community-based mental health care supports.
(0915)
    Thank you very much.
    Thank you very much. It's much appreciated.
    Now we will go to questions. We'll start with Mr. Barrett.
    My thanks to the witnesses for your testimony this morning.
    Good morning, Mayor Peckford. How are you?
    Very well, thank you.
    It's great to have you here this morning.
    Ms. Fast, if it's okay, I'd like to ask some questions of Equal Voice if you wouldn't mind taking them.
    Absolutely.
    Ms. Peckford. I certainly appreciate the work of Equal Voice and getting more females elected to public office. I certainly recognize the overrepresentation of online hate against women in the political sphere. I recognize it by observation, and not by experience, being a male politician.
    What specific recommendations do you have for the committee for the reporting of online hate directed at female politicians or women participating in political debates online?
    Thank you, MP Barrett. It's good to see you in this setting. I usually see you in our home community.
    I think it's twofold. We have a culture issue. Until very recently, Equal Voice was part of a conversation where women used to accept that the price of being in politics, and being under-represented in politics, was that you would be the target of some online hating and bullying. That just went along with the job.
    What I think we've seen recently amongst all parties in most legislatures is that we are at a point where we think this is unacceptable and that no person's rights, regardless of their gender, their cultural background, or their sexual orientation, should be subject to online hate, or analogous experiences of hate, as a consequence of basic identity considerations.
    What's good is that the conversation has evolved. What's challenging—and it's so great that this committee is taking on this work—is the reporting of these incidents. I recognize that social media companies are doing better at giving users control over how online hate is received. I always give this example. In my own recent election campaign, I ran a Facebook page, which is pretty common for a candidate at any level of government. I had far more control than I even understood.
    While I was being trolled—minimally, by the way—I actually had a remarkably positive experience as a candidate, not just because I won but also because the dialogue was largely respectful online and offline. I was putting a lot of focus on the online aspect. I had control when trolling began. These were things we would consider to be out of order in any regular political campaign. My status as a mother was being challenged. They said I couldn't be a mayor and a parent and three children. Some of these assertions were really ridiculous. They started to go in a direction that was challenging.
    Social media, Facebook in particular, gave me control over my platform. That was super- important—not for censoring but to take out comments that were unwarranted. It's a very frustrating experience for elected women to go beyond that mechanism, because reporting is very challenging. Social media companies are getting better at responding, but there is no standard.
    I think you've heard around this table that we need a standard. Whether it's a digital charter or a regulatory framework that stipulates how and when social media companies can take action, I think a standard is incredibly important. We also know that through the Canadian Human Rights Commission we have lots of mechanisms. The bar to demonstrate and prove hate language is now criminal. We have other mechanisms that Canadians would not have been able to utilize in the past. There's a loss there in terms of how you ultimately take it on.
    We were quite involved in Newfoundland's finance minister's journey as a woman who was the target of online hate. At the end of the day, as you might know, she left politics maybe earlier than expected. Part of that, or all of that, was because she experienced heightened degrees of frustration owing to excessive bullying and hate language directed her way, not because of policy but because of body size, gender, and familial status, which in the end made it untenable for her to serve in public life.
    Certainly, I think the reporting mechanisms have to be easier. The responsiveness has to be better. I think we need to set a standard in Canada, and that's what's really missing.
(0920)
     Would your suggestion be that section 13 be reinstated or that it be replaced and revised?
    I think you have experts around the table who might in fact suggest improvements to the committee. I think a reinstatement is very logical, because basically we have an online environment that is a free-for-all, apart from what social media companies have been doing. It's very, very difficult for most Canadians and most elected officials, men or women, to pursue the only recourse available to them.
    As you know, Michelle Rempel talks about it, and you can talk to her about it. She had to take a constituent to court because of online bullying that really never ended and began to transcend into real life. What's the difference between online and real life? As you know, that distinction is increasingly blurred.
    I think there needs to be better recourse, and section 13 is a good way to start. Could it change now because of how social media and our online participation have evolved? Possibly. I don't think EV is in a position to say one way or another what that language should look like, but I would strongly recommend, and I think Equal Voice would recommend, that this section be seriously looked at again.
    Thanks very much, Mayor Peckford. I appreciate your response.
    Thank you for your questions.
    Mr. Boissonnault.
    We have only six minutes in these things. I want to thank Jennifer for the work that she has done at Egale Canada and for the transformative role the organization has played in the LGBTQ2 community in Canada.
    Your worship, thank you, and congratulations.
    Ms. Fast, thank you. I'm a big fan of Equal Voice, women candidates and seeing more women in office. I voted for the funding to support more work from Equal Voice.
     Thank you.
    This is a gentle nudge to my Conservative colleagues, as hopefully they would vote for it in a future time when you need more money.
    Morgane, thanks for sharing your raw and real comments and for creating your foundation. It's important. It's not easy, but it is critical. Keep being a voice for the voiceless. We need you to do more, and I know you will.
    Ricki, thank you for being an outstanding leader and a voice in a time that has been very difficult for the LGBTQ2 community in Edmonton. It's not easy when a community has disagreement within itself. Your work at the Pride Centre of Edmonton has been exceptional, so thank you. Thank you for coming out from Edmonton today.
    Colleagues, tomorrow we and others will mark the International Day Against Homophobia, Transphobia and Biphobia. Fondation Émergence started this day 16 years ago. I can't believe we're here in 2019, 50 years after the decriminalization of homosexuality, with so much work left to do.
    I'm in a reflective mood. I'm 45% sad and 55% hopeful and resolved that we're going to get through this. I think we need to reflect on difference and diversity, and how difference leads to diversity, which is great. How does diversity get twisted into being the other?
    Just to be who you are, just to be who we are, we go through the fires of hell and we risk losing it all. It's about being different in a society that wants everybody to conform. Everybody on the panel today is linked, because the origins of biphobia, homophobia and transphobia are found in misogyny. As soon as somebody believes that being feminine or less masculine is somehow a bad thing, the phobias come up.
    I will get to some questions. I don't usually do this, but I'm in a mood.
    We have to figure this out. I don't know if it's progressives or people who don't hate, or I don't know what it is, but if we could just come together and get to the root of how people are othered, then I think we stand a chance. We shouldn't give the hate platforms any more oxygen, full stop.
    I want to ask you some questions. How do we stop the hate from having a platform? In the United States, if you take a look at privacy laws, you'll see that there is a $40,000 U.S. fine for every privacy breach. What if we held the platforms accountable every time they posted something hateful online? For every view, there could be a $25,000 fine. Don't you think they would move quickly? Would that kind of fine system work to actually move the platforms to do more, in your opinion?
    We'll have a quick yes-or-no round. Jennifer, go ahead.
(0925)
    That's actually really difficult. If we're talking about privacy concerns, I think it's maybe more appropriate.
    I just used privacy as an example. We could have heavy fines of the ISPs.
     I think we need to be really careful about that, because it can create an incentive to censor. If there are really hefty fines and a need for fast action, that can be an incentive to just take things off, and it can lead to the removal of important speech, political speech—
    What about report and take down, as we see in France? When there is a hate site, and it's confirmed that it is a hate site, it's reported and it's gone within 24 to 48 hours.
     Again, I think that we just need to be careful about who's making the determination of what is hate, what speech is being removed and that these decisions aren't necessarily being outsourced to private corporations that have a profit motive to potentially censor any unpopular views. That can negatively impact the LGBTQ community.
    Does Egale have a position on section 13?
    Our view is that, certainly, a non-criminal administrative law remedy needs to be examined, but the circumstances are so different now and the forms of hate have changed. There's a need to really examine what's going to be most effective, based on an evidence-based policy. I'm not sure what the best approach will be, but I think there's room for that.
    Thank you.
    I'm going to stop you there.
    Ricki, you mentioned the “report and take down” model. Why do you like that model, and how do you think it could work?
    I think it can work when there's a very clear threat that needs to be addressed urgently. I agree with some of what Jennifer said, though. Who is going to do the policing? That's a big job. Right now, social media platforms use algorithms, and this is a very human-dimension thing. There would be major resources required to make this happen, but we've seen that it can be done.
    If it is a criminal offence, we have police that are responsible for enforcing the Criminal Code. If we provide the resources, they should be able to do the work. Then it goes to the prosecution to make it work. That is the logical flow. It does require more resources, and it happens in other countries.
    Morgane, in your situation that you talked about, would a “report and take down” model have helped you? What other mechanisms do you think we should be looking at as the federal government?
    The crown prosecutor's office declined to prosecute on criminal charges for whatever reasons it had. Ontario prosecuted effectively the same flyer. The “report and take down” model doesn't work on platforms. It has to be done at the ISP level.
    The amount of doxing that has been done on me is fantastic. They find out things that I've forgotten, but that's all done out of the basement of some lonely guy's house in Florida, apparently.
    It's a global issue. We need to coordinate.
    Yes, it's someone's private little server somewhere, so it has to be the ISP that does that.
(0930)
    Thank you all very much for being here today.
    Ms. Ramsey.
    Thank you all so much for being here today.
    This panel is really critical, I think, as a diversity of voices, certainly in talking about how we tackle this in different ways.
    Morgane, the fact that you were successful and referencing the case, I think, is important. Talking about what you're putting online for women is important, as well as the services you provide in Edmonton.
    I thank you all for the work that you're doing. It's incredibly important. As the only woman politician currently sitting at the table, I certainly have experienced this. I've had my children threatened. I know what that feels like, and I know how that feels in your home.
    First of all, you're all courageous—and Morgane, certainly you for being here and sharing your very personal story. I thank you for that because it's going to take the courage of people to stand up and fight this together, to battle it by exposing themselves more than they already have. I thank you for that. Your efforts are incredibly important on behalf of all Canadians, so I thank you for that today.
    It really is shocking when you think about what you pointed out: that things are allowed online that are not allowed in print. If something was handed to us, we could challenge that. We have a way to do that. We know where to go. However, when it's online, things just seem to get lost. People attempt to report, and the reporting system is certainly something that we could study entirely on its own.
     Ricki, you highlighted newcomers and immigrants who are nervous to report, LGBTQ people who are nervous to report and women who are nervous to report because then it puts the spotlight on them. We see the horror stories of what happens when people put themselves out there.
    Morgane, you highlighted what your family has been through, which is unacceptable in our country.
    First of all, I want to congratulate you on receiving the meritorious service medal in 2018 for your service to Canada on the matter of LGBTQ2+ rights. Thank you for that and, specifically, your transgender human rights work for sure.
    I want to ask you all two questions—a little more about why you feel that the online publications are more harmful than the physical. What is the difference between the harms that people are experiencing online versus something that they would see in a publication? Second, how do you feel that limiting online hatred would help your work? I can imagine the work that you would all be able to do if you didn't have to focus so much of your efforts on combatting online hate.
     Maybe I'll open it with Morgane because I started with her, and then we'll work down the panel.
     Online hatred is more harmful because of the speed and the reach. I can get onto my computer and attack any one of you personally with lies, put it out there and then suddenly it's true. It's instant; it has worldwide reach.
    I can spend $1,000 and get 500,000 people to see it by midnight tonight if I really want to. That's just how effective online publications are.
    If I have 1,000 friends on Twitter and I get them to do it, that probably quadruples its exposure. Then somebody cuts and pastes it and sticks it somewhere else. It's really hard to cut and paste a book or an article, but it's really easy to cut and paste a little thing. You just take a screenshot and make your own article. The derivative articles come out really fast.
    Information runs away on the victim very fast. It's really hard to get in front of it. Everybody who has been in politics understands that need to get ahead of it. When it's driven by hatred, the obsession is so strong to “get” that person that it seems they'll do anything to do it. Therefore, the publications just get put out there, with multiple copies and things like this. It's the speed and the reach that's the problem.
    I agree with Morgane. I would just add that there's also a momentum built up with...when one person starts something, it helps other people feel entitled to also contribute to the conversation in a negative way and add things on, so misinformation gets even worse. As you said, the speed is just incredible. Also, it stays there. If you have a flyer, you crumple up the flyer and throw it away, but when it's online it's frozen there.
    I certainly agree with Morgane and Ricki. I don't know if Mayor Peckford has anything to add.
(0935)
    For people in public life, it's the degree to which you can be individually targeted. It's why social media companies are doing better—with the Twitter mute function, for example. On Facebook, if you run a campaign page, you can actually hide someone's comment. They still believe it's there. They still believe their hate is out there in the world, but in fact it's been hidden. You have more control. I think it's the degree to which it's individually targeted and it lasts and you can't counter it effectively. Clearly, the viral effect is significant.
    I'm EV's past executive director. I can't tell you how many calls I would field where this was among the top three questions: What will I do when—it's not even “if”—I am the target or my family is the target? It's all online. No one's thinking about a flyer in their community. People are wondering what do they do when they're the target.
    It's a little bit better now, but barely. There was very little we could offer. Women have internalized this notion that if they're going to run for office and if they have intersectional identities that are also subject to being targeted or vilified, they expect that this is part of public life.
    I think it's good to be realistic about public life. I don't think we ever want to say to women that this is all roses and they'll have a great time. It's incredibly satisfying and now that I'm serving in elected life I can say that.
    We are always up against reframing politics and the political journey because of all of the crap. So much of it now dominates your online engagement, which is absolutely required to get elected. In my own experience, I actually don't think my electoral campaign would have been viable without online engagement. I had a huge reach. It is so incredibly powerful, but then the capacity for it to be turned against you is equally, if not more, powerful. That's the dance you're doing.
    With better rigour and with better standards, at least we can say there's something to work with. The non-criminal administrative route to pursuing justice is also very helpful, I think.
    Thank you very much. We're way over time.
    I'm so sorry. That was me.
    Don't worry.
    Mr. Erskine-Smith.
    I want to start by asking Jennifer from Egale Canada a question. You had—rightly, I think—highlighted the importance of a balance between holding social media companies and platforms accountable, but also avoiding undue censorship.
     I ordinarily sit on a different committee that has looked at these problems. This is one of the recommendations we've made. I want to see if you take issue with it. We recommended that “the Government of Canada enact legislation imposing a duty on social media platforms to remove manifestly illegal content in a timely fashion, including hate speech [and] harassment...or risk monetary sanctions commensurate with the dominance and significance of the social platform, and allowing for judicial oversight of takedown decisions and a right of appeal.”
    Does that seem like a fair balance to you?
     Certainly social media companies have shown that they can be quite effective in dealing with the most obviously prohibited content. That has been effective against terrorist content and stuff like that, so there is a role for that, but more needs to be done.
    When it comes to that “more”, it seems to me there are really two avenues of attack here. One is large social media companies. Google made $9 billion in Q4 of 2018 and Facebook made $7 billion. They're 75% of digital ad revenue in Canada combined, so let's take those two as an example.
    One answer is to say, where there's obviously illegal content and we don't want you to be the final arbiter in any way, there is going to be judicial appeal as far as it goes, but you're going to be accountable and we're going to restrict safe harbour for hate speech in the same way we do for terrorism and child porn. We have to strike the right balance, but that's one avenue.
    In regard to the other answer, you highlighted the issue of usefulness. You said you didn't know about section 13, because you didn't know how useful or effective that was, but you did highlight the need for a non-criminal administrative law remedy. By that, presumably, I take what you mean is it's not just about holding social media companies and platforms to account as the broadcasters or publisher hosts, as it were, it's also about holding the people themselves accountable in some fashion who are posting this hateful content. Is that right?
    Yes. There needs to be recourse against the platforms and the individuals responsible for the speech.
(0940)
    On the platforms piece, I generally understand the answer to that. There is financial accountability in the form of some type of sanction if they don't take down obviously illegal content, and there's basically a duty of care and some type of negligence there.
    What I struggle to understand is that we have hate speech laws under the Criminal Code and we have laws against threats and laws against harassment, but it's a very cumbersome process to apply those laws. We tend to apply those laws in extreme cases.
    I have a local case where a paper called Your Ward News was being delivered to thousands of households. It was shut down through Canada Post, and rightfully so. Then there was a criminal prosecution and a judge recently found that the publishers had broken the hate speech laws, and now there will be a sentence and punishment doled out. So it is with someone with egregious behaviour or with a Whatcott case.
    Egregious behaviour deserves a more appropriate and significant punishment, yet if someone makes a hateful remark online, posts a comment that is harassing toward a female, for example, whether a politician or not, the Criminal Code is not an effective instrument. I don't even think section 13 was a particularly effective instrument.
    Do you have any suggestions about what would be a more effective instrument?
    One of the issues on that is that there does need to be more study on how we can actually address this modern context. There are many distinctions and different types of approaches that can be adopted to different contexts, so distinctions need to be drawn between speech whose character is difficult to determine or is maybe on the line.
    Targeted harassment and privacy violations might have a stronger claim based on an individual. In situations where one person is the subject of an avalanche of attacks, there might be a need to act far more quickly and reactively to remove that content, although that might not be about personal responsibility. Again, these are all complex issues that have to be targeted to the different types of harms that exist.
    Another thing I would highlight about non-criminal administrative law fora is that they can allow for alternative dispute resolution mechanisms as well. We don't need to be talking about harsh penalties on individuals who post content that is perhaps on the line. We can actually look at more constructive models.
    One alternative model has occurred to me and I'd be interested in your feedback. If someone is at the extreme level of a Whatcott or Your Ward News, we bring to bear the criminal law or some more appropriate financial sanction through the Canadian Human Rights Act, if section 13 in some fashion were to be revisited. However, for someone who just posts a hateful comment online once, wouldn't it be more appropriate and efficient to have an administrative system that is flexible and efficient, that would say there's going to be a $30 to $50 fine and don't do it again?
    I hear about the education piece and I want to stop the big hate speech, obviously, but it's the people in their basements who post a one-off comment on Twitter in reply and there's no way to hold that person accountable. How do we effectively hold that person accountable?
     Again, I think that does need to be looked at, but examining the lower-level, administrative law, non-criminal approaches is certainly an important part of the tool kit.
    Has anyone considered a ticketing offence?
    I haven't considered that particular possibility, but it may be an appropriate response; I'm not sure. Again, I think it depends on what we are looking at in terms of the type of speech and how we're making these determinations. I don't think we want a situation where people are automatically receiving fines, but it's important to take into account a variety of tools in the tool box with lower forms of response than criminal sanction.
    I'm out of time, but I'll say one last thing. The problem is not about creating new laws that restrict speech. That's where we get sidetracked. We already have hate speech laws, and we have defamation laws, harassment laws and laws against threats. It's about making sure these laws are properly enforceable in an online context.
    Thanks very much.
    To all of the witnesses, thank you very much. You were all really helpful today. I very much appreciate your contributions to the committee.
    We're going to take a brief recess as the next panel comes up.
(0940)

(0950)
    We are now going to convene our second panel of the day.
    It is a pleasure to be joined by Ms. Cara Zwibel, the Director of the Fundamental Freedoms Program at the Canadian Civil Liberties Association.
    Welcome, Ms. Zwibel.
    From the Justice Centre for Constitutional Freedoms, we have Mr. Jay Cameron, barrister and solicitor.
    Welcome back, Mr. Cameron.
    You have both appeared before the committee as witnesses before, so you know exactly what we expect. You have about eight minutes, and then we're going to ask questions.
    Ms. Zwibel, I'm going to ask you to go first. We don't want to lose the video conference when we have it working properly.
     Thank you to the committee for inviting the Canadian Civil Liberties Association to participate in its study on online hate.
    As you all know, the CCLA is a national, non-profit and non-partisan public interest organization with over 50 years of experience in promoting respect for and observance of fundamental human rights and civil liberties. CCLA is deeply committed to protecting equality rights for all and has campaigned against discrimination in its many forms. Freedom of expression has also been a cornerstone of our work since the organization's inception.
    Any attempts to regulate online hate will inevitably bump against freedom of expression, because contrary to what some say, the precise contours of hate speech are not easily discerned. As a result, we have argued that the Criminal Code prohibition on the wilful promotion of hatred, and prohibitions on hate speech contained in human rights codes, are vague and unreasonably restrict freedom of expression. In our view, a mature democracy like Canada does not achieve equality by limiting freedom of expression.
    I'd like to start by addressing what was formerly section 13 of the Canadian Human Rights Act, as I understand it is a subject that the committee has a great deal of interest in.
    CCLA appeared before the Senate committee on the bill that ultimately repealed section 13. We supported the repeal, and continue to believe that asking human rights tribunals to play the role of censor does not fit well with the functions of tribunals.
    Human rights tribunals are focused on dealing with discriminatory acts in a variety of areas. In order to address issues of systemic discrimination and to help achieve substantive equality, they need to interpret human rights statutes liberally. However, when it comes to hate speech provisions, our Supreme Court has made clear that only a very narrow interpretation is appropriate, in recognition of the fact that a broad restriction on hateful content would unduly or unreasonably limit freedom of expression.
    As a result, only the very worst and most extreme forms of speech are caught, even though we know that many more subtle forms of offensive messaging may have harmful impacts.
    A human rights commission or tribunal charged with prosecuting hate speech is put in a situation of conflict. In their core anti-discrimination work, they seek to protect minority groups, but in addressing hate speech complaints, they may often have to tell such groups that a very offensive expression simply doesn't rise to the level of hate speech for the purposes of the act.
    In our view, section 13 was not an efficient or effective way of dealing with online hate. I'm aware that some witnesses you've heard from have suggested that section 13 should be reinstated in either its original form or modified in some way, but for the reasons I've just outlined, CCLA disagrees with this approach.
    More broadly, I want to emphasize that while the committee may be considering how the Canadian Human Rights Act or the Criminal Code can be amended to deal with the problem of online hate, it should consider that these and other strictly legislative tools may not be well suited to addressing the very complex issue of hatred, because, of course, underlying the issue of online hate is the issue of hatred more broadly.
    Canada's experience with prosecuting those who are alleged to promote hatred shows that these individuals often use their prosecution as a way to further promote their message and to cast themselves as martyrs for free speech and gain a wider audience. Pursuing haters through our legal system can have counterproductive effects.
    CCLA believes that the government does have a role to play. The government should focus efforts on education and counter-speech. The Canadian Human Rights Commission currently has a relatively narrow public education mandate. That body or another entity could engage in much more robust education efforts, including programs that bring people from diverse communities and backgrounds together in ways that can help to address the root causes of hatred.
    There's also a need for education around digital literacy. We need to be focusing on ensuring that young people understand that content on the Internet can come from anywhere and everywhere, that not all sources are credible and that information can be easily manipulated. Organizations like MediaSmarts are already doing excellent work in this area, and I understand that some of their work on online hate is being done with support provided by Public Safety Canada. More work like this, and more support from the government on work like this, is what we recommend.
    The government also has a role to play in countering hateful content online with its own counter-speech that focuses on messages of inclusion and equality, and that provides resources and support to groups that engage in counter-speech.
(0955)
     Because it would be more interesting to try to answer your questions, I'm going to stop there. Thank you again for inviting us to appear.
    Thank you very much.
    Mr. Cameron.
     Thank you very much.
    Honourable members, thank you very much for the invite to appear here today.
    I'm with the Justice Centre for Constitutional Freedoms. We're a not-for-profit, non-political, non-religious organization. We're dedicated to the protection of the fundamental freedoms and constitutional rights of Canadians.
    I'm going to talk about three things this morning: first, the problem with setting out to censor hate without proper parameters; second, the reality on the ground with human rights tribunals in the context of this study; and third, the dangers of state censorship and big tech combined. I will then provide you with four recommendations.
    Like the Canadian Civil Liberties Association, the starting point for this conversation should properly be the Constitution of this country. That is Canada's foundational document, but it is not mentioned anywhere in the outline for this committee study and most of the witnesses before the committee made no mention of it except to urge you to infringe it as fast as possible.
    Set out in paragraph 2(b) of the Canadian Charter of Rights and Freedoms is the fundamental right to have an opinion and to express it. This committee is studying online hate and preventing online hate, but it has not established parameters or definitions as to what constitutes hate. It behooves the committee to ask, what is hate and what is the enticement of hate? The reality is that crying hate has become one of the favourite tools in some circles to prevent dialogue and discredit disagreement.
    You disagree with my religion, that's hate. You disagree with my politics, that's hate. You disagree with my gender identity, that's hate. You have concerns about immigration, resources and security, that's hate. If you're a single woman working out of your house as an aesthetician and you aren't comfortable waxing a pair of testicles, that's hate. You want to peacefully express your opinions on a university campus regarding abortion, you can't, because that's hate.
    You just heard from a previous witness who said Meghan Murphy is hate, Feminist Current is hate and The Post Millennial is hate, all without any examples whatsoever. Therein lies the problem.
    The same witness demonstrated in front of the Vancouver Public Library and compared the feminist talk going on inside to a Holocaust denial party, because the women were talking about the interests and rights of biological women.
    Lastly, but not least, U.S. Senator Elizabeth Warren, within the last couple of days, described all of Fox News as hate.
    None of this is hate. It's a disagreement and it's a dialogue, but it's not hate. It's protected speech under the Constitution and it is entirely legal.
     I alluded to the woman in the waxing case. You've heard about this case. It made international news. The Justice Centre represented this woman. She's a single woman. She works out of her home. She has a small child and she provides aesthetician services to the community. She advertises on the Internet and tells the world that she provides waxing services to women.
    She's trying to make ends meet. She doesn't have the supplies to wax somebody's scrotum. She doesn't really want to work on somebody's scrotum. She didn't start out intending to work on somebody's penis. It was irrelevant to her whether that person thought they were a man or a woman, because it was about physiology.
    She had a human rights complaint made against her, which terrified her, and she told me that she went to 26 different lawyers first before she found the Justice Centre. Every single one of the 26 lawyers refused to take her case. Why? Well, they gave a variety of different reasons, according to my client. Some of them were afraid of activists; some were afraid of the different procedures at the Human Rights Tribunal. Some were afraid of representing somebody who had allegedly engaged in discrimination and they didn't want the stigma attached to representing somebody like that in that context.
(1000)
     There's also not much money in these cases, so they aren't particularly attractive to lawyers. That creates a significant access-to-justice problem that this committee needs to consider. It needs to consider people, like my client, who have a complaint made against them despite the fact they didn't do anything wrong.
    A lot of people who have complaints against them are common people. Many have limited means and are facing a bewildering process, and even worse, they're facing the stigma of a human rights complaint. In this day and age of hypersensitivity and social media, where gossip travels around the world in an instant, being accused of discrimination in many cases is worse than a criminal accusation. It's enough to destroy your reputation. Even the lawyers don't want to be involved in it because they're afraid of stigma. They don't want to hear that you represented that bigot, that racist, that misogynist, that homophobe, that Islamophobe. How could you, in good conscience, represent these disgusting, filthy human beings?
    Is the state going to appoint counsel and pay for it if people can't? In the woman's complaint, the complainant's name was withheld by the tribunal and kept private, but my client's name was publicized for the whole world to see. As a single mom, my client didn't need the complaint. She was trying to make ends meet. It caused her months of terror. Life was hard enough, and she told me that she wept when the complaint was withdrawn. I'm going to say that again: The complaint was withdrawn. It never made it to a hearing. There was never any vindication for her, simply the accusation that she had discriminated on the basis of gender identity or gender expression.
    There are 14 other cases before the BC Human Rights Tribunal from the same complainant. Every single one of them, to my knowledge, requests damages against the people who refused to wax the complainant. None of them has a lawyer, to my knowledge, so there's lots of pressure to settle. Indeed, some of them have. Only the tribunal knows who the parties are until a hearing date is set, and then the parties are publicized three months in advance.
    The Justice Centre offered to represent these respondents for free. We asked the BC Human Rights Tribunal, given the fact that there's an access-to-justice problem, to pass along that offer to all of the respondents. The BC Human Rights Tribunal refused to do so. That's something you need to consider, as well. Human rights tribunals are not the saviours in these case. Often they create more problems than they fix.
    I want to say a little bit about the fine under former section 13 of the Canadian Human Rights Act. It was $10,000. That fine was found to be unconstitutional at the first stage of hearings. It was overturned by the Federal Court of Appeal—it never made it to the Supreme Court of Canada. The fine for a conviction of drunk driving is $1,000. That is a crime under the Criminal Code, which is a grave social evil. What you have heard this morning is that people should be punished for the vague crime—no specifics, like the case of Meghan Murphy, who is not here to defend herself—of transphobia or misgendering. That's part of the problem you need to think about.
    How much time do I have left?
(1005)
    You're at nine minutes, so you're approaching the point where you might want to get to your recommendations.
     We recommend four things.
    First, we recommended that the Canadian Human Rights Act, if it is to be amended, be amended to define what is and is not hate speech. Pursuant to the Supreme Court of Canada's decision in Saskatchewan v. Whatcott, 2013, 1 SCR 467 at paragraphs 90 and 91, the Supreme Court of Canada sets out what is hate speech. Most of what you've heard from the witnesses who are telling you something is hate speech doesn't even come close to hate speech.
    Second, if there is any new legislation to be implemented, we say there ought to be defences to a complaint of hate speech mirroring the defences in subsection 319(3) of the Criminal Code, specifically that:
No person shall be convicted of an offence under subsection (2) [of 319]
(a) if he establishes that the statements communicated were true;
(b) if, in good faith, the person expressed or attempted to establish by an argument an opinion on a religious subject or an opinion based on a belief in a religious text;
    I'll pause here to note that the Bible, under the parameters that you've been asked to consider, and the Koran and other religious books could be considered hate speech just because verses from them are posted online saying things like, “God created male and female”. That's not hate; that's a statement and it's entirely permissible, but it would be protected under the defences that I'm outlining here.
    Subsection 319(3) continues:
(c) if the statements were relevant to any subject of public interest, the discussion of which was for the public benefit, and if on reasonable grounds he believed them to be true; or
(d) if, in good faith, he intended to point out, for the purpose of removal, matters producing or tending to produce feelings of hatred toward an identifiable group in Canada.
    Third, we recommended that the maximum fine for any finding of hate be capped at no worse than the Criminal Code fine for drunk driving, at $1,000.
    Fourth, we recommended that Parliament launch an initiative to encourage people to come forward with their big-tech censorship stories so that it can understand the extent of that problem, which is significant, and not embark on a mission of censorship without all of the facts.
    Those are my submissions. Thank you.
    Thank you very much, Mr. Cameron.
    We'll go to Mr. Cooper.
    Thank you very much to the witnesses.
    Mr. Cameron, your client, who endured a complaint through the Human Rights Commission that was ultimately withdrawn and who was subject to enormous cost in her life, would not be entitled to costs. Is that right?
    She applied for costs. Because the complaint was withdrawn precipitously once counsel became involved and submissions and evidence were filed, we applied for costs based on a number of misrepresentations that we said the complainant made. The fact the complaint was started put this woman through a terrible time of crisis for five months and then it was withdrawn. But what is she supposed to do? There's no recourse for her.
    To that end, could you confirm that under the framework of the Canadian Human Rights Act, if there were a frivolous and vexatious complaint made, a respondent would be statutorily barred from suing? Is that right?
    That's my understanding, and that's a problem because there's no disincentive for launching multiple complaints.
    The person involved in the waxing case made 16 of these complaints. Some of them are in various stages of settlement and some of them are proceeding to a hearing, I understand. But the point is that there's obviously a problem when people can just destroy somebody's reputation by a charge of discrimination and then nothing happens to them when it was done maliciously.
(1010)
    That's right.
    You touched on big tech and big government. I think we've seen some steps that have been taken in Europe by the European Commission. I would suggest there is a real issue of censorship creep with some of those steps taken. You touched on it, but you didn't have an opportunity to elaborate, so I'd be interested in your thoughts on big tech and big government coming together and the dangers in that.
    Yes, it's not a myth that big tech is censoring views that it disagrees with. Two days ago, Democratic representative Tulsi Gabbard, the representative for Hawaii, appeared on the Joe Rogan show. She voiced opposition to the censorship of Facebook users, arguing instead that “companies like Facebook have betrayed the longstanding American commitment to free expression by ousting unpopular political commentary from their platforms.”
    Just listen to those words, what she's saying. She's saying that unpopular political commentary is what is being ousted; not hate, but simply stuff that the censors at Google and Facebook disagree with. Because they have the power and little oversight, they do whatever they want.
    It's a dangerous proposition for a government to consider and to propose teaming up with these institutions and entities that are already engaged in gross censorship that is well documented. We'll be submitting a paper about that, but it's well established at this point. Google, as well, is routing search results away from certain media outlets and conservative voices that it disagrees with. It's routing traffic away from those entities, and that's unacceptable.
     In that vein, we've seen examples from Antifa, for example, which has expressly incited violence. Social media platforms have refused to take that content down, so we see the inconsistency.
    In 2016, the European Commission entered into an agreement with YouTube, Facebook, Twitter and Microsoft, wherein those platforms agreed to take down content that constituted hateful conduct or violent extremism—which, I would submit, are relatively vague terms—within a 24-hour time frame.
    Should we be concerned about ordering social media platforms to take down content within 24 hours? It seems to me there's not a lot of time for deliberation. Should we also be concerned when state actors make requests for social media platforms to take down certain content, given the fact that state actors might have their own agendas?
    Absolutely. Twitter is notorious for this type of thing. There's a movie that has come out called Unplanned, a true story about a director of Planned Parenthood. There was a U.S. Senate hearing at which a co-producer of the movie testified about how Google had refused to take their ad dollars. Twitter took down their account and deleted hundreds of thousands of followers from the Unplanned movie.
    We have the same problem in Canada, where theatres are refusing to screen the movie. Despite all of the questionable content that is in the theatre, they are refusing to show this true story, essentially censoring it for the public. Whether you agree with pro-life positions or not, that should still concern you as Canadians.
    Twitter is bizarre. It permanently banned Meghan Murphy for the crime of misgendering. She's not a conservative; she's quite far left on the feminist side of the spectrum. Twitter takes down accounts like this. I went looking for something on Twitter and accidentally stumbled onto a page with this guy's penis in front of this woman's face. You can have all of this stuff on Twitter, but if you want to talk about conservative viewpoints or things that Jack Dorsey disagrees with at Twitter, Twitter takes them down. It's such a double standard and it should, quite frankly, offend more of the people in government than it does.
(1015)
    Thank you very much.
    We're going to go to Mr. Ehsassi.
    You mentioned in your opening remarks that you were somewhat insulted that there was no mention of the Constitution in the motion we're examining here. Is that a correct—
    No, I didn't say I was insulted. I said that the starting point for the conversation needs to be paragraph 2(b) of the charter, because it protects the fundamental rights that the Supreme Court of Canada has said are the foundation of Canada's liberal democracy. It can't function without freedom of expression. The context of this conversation needs to be paragraph 2(b) at the start. I'm not offended, though, sir.
    Surely, you understand that in adopting any recommendations, we would obviously be well aware of paragraph 2(b) of the Constitution. Is that correct?
    Sir, I'm not sure of that at all. This government's track record regarding paragraph 2(b) of the charter is not good. There was the Canada summer jobs fiasco in 2018, in which people were compelled to make—
    No, but you understand, obviously, that members of this committee are mindful of that.
    I understand that people should be mindful of it, but as to whether or not this government takes paragraph 2(b) seriously, I'm not convinced of that at all, no.
    You were sitting here. You had the opportunity to listen to the previous witnesses. What were your thoughts on some of their concerns?
    The representative from Egale made an excellent point about the dangers of attempting to take down speech and to fine ISPs for content. That's a legitimate concern. People in Canada have a right not to be subjected to criminal hatred, so insofar as those concerns are based on the incitement of criminal hatred, I think they are legitimate and I support the prosecution of the incitement of violence or genocide against identifiable groups of people.
     The problem is that a lot of the concerns being expressed are couched in vagaries and—
     You constantly cite cases that you find to be extreme, but obviously you would agree with us that there is a public interest in making sure that hatred does not spread. You would agree with that objective, would you not?
    I think that for me to agree with that, you would have to define what hatred is. How are you defining it? If you're defining it like some of these witnesses, then no, I don't think the government legitimately has an objective.
    You heard from the witnesses that some of the witnesses had to deal with sexism on Facebook, correct?
    Is sexism hate? Has that been established? I don't know that it has been, sir. Is sexism arguing against a woman's legal right to have an abortion because somebody has a perspective that's different than that?
    No, but those weren't the examples that were provided. You were sitting here, but that didn't concern you in the least.
    Please don't put words in my mouth. I'm not here to argue.
    I asked you very direct questions, and the responses don't—
    I'm giving you direct answers. I don't know that it has been established that sexism, which is not defined for the purposes of this committee, is hate. If you're telling me that it is, then I think what we need to establish is what you mean by sexism and establish parameters.
    Would you not agree that they did face sexism, the previous witnesses who were before us?
    Which witnesses are you speaking of?
    It was one of the mayors who showed up and I refer to the examples she provided.
    Did she personally experience sexism?
    Correct.
    I don't know what she experienced. I'm not her. I can't give testimony.
    You were here. You were sitting here, and I take it you were listening. Are we supposed to be concerned about sexism online?
    Again, it depends on how you define sexism. What is sexism? Tell me what you're talking about, and I'll tell you whether I think you should be concerned about it online. Give me an example.
    Just to let you know, in my riding of Willowdale, we experienced the van attack. As I'm sure you're well aware, the suspect in that case was very much influenced by incels. For those people who don't already know, incel is a group for men who feel rejected by women. Right before the actual van attack, he posted, “the Incel Rebellion has already begun.”Do you think it would have been irresponsible, in this particular instance, for Facebook, for example, to have eliminated that comment?
(1020)
    Do I think it would be irresponsible for Facebook to eliminate—
    Do you think anything should be done, as far as you're concerned? We see all these instances, and from your perspective, should governments be concerned?
    Yes, I think that governments are prosecuting offences under sections 318 and 319 of the Criminal Code, for example. You need permission from the Attorney General to prosecute that offence. Obviously, that's something serious, and when there is a breach of the Criminal Code, I think it should be prosecuted. I support that.
    My point, sir, is that not all the people who are charged with the human rights offence or are the subject of a human rights complaint are lunatics plotting a van attack against women, right? There are lots of common people who are innocent.
    I think we all understand that.
    I think it's important to clarify it, because I'm not sure that we do all understand it.
    All I've asked is whether there is a public interest for governments to be concerned about these types of things. You expressed to us that gossip can spread on the net in no time, and it spreads wide. Hatred, you would agree, actually spreads as well.
    Sure, communication spreads. That's the same for a speech that infringes the Criminal Code as well.
    Thanks very much.
    Ms. Ramsey.
    I'd like to start by saying that we're talking about people's lives. We heard a very personal story from a witness in the previous panel that was backed up by a decision at the Human Rights Tribunal in favour of her and her experience.
    There is a very important line around freedom of speech and making sure it's protected fiercely in our country, and at the same time preventing...not just online hate speech. Yesterday we heard from some Facebook folks. The chair and the vice-chair and I participated in a panel where we heard from Facebook about hate and what happens with the building of online hate and how it's really a systemic issue that ultimately results in hate speech on the Internet. We talked about Facebook and Twitter. We mentioned those things here. There are so many applications, so many gaming chat rooms, so many corners of the Internet that we haven't been able to have proper conversations about here, because we've focused on the larger web giants. This is systemic throughout the entire Internet.
    The challenge before us is very difficult—having conversations about reporting; having conversations about lived experience, which we heard previously; about the importance of protecting Canadians and making sure they feel safe in our country. It is a very difficult task ahead of us to take all these things and place them ultimately into a report that will reflect everything we've heard here.
    In the previous panel, we heard about the differences between the way online content and physical publications are treated in our country.
     Ms. Zwibel, how do we account for the differences in treatment of what you're able to put in print and what you're able to put online in our country, and how do we reconcile those two? How do we create something that is equal across those platforms?
    Unfortunately, I didn't hear the panel before, so I'm not sure of the differences that were discussed.
    At a very practical level, we have to recognize that the online space is not at all like the real world, where there are geographic boundaries that Canada can police and patrol. The online world doesn't have those kinds of boundaries. Let's for now confine ourselves to what would constitute criminal hate speech under the Criminal Code. It's much more difficult for the Canadian government to deal with that online. Even that kind of content, if it originates outside of the country online, is going to be very difficult for a Canadian court to do anything about.
    I don't think it's a problem of political will or enforcement necessarily. It's just the reality of the infrastructure that exists that allows both the good and the bad that comes from democratizing freedom-of-expression media. It used to be that only people who could afford to establish a printing press could have a megaphone. Today everyone has a megaphone, and there are obviously good and bad aspects of that reality. I think that's what accounts for that difference, and I'm not sure it's a problem the law can really address. We wouldn't want a court in Russia deciding what Canadians can access on the Internet. By the same token, Canadian courts can't decide what the world can access. We shouldn't be allowing our courts to make orders that would remove content from the Internet worldwide.
(1025)
    It's certainly a global issue, and that's something that was part of our panel discussion yesterday as well.
    There is a report by the House of Commons Standing Committee on Canadian Heritage titled “Taking Action Against Systemic Racism and Religious Discrimination Including Islamophobia”. It was an all-party report that was brought to the House of Commons.
    One of the recommendations they had was to establish uniform pan-Canadian guidelines and standards for the collection and handling of hate crime data and hate incident data, including efforts to standardize the definition and the interpretation by law enforcement of “hate crimes”.
    What are your thoughts on that particular recommendation?
     I do think there's a data problem; I think we don't have full information about what exists.
    At the very root, to go back to something Mr. Cameron said at the outset in his comments, we need to understand what we mean when we talk about a hate crime or a hate incident. Typically I think of a hate crime as a criminal act that's motivated by hatred, but it can be quite difficult to establish in a court of law the motivations of the perpetrator of a crime. It's more obvious in cases where they spray a swastika or they assault someone wearing a hijab. However, there are other cases that will target particular individuals because of their race or religion or gender, and it will not be obvious. I think there's a definitional problem that we need to address.
    I also think we shouldn't conflate hate crimes and hate incidents. An incident might be someone shouting a racial slur to a stranger in a grocery store. That's something we might want to know about, but it's not something that the criminal law should be dealing with. I think we need to address that definitional question and figure out how to gather the data.
    I know that some of the witnesses you've heard from have questioned whether government is in the best position to gather that data, and I'm not sure of the answer to that question. There are some people who feel that, in certain communities, incidents will not be reported if the only place to report them is to police. In some cases, we have faith groups or other groups that are suited to doing that.
     I think there probably needs to be a collaborative approach on that.
    Mr. Fraser.
    Thank you to the witnesses for joining us today.
    I'll be splitting my time with Mr. Erskine-Smith.
    Mr. Cameron, I would like to pick up on a couple of comments you made in your presentation.
    I take what you're saying about paragraph 2(b) of the charter. Freedom of expression is a fundamental freedom. Section 2 also includes other fundamental freedoms, such as religion and association. Of course, all the rights in the Charter of Rights are read together, and it's oftentimes a balancing of those fundamental freedoms, which can come into conflict. They have to be balanced.
    I take issue with the fact that you think we should look at paragraph 2(b) first and that that's the most important and paramount consideration of all the rights. I disagree with that. Also, section 1 of the charter makes it very clear that all of the rights, including the fundamental freedoms, are subject to reasonable limits. The court has ruled on that, and I think it's misleading to say that paragraph 2(b) is the paramount consideration.
    Another comment that you made was in your third recommendation, saying that any fine for anything involving hate speech online or whatever, should be capped at the Criminal Code fine for impaired driving, which is $1,000. That's the minimum fine, first of all, and second of all, you can go to jail for impaired driving. It is a serious offence, but of course it depends on all of the circumstances.
    Third, you mentioned that the BC Human Rights Tribunal should, in some fashion or another, be promoting your legal services and giving you a platform in order to take on clients. That would help you get the word out there about your organization and what you stand for. I don't think it's the BC Human Rights Tribunal's role, at all, to be promoting any legal services over others.
    I want to move, though, to something you said, which was that there's this sentiment out there that disagreeing with someone's point of view is considered hate. You went through a list of them and said “You disagree with that. That's hate.” I don't think that's true. I think the essential point here is that spreading misinformation angers people and riles people up online, and spreading that disinformation turns members of a community against one another. That's the fundamental problem we're seeing with things online that are not true, and they're being propagated by people with insincere motives, and motives that are outside the bounds of civil society, I would suggest.
    What I would like to ask you, sir, is, when we see the Toronto van attack or what happened in Christchurch or the Quebec City mosque shooting, does it trouble you that those terrible individuals have been inspired by provocative and hateful content on social media platforms?
(1030)
     Does it trouble me that they were inspired by social media?
    Does it trouble you that they were inspired by hateful content on social media platforms?
    I don't presume to know what inspired these people. I'm not them; I don't know what their childhood was like or what they were subjected to.
    It's been widely reported that hateful content on social media platforms was at least partially responsible for the ideologies they hold. Does that trouble you?
    Any time somebody commits a heinous act against people, it troubles me. I think any time they're motivated to do that, in part or in whole, by something somebody said, it is troublesome. Crimes happen every single day and people are influenced by what other people across the country say. That's troublesome.
    We'll leave it there.
    I'll turn my time to Mr. Erskine-Smith.
    Mr. Cameron, I share your love of paragraph 2(b).
    As a young law student, I once volunteered for the CCLA.
    Hi Cara, it's nice to see you again.
    I want to ask about different ways we already restrict speech.
    Do you agree with laws that restrict speech related to terror? Just be brief; just say yes or no.
    That is related to what, sorry?
    It is speech related to terrorism.
    That's a Criminal Code offence.
    Child porn.
    Absolutely.
    Defamation.
    The law of defamation is tried in civil court. It's punished. It's not censored prior to the defamation. There's a difference.
    I understand. It's still defamation.
    Harassment.
    Do you mean criminal harassment?
    Yes, criminal harassment.
    Absolutely. Do you mean a restraining order or something like that?
    Sure.
    Threats.
    Sure.
    On hate under the Criminal Code, I understood you do support the existing laws.
    It's the law.
    Now let's talk about how we enforce those laws.
    The problem online is that, in many ways, these existing laws—which you and I both support as restrictions on speech—are unenforceable in effect. The Criminal Code is a very cumbersome instrument and can't properly apply in so many instances when there's such a voluminous amount of hate online. Our law enforcement agencies and our courts can't possibly keep up with the comments, whether those comments are on Twitter, on Facebook or whatever the case may be. I'm not talking about censoring your favourite conservative commentator. I'm talking about what you and I agree with, which is enforcing existing laws under the Criminal Code.
    Do you think there should be liability for online platforms if they fail to take down, in a timely manner, content that is hate according to the Criminal Code .
    You're presuming, as a foundational premise, that there's a problem with the Criminal Code and the way it's enforced. That hasn't been established. I know that people are complaining—
    Do you think the Criminal Code is an effective instrument right now, based on everything we've seen, in enforcing the hate speech laws on content online? Your answer is that it's an effective instrument.
(1035)
    The problem with what is being proposed here is that you're contemplating taking the prosecution of hate speech away from a prosecutor or a Crown attorney—
    No, I'm not contemplating that. I'm contemplating a complementary method.
    I would like to answer the question, if I may.
    You're talking about the approval of the attorney general and giving it to a tribunal, which is an entirely different entity.
    You're inventing the suggestion. I'm not talking about the human rights tribunal; I'm talking about imposing, through new legislation, some liability on social media platforms that fail to take down hateful content—according to the law as it is—in a timely manner.
    Who determines if it's hateful?
    If it's obviously hateful, ultimately, there would be a judicial mechanism. A government agency would find it and, ultimately, there would be a judicial mechanism, if Facebook or Google or whomever disagreed.
    What mechanism is there for somebody to determine that something on an online platform is hateful? For example, Facebook says that it's content-neutral, that it's a marketplace of ideas and that it doesn't police speech.
    Facebook takes down terrorism content already. They take down child porn content already. They take down content already according to existing laws. They apply existing laws, so we're asking them to apply the laws with respect to hate speech. If they get it wrong and someone says that Facebook got it wrong, then they take it to court.
    Why is that so hard?
    The criticism out of the United States and certain other pundits is that Facebook is violating its own premise by taking down speech when it represents to the public—
    You're raising distraction concerns that I'm not raising at all.
    I'm not; I'm answering your question. Maybe you don't understand the answer I'm trying to give you, but it is an answer to your question.
    Facebook says that it's a marketplace of ideas and that it's neutral, yet it is policing speech. They're taking down speech. My question for you is: Who is determining whether or not something is hateful?
    You're advocating that there should be liability for an organization like Facebook if it doesn't take down hate speech fast enough. My question is who's deciding that? That's the problem.
     We already have laws—
    Thank you, Mr. Erskine-Smith, but your time is up. It was very interesting.
    Thank you to both of the witnesses. You have offered a different perspective from what some other witnesses have offered, and that's the importance of the marketplace of ideas. Whether we agree or do not agree with everybody's views on everything, we have the right to express them in a tribunal like this at Parliament. We can have an exchange. That's the important thing.
    I just want to end by saying that there have been a lot of presumptions about what the committee is or is not going to do. This committee is looking at defining...looking at how to track and looking at how to educate. A lot of the things we're looking at go well beyond the question of what we're now calling “policing”. Hopefully, all parties on this committee will be able to deliver something that all sides might agree with.
    We really appreciate your testimony today. We have an in camera session for a few minutes after this.
    [Proceedings continue in camera]
Publication Explorer
Publication Explorer
ParlVU