:
Good morning, everyone.
I call this meeting to order.
[English]
Welcome to meeting 127 of the House of Commons Standing Committee on Justice and Human Rights.
[Translation]
Pursuant to Standing Order 108(2) and the motion adopted by the committee on December 2, 2024, the committee is meeting in public to continue its pre‑study of the subject matter of Bill .
[English]
Before I welcome the witnesses for the first panel, I have a few introductory remarks to make.
For those appearing in person, please use your microphone and your headset. Move it away from the microphone so that we do not give a hard time to our interpreters. It's also for their safety and health. For those in the room and those appearing virtually, please wait to be recognized by the chair.
[Translation]
I'm speaking French right now. The English participants should be hearing the English interpretation.
[English]
If you did not understand what I just said in French, you do not have your device turned the proper way. I would ask that you ensure that you have your device turned the right way so that you understand the language of your choice and we're not interrupted midstream.
[Translation]
Please mute your electronic devices.
[English]
If you are appearing virtually, unmute yourself only when you are recognized by the chair.
I will now introduce our three panellists this morning.
[Translation]
First, we have Frances Haugen.
[English]
She is an advocate for social platform transparency and accountability. She is appearing by video conference.
Marni Panas is a Canadian certified inclusion professional.
From Connecting to Protect, we have Jocelyn Monsma Selby, chair, clinical therapist and forensic evaluator, by video conference.
I will give each of you up to five minutes to say your introductory remarks. I understand that it's a little bit difficult, particularly if you're on your screen. When you have 30 seconds left, I will let you know. When the time is up, I will interrupt you as softly and delicately as possible, whether during your five-minute remarks or during your answers to members' questions.
I want to let you know that we have Senator Kristopher Wells with us today. He will be here for the first hour. Welcome, Senator.
I will now ask Ms. Frances Haugen to please start.
You have up to five minutes.
:
Thank you for inviting me today.
You have probably had the opportunity to hear from a lot of people about the harms of social media, so I will not repeat the laundry list again. Instead, I'd like to focus on two topics that hopefully will give context to that testimony and provide urgency for action.
First, I want to emphasize that we are profoundly underestimating the severity of social media's impact on children, due to limitations in how we observe and measure these effects. When researchers and policy-makers discuss the harmful effects of social media, they typically point to studies of teenagers documenting rates of self-harm, eating disorders and declining mental health among 16-year-olds, but these studies are echoes of the past, capturing the aftermath of social media exposure that began years earlier, typically around 12 or 13.
What's alarming is that when we talked to today's 12-year-olds and 13-year-olds, we discovered that they started on social media around age eight or nine. In 2022, 30% of American children between the ages of seven and nine were already active on social media platforms. That number is probably higher today. This creates what I call the telescope effect of our understanding of social media's impact. Like astronomers observing distant galaxies, we're always looking at information about the past: how social platforms were designed in the past, past usage patterns. This may be okay when we look at the stars, because the heavens change slowly, but when we examine the digital lives of teenagers, their rapidly changing world means that we end up continually surprised that rates of harm keep going up.
A seven-year-old is influenced and impacted in a meaningfully different way than a 13-year-old is. The children starting social media use today are doing so at ever younger ages, during even more crucial developmental periods and with even more sophisticated and engaging platforms than today's teenagers whom we're currently studying. If we don't act, we're on track to wake up in 10 years to realize that we've fundamentally altered a generation's development in ways that we failed to anticipate or prevent.
My second point concerns the emerging and under-reported threat of the rise of AI avatars and their impact on children's social development. These AI avatars are sophisticated virtual companions. They use artificial intelligence to engage in conversation, respond to emotions and build what feel like genuine relationships with users. They're designed to be always available, eternally patient and perfectly attuned to their users' interests and needs.
The leading provider of these AI avatars proudly announces that the average user—predominantly children under 18—spends two hours daily interacting with these virtual companions. This statistic should alarm us. Learning to navigate real human relationships is inherently challenging and sometimes uncomfortable. It requires compromise, patience and the ability to engage with others' interests and needs, not just our own. AI avatars, in contrast, offer a path of least resistance. They never disagree uncomfortably. They never have conflicting needs, and they never require the complex emotional labour that real friendships demand.
We need to expand our understanding of what constitutes social media. These AI-driven spaces represent a new frontier of potential harm, where the artificial ease of virtual relationships further erodes children's ability and motivation to build genuine human connections. If we don't act now to understand and regulate these technologies, we risk being blindsided by their effects, just as we were with social media platforms.
In conclusion, the problems we're seeing with social media are reflections of broader societal issues. The adults most negatively impacted by social media are often those already marginalized in our society, whether geographically, physically or economically. While in-person socialization carries real costs in terms of transportation, activities and time, online socialization appears free at first. The true cost is paid in terms of mental health, development and human connection.
Similarly, and perhaps most critically, the children most likely to become deeply enmeshed in these virtual worlds, whether traditional social media or AI-driven spaces, are likely to be our most vulnerable and marginalized youth. These are often the children with fewer opportunities for in-person social interaction, fewer resources for supervised activities and fewer adult mentors to guide them through the challenges of growing up or to provide context and support when they face online harm.
We must act now to ensure that children have appropriate and safe digital spaces, because their ability to meaningfully build relationships and connect will shape the world we all live in for decades to come.
Thank you.
:
I am Marni Panas. I use the pronouns “she” and “her”. I am a Canadian certified inclusion professional. I led the development of diversity and inclusion activities at Alberta Health Services, Canada's largest health care services provider. I am the director of DEI for one of Canada's most respected corporations, and I am the board chair for the Canadian Centre for Diversity and Inclusion.
Today, I am speaking on behalf of myself and my own experiences. I'm here to vehemently defend every Canadian's right to freedom of expression, the foundation of our democracy. However, I and millions like me do not have freedom of expression, because it is safer to be racist, homophobic, sexist and transphobic online than it is to be Black, gay, a woman or transgender online. Online hate is real hate. It descends into our streets. It endangers Canadians in real life.
In September 2021, I took the stage at a university in my hometown of Camrose, Alberta, to deliver a lecture on LGBTQ2S+ inclusion, a lecture I've delivered to thousands of students, medical professionals, and leaders around the world. While I was on stage, unbeknownst to me, a student, like many other youth who have been radicalized by online hate, was livestreaming my presentation on Facebook and several far-right online platforms. By the time I got off stage, thousands of people were commenting on my appearance, my identity and my family. The worst of the comments included threats to watch my back. My next lecture was cancelled. Police escorted me off campus for my own safety.
In March 2023, I was invited to participate on a panel celebrating International Women's Day to raise awareness for an organization in Calgary that works to protect women and children from domestic violence. Because of the many online threats of violence directed towards me, the Calgary Police Service and my employer's protective services unit had to escort me in and out of the Calgary Public Library, where the event was being held.
Last February, emboldened by the introduction of anti-trans legislation in Alberta, people harassed and threatened me and others online at levels I had never experienced before, even trying to intimidate me by contacting my employer. I'm grateful for the support of my current employer, who once again had to step in to have my back.
It is rarely the people spewing hate online who are the greatest threat, but words are never just words. It is the people who read, listen and believe in hate speech who become emboldened to act on what's been said. These words and the actions they fuel have followed me to my community, my workplace and even my doorstep. The impact of this relentless harassment for simply living my life publicly, proudly and joyfully as me has profoundly impacted my mental health, my well-being and my sense of safety where I live and work, leaving me withdrawn from the communities I cherish and leaving me wondering every time someone recognizes me on the street whether this is the moment where online hate turns to real physical violence. I feel far less safe in my community and in my country than I ever have before.
No, I don't have freedom of expression. There is a cost to being visible. There is a cost to speaking out. There is a cost to speaking before you today, knowing that this is being broadcast online. Most often, the cost just isn't worth it. The people all too often silenced are those who desperately need these online platforms the most to find community and support. This is made worse when the same platforms allow disinformation to be spread that aims to dehumanize and villainize LGBTQ2S+ people, contributing to the significant rise in anti-LGBTQ2S+ violence as highlighted by CSIS this past year.
The status quo is no longer acceptable. Platforms need to be held accountable for the hateful content they host and the disinformation they allow to spread. The federal government needs to act. We can't wait. I've been called brave, courageous and even resilient, but I'd rather simply just be safe. People have a right to freely exist without fear because of who they are and whom they love. This is needed in online spaces, too. In fact, our communities and our democracy depend on it.
Uphold freedom of expression. Pass Bill , and protect us all from online harms.
[Translation]
Thank you.
:
Honourable Chair Diab and all member of the Standing Committee on Justice and Human Rights, thank you for the opportunity to be here today.
My first point is that, in Canada, our current legal framework addresses child sexual abuse and exploitation via the Criminal Code and the law for protection of children from sexual exploitation. However, we should not be relying on a broad duty of care by any Internet platform. There should be law requiring the identification and immediate action to report and take down illegal sexually explicit images. We need regulation that is fit for purpose and safety by design.
My second point is this. Bill reads, “reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services...respect...their duties under that Act.” This is a glitch. All Internet platforms need accountability, not just social media sites. It takes just three clicks to find child sexual abuse imagery or child sexual exploitation material on the regular Net, and this includes images generated by artificial intelligence found through accessing many, many online platforms, including the dark web. These IPAs are disguised within websites and embedded in emojis and hidden links, requiring the viewer to follow a digital pathway that can disappear as quickly as the next link is clicked on.
In 2022, the IWF found a 360% increase in self-generated child sexual abuse reports of seven-year-olds to 10-year-olds, more prevalent than non-self-generated content. This trend has continued into 2023, when the IWF hashed 2,401 self-generated sexually explicit images and videos of three-year-olds to six-year-olds. Of those images, 91% were girls showing themselves in sexual poses, displaying their genitals to the camera. It's normal for children to have curiosity, explore their bodies or experiment sexually, but that is not what the IWF found. What is shocking is the unsupervised access of children using digital devices.
My third point is with regard to guidelines respecting the protection of children in relation to regulating services and age of consent to data processing and in using social media. There is a duty to make certain content inaccessible. Caution should be used in passing regulation based on precedents set out in other countries. We need to look in turn at all the international laws, treaties and conventions. A single guiding principle is in article 5 of the UNCRC, concerning the importance of having regard for an individual child's “evolving capacities” at any moment in time in their interactions with the online world.
My fourth point is the establishment of a digital safety office of Canada, a digital safety commission and a digital safety ombudsperson. Could Canada benefit by establishing an online safety office and a children's commissioner or ombudsperson? The answer is yes, and several countries have been blazing a trail for us. These countries are part of a global online safety regulators network that aims to create a coordinated approach to online safety issues. Canada, sadly, is not at the table.
Last week, I was invited to attend a global summit in Abu Dhabi, sponsored by WeProtect and the UAE government. I was the only child protection representative from Canada, and I'm a self-funded third party voice.
I have a few final thoughts.
It took 50 years from the development of the Gutenberg Press to develop 20 million books. It took Ford 10 years to develop 10 million Model Ts. It took Playboy approximately two years to sell over a million copies each month. It took the global Internet in 1995 two years to develop 20 million users. It took Facebook 10 months to reach one million users. Today, Meta's ecosystem—including Instagram, WhatsApp and Messenger—has approximately 2.93 billion daily active users.
We need to close the gap between the rapid development and access of the Internet and needed regulation. We cannot have a continued partisan approach, lacking civility, to develop the important regulations needed to protect children and vulnerable individuals.
:
Thank you so much, Madam Chair.
Thank you to our witnesses.
We're talking here about one of the most serious bills, I think, that have come before Parliament, certainly in my time and many others' time. That is Bill .
I want to start with you, Ms. Selby. This is on record from the Canadian Constitution Foundation:
“Bill C-63 combined things that have no reason to go together,” Van Geyn said. “The issue of the online sexual exploitation of children through pornography is urgent and serious and should not be lumped in together with the government’s controversial plans to criminalize all kinds of speech and allow for civil remedies through the Canadian Human Rights Commission for speech,” she added.
My question for you is this: Shouldn't we have a stand-alone bill or legislation that protects children from online perverts? Shouldn't that be its own legislation?
:
That's a great question. Having a positive duty of care is a really critical component of the Canadian bill, because it says you have to be actively thinking about how your product might be misused and be designing proactively for it.
Parental controls can be really powerful. They are one set of tools, but not all children have parents who understand technology well enough. Remember, most parents today didn't grow up as a 10-year-old with a smart phone, or I hope not. It means that we need to make sure there is, at a minimum, a floor or a net that is catching all children.
We also need to ask whether we should be putting the obligation on parents, when they have so much to deal with already, to also stay abreast of exactly what threat is coming from where and what setting and toggle they need to put on their phones.
:
That's a great question.
One of the challenges when writing any kind of Internet regulation law is that technology moves very quickly. For example, right now the Europeans are suggesting things like banning addictive features. In technology, it can be really hard to define what an addictive feature is. If I say, “This is the thing you're not allowed to do”, what usually happens is that either the definition is specific enough to be easy to understand, in which case the tech companies immediately just do a slight twist and say, “Well, it's not in anymore”, or you have a situation where you write them at such a high level that you have to ask what it means to have an addictive feature.
Duty of care is a nice, flexible in-between where you say, “Hey, you need to be demonstrating proactively that you're looking out for the needs of children and designing safety by design.”
:
I have so many privileges that I actually do okay. That's with all of my privileges. I can't imagine children and youth and people who don't have privileges like the support of my employer and the people around me. That's with all of those supports.
There is freedom of expression, but there are consequences. We all have consequences for speaking. You folks in the House of Commons can't just say anything in the House of Commons without some consequence. That has to occur online, and it has to occur in all of our spaces.
Today, again, I do not have freedom of expression. Even just visibly posting a picture of my partner and me being happy, dancing at a concert, comes at a cost. That cost is often ridicule. That cost turns into harassment. Then that cost turns into people believing the disinformation that is spread online, which leads to policies that restrict my ability to even participate fully in society. It goes so far. This is for somebody who has all of the privileges that I have.
:
Yes, I'm not sacrificing the good for perfection. The good needs to happen now.
The fact is that if online platforms honoured their own standards of practice and the community standards they already have in place, we probably wouldn't be here. They all come out with these great standards of practice, but any time I report anybody not following those, they're ignored. We're ignored. We need something now. Lives are being lost to this.
What's important is that there are a lot of youth who find community in online platforms. That's essential when you think of rural populations and when you think of people like myself. The very first time I found somebody like me, when for the first time in my life I realized that I'm not alone and that there are other people like me, was a life-saving moment for me. That was 20 years ago, when the Internet first started. That saved my life.
We need to protect those environments for youth and people to find social connection in a healthy and meaningful way. That has been robbed from them. The impact of that is violence, death, isolation, loneliness and having to hide the most important parts of your identity. That needs to change now. We can no longer wait. Too many lives have been interrupted. Too many lives have been lost because of the harms experienced online.
:
Thank you very much, Madam Chair.
I'd like to thank all the witnesses who have joined us today to help guide the committee through this study.
Ms. Panas, I'd like to start with you.
Thank you for showing up today and explaining just how your previous experience—your life experience—makes this approach a very important thing for our committee to consider. Often, when a party comes forward with a policy idea on regulating the Internet or online spaces, the first charge levelled against policy-makers is that they're taking away freedom of speech and freedom of expression, but I think you have quite clearly explained how, by not doing anything.... The status quo is actually affecting your freedom of expression right now.
I want to talk about this concept of a public space or the public square. When we're in a room, like we are right now, everyone has an equal voice and we can all hear each other equally, but in an online space, especially on social media platforms, the platform itself is not a passive bystander. It can actively promote content, or it can actively put it down into certain corners and it can direct people to certain dark corners of the Internet.
My other committee is the public safety committee. We've been looking at how our foreign adversaries make use of online platforms to spread disinformation, and there's quite a lot of overlap with the subject matter we're dealing with today. We've had witnesses at that committee talking not only about whether we need to take a law approach or a regulatory approach, but also about trying to instill a digital literacy strategy.
Do you have any thoughts on equipping Canadians with the skills they may need to navigate the online space?
:
Thank you so much for the question and for the comments of support.
Yes, it is scary being here. It's not scary because of you folks—you folks are pretty friendly—but I know the moment I leave this space.... I know the people who are watching me right now, and what it will mean to me online. It's terrifying, quite honestly.
This is such a complex issue. The Internet is so complex. Literacy is part of it. We need a multi-faceted approach to supporting this. We need education supports, but certainly online accountability as well.
You know, when I think about literacy, it's a really interesting word. X, for example, has banned the word “cis”—“cisgender”, for example. It's a Latin word. It's essentially a biological and chemical term, often rooted in science, that has been banned because of the implications of denying transgender people's existence. That's the whole purpose.
Literacy would rely on the platforms to actually use language that is appropriate, rather than ban language, which serves to actually eliminate me from society. The people need the literacy, but the platforms have to be held accountable for ensuring proper literacy.
Ms. Haugen, I'd like to turn to you for my next question. We're in a kind of legislative deadlock right now in the House of Commons. There's pretty much nothing getting done in our main chamber. It's been like that since the end of September. In fact, we don't even have Bill properly before this committee. This is a prestudy. It hasn't even passed second reading.
The fact of the matter is that this Parliament is rapidly running out of runway. Bill is still a long way away from the Governor General's desk. You have just talked about how rapidly this technology is evolving. It may be that we don't actually have a proper legislative approach to this problem for another two or three years.
What are some of the things a future Parliament has to take note of? We have this draft of Bill , but what are some of the other things we may need to think of in a future piece of draft legislation?
:
One of the reasons I'm so excited about the approach Canada took was that you guys did more rounds of citizen assemblies than anyone else in the world did. You actually had conversations. Groups of Canadians went and argued about trade-offs on how to approach the Internet. This is what came out, other than the hate speech attachments that have been added on at the end. As a result, I think the bill overall is pretty resilient. It addresses a bunch of core things that need to be addressed.
The place where I would encourage you guys to be a little more open-minded or to do a little more future-proofing would be to ensure that the concept of what is a social platform is able to evolve. For example, virtual reality is easy to laugh at right now. If you go walk around Meta Horizon Worlds, which is Facebook's virtual reality space, it's overwhelmingly full of people under the age of 12. Age assurance is important for that reason. Those who talk to AI chatbots are overwhelmingly under the age of 18.
Think a little more expansively about what it means to be social, because children are starting to say.... Games are another space that is effectively social networks. As long as you're thinking a little bit more expansively about what's under the tent, the structure overall, and saying that we need to have a proactive duty of care and we need to care about transparency and these issues, that is what's important.
I thank the witnesses for their attendance. I echo the commentary of my colleague Ms. Ferreri that this is such an important discussion we're having today.
Just to clarify, Ms. Haugen, I heard you say that you are not familiar with Bill , which ostensibly achieves the same result in terms of keeping kids safe online. We get to it in a vastly different way versus Bill . It's unfortunate that you haven't had a chance to review that.
Can the same be said for you, Ms. Selby, that you are not familiar with Bill ?
I'll start with you, Ms. Panas. I listened very carefully to your opening statement. You reiterated in some of the questions put to you that ultimately you feel safe in this environment, but the same cannot be said when you actually leave this building. You talked about various avenues of online harassment.
Let's face it: That's the reality Canadians are facing. It's not necessarily just children and teenagers. It's also adults. There is a legal definition of criminal harassment in the Criminal Code of Canada, but what's sadly lacking in the Criminal Code of Canada is provisions to deal with online harassment. Sadly—and this is a direct indictment against the Liberal government—Bill contains no provisions at all that deal with online harassment. Bill does. I don't know if you've had a chance to dive into Bill C-412 to take a look at the provisions that deal with online harassment.
The question I put to you, Ms. Panas, is this: Do you think law enforcement and judges should have more tools to provide “no contact” orders for criminal harassment online? Do you think that's a good idea?
:
Thank you, Madam Chair.
I want to thank all the witnesses for being here today. There's a lot to cover.
Ms. Panas, I'll start with something you said. You talked about feeling comfortable and feeling safe online. Last Christmas Day, I posted a video. I was standing in front of a Christmas tree at a community centre wishing everybody a merry Christmas. The first five or six or 10 comments were, “I hope you lose the next election”, “Rot in hell”—blah blah blah—and those were the nice ones. But I sloughed it off. I have big shoulders. It doesn't matter. That's not what this bill is about. This bill is about protecting people who don't have that ability and who are the most vulnerable.
I want to pick up on what Mr. Brock was trying to do. I want to thank you for your answers about the difference between Bill , which you support, and Bill , which I consider to be.... Well, it doesn't matter what I think. We've had witnesses who have said it's far too narrow and doesn't accomplish the goals we're trying to achieve here. One witness said that she thought it confused tort law with criminal law, which I agree with.
I want to deal with this right off the bat. If something is posted online that's offensive and that involves some of the things we're talking about—I won't use the examples—Bill provides a method to have it taken down from the Internet right away. Contrast that with the so-called solution of Bill , which would require somebody to go out and retain a lawyer, put together some sort of application or motion, go before a judge and try to convince him or her that this should be taken down.
First of all, you're dealing with people who are the most vulnerable, who don't know how to find a lawyer, who can't afford a lawyer, who have to find a lawyer who knows how to deal with this and appear before a judge who has no expertise in this. It's an insulting joke dressed up as policy. It's not effective. I'd like to get that off the table.
I'm assuming you agree with that, Ms. Panas. You've already highlighted the importance of having the ability to deal with this quickly.
:
Thank you, Madam Chair.
Ms. Haugen and Ms. Selby, I'd actually like to continue on that same subject. One of our previous witnesses, the Canadian Centre for Child Protection, is expressly calling for private messaging services and certain aspects of private messaging features to be subject to regulation.
It's hard. To give a personal example, I have 12-year-old twins. We have them on Messenger Kids. We started them off with iPads. We're not prepared to go to the cellphone yet. I'm sure I'm going through what a lot of parents are going through. This is the new frontier. When they get their own cellphones, how can I be sure that those messaging services will be protecting them?
Ms. Haugen, you cited Instagram, but are social media companies doing enough? Do we need to take this regulatory approach?
I'd just like to hear both of you—Ms. Haugen first, and then Ms. Selby—offer a little bit of context.
:
The only reason Instagram took those actions—they knew they could have taken those actions a decade ago—was that they were afraid of laws like this one. They were afraid of Australia banning access to social media for under-16s. They were afraid of the lawsuits that are happening in the United States.
You have to put them in situations where they are afraid of consequences, because of the amount of money to be made from cutting corners and from maximizing the number of connections, no matter the risk for these kids and no matter how addictive this is, for advertising dollars. They have to face consequences if you want them to behave well.
To the second question, on how we keep encrypted messaging secure, we need to think a little more expansively. For example, if I am a child on an encrypted messenger and an adult sends me a lewd image—I did not ask for it and I do not want it—I should have the ability to report that adult. No encrypted messaging has been violated by me reporting that adult. Platforms should have an obligation to take people off their platforms who contact children in that way.
:
We will now resume for our second panel.
[Translation]
Appearing as an individual, we have Andrew Clement, professor emeritus, Faculty of Information, University of Toronto, by video conference.
[English]
I hope that everybody is able to understand both languages and that you've selected the right language of your choice at the bottom.
[Translation]
We also have Guillaume Rousseau, full professor and director of Applied State Law and Policy Programs at the Université de Sherbrooke. He is participating in the meeting by videoconference.
[English]
From the Canadian Constitution Foundation, we have Joanna Baron, executive director. She is here in person.
Please wait until I recognize you by name before speaking.
Each panellist will be allowed up to five minutes for opening remarks.
Mr. Clement, please commence with your opening remarks. You have up to five minutes.
:
Thank you, Madam Chair and committee members, for the opportunity to contribute to your important prestudy of Bill , the online harms act.
I'm Andrew Clement, a professor emeritus in the faculty of information at the University of Toronto, speaking on my own behalf. I'm a computer scientist by training and have long studied the social and policy implications of computerization. I'm also a grandfather of two young girls, so I bring both a professional and a personal interest to the complex issues you're having to grapple with.
I will confine my remarks to redressing a glaring absence in part 1 of the bill—a bill I generally support—which is the need for algorithmic transparency. Several witnesses have made a point about this. The work of Frances Haugen is particularly important in this respect.
Social media operators, broadly defined, provide their users with access to large quantities of various kinds of content, but they're not simply passive purveyors of information. They actively curate this content, making some content inaccessible while amplifying other content, based primarily on calculations of what users are most likely to respond to by clicking, liking, sharing, commenting on, etc.
An overriding priority for operators is to keep people on their site and exposed to revenue-producing advertising. In the blink of an eye, they select the specific content to display to an individual following precise instructions, based on a combination of the individual's characteristics—for example, demographics, behaviour and social network—and features of the content, such as keywords, income potential and assigned labels. This is referred to as an “algorithmic content curation practice”, or “algorithmic practice” for short.
These algorithmic practices determine what appears most prominently in the tiny display space of personal devices and thereby guides users through the vast array of content possibilities. In conjunction with carefully designed interactive features, such curation practices have become so compelling, or even addictive, that it holds the attention of U.S. teens, among others, for nearly five hours a day. Disturbingly, their time spent on social media is strongly correlated with adverse mental health outcomes and with a rapid rise in suicide rates starting around 2012. We've heard vivid testimony about this from your other witnesses. Leading operators are aware of the adverse effects of their practices but resist reform, because it undermines their business models.
While we need multiple approaches to promote safety online, a much better understanding of algorithmic curation practices is surely one of the most important.
Canadians have begun calling for operators to be more transparent about their curation practices. The Citizens' Assembly on Democratic Expression recommended that digital service providers “be required to disclose...the...inner workings of their algorithms”. Respondents to the online consultation regarding this proposed online harms legislation noted “the importance of...algorithmic transparency when setting out a regulatory regime.” Your sister standing committee, the Standing Committee on Public Safety and National Security, has made a similar recommendation: “That the Government of Canada work with platforms to encourage algorithmic transparency...for better content moderation decisions.”
Internationally, the U.S., the EU and others have developed or are developing regulatory regimes that address online platforms' algorithmic practices. Most large social media services or online operators in Canada also operate in the EU, where they are already subject to algorithmic transparency requirements found in several laws, including the Digital Services Act. It requires that “online platforms...consistently ensure that recipients of their service are appropriately informed about how recommender systems impact the way information is displayed, and can influence how information is presented to them.”
While Bill requires operators to provide detailed information about the harmful content accessible on the service, it is surprisingly silent on the algorithmic practices that are vital for determining the accessibility, the reach and the effects of such content. This lapse is easily remedied through amendments—first, by adding a definition of “algorithmic content curation practice”, and second, by adding requirements for the inclusion of algorithmic content curation practices in the digital safety plans in clause 62 and in the electronic data accessible to accredited persons in clauses 73 and 74. I will offer specific amendment wording in a written submission.
Thank you for your attention, and I welcome your questions.
:
Good morning, everyone. Thank you for inviting me to speak to Bill .
I apologize for my appearance. I had surgery yesterday, which is why I'm wearing a bandage. Although I have a few scars on my head, my mind is working fine. I should be able to make this presentation and answer your questions.
As a constitutional lawyer, I mainly want to draw your attention to the issue of freedom of expression and, since I'm from Quebec, I also want to draw your attention to the fact that Bill C‑63 is very similar to Bill 59, which was studied in Quebec in 2015 and 2016.
For those who, like me, fought against Bill 59, it's a bit like Groundhog Day, since Bill contains extremely similar elements, including the prohibition on hate speech. This reminds us of the extent to which Quebec and federal jurisdictions are not always sufficiently exclusive and that there is a lot of overlap. I will stop my digression on Canadian federalism here, but I would like to point out in passing that I have just tabled a report with the Quebec advisory committee on constitutional issues within the Canadian federation. If you're interested in this issue, you should know that a report has just been submitted to the Government of Quebec.
Bill 59, which was studied in 2015 and 2016, banned hate speech, and it was considered very problematic in terms of freedom of expression. In the end, the government of the day decided to set aside part of the bill and not adopt the hate speech component of the bill in order to keep the other part of the bill, which was much more consensual and dealt in particular with the regulation of underage marriages. With respect to Bill C‑63, I hope we are preparing for a similar outcome.
I think the bill contains a lot of interesting things about sexual victimization and “revenge porn”. I believe the equivalent term in French is “pornodivulgation”. I think this whole area of protecting minors and protecting them from sexual victimization is very important. However, everything to do with hate seems much more problematic to me.
Sometimes, people talk about splitting the bill, saying that part 1 isn't a problem, and that parts 2 and 3 are more problematic. For my part, I draw your attention to the fact that, even in part 1, the definition of harmful content includes content that promotes hatred. Even in part 1, there's this mix between the issue of protecting minors from certain elements of pornography and the issue of hate. In my opinion, if we want to rework the bill properly, we must not only not adopt parts 2 and 3, but also eliminate hate from part 1.
The problem with everything to do with hate in the bill is that the definition is very vague and very broad. Hate is defined as detestation and defamation, but the definitions of detestation and defamation often include a reference to hate. It's all a bit circular. It's very vague and, for that reason, it's very difficult for litigants to know what their obligation is, to know what they can and cannot say.
I understand that this definition is inspired by the Supreme Court's Whatcott case, but there are two problems in this regard.
First, this definition was given in a human rights case, but here we want to use it as a model in criminal law. In terms of evidence, in particular, these two areas are very distinct. Second, I understand why we are taking our cues from the Supreme Court when it comes to definitions, because that means that the provision of the act is less likely to be struck down. I understand it on a technical level, but on the substance, a definition that isn't clear and isn't good isn't clear and isn't good, even if it comes from the Supreme Court.
I want to repeat this famous sentence: The Supreme Court is not final because it is infallible, it is infallible because it is final.
As legislators, you really have to ask yourself whether the definition is clear rather than just whether it is the Supreme Court's definition. Ultimately, if you absolutely want to have a definition inspired by the Supreme Court, I would recommend the definition in the Keegstra decision, which is more of a criminal decision. It's a little clearer and a little less problematic than the Whatcott inspired definition.
That said, if you go along with what I'm proposing and remove the hate component from the bill, it will raise the following question: If we create a bill that is more targeted on sexual victimization and the protection of minors, will we need a commission, an ombudsperson, an office and all the bureaucracy that is planned when the purpose of the act is more limited? We will therefore have to rethink the bill so that it is less bureaucratic.
Finally, I draw your attention to the fact that the bill should include the abolition of exemptions that allow hate speech in the name of religion. We were talking earlier about Bill and Bill , but there's also Bill , which I invite you to study.
Thank you.
:
Good afternoon. Thank you for the opportunity to present before this committee.
I represent the Canadian Constitution Foundation, a national legal charity that defends fundamental freedoms. We have participated in Whatcott, Fleming, Ward and other seminal Supreme Court of Canada decisions on freedom of expression. We view this bill, Bill , as posing a grave threat to all Canadians' right to free speech and a flourishing democracy.
We welcome the 's announcement that he intends to split the bill with regard to parts 1 and 4, but we remain concerned about the constitutionality of aspects of part 1, as well as parts 2 and 3 in their entirety.
First I'll address portions of the bill that expand sanctions for offences related to hate speech, including “harmful content” and “content that foments hatred”. I am referring to both the mandate of the new digital safety commissioner, created in part 1 of the bill, and the expanded penalties for hate crimes in part 2.
Part 1 of the bill imposes obligations on an operator to “implement measures that are adequate to mitigate the risk that users...will be exposed to harmful content”. This includes “content that foments hatred”. This office will cost around $200 million over five years and impose fines up to the millions of dollars on platforms.
Part 2 of the bill, meanwhile, increases penalties for existing hate crimes, including promoting genocide, now punishable with up to life. It also creates a new stand-alone offence, in proposed section 320.1001, for any federal offence motivated by hatred, now punishable up to life.
As the previous witness mentioned, and I agree with many of his comments, hate speech is an inherently subjective concept. These expanded penalties and regulatory obligations pose a risk of gross disproportionality and excessive chill of protected expression. In Whatcott, the Supreme Court of Canada said that hatred encompasses only the most “extreme manifestations [captured] by the words 'detestation' and 'vilification'”. Only that type of speech can be penalized without violating the charter.
Bill adopts this language in proposed subsection 319(7): “hatred means the emotion that involves detestation or vilification”. But “detestation” is really just a synonym for “hate”, and vilification is a highly subjective concept. We are in a present moment of passionate and often fraught disagreement in our society, where a lot of claims are made that are understood differently depending on context.
For example, calling someone a Zionist currently may land as vilification or, more dubiously, promotion of genocide, or as praise, depending on the speaker and the audience. Just a few days ago, a former CBC producer, Shenaz Kermalli, accused of hateful expression for posing with an individual wearing an “F Hamas” sweatshirt on social media. That's the problem with criminalizing language. It's subjective. It shifts depending on context.
These concerns become pressing with the expanded sanctions proposed in part 2. Even if our judges can be relied upon to respect the principles of proportionality when sentencing an offender under section 320, for example, the range of available sentences in the law will now include life imprisonment. It's not a frivolous possibility that prosecutors can refer judges to a range of sentencing up to life imprisonment for a crime such as vandalism if it is alleged that the crime was motivated by hate.
The reality is that it's virtually impossible to identify in advance, predictably, a line that separates the merely “awful but lawful” from criminal hate speech. This lack of clarity poses an urgent threat to online discourse, which is our current town square and should brook this type of passionate and adversarial disagreement. When these types of sanctions are in play, everyone has an incentive to err on the side of caution. Platforms will flag and remove content that is actually protected expression, and individuals will self-censor.
Finally, I will briefly address part 3 of the bill. It brings back a civil remedy for online hate speech, which allows members of the public to bring complaints before the Canadian Human Rights Commission. This would be disastrous. You should not go forward with this proposal. Even if most alleged instances are dismissed for not meeting the threshold of hate speech, the penalties for individuals found liable—up to $50,000 paid to the government plus $20,000 to the victim—are severe enough that we can infer that the new regime will lead to large amounts of soft-pedalling of expression for fear of skirting the line. It will interfere severely with press freedom to publish controversial opinions, which are necessary for a flourishing civil society. Finally, process is punishment, even if the case does not proceed. We will see more people punished for protected expression.
Thank you. I welcome your questions.
:
Thank you, Madam Chair.
Good afternoon to all our witnesses.
Mr. Rousseau, it's a pleasure to have you here. I wish you a good and speedy recovery.
In your opening remarks, you said that creating the ombudsperson position and the commission would only increase bureaucracy. I don't know if you were here, but we just heard from Ms. Panas that her life is difficult every day, that she experiences hate online and on the street and that there is no process in place right now, which is a real problem. This process would make a big difference.
I'd like to hear your comments on that.
:
Thank you for your very good question. I appreciate it in particular, since it was asked by the member for Sherbrooke, the member who represents me.
If the committee accepted my recommendation and decided that the bill should focus only on the issue of sexual victimization and revenge porn, rather than also include the hate component, that's where the issue of bureaucracy arises. The fact that the issue of sexual victimization has been mixed up with the issue of hate is problematic, since we can agree on one part but not on the other. We are mixing up two debates that aren't necessarily related. However, this approach has the advantage of ensuring that there's a volume of cases that perhaps better justifies the creation of the commission, the ombudsperson and the office. That's what I wanted to bring to your attention.
Considering the different points of view and the challenges related to freedom of expression, you could get away with focusing on sexual victimization. Does that justify this question, which is more targeted and very important? Does that justify the creation of these three organizations? That's what I'm mainly drawing your attention to. If we look for avenues other than creating this bureaucracy, we can think of legal recourse by individuals, as legislation often allows. A person could initiate a lawsuit. If they've been a victim or if they've suffered damages, they may be inclined to use that kind of recourse. However, it raises other issues, such as access to justice.
Another possible avenue would be to imagine a fund dedicated to victims of revenge porn or, more broadly, hate speech. That could facilitate access to justice.
:
Thank you for your response.
We heard from a mom whose little girl was abused. The family is currently in court. As my colleague mentioned earlier, sometimes people don't know where to turn or don't necessarily have the money to start legal proceedings.
Don't you think that would be a way to help them? I think we underestimate the scope of the problem when we aren't caught up in these networks or platforms, or when our children aren't necessarily affected by everything that happens online.
The testimony we've heard so far has been horrific. It's heartbreaking. Our goal is to be there to protect our children, to help families and to reduce online hate, at the very least, if we can't eradicate it.
Help us find the right way to do that.
:
That's a very good point. If we create these three organizations, we will have to ensure that there is a certain volume of business. If we deal with the more targeted issue of sexual victimization and set aside the issue of hateful content, will there be enough business to justify the creation of these three organizations? That's the question I wanted to put to you.
At the other end of the spectrum, however, the danger is that too many cases will be filed. If we add hate speech to sexual victimization, since the definition of harmful content is very broad, we could end up with an extremely high number of people filing complaints. This could result in very long delays. Generally speaking, administrative tribunals offer slightly faster and less costly access to justice than the courts, but some administrative tribunals are still overwhelmed by cases and there are very long delays. So we shouldn't think that just because an administrative route is created, there will necessarily be access to justice. It's difficult, but it's important to try to anticipate the volume of cases we'll have and the resources we'll need.
They said it was going to cost about $200 million. I think that estimate comes from the Parliamentary Budget Officer. You would think that with that kind of money, there would be relatively quick processing, but hate and online sexual victimization are such broad issues that it's quite likely there will be an extremely large volume of cases, where some complaints will be warranted and others less so, and you end up with a problem of access to justice. So I draw your attention to that.
:
Thank you, Madam Chair.
I would like to welcome the three witnesses. This is a good group of witnesses. I'm pleased to have them here today. I just deplore the fact that we have far too little time to ask such important questions of such competent witnesses.
Mr. Rousseau, I also wish you a speedy recovery. First, I want to mention that we haven't received your opening remarks. It's not mandatory to send them, obviously, but you had some interesting references. So if you have the opportunity to send them to us, I would be very grateful.
I would ask the same of each of the witnesses.
That said, Mr. Rousseau, I'm going to address the issue of the definition of hate. You told us that this is indeed a rather problematic definition. You referred to a Supreme Court decision that contains, if I understood correctly, a definition that might be more appropriate, but I didn't really understand what decision it was about.
First, can you spell the name of the case in question for me so I can write it down properly?
Second, what definition did the Supreme Court propose in this regard?
:
Thank you for your question.
I think I sent my notes, but maybe too late. I was told to send them 72 hours in advance, and I think I did that last night. Perhaps that's why you didn't receive them. They should be arriving a little late. In the worst-case scenario, please don't hesitate to write me an email, and I'll send you my notes in the next few days.
The definition proposed in the bill is inspired by the Supreme Court decision in Saskatchewan (Human Rights Commission) v. Whatcott. In my view, this definition is a bit too broad; it refers to detestation and defamation, the definition of which refers to hate. So it's a circular and vague definition. Also, it comes from a human rights judgment. We know that human rights differ from criminal law, particularly when it comes to the notion of intent. In human rights, when it comes to discrimination, we focus mainly on the effects, regardless of intent, whereas in criminal law, intent is at the heart of the reflection. So it's really not the same logic. That's why it's problematic to create a definition based on a human rights ruling in a bill that is part of a more criminal logic. In addition, the definition is too broad.
I'd like to draw your attention to the decision in R. v. Keegstra, which was rendered in 1990 and was repeated in Mugesera v. Canada (Minister of Citizenship and Immigration). It defines hatred as “an emotion of an intense and extreme nature that is clearly associated with vilification and detestation”. That seems to me to be a bit narrower than the definition in the bill, which is inspired by the Whatcott decision, which refers instead to detestation and defamation, since we're talking here about the intense and extreme nature of emotion. The word “extreme” already prevents it from being interpreted too broadly. However, here too, we're talking about vilification and detestation, so we have somewhat the same problem. I'm not telling you that the definition is perfect, but since it comes from a criminal case, it's preferable to the definition set out in the Whatcott decision.
:
You're absolutely right.
Obviously, the challenge is to protect vulnerable people who are victims of revenge porn or hate speech while protecting freedom of expression. That's your challenge.
Where that balance is very concretely reflected, where it can be found, is in the definition of hate. At the heart of this balance is the definition of content that promotes hatred and the definition of hate. I would draw your attention to the fact that you need to define this very precisely. This concerns freedom of expression for two reasons.
First, if you define the concept of hate too broadly, the courts will sanction people who have justly acceptable speech that, in an open liberal democracy, should ideally be tolerated. There's that risk.
Second, there is an even greater risk: if it isn't clear, litigants who wish to speak won't know exactly whether their remarks could fall within the scope of the act or not. A litigant might want to say something that is just fine and not subject to the act. However, since the definition isn't clear, the litigant could refrain from making that comment. That chilling effect is perhaps more problematic than the risk of courts convicting people of actions that should be protected by freedom of expression.
:
Thank you very much for that question.
“Artificial intelligence” is not a very well-defined term. It's used very broadly, and it has multiple meanings, but we can think of it as a set of algorithmic techniques. It's part of algorithmic practices on the part of these companies. I prefer to use the term “algorithmic intensification”, rather than “intelligence”, because these algorithms do not comprehend or understand content in the way humans do, so they're very limited in their ability to moderate content, particularly if it's going to be taken down.
AI is being used by the platforms particularly in order to keep people on their site and to keep the content sort of flowing and people clicking and so on, and that they can be quite good at because they can keep refining it. It's a statistical process. Also, as we've heard, most recently with generative AI, it's being used to create deepfakes, which can be deeply misleading. I think it's very important that when that is being done, it's clearly understood by the users that this is not a real, authentic image. That doesn't address all of the problems—like these AI friends that become seductive in various ways—but it's a start.
Ms. Baron, I'll turn to you for my last question.
In your opening remarks, you were talking about the importance of protecting freedom of expression, and you said this is the new public square. One key difference, though, is that unlike the physical town square, the digital town square is not a passive bystander. We know that on platforms, those algorithms can play a role in amplifying some content while suppressing other content. It can have a very real effect of pushing some people into some pretty dark corners.
We just heard from a witness in the previous panel, a member of the LGBTQ community, who said that her ability to freely express herself with the status quo is being hampered. How would you like to tackle that? We're trying to figure out a way forward here. How do we protect her ability to freely express herself, because the status quo is greatly impugning her right?
:
Thank you, Madam Chair.
Before I ask questions of the witnesses, I want to put forward the motion of which I gave notice at the last meeting. We heard evidence earlier this week, and again today, of the rapid growth of online harms, particularly for our children. Professor Clement, I too am a granddad, and I have an image of my innocent grandchildren in my mind when I hear this evidence, so I'm very motivated to act quickly on this. This has definitely become a global epidemic that requires immediate action.
Now, happily, our Conservative private member's bill, Bill , addresses some of those issues in an immediate manner. Therefore, Madam Chair, I move the following motion, and we're asking for unanimous consent: That the committee urgently undertake a prestudy of Bill C-412, an act to enact the protection of minors in the digital age and to amend the Criminal Code.
I'm asking for unanimous consent on that.
:
Okay, that's good. Thank you.
Thank you to all the witnesses.
Ms. Baron, I have a question for you. I'm reading from an article written by you that was published in The Hub on February 28 of this year. You said, “The internet is an ugly place.” I agree with you. There's a lot of good, and there's a lot of ugliness. You said that the online harms act is “a profoundly anti-free expression bill that threatens draconian penalties for online speech, chilling legitimate expression by the mere spectre of a complaint to the Canadian Human Rights Commission or the new Digital Safety Commission of Canada.”
Now, you heard that the minister has parsed parts 2 and 3 out of this bill, so I'm assuming it's a less offensive bill. Here's my question for you. In your opinion, if we were to also remove part 1, so all that's left is part 4, would that be a good stand-alone bill? Could that work together with Bill as well?
:
Thank you, Madam Chair.
Thank you to the witnesses.
Ms. Baron, I want to pick up on something you said in your opening remarks. You've repeated several times that you're not in favour of a digital safety commission and the process laid out in part 1. You just made it very clear that you don't support Bill because it's too “vague”, which is a word that's been used by virtually every witness who's been asked about it. We've had two categories of witnesses on Bill C-412: They either didn't know about it or didn't like it, so I'll leave that there.
But what don't you like about the idea...? You said in your remarks, “there are other ways of enforcing that”, and then you went on to criticize the court process. Where does that leave us?
:
Thank you, Madam Chair.
Ms. Baron, I thought I saw on your LinkedIn page that you speak French; I'll take advantage of that.
I won't make you repeat what you've already said, but I'd like to bring you to another topic addressed by Mr. Rousseau, which is the abolition of religious exceptions in the Criminal Code.
Bill has been introduced, and it provides for the repeal of paragraphs 319(3)(b) and 319(3.1)(b) of the Criminal Code. These are provisions that serve as a defence for hate speech or anti-Semitic speech, as long as it is based on a religious concept that we believe in and defend in good faith. In my opinion, the spread of hatred seems a bit difficult to accept as part of a religion. I would say that 99% of religions are based on love and communal harmony, not on the spread of hate.
Is it a good idea to abolish these defences of religious exception? I'd like to hear your thoughts.
:
Thank you, Madam Chair.
Ms. Baron, I guess I want to talk to you about your views on the responsibilities that social media companies should assume. An example was given in the previous panel about how Instagram suddenly made private the accounts of everyone who was 16 or younger, because there were people who were going through those images and trying to find ways to approach young teens. Instagram could have done this 10 years ago, but they did it now because there was the threat of regulation coming their way.
These social media companies have algorithms that can amplify certain content and suppress other content. What is your view on the government's role in making these social media companies have some basic standards of practice that allow people to safely participate online? Again, one day it might be freedom of expression that's being compromised, people being able to freely express themselves. The great fear is that this can follow them from the online space and manifest itself physically. People are feeling that their actual lives are in danger.
What are your views on how we approach the subject of making social media companies more responsible for making a safer online space?
Thank you very much to our witnesses who appeared in person and virtually. I wish everybody a wonderful, safe rest of today.
Thank you very much to the committee members for a wonderful session. I wish you all a very nice Christmas holiday season, and I will see you in January.
I am going to suspend now. Thank you.
[The meeting was adjourned at 4:40 p.m., Monday, January 6, 2025. See Minutes of Proceedings]