moved that Bill , be read the second time and referred to a committee.
He said: Mr. Speaker, hon. colleagues, I am very pleased today to speak to Bill , the online harms act. I speak today not only as a minister and as a fellow parliamentarian, but also as a father, as a South Asian and as a Muslim Canadian.
There are a few moments in this place when our work becomes very personal, and this is one such moment for me. Let me explain why. I ran for office for a number of reasons in 2015. Chief among them was to fight against discrimination and to fight for equality in what I viewed as an increasingly polarized world. In recent years, we have seen that polarization deepen and that hatred fester, including at home here in Canada.
I would never have fathomed that in 2024, Canada would actually lead the G7 in the number of deaths attributable to Islamophobia. Among our allies, it is Canada that has experienced the most fatal attacks against Muslims in the G7. There have been 11. Those were 11 preventable deaths. I say “preventable” because in the trials of both the Quebec mosque shooter, who murdered six men on January 29, 2017, and the man who murdered four members of the Afzaal family in London, Ontario, the attackers admitted, in open court, to having been radicalized online. They admitted what so many of us have always known to be the case: Online hatred has real-world consequences.
Yesterday was the third anniversary of the attack on the Afzaal family, an attack described by the presiding judge as “a terrorist act”. In memory of Talat, Salman, Yumna and Madiha, who lost their lives to an act of hatred on June 6, 2021, we are taking action.
Bill , the online harms act, is a critical piece of that action. This bill is the product of years of work.
[Translation]
We held consultations for over four years. We talked to victims' groups, advocacy groups, international partners, people from the technology industry and the general public. We organized a nationwide consultation and held 19 national and regional round tables. We published a report about what we learned. We listened to the recommendations of our expert advisory group on online safety, a diverse think tank made up of experts who are respected across Canada. We were given valuable advice and gained a great deal of knowledge thanks to those consultations, and all of that informed the development of Bill .
Many of our international partners, such as the United Kingdom, Australia, Germany, France and the European Union, have already done considerable legislative work to try to limit the risks of harmful content online. We learned from their experience and adapted the best parts of their most effective plans to the Canadian context.
[English]
We have also learned what did not work abroad, like the immediate takedown of all types of harmful content, originally done in Germany; or like the overbroad restriction on freedom of speech that was struck as unconstitutional in France. We are not repeating those errors here. Our approach is much more measured and reflects the critical importance of constitutionally protected free expression in Canada's democracy. What we learned from this extensive consultation was that the Internet and social media platforms can be a force for good in Canada and around the world. They have been a tool for activists to defend democracy. They are platforms for critical expression and for critical civic discourse. They make learning more accessible to everyone.
The Internet has made people across our vast world feel more connected to one another, but the internet also has a dark side. Last December, the RCMP warned of an alarming spike in online extremism among young people in Canada and the radicalization of youth online. We know that the online environment is especially dangerous for our most vulnerable. A recent study by Plan International found that 58% of girls have experienced harassment online.
Social media platforms are used to exploit and disseminate devastating messages with tragic consequences. This is because of one simple truth. For too long, the profits of platforms have come before the safety of users. Self-regulation has failed to keep our kids safe. Stories of tragedy have become far too common. There are tragic consequences, like the death of Amanda Todd, a 15-year-old Port Coquitlam student who died by suicide on October 10, 2012, after being exploited and extorted by more than 20 social media accounts. This relentless harassment started when Amanda was just 12 years old, in grade 7.
There was Carson Cleland last fall. He was the same age as my son at the time: 12 years old. Carson made a mistake. He shared an intimate image with someone whom he thought was a friend online, only to find himself caught up in a web of sextortion from which he could not extricate himself. Unable to turn to his parents, too ashamed to turn to his friends, Carson turned on himself. Carson is no longer with us, but he should be with us.
We need to do more to protect the Amanda Todds and the Carson Clelands of this country, and with this bill, we will. I met with the incredible people at the Canadian Centre for Child Protection earlier this year, and they told me that they receive 70 calls every single week from scared kids across Canada in situations like Amanda's and like Carson's.
As the father of two youngsters, this is very personal for me. As they grow up, my 10-year-old and 13-year-old boys spend more and more time on screens. I know that my wife and I are not alone in this parenting struggle. It is the same struggle that parents are facing around the country.
At this point, there is no turning back. Our children and teens are being exposed to literally everything online, and I feel a desperate need, Canadians feel a desperate need, to do a better job of protecting those kids online. That is precisely what we are going to do with this bill.
Bill is guided by four important objectives. It aims to reduce exposure to harmful content online, to empower and support users. Second, it would address and denounce the rise in hatred and hate crimes. Third, it would ensure that victims of hate have recourse to improved remedies, and fourth, it would strengthen the reporting of child sexual abuse material to enhance the criminal justice response to this heinous crime.
[Translation]
The online harms act will address seven types of harmful content based on categories established over more than four years of consultation.
[English]
Not all harms will be treated the same. Services will be required to quickly remove content that sexually victimizes a child or that revictimizes a survivor, as well as to remove what we call “revenge porn”, including sexual deepfakes. There is no place for this material on the Internet whatsoever.
For other types of content, like content that induces a child to self-harm or material that bullies a child, we are placing a duty on platforms to protect children. This means a new legislative and regulatory framework to ensure that social media platforms reduce exposure to harmful, exploitative content on their platforms. This means putting in place special protections for children. It also means that platforms will have to make sure that users have the tools and the resources they need to report harmful content.
To fulfill the duty to protect children, social media platforms will have to integrate age-appropriate design features to make their platforms safer for children to use. This could mean defaults for parental controls and warning labels for children. It could mean security settings for instant messaging for children, or it could mean safe-search settings.
Protecting our children is one of our most important duties that we undertake as lawmakers in this place. As a parent, it literally terrifies me that the most dangerous toys in my home, my children's screens, are not subject to any safety standards right now. This needs to change, and it would change with the passage of Bill .
It is not only that children are subject to horrible sexual abuse and bullying online, but also that they are exposed to hate and hateful content, as are Internet users of all ages and all backgrounds, which is why Bill targets content that foments hatred and incitements to violence as well as incitements to terrorism. This bill would not require social media companies to take down this kind of harmful content; instead, the platforms would have to reduce exposure to it by creating a digital safety plan, disclosing to the digital safety commissioner what steps they are putting in place to reduce risk and reporting back on their progress.
The platforms would also be required to give users practical options for recourse, like tools to either flag or block certain harmful material from their own feeds. This is key to ensuring community safety, all the more so because they are backed by significant penalties for noncompliance. When I say “significant”, the penalties would be 6% of global revenue or $10 million, whichever is higher, and in the instance of a contravention of an order from the digital safety commission, those would rise to 8% of global revenue or $25 million, again, whichever is higher.
The online harms act is an important step towards a safer, more inclusive online environment, where social media platforms actively work to reduce the risk of user exposure to harmful content on their platforms and help to prevent its spread, and where, as a result, everyone in Canada can feel safer to express themselves openly. This is critical, because at the heart of this initiative, it is about promoting expression and participation in civic discourse that occurs online. We can think about Carla Beauvais and the sentiments she expressed when she stood right beside me when we tabled this legislation in February, and the amount of abuse she faced for voicing her concerns about the George Floyd incident in the United States, which cowered her and prevented her from participating online. We want her voice added to the civic discourse. Right now, it has been removed.
[Translation]
The online harms act will regulate social media services, the primary purpose of which is to enable users to share publicly accessible content, services that pose the greatest risk of exposing the greatest number of people to harmful content.
[English]
This means that the act would apply to social media platforms, such as Facebook, X and Instagram; user-uploaded adult content services, such as Pornhub; and livestreaming services, such as Twitch. However, it would not apply to any private communications, meaning private texts or direct private messaging on social media apps, such as Instagram or Facebook Messenger. It is critical to underscore, again, that this is a measured approach that does not follow the overreach seen in other countries we have studied, in terms of how they embarked upon this endeavour. The goal is to target the largest social media platforms, the places where the most people in Canada are spending their time online.
Some ask why Bill addresses both online harms and hate crimes, which can happen both on and off-line. I will explain this. Online dangers do not remain online. We are seeing a dramatic rise in hate crime across our country. According to Statistics Canada, the number of police-reported hate crimes increased by 83% between 2019 and 2022. B'nai Brith Canada reports an alarming 109% increase in anti-Semitic incidents from 2022 to 2023. In the wake of October 7, 2023, I have been hearing frequently from Jewish and Muslim groups, which are openly questioning whether it is safe to be openly Jewish or Muslim in Canada right now. This is not tenable. It should never be tolerated, yet hate-motivated violence keeps happening. People in Canada are telling us to act. It is up to us, as lawmakers, to do exactly that.
We must take concrete action to better protect all people in Canada from harms, both online and in our communities. We need better tools to deal with harmful content online that foments violence and destruction. Bill gives law enforcement these much-needed tools.
The Toronto Police Service has expressed their open support of Bill because they know it will make our communities safer. Members of the Afzaal family have expressed their open support for Bill because they know the Islamophobic hate that causes someone to kill starts somewhere, and it is often online.
However, we know there is no single solution to the spread of hatred on and off-line. That is why the bill proposes a number of different tools to help stop the hate. It starts with the Criminal Code of Canada. Bill would amend the Criminal Code to better target hate crime and hate propaganda. It would do this in four important ways.
First, it would create a new hate crime offence. Law enforcement has asked us for this tool, so they can call a hate crime a hate crime when laying a charge, rather than as an afterthought at sentencing. This new offence will also help law enforcement track the actual number of hate-motivated crimes in Canada. That is why they have appealed to me to create a free-standing hate crime offence in a manner that replicates what already exists in 47 of the 50 states south of the border. A hate-motivated assault is not just an assault. It is a hate crime and should be recognized as such on the front end of a prosecution.
[Translation]
Second, Bill would increase sentences for the four existing hate speech offences. These are serious offences, and the sentences should reflect that.
Third, Bill C-63 would create a recognizance to keep the peace, which is specifically designed to prevent any of the four hate propaganda offences and the new hate crime offence from being committed.
This would be modelled on existing peace bonds, such as those used in domestic violence cases, and would require someone to have a reasonable fear that these offences would be committed. The threshold of “reasonable fear” is common to almost all peace bonds.
In addition, as some but not all peace bonds do, this would require the relevant attorney general to give consent before an application is made to a judge to impose a peace bond on a person. This ensures an extra layer of scrutiny in the process.
[English]
Finally, the bill would codify a definition of hatred for hate propaganda offences and for the new hate crime offence, based on the definition the Supreme Court of Canada created in its seminal decisions in R. v. Keegstra and in Saskatchewan Human Rights Commission v. Whatcott. The definition sets out not only what hatred is but also what it is not, thereby helping Canadians and law enforcement to better understand the scope of these offences.
The court has defined hate speech as content that expresses detestation or vilification of an individual or group on the basis of grounds such as race, national or ethnic origin, religion and sex. It only captures the most extreme and marginal type of expression, leaving the entirety of political and other discourse almost untouched. That is where one will find the category of content that some have called “awful but lawful”. This is the stuff that is offensive and ugly but is still permitted as constitutionally protected free expression under charter section 2(b). This category of content is not hate speech under the Supreme Court's definition.
[Translation]
I want to make clear what Bill does not do. It does not undermine freedom of expression. It strengthens freedom of expression by allowing all people to participate safely in online discussions.
[English]
Bill would provide another tool as well. It would amend the Canadian Human Rights Act to define a new discriminatory practice of communicating hate speech online. The legislation makes clear that hate does not encompass content that merely discredits, humiliates, hurts or offends, but where hate speech does occur, there would be a mechanism through which an individual could ask that those expressions of hate be removed. The CHRA amendments are not designed to punish anyone. They would simply give Canadians a tool to get hate speech removed.
Finally, Bill would modernize and close loopholes in the mandatory reporting act. This would help law enforcement more effectively investigate child sex abuse and exploitation and bring perpetrators to justice, retaining information longer and ensuring that social media companies report CSAM to the RCMP.
There is broad support for the online harms act. When I introduced the legislation in February, I was proud to have at my side the Centre for Israel and Jewish Affairs and the National Council of Canadian Muslims. Those two groups have had vast differences in recent months, but on the need to fight hatred online, they are united. The same unity has been expressed by both Deborah Lyons, the special envoy on preserving Holocaust remembrance and combatting anti-Semitism, and Amira Elghawaby, the special representative on combatting Islamophobia.
The time to combat all forms of online hate is now. Hatred that festers online can result in real-world violence. I am always open to good-faith suggestions on how to improve the bill. I look forward to following along with the study of the legislation at the committee stage. I have a fundamental duty to uphold the charter protection of free expression and to protect all Canadians from harm. I take both duties very seriously.
Some have urged me to split Bill in two, dealing only with the provisions that stop sexually exploitative material from spreading and throwing away measures that combat hate. To these people, I say that I would not be doing my job as minister if I failed to address the rampant hatred on online platforms. It is my job to protect all Canadians from harm. That means kids and adults. People are pleading for relief from the spread of hate. It is time we acted.
Bill is a comprehensive response to online harms and the dangerous hate we are seeing spreading in our communities. We have a duty to protect our children in the real world. We must take decisive action to protect them online as well, where the dangers can be just as pernicious, if not more so. Such action starts with passing Bill C-63.
:
Mr. Speaker, we must protect Canadians in the digital age, but Bill is not the way to do it. It would force Canadians to make unnecessary trade-offs between the guarantee of their security and their charter rights. Today I will explain why Bill is deeply flawed and why it would not protect Canadians' rights sufficiently. More importantly, I will present a comprehensive alternative plan that is more respectful of Canadians' charter rights and would provide immediate protections for Canadians facing online harms.
The core problem with Bill is how the government has changed and chosen to frame the myriad harms that occur in the digital space as homogenous and as capable of being solved with one approach or piece of legislation. In reality, harms that occur online are an incredibly heterogenous set of problems requiring a multitude of tailored solutions. It may sound like the former might be more difficult to achieve than the latter, but this is not the case. It is relatively easy to inventory the multitudes of problems that occur online and cause Canadians harm. From there, it should be easy to sort out how existing laws and regulatory processes that exist for the physical world could be extended to the digital world.
There are few, if any, examples of harms that are being caused in digital spaces that do not already have existing relatable laws or regulatory structures that could be extended or modified to cover them. Conversely, what the government has done for nearly a decade is try to create new, catch-all regulatory, bureaucratic and extrajudicial processes that would adapt to the needs of actors in the digital space instead of requiring them to adapt to our existing laws. All of these attempts have failed to become law, which is likely going to be the fate of Bill .
This is a backward way of looking at things. It has caused nearly a decade of inaction on much-needed modernization of existing systems and has translated into law enforcement's not having the tools it needs to prevent crime, which in turn causes harm to Canadians. It has also led to a balkanization of laws and regulations across Canadian jurisdictions, a loss of investment due to the uncertainty, and a lack of coordination with the international community. Again, ultimately, it all harms Canadians.
Bill takes the same approach by listing only a few of the harms that happen in online spaces and creates a new, onerous and opaque extrajudicial bureaucracy, while creating deep problems for Canadian charter rights. For example, Bill C-63 would create a new “offence motivated by a hatred” provision that could see a life sentence applied to minor infractions under any act of Parliament, a parasitic provision that would be unchecked in the scope of the legislation. This means that words alone could lead to life imprisonment.
While the government has attempted to argue that this is not the case, saying that a serious underlying act would have to occur for the provision to apply, that is simply not how the bill is written. I ask colleagues to look at it. The bill seeks to amend section 320 of the Criminal Code, and reads, “Everyone who commits an offence under this Act or any other Act of Parliament...is guilty of an indictable offence and liable to imprisonment for life.”
At the justice committee earlier this year, the stated:
...the new hate crime offence captures any existing offence if it was hate-motivated. That can run the gamut from a hate-motivated theft all the way to a hate-motivated attempted murder. The sentencing range entrenched in Bill C-63 was designed to mirror the existing...options for all of these potential underlying offences, from the most minor to the most serious offences on the books....
The continued, saying, “this does not mean that minor offences will suddenly receive...harsh sentences. However, sentencing judges are required to follow legal principles, and “hate-motivated murder will result in a life sentence. A minor infraction will...not result in it.”
In this statement, the admitted both that the new provision could be applied to any act of Parliament, as the bill states, and that the government would be relying upon the judiciary to ensure that maximum penalties were not levelled against a minor infraction. Parliament cannot afford the government to be this lazy, and by that I mean not spelling out exactly what it intends a life sentence to apply to in law, as opposed to handing a highly imperfect judiciary an overbroad law that could have extreme, negative consequences.
Similarly, a massive amount of concern from across the political spectrum has been raised regarding Bill 's introduction of a so-called hate crime peace bond, calling it a pre-crime provision for speech. This is highly problematic because it would explicitly extend the power to issue peace bonds to crimes of speech, which the bill does not adequately define, nor does it provide any assurance that it would meet a criminal standard for hate.
Equally as concerning is that Bill would create a new process for individuals and groups to complain to the Canadian Human Rights Commission that online speech directed at them is discriminatory. This process would be extrajudicial, not subject to the same evidentiary standards of a criminal court, and could take years to resolve. Findings would be based on a mere balance of probabilities rather than on the criminal standard of proof beyond a reasonable doubt.
The subjectivity of defining hate speech would undoubtedly lead to punishments for protected speech. The mere threat of human rights complaints would chill large amounts of protected speech, and the system would undoubtedly be deluged with a landslide of vexatious complaints. There certainly are no provisions in the bill to prevent any of this from happening.
Nearly a decade ago, even the Toronto Star, hardly a bastion of Conservative thought, wrote a scathing opinion piece opposing these types of provisions. The same principle should apply today. When the highly problematic components of the bill are overlaid upon the fact that we are presently living under a government that unlawfully invoked the Emergencies Act and that routinely gaslights Canadians who legitimately question efficacy or the morality of its policies as spreading misinformation, as the did in his response to my question, saying that I had mis-characterized the bill, it is not a far leap to surmise that the new provision has great potential for abuse. That could be true for any political stripe that is in government.
The government's charter compliance statement, which is long and vague and has only recently been issued, should raise concerns for parliamentarians in this regard, as it relies on this statement: “The effects of the Bill on freedom expression are outweighed by the benefits of protecting members of vulnerable groups”. The government has already been found to have violated the Charter in the case of Bill for false presumptions on which one benefit outweighs others. I suspect this would be the same case for Bill should it become law, which I hope it does not.
I believe in the capacity of Canadians to express themselves within the bounds of protected speech and to maintain the rule of law within our vibrant pluralism. Regardless of political stripe, we must value freedom of speech and due process, because they are what prevents violent conflict. Speech already has clearly defined limitations under Canadian law. The provisions in Bill that I have just described are anathema to these principles. To be clear, Canadians should not be expected to have their right to protected speech chilled or limited in order to be safe online, which is what Bill C-63 would ask of them.
Bill would also create a new three-headed, yet-to-exist bureaucracy. It would leave much of the actual rules the bill describes to be created and enforced under undefined regulations by said bureaucracy at some much later date in the future. We cannot wait to take action in many circumstances. As one expert described it to me, it is like vaguely creating an outline and expecting bureaucrats, not elected legislators, to colour in the picture behind closed doors without any accountability to the Canadian public.
The government should have learned from the costs associated with failing when it attempted the same approach with Bill and Bill , but alas, here we are. The new bureaucratic process would be slow, onerous and uncertain. If the government proceeds with it, it means Canadians would be left without protection, and innovators and investors would be left without the regulatory certainty needed to grow their businesses.
It would also be costly. I have asked the Parliamentary Budget Officer to conduct an analysis of the costs associated with the creation of the bureaucracy, and he has agreed to undertake the task. No parliamentarian should even consider supporting the bill without understanding the resources the government intends to allocate to the creation of the new digital safety commission, digital safety ombudsman and digital safety office, particularly since the findings in this week's damning NSICOP report starkly outlined the opportunity cost of the government failing to allocate much needed resources to the RCMP.
Said differently, if the government cannot fund and maintain the critical operations of the RCMP, which already has the mandate to enforce laws related to public safety, then Parliament should have grave, serious doubts about the efficacy of its setting up three new bureaucracies to address issues that could likely be managed by existing regulatory bodies like the CRTC or in the enforcement of the Criminal Code. Also, Canadians should have major qualms about creating new bureaucracies which would give power to well-funded and extremely powerful big tech companies to lobby and manipulate regulations to their benefit behind the scenes and outside the purview of Parliament.
This approach would not necessarily protect Canadians and may create artificial barriers to entry for new innovative industry players. The far better approach would be to adapt and extend long-existing laws and regulatory systems, properly resource their enforcement arms, and require big tech companies and other actors in the digital space to comply with these laws, not the other way around. This approach would provide Canadians with real protections, not what amounts to a new, ineffectual complaints department with a high negative opportunity cost to Canadians.
In no scenario should Parliament allow the government to entrench in legislation a power for social media companies to be arbiters of speech, which Bill risks doing. If the government wishes to further impose restrictions on Canadians' rights to speech, that should be a debate for Parliament to consider, not for regulators and tech giants to decide behind closed doors and with limited accountability to the public.
In short, this bill is completely flawed and should be abandoned, particularly given the minister's announcement this morning that he is unwilling to proceed with any sort of change to it in scope.
However, there is a better way. There is an alternative, which would be a more effective and more quickly implementable plan to protect Canadians' safety in the digital age. It would modernize existing laws and processes to align with digital advancements. It would protect speech not already limited in the Criminal Code, and would foster an environment for innovation and investment in digital technologies. It would propose adequately resourcing agencies with existing responsibilities for enforcing the law, not creating extrajudicial bureaucracies that would amount to a complaints department.
To begin, the RCMP and many law enforcement agencies across the country are under-resourced after certain flavours of politicians have given much more than a wink and a nod to the “defund the police” movement for over a decade. This trend must immediately be reversed. Well-resourced and well-respected law enforcement is critical to a free and just society.
Second, the government must also reform its watered-down bail policies, which allow repeat offenders to commit crimes over and over again. Criminals in the digital space will never face justice, no matter what laws are passed, if the Liberal government's catch-and-release policies are not reversed. I think of a woman in my city of Calgary who was murdered in broad daylight in front of an elementary school because her spouse was subject to the catch-and-release Liberal bail policy, in spite of his online harassment of her for a very long time.
Third, the government must actually enforce—
:
Mr. Speaker, third, the government must actually enforce laws that are already on the books but have not been recently enforced due to a extreme lack of political will and disingenuous politics and leadership, particularly as they relate to hate speech. This is particularly in light of the rise of dangers currently faced by vulnerable Canadian religious communities such as, as the mentioned, Canada's Jewish community.
This could be done via actions such as ensuring the RCMP, including specialized integrated national security enforcement teams and national security enforcement sections, is providing resources and working directly with appropriate provincial and municipal police forces to share appropriate information intelligence to provide protection to these communities, as well as making sure the secure security infrastructure program funding is accessible in an expedited manner so community institutions and centres can enhance security measures at their gathering places.
Fourth, for areas where modernization of existing regulations and the Criminal Code need immediate updating to reflect the digital age, and where there could be cross-partisan consensus, the government should undertake these changes in a manner that would allow for swift and non-partisan passage through Parliament.
These items could include some of the provisions discussed in Bill . These include the duty of making content that sexually victimizes a child or revictimizes a survivor, or of intimate content communicated without consent, inaccessible to persons in Canada in certain circumstances; imposing certain duties to keep all records related to sexual victimization to online providers; making provisions for persons in Canada to make a complaint to existing enforcement bodies, such as the CRTC or the police, not a new bureaucracy that would take years to potentially materialize and be costly and/or ineffective; ensuring that content on a social media service that sexually victimizes a child or revictimizes a survivor, or is intimate content communicated without consent, by authorization of a court making orders to the operators of those services, is inaccessible to persons in Canada; and enforcing the proposed amendment to an act respecting the mandatory reporting of internet child pornography by persons who provide an Internet service.
Other provisions the government has chosen not to include in Bill , but that should have been and that Parliament should be considering in the context of harms that are being conducted online, must include updating Canada's existing laws on the non-consensual distribution of intimate images to ensure the distribution of intimate deepfakes is also criminalized, likely through a simple update to the Criminal Code. We could have done this by unanimous consent today had the government taken the initiative to do so. This is already a major problem in Canada with girls in high schools in Winnipeg seeing intimate images of themselves, sometimes, as reports are saying, being sexually violated without any ability for the law to intervene.
The government also needs to create a new criminal offence of online criminal harassment that would update the existing crime of criminal harassment to address the ease and anonymity of online criminal harassment. Specifically, this would apply to those who repeatedly send threatening and/or explicit messages or content to people across the Internet and social media when they know, or should know, it is not welcome. This could include aggravating factors for repeatedly sending such material anonymously and be accompanied by a so-called digital restraining order that would allow victims of online criminal harassment to apply to a judge, under strict circumstances, to identify the harassment and end the harassment.
This would protect privacy, remove the onus on social media platforms from guessing when they should be giving identity to the police and prevent the escalation of online harassment into physical violence. This would give police and victims clear and easy-to-understand tools to prevent online harassment and associated escalation. This would address a major issue of intimate partner violence and make it easier to stop coercive control.
As well, I will note to the that members of the governing Liberal Party agreed to the need for these exact measures at a recent meeting of PROC related to online harassment of elected officials this past week.
Fifth, the government should consider a more effective and better way to regulate online platforms, likely under the authority of the CRTC and the Minister of Industry, to better protect children online while protecting charter rights.
This path could include improved measures to do this. This could include, through legislation, not backroom regulation, but precisely through law, defining the duty of care required by online platforms. Some of these duties of care have already been mentioned in questions to the ministers today. This is what Parliament should be seized with, not allowing some unnamed future regulatory body to decide this for us while we have big tech companies and their lobbying arms defining that behind closed doors. That is our job, not theirs.
We could provide parents with safeguards, controls and transparency to prevent harm to their kids when they are online, which could be part of the duty of care. We could also require that online platforms put the interests of children first with appropriate safeguards, again, in a legislative duty of care.
There could also be measures to prevent and mitigate self-harm, mental health disorders, addictive behaviours, bullying and harassment, sexual violence and exploitation, and the promotion of marketing and products that are unlawful for minors. All of these things are instances of duty of care.
We could improve measures to implement privacy-preserving and trustworthy age verification methods, which many platforms always have the capacity to do, while prohibiting the use of a digital ID in any of these mechanisms.
This path could also include measure to ensure that the enforcement of these mechanisms, including a system of administrative penalties and consequences, is done through agencies that already exist. Additionally, we could ensure that there are perhaps other remedies, such as the ability to seek remedy for civil injury, when that duty of care is violated.
This is a non-comprehensive list of online harms, but the point is, we could come to consensus in this place on simple modernization issues that would update the laws now. I hope that the government will accept this plan.
A send out a shout-out to Sean Phelan and David Murray, two strong and mighty workers. We did not have an army of bureaucrats, but we came up with this. I hope that Parliament considers this alternative plan, instead of Bill , because the safety of Canadians is at risk.