An Act to enact the Online Harms Act, to amend the Criminal Code, the Canadian Human Rights Act and An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other Acts

Sponsor

Arif Virani  Liberal

Status

Second reading (House), as of Sept. 23, 2024

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-63.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 of this enactment enacts the Online Harms Act , whose purpose is to, among other things, promote the online safety of persons in Canada, reduce harms caused to persons in Canada as a result of harmful content online and ensure that the operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act.
That Act, among other things,
(a) establishes the Digital Safety Commission of Canada, whose mandate is to administer and enforce that Act, ensure that operators of social media services in respect of which that Act applies are transparent and accountable with respect to their duties under that Act and contribute to the development of standards with respect to online safety;
(b) creates the position of Digital Safety Ombudsperson of Canada, whose mandate is to provide support to users of social media services in respect of which that Act applies and advocate for the public interest in relation to online safety;
(c) establishes the Digital Safety Office of Canada, whose mandate is to support the Digital Safety Commission of Canada and the Digital Safety Ombudsperson of Canada in the fulfillment of their mandates;
(d) imposes on the operators of social media services in respect of which that Act applies
(i) a duty to act responsibly in respect of the services that they operate, including by implementing measures that are adequate to mitigate the risk that users will be exposed to harmful content on the services and submitting digital safety plans to the Digital Safety Commission of Canada,
(ii) a duty to protect children in respect of the services that they operate by integrating into the services design features that are provided for by regulations,
(iii) a duty to make content that sexually victimizes a child or revictimizes a survivor and intimate content communicated without consent inaccessible to persons in Canada in certain circumstances, and
(iv) a duty to keep all records that are necessary to determine whether they are complying with their duties under that Act;
(e) authorizes the Digital Safety Commission of Canada to accredit certain persons that conduct research or engage in education, advocacy or awareness activities that are related to that Act for the purposes of enabling those persons to have access to inventories of electronic data and to electronic data of the operators of social media services in respect of which that Act applies;
(f) provides that persons in Canada may make a complaint to the Digital Safety Commission of Canada that content on a social media service in respect of which that Act applies is content that sexually victimizes a child or revictimizes a survivor or intimate content communicated without consent and authorizes the Commission to make orders requiring the operators of those services to make that content inaccessible to persons in Canada;
(g) authorizes the Governor in Council to make regulations respecting the payment of charges by the operators of social media services in respect of which that Act applies, for the purpose of recovering costs incurred in relation to that Act.
Part 1 also makes consequential amendments to other Acts.
Part 2 amends the Criminal Code to, among other things,
(a) create a hate crime offence of committing an offence under that Act or any other Act of Parliament that is motivated by hatred based on certain factors;
(b) create a recognizance to keep the peace relating to hate propaganda and hate crime offences;
(c) define “hatred” for the purposes of the new offence and the hate propaganda offences; and
(d) increase the maximum sentences for the hate propaganda offences.
It also makes related amendments to other Acts.
Part 3 amends the Canadian Human Rights Act to provide that it is a discriminatory practice to communicate or cause to be communicated hate speech by means of the Internet or any other means of telecommunication in a context in which the hate speech is likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination. It authorizes the Canadian Human Rights Commission to deal with complaints alleging that discriminatory practice and authorizes the Canadian Human Rights Tribunal to inquire into such complaints and order remedies.
Part 4 amends An Act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service to, among other things,
(a) clarify the types of Internet services covered by that Act;
(b) simplify the mandatory notification process set out in section 3 by providing that all notifications be sent to a law enforcement body designated in the regulations;
(c) require that transmission data be provided with the mandatory notice in cases where the content is manifestly child pornography;
(d) extend the period of preservation of data related to an offence;
(e) extend the limitation period for the prosecution of an offence under that Act; and
(f) add certain regulation-making powers.
Part 5 contains a coordinating amendment.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Michelle Rempel Conservative Calgary Nose Hill, AB

I'm glad you brought this up, because it was actually my next question. It's a question between you and Mr. McSorley.

The government, in Bill C-63, has not thought about age verification at all. It's punting this to a regulator that's not created, and it's going to be two or three years down the road.

Witnesses on the other panel have suggested that age verification can be done right now through algorithms, and I agree with that. You can detect someone's age using an algorithm. If Meta knows somebody wants to buy a KitchenAid spatula, it knows how old they are.

I'm wondering, between the two of you, if the way that we should be squaring the circle on age verification to protect personal information, while also ensuring that minors are not subjected to harm, is by requiring online operators to use algorithms or other technological means to determine age within a degree of accuracy.

Does that make sense to you, Ms. Jordan-Smith?

Tim McSorley National Coordinator, International Civil Liberties Monitoring Group

Thank very much, Chair.

Thank you to the committee for this invitation to speak to Bill C-63.

I'm grateful to be here on behalf of the International Civil Liberties Monitoring Group, a coalition of 44 Canadian civil society organizations that work to defend civil liberties in the context of national security and anti-terrorism measures.

The provisions of this bill, particularly in regard to part 1 of the online harms act, are vastly improved over the government's original 2021 proposal, and we believe that it will respond to urgent and important issues. However, there are still areas of serious concern that must be addressed, especially regarding undue restrictions on free expression and infringement on privacy.

This includes, in part 1 of the act, first, the overly broad definition of the harm of “content that incites violent extremism or terrorism” will lead to overmoderation and censorship. Further, given the inclusion of the online harm of “content that incites violence”, it is redundant and unnecessary.

Second, the definition of “content that incites violence” itself is overly broad and will lead to content advocating protest to be made inaccessible on social media platforms.

Third, the act fails to prevent platforms from proactively monitoring, essentially surveilling, all content uploaded to their sites.

Fourth, a lack of clarity in the definition of what is considered “a regulated service” could lead to platforms being required to break encryption tools that provide privacy and security online.

Fifth, proposed requirements for platforms to retain certain kinds of data could lead to the unwarranted collection and retention of the private information of social media users.

Finally, seventh, there has been little consideration on how this law will inhibit the access of Canadians and people in Canada to content shared by people in other countries.

Briefly, on part 2 of the act, this section amends Canada's existing hate-crime offences and creates a new stand-alone hate crime offence, and it is only tangentially related to part 1. It has raised serious concerns among human rights and civil liberties advocates in regard to the breadth of the offences and the associated penalties. We've called for parts 2 and 3 to be split from part 1 in order to be considered separately, and we're very pleased to see the government's announcement yesterday that it intends to do just that.

I'd be happy to speak to any of these issues during questions, and I've submitted a more detailed brief to the committee with specific amendments on these issues. However, I'd like to try to focus in the time I have on the first two points that I've made regarding “content that incites violent extremism or terrorism”, as well as a definition of “content that incites violence”.

The harm of “content that incites violent extremism or terrorism” is problematic for three reasons and should be removed from the act. First, it is redundant and unnecessary. The definitions of “content that incites violent extremism or terrorism” and “content that incites violence” are nearly identical, the major difference being that the first includes a motivating factor for the violence it is attempting to prevent. These two forms of harms are also treated the same throughout the online harms act, including requirements for platforms to retain information related to these harms for a year to aid in possible investigations.

Moreover, and maybe most importantly, incitement to violence alone would clearly capture any incitement to violence that arises from terrorist or extremist content. Further definition of what motivates the incitement to violence is unnecessary.

Second, if included, incitement to terrorism will result in the unjustified censorship of user content. “Terrorism”, and with it “extremism”, are subjective terms based on interpretation of the motivations for a certain act. The same opinion expressed in one context may be viewed as support for terrorism and therefore violent, while, in another, it may be viewed as legitimate and legally protected political speech.

Acts of dissent become stigmatized and criminalized not because of the acts themselves but because of the alleged motivation behind the acts. As we have seen, this leads to unacceptable incidents of racial, religious and political profiling in pursuit of fighting terrorism.

Studies have also extensively documented how social media platforms already overmoderate content that expresses dissenting views under the auspices of removing “terrorist content”. The result is that, by including terrorism as a motivating factor for posts that incite violence, the act will be biased against language that is not, in fact, urging violence but is seen as doing so because of personal or societal views of what is considered terrorism or extremism.

I note also that “extremism” is not defined in Canadian law. This ties into the third key part that we're concerned about, and that's that parts of the language used in this definition are undefined in Canadian law or the Criminal Code. This contradicts the government's main justification for all seven harms—that they align with the Criminal Code and do not expand existing offences.

Alistair MacGregor NDP Cowichan—Malahat—Langford, BC

Thank you very much, Madam Chair.

I'd like to echo colleagues in thanking the witnesses for joining our committee and helping us wade through a very difficult subject. I'm a father of three daughters. I have 12-year-old twins, so we are dealing with that as parents, with them getting access to the Internet, and the challenges of finding ways to allow them to do that safely.

Ms. Todd and Ms. Lavers, I'd like to start with you, because part of the debate on the subject of Bill C-63 has been on whether we should just modernize existing laws and changes to the Criminal Code or whether we should add another layer of bureaucracy.

Briefly, when you had your experiences in reporting this to the police and when the police were trying to make use of existing Criminal Code provisions to solve this for your children, can you talk about some of the limitations you experienced with that and illustrate why you think more is needed based on your personal experiences?

December 5th, 2024 / 11:35 a.m.


See context

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

Thank you for allowing me to do this.

I will continue about why I feel that Bill C-63 is important.

I also want to say that we aren't the only country that has afforded this. The U.K. has an Online Safety Act that was established and written into law in 2023, and Australia had the Online Safety Act put into law in 2021. Also, the EU has an online harms act that is similar to what Canada is doing. Canada has been in collaboration with the U.K., Australia and the EU regarding BillC-63.

Why is this important? It's important because it protects children. What I don't understand—and this is from my own thinking—are all the people who are negative on Bill C-63, saying that it's not about children and it's not about protection. They focus on the parts that Minister Virani has said he and his cabinet would rewrite. It is about protecting children. It's about protecting children and families from the online behaviours of others.

We can't do this without the tech companies' help. It's really important that we understand this. There are so many people who don't understand this. I read the negative comments, and, personally, it just infuriates me, because my daughter died 12 years ago, and I've waited 12 years for this to happen. Parliamentarians and political groups are arguing about this not being necessary, and we're going.... It just hurts me. It hurts me as a Canadian.

We need accountability and transparency. We need to support the victims. Passing Bill C-63 is not just about regulation; it's about taking a stand for the safety and dignity of all Canadians. This about ensuring that our digital spaces are as safe and respectful as our physical ones.

By supporting this bill, we are committing to a future in which the Internet is a place of opportunity and connection, free from threats of harm and exploitation. Passing Bill C-63 would demonstrate the federal government's commitment to adapting to the digital age and ensuring that the Internet remains a safe space for all users. It balances the need for free expression with the imperative to protect individuals from harm, making it a necessary and timely piece of legislation.

It's also essential to recognize the collective effort in creating platforms that address the challenges faced by children, women and men.

We've come to realize that what happened to Amanda could happen to anyone. As Amanda herself said, “Everyone has a story.” When these stories emerge, and they belong to your child, your relatives or your grandchildren, they carry more weight.

No one is immune to becoming a statistic, and, as I have previously shared, I have waited 12 years for this, because on day one of Amanda's death, I knew things needed to change in terms of law, legislation and online safety. I can't bring my child back, but we can certainly keep other children safe.

Thank you for this time.

James Maloney Liberal Etobicoke—Lakeshore, ON

Do any of the other witnesses want to comment on that before I move on?

No. Okay.

Ms. McDonald, I'll go back to you then.

The 10,000 to 20,000 number that you mentioned a couple of times is quite stark. Without the takedown provisions that are part of Bill C-63.... Let me put it another way. With the takedown provisions that are included in Bill C-63, how would the outcomes be different? What would the time frame difference look like, in your opinion, based on the companies having free reign to make the decision now, versus the provisions of Bill C-63?

James Maloney Liberal Etobicoke—Lakeshore, ON

Thank you, Madam Chair. I want to thank you and the members around the table for allowing us to do this study, particularly those who voted in favour of proceeding with it.

As well, I want to thank the witnesses for their powerful and important presentations today.

I just want to highlight, so that everybody knows, and I think everybody is aware, that the minister announced yesterday that we intend to split Bill C-63 into two parts, with the digital safety and child protection measures separated from the measures that focus on hate. I'd like to get on record that we've agreed to start with a prestudy of three meetings, but I believe that we should continue with three to six meetings on part 1 of the bill. This means a focus on the online harms act and the amendments to the mandatory reporting act. Then we can proceed with a second study, on the balance of the bill, at a later date.

I do have questions for the witnesses. I just want to emphasize our gratitude to all of you for being here, because we know it is incredibly difficult to share your stories in this fashion or in any other fashion. You have our gratitude and respect.

Child sexual abuse in Canada is currently illegal. Law enforcement can and should deal with horrible content, as Ms. Rempel was saying. However, as you said, Ms. Lavers, we need to depoliticize this, and the Criminal Code amendments alone are not enough.

What Bill C-63 would do.... I'll just be clear: A number of the issues that Ms. Rempel Garner was referring to are included in Bill C-63, so I think people need to understand that.

My question to all of you is this: If we were to proceed with just the Criminal Code measures alone, without the digital safety framework, would that be enough to address the problems we're talking about today, in your opinion? I put the question to all of you.

December 5th, 2024 / 11:25 a.m.


See context

Founder and Mother, Amanda Todd Legacy Society

Carol Todd

It was my understanding that there was embedded, in Bill C-63, something about AI, but—

Michelle Rempel Conservative Calgary Nose Hill, AB

Do you realize that Bill C-63 would not do that?

Michelle Rempel Conservative Calgary Nose Hill, AB

Thank you.

Ms. Todd, you just asked me if I could show you that approach, and I can.

There's a bill in front of Parliament right now, called Bill C-412. It outlines a specific duty of care for online operators that says exactly what they have to do in this. It also specifies the regulatory body. If it was passed today, it could be enacted today, and we could have immediate impacts.

That's my concern with Bill C-63. It takes this responsibility and puts it into a regulator that hasn't been built. It also gives online platforms the ability to wiggle out of this two, three or four years in the future. My concern is with regard to how many more kids are going to experience this and have detrimental impacts.

Therefore, I would direct your attention to Bill C-412. However, with the time I have left, I'd like to just ask some questions on whether you think some high-level things that are in there would be a good approach. First of all is the immediate updating of Canada's non-consensual distribution of intimate image laws to include images created by artificial intelligence, otherwise known as deep nudes.

Do you think we need to do that today, Ms. Todd?

Barbie Lavers As an Individual

Good morning. Thank you for inviting my husband and me to speak today.

We want to introduce our son to you today. Harry was a very outgoing and inclusive young man. He was intelligent and handsome. He was an athlete and a brother, and he was loved by his friends and his community.

Harry was a patriot. He loved his country. He joined the cadets at age 14. Then in grade 11, in fall 2022, Harry joined the Prince Edward Island Regiment. He was 16. He was doing his basic training in Summerside, Prince Edward Island, on the weekends, while going to Souris Regional School full time. He only had one weekend left to complete his basic training for the RCAC. He was so proud of Canada, and he planned to dedicate his life to serving his country.

I'm Barbie Lavers. My husband is Carl Burke. We are Harry's parents. Harry was 17 years old when we lost him to sextortion. As a family, we had many conversations with Harry and his sister Ella about safe online use and about the dangers of sharing images online. Unfortunately, our family was not aware of the word “sextortion”. We had never heard of it.

On April 24, Harry came to his dad and told him that he had screwed up. He had shared intimate pictures with a girl, supposedly his own age, from Nova Scotia. This individual was now demanding money, or they would share Harry's images with all of his contacts, and in particular with his commanding officer in the RCAC. Sadly, this individual did share some of the images with his friends in cadets, and Harry knew this. I was also contacted on Instagram by apparently the same individual, who told me they would ruin his life.

When Harry came to us that evening and told us what had happened, all four of us sat at the table, talked about it and made a plan to contact the local RCMP in the morning. We thought Harry was comfortable with this plan, but sadly, he wasn't.

On the morning of April 25, we were getting ready for our day. My husband went down to check on Harry. The sheets in his bed had been pulled back, but the bed was not slept in. He yelled to me, “Where is Harry?” I came running down the stairs. By this time, Carl was in the garage. He found Harry face down on the floor. He shot himself.

What I'm telling you here does not define or demonstrate, in any way, what we found, what we felt or how our family felt, or how our lives have been changed forever.

Just two weeks ago, two teen boys and a young man in P.E.I. were targeted for the under-reported global crime of sextortion. The boys were targeted on social media platforms, where the strangers posed as age-appropriate girls for sex photo swaps. This has to be stopped.

We as a family support Bill C-63 to protect our children. As advancements continue with technology and as access to devices continues, the risks to our children increase. We must work together as communities, as families and as governments, through user regulations and accountability, to reduce the online abuse of our children and to provide support to all of us.

Social media platforms must be held accountable. They must incorporate regulations to keep our children safe. Children like our Harry are dying. The evidence of harm to our children is abundantly apparent.

Our 17-year-old daughter Ella has a Facebook account. She is unable to access Marketplace on Facebook because she is under 18. If you or I were on Marketplace, occasionally you might get a pop-up that says a seller might not be from your country. Obviously, Facebook has the ability to review IP addresses from incoming messages to their system. Can we not use this for our children's safety?

Now is not the time to enact or to dramatize politics. Colours need not matter in this discussion. Our children are the most important issue here, not colours. This bill provides an opportunity to protect our children and to show political coalition. Our children are in crisis. Some could even say they're at war. It is not time for our children to be used as political pawns to show that one party is more correct than the other. A temporary alliance must be, and is, required to save our children.

The longer Bill C-63 remains a political issue, the more children we will lose. We beg you to please stop wasting time and do something to help save our children.

Carol Todd Founder and Mother, Amanda Todd Legacy Society

Good morning.

I'm speaking to you from Vancouver, British Columbia. I thank you for this invitation to participate in this prestudy session on Bill C-63.

To start, the majority of what I'm going to say in the next five minutes and in answer to the questions are my thoughts and my thoughts only.

Today I must stress the importance of Bill C-63, the online harms act. This bill is a comprehensive approach to addressing the growing concerns of harmful content on the Internet. Online safety, I feel, is a shared responsibility, and everyone—users, parents, educators and platforms—plays a role in creating a safer online world by ensuring protection, accountability and support.

My name is Carol Todd. I'm widely known as the mother of Amanda Todd. I am a teacher-educator in British Columbia with my work primarily centred on education on digital literacy, online safety and child abuse prevention, namely exploitation and sextortion. Providing children, teachers and families with the knowledge and skills to navigate the digital world is essential and is one of the reasons I created a legacy, a non-profit, in Amanda's memory.

My daughter, Amanda Todd, was a Canadian teenager whose tragic story brought international attention to the severe impacts of cyberbullying, online harassment and exploitation. She was born in November 1996 and faced relentless harassment both online and off-line as a young teenager. She ultimately took her life in October 2012. Knowingly, parents shouldn't outlive their children in preventable situations.

Amanda's ordeal began when she was 12 years old. She was persuaded by an online stranger to expose her breasts on a webcam. This individual saved the image and later used it to blackmail her, threatening to share the photos with her friends and family if she didn't perform more explicit acts. Despite changing schools multiple times, Amanda couldn't escape the harassment, and the blackmailer continued to follow her for two and a half years, creating fake profiles to spread the image and further humiliate her.

In September 2012, five weeks before Amanda took her own life, Amanda posted a YouTube video entitled “My story: Struggling, bullying, suicide, self-harm”, in which she showed flash cards to share her painful experiences. She detailed the bullying, physical assaults and severe emotional distress that she endured both online and off-line. The video went viral after her death, and currently it's been viewed about 50 million times across the world.

Amanda's death prompted significant public and governmental responses. In 2022, Aydin Coban, a Dutch man, was convicted of harassing and extorting Amanda in a Canadian court and sentenced to 13 years in prison. He is currently serving his Canadian time in the Netherlands.

Amanda's story continues to resonate, highlighting the urgent need for stronger protections against online harassment and better supports for victims of bullying, cyber-bullying and exploitation.

There are so many voices that remain unheard due to fear, judgment or shame, or because they can no longer speak. It is vital to let these silent voices be heard and to create a more compassionate and understanding world, where we help and not hurt.

Over the past decade, we have observed rapid changes in technology. We have watched devices that were a useful tool for communication turn into fun devices that can exploit and hurt others. Since its inception, the Internet has taken on darker tones. The word “algorithms” is now in our vocabulary, where it once never was.

Research has highlighted some of the harmful effects related to screen time. These effects include reduced well-being, mood disorders, depression and anxiety. These effects impact children and adults alike in a world filled with online media.

With increased access to the Internet comes easier access to violent and explicit online content that can impact sexual attitudes and behaviours, harm to children through the creation, sharing and viewing of sexual abuse material, and increased violence against women and girls, as well as sex trafficking.

Governments must take action to enact new laws and modify existing ones.

To make the online world safer, we must increase education and awareness. We must have stronger regulations and laws, like Bill C-63. We have to improve the behaviours of the online platforms. We need parental controls and monitoring, and we need to encourage reporting like Cybertip.ca.

Bill C-63

The Chair Liberal Lena Metlege Diab

I call the meeting to order.

Welcome to meeting number 125 of the House of Commons Standing Committee on Justice and Human Rights.

Pursuant to Standing Order 108(2) and the motion adopted on December 2, 2024, the committee is meeting in public to begin its study of the subject matter of Bill C-63, an act to enact the online harms act, to amend the Criminal Code, the Canadian Human Rights Act and an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service and to make consequential and related amendments to other acts.

Before welcoming our witnesses this morning, I wish to call your attention to the presence in the room of Ms. Sokmony Kong, Secretary of the Cambodian division of the Assemblée parlementaire de la Francophonie. This parliamentary official was chosen by the Association des secrétaires généraux des parlements francophones, or ASGPF, in recognition of her very highly esteemed work within her organization. Ms. Kong chose the Parliament of Canada for her two-week professional development placement.

We wish you an excellent stay with us, Ms. Kong. As a former member-at-large representing America for the APF, I’m very pleased you chose Canada. I therefore wish you a good stay with us.

I would like to welcome our witnesses for the first hour. They are all appearing by video conference.

Before I say their names, I have a few reminders.

I'm going to ask colleagues in the room or by video conference to please wait until I recognize you by name before speaking, and to ensure you address your questions through the chair. Please do not take the floor until after you are recognized.

For witnesses participating by video conference, please ensure you have selected, on the bottom of your screen, the language of your choice.

I also want to say that all of the equipment belonging to the witnesses here with us this morning was tested and everything is working well.

As the chair, I want to make note of the fact that it is my responsibility, with the help of the clerk, to keep time as best we can in order to allow fairness for the witnesses, and for the members in the room asking questions, and also to suspend for a minute to allow one hour for the second group of panellists to be brought in.

I will now introduce them to you and ask each of them to give their opening remarks for up to five minutes.

With us this morning, from the Amanda Todd Legacy Society, is Madam Carol Todd, founder and mother.

We also welcome Ms. Lianna McDonald, executive director of the Canadian Centre for Child Protection.

We also have Carl Burke and Madam Barbie Lavers, who are participating together as individuals.

Now I will ask Madam Todd to please begin with her opening comments.

December 4th, 2024 / 6:15 p.m.


See context

Professor of Law, Queen's University, As an Individual

Bruce Pardy

It would be my pleasure.

That's what these three bills do. Bill C-11, Bill C-18 and, in part, Bill C-63 grow the administrative state. They grow the bureaucracy. These bills give powers to administrative bodies, to bureaucrats, to make rules. If you look in the statutes, you don't even know what the rules are. That's what we mean by the expansion of the administrative state.

Our freedom of speech, our freedom to listen to what we want, is now in the hands of a bureaucracy. That bureaucracy is not just enforcing the rules made by Parliament. Parliament, instead, has delegated its authority to that bureaucracy to decide what the rules are going to be. This is what I was alluding to when I talked about the disintegration of the separation of powers and the growth of the administrative state. Our rights are now not in the hands of Parliament, but in the hands of the bureaucrats to whom Parliament has delegated its authority. In this way, and in so many others, your freedom of speech is in peril.

You don't even know what the rules are, because those rules have not been made yet. They'll be made in a back corner, in a back room, and not with the sunlight in the House of Commons, in a debate about what the rules ought to be. Therefore, Bill C-11, Bill C-18 and, to some extent, Bill C-63 are all good illustrations of this trend and of how our rights, including our right to free speech, are being eroded.

Jamil Jivani Conservative Durham, ON

Thank you.

This is also for Mr. Pardy.

The chair of this committee bizarrely suggested that the discussion we were having about the growth of the bureaucracy is irrelevant to Bill C-11, Bill C-18 and Bill C-63. Could you maybe explain, for the benefit of everyone listening, why the conversation about the administrative state is important for these pieces of legislation related to freedom of expression?

Jamil Jivani Conservative Durham, ON

Thank you, Madam Chair.

I'd like to direct my questions to Mr. Pardy.

Mr. Pardy, I think what we've seen on display in some of the comments made today at this meeting is a certain logic that has informed the legislation that you've referenced: Bill C-11, Bill C-18 and Bill C-63. That logic seems to be people pointing to problems in society and suggesting that the expansion of the federal bureaucracy is somehow the necessary solution to those problems. They're not really making a case for the efficacy of that bureaucracy but are nonetheless saying that the bureaucracy must grow and that the Canadian taxpayer must pay for that growth.

I'd like for you to speak to your concerns related to the expansion of the federal bureaucracy. In particular, I'm referencing some of your writing on the growth of the administrative state.