Thank you. I appreciate it.
I think it touches on something incredibly relevant, because we could have witnesses before us who could be talking about how important this bill is, yet we have instead a proposal by the Liberals to delay and quite possibly kill this important bill, a bill that would help set a standard in this country to say it is not okay to exploit those who are most vulnerable.
I will continue to share some organizations that have made a very clear statement, sharing how important it is that we pass this bill and that we pass it quickly.
When it comes to the members of the governing party, since they supported this bill at second reading, I hope they do not have some nefarious motive in standing up for some of the most egregious actors, both individuals and corporations, in our society.
Certainly, when it comes to the history, I referenced earlier the close connection this bill has with Bill S-210. Quite frankly, it was astounding to have the government, and in particular the Liberal cabinet, bow down to the lobbyists of some of the most egregious corporate actors on the planet instead of standing up for minors, in the case of Bill S-210, and ensuring that they are protected in our society.
In the absence of having witnesses before us—and I would note that they could have been there today, but they're not—I will read a quote from Parents Aware. They describe their organization a bit in the quote, so I will share with the committee their endorsement of Bill C-270. They said:
Parents Aware offers our full support on the Criminal Code amendments that are proposed in the Stopping Internet Sexual Exploitation Act. We feel that the addition of these offences with penalties is an effective way to hold companies and individuals criminally responsible when creating and distributing pornographic content depicting underage participants.
Here you have another organization that does good work in helping to bring awareness to some of the risks that exist in the online world and in particular the impact they can have on minors.
I'll just note something I found interesting. We did a TikTok study during my time on the ethics committee, which was very enlightening. It connects to this because it speaks to the speed with which technology is evolving. In particular, there are studies that suggest that the use of TikTok has endorphin-type responses in the brain similar to those from pulling the handle of a slot machine. It's that sort of thing, and the algorithms and the content that exist.
I know there was a big announcement yesterday—and I won't get into the specifics of it because it would be off topic, and I wouldn't want to get off topic—from the government related to TikTok, which I have no doubt will be studied. It will probably be studied by the ethics committee.
We have this responsibility to ensure that the justice system is responsive to the bad actors preying on some of the advances that have taken place and the access we have.
I think it's access. We are in the Internet age. I've talked quite a bit at different points in time about the first version of the Internet. It was that idea that the world could be connected, that there was access and that one computer could connect to another computer. That was a revolutionary concept. It obviously expanded significantly. It came with the idea that there was information associated with it.
We then moved into this “web 2” type of scenario. We had “web 1”, which was the access part. A news website would be a good example of that. You now have access to content—an encyclopedia, so to speak, at your fingertips—that you might not have had prior to that point, and then “web 2” came along.
That's very much the idea of social media. It's this interactive type. My social media will look different from my colleagues' social media, different from the social media of other folks and, Madam Chair, different from your social media. It all looks different, and it's the same thing in every aspect of that. You have algorithms. That idea of "web 2" is that it is no longer just a brochure or a library online; it's something that is actually responsive. It's a kitchen table that is truly the entire world all at once, all speaking at the same time.
We are moving from that, however, to what is often referred to as “web 3”, and that is the world in which it is certainly less tangible, in the sense that you're involving artificial intelligence.
I think there are certain expectations of AI. You look to sci-fi, dystopian-type future movies in which robots take over the world, and that's not what I think the point is. The point is that you now have the ability for the Internet to start to do some of the content curation on its own, so it's not simply responding to you but interpreting how you would want it to respond to something. That can have an impact in the ability for content to be created, and that's what I referenced before—the work that's been done by one of my colleagues in terms of deep fakes. That's one small part of it. That's the creation of content.
It can also be the scraping of content. We see this in terms of copyright for music. You can ask ChatGPT today or any AI chat generator to write you a song, and it's quite something. I'd encourage those who maybe haven't had the chance to do so to go play around with that, because it gives you some insight into the level of interaction that the “web 3” world will have, and you see it in the context of a chat generator.
The reason it connects so closely with Bill C-270 is that in the absence of a clear framework for accountability, it does not limit the leaps and bounds of advancement in how that will impact people, including victims of exploitation in the future. It started off and was pretty easy with “web 1” because it was just basically the world going online and being connected. Access was a big part of it. “Web 2” algorithms have been, and still are, a big part of what this future is, but “web 3” is now taking it to the next step. We have to make sure, in particular when it comes to the content for which there may not be consent, that we develop the legal framework to ensure that there are consequences for the actions of bad actors, both corporate and individual.
When it comes to the role of Parents Aware as an organization, I know there are a whole host of other groups that are doing good work in talking about how to keep kids safe online. It's of course the bogeyman type of scenario, with a bad actor on the other end who would try to do terrible things, but it's becoming more than just that. It is opening up a world of danger online that we all carry with us and have access to in the devices we all keep in our pockets and vote on. We have to ensure that the actors who would perpetrate the crimes can still be held responsible. That's what it really comes down to. In the organization that I just referenced, you have a clear example of the ability for consequences for those actions.
Madam Chair, I had spoken a little bit about the story of Joy Smith. My wife, Danielle, had the opportunity to volunteer in her office back in 2015, and see the incredible legacy and the work that's been done.
I know my colleague across the way has done a tremendous amount of work when it comes to helping to combat human trafficking. I believe his bill received royal assent. Did it?