Mr. Speaker, it is my honour to rise again today to address Bill C-11. This bill, when printed, is nearly an inch thick. It is a monster bill for around here. It is a timely bill, as well. I am looking forward to delving into it. I have not had the opportunity to read through it in great detail to this point, but I want to speak to it.
This is a top-of-mind issue for many Canadians. One of the things I want to point out right off the top is that when someone is online and a virtual persona, if they think they are getting a free product, they are actually the product. That is the thing to remember and many folks do not seem to realize that. That is something I have not seen in this bill, which is important. I think it is missing from this bill, although this bill may not been seeking to address that specifically.
There could be some sort of public awareness campaign, much the same as we have done with cigarettes. In the past, the public was trained that if someone smoked cigarettes, they would get cancer. We could do this for online profiles and show the dangers and what is going on out there.
As well, the member for Port Moody—Coquitlam mentioned what is actually happening with our data. We think we are filling out a fun game or personality test, but we are actually giving away data. It can be harvested commercially to send advertisements and promote certain products.
We continue to see more invasion of our privacy. I do not know about other members, but the thing that jumped out at me, during my first cursory read of this bill, was the term “algorithm transparency”. That is something I am really fascinated by.
On the weekend, my friend was telling me that he took his phone, laid it on the table and he and his friends talked about white rabbits for three to four minutes. They just said the words “white rabbits” often. Then they opened up his phone, went to Facebook and the advertisements he was getting were about white rabbits. Our phones are listening to us and there are algorithms that are promoting certain things.
We can probably turn that feature off and mute the microphones on our phones all the time if we know how to do that, if we care enough about it or are concerned about that kind of thing. There is a joke that the Chinese are listening to us. It is just an assumption that is being made. I do not think there is actually somebody listening on the other end, but there is an algorithm that is obviously listening to what we are saying and trying push products toward us that we are interested in.
The white rabbit story is interesting. It is not necessarily something that would come up in day-to-day discussions. However, I know that if we connect to someone else's WiFi then suddenly we start getting different advertisements. My cousin has a CNC plasma cutting table for cutting metal. It is really cool, but what is interesting is that when I go to his house and connect to his WiFi, which is also connected to that CNC plasma table, I start getting advertisements for CNC cutting tables. That is wild and fascinating. The algorithm transparency piece is one of the most fascinating pieces of this law.
Sometimes on Facebook, we get ads. We can click on the “X” to get rid of the ad. When an ad comes up, one wonders why they are seeing it. If I could get an answer for that, that would be amazing.
I am interested in that. What is being fed into the system that is promoting this particular ad to me? That is something I am really interested in knowing. At this point, there seems to be no recourse whatsoever to know why these ads show up. In my virtual personality that lives out on the Internet and in the data collected on me, what recent actions in particular have I undertaken that have driven this particular ad into my feed? I am fascinated to see if we are going to be able to bring that transparency with this bill. I am not necessarily convinced we will be able to do it, but I am fascinated by it.
The other piece I do not think this bill addresses at all is the question of social media platforms or Internet platforms being message boards or publishers. This continues to be a sticking point. There have been committee hearings with the major social media platforms, and we have seen countries around the world seek to grapple with this issue. This is precisely what governments ought to be doing.
What it means to govern and to legislate is to come up with a system that balances the interests of all people in a way of our choosing. That is what it means to be in a democracy. That is what it means to be governed by ourselves, so to speak. In many cases we see effective lobbying efforts by organized groups, and in particular commercial interests, that do not necessarily allow the government to get that balance right.
We see in the news how we grapple to enable this. Some large social media platforms have amassed a wealth that exceeds that of many nations. Some of the largest nations in the world are able to compete with this, but many smaller nations do not have the resource capacity many of these large media companies do, so there is tension there. I compliment this bill in that it is attempting to have that discussion.
Do I trust the Liberals to get it right? No, typically not, but I commend them for bringing this forward and beginning the conversation. This is going to be a long conversation. Like I said before, this bill is an inch thick.
The member for Scarborough—Rouge Park just made a comment. I do not quite know what he said, but I am sure he was complimenting me on my speech. I thank him and appreciate that.
Around algorithmic transparency, the piece that is really important, and that I do not think this bill quite grasps, is whether platforms are curating content, publishing it or choosing winners and losers. The algorithmic transparency of that is a big concern for me, and I know it is a big concern for many people across the country. It is interesting this is a concern for people both on the right and the left. It is a concern for all the political parties. It is a concern for ideological differences, and in general for what is curated and what is deemed to be on the platform.
This is also a concern for the platforms themselves, in that one particular message that comes from a platform can then become part of a mob mentality. People could then really go after it.
There is no protection, necessarily, for platforms because there is ambiguity about whether they are responsible for messages on the message board and, if they are, whether they are liable as a newspaper would be. That is the major challenge.
While I am not convinced, at this point, that we will get algorithmic transparency in that sense, it is important to be able to tell people, “This is our algorithm, this is how messages get on the board. We are not responsible for the messages and, therefore, this is how the system works.” There is no human input. It is just a sophisticated method of getting messages in front of people that they want to see, that they think are interesting and that they find helpful.
For the most part, I would say we are getting that right. Where there is some concern is about political messaging. We have already seen that Facebook has worked hard on that, but there is always a spectrum, I would say, of political messaging. There is explicit party messaging, which is relatively easy to monitor and manage, but then there is political messaging that goes farther afield. When it is a random, individual Canadian doing political messaging, how is that managed? That is when it will be really important for us to get the algorithmic transparency piece right.
There is another thing I am interested in seeing and have not seen. Part of the government's rollout on this bill has been pushing freedom from hate and from violent extremism. That is important to me. The managing of the Internet and platforms around violent and degrading sexually explicit material has been something I have worked on in this place. It was in 2017 that the House unanimously passed a motion for the government to study the impacts of violent and degrading sexually explicit material.
This was something that had not been studied since 1985. I was not even born in 1985, so that tells us it was a long time ago. The member for Fleetwood—Port Kells is shaking his head at me. I am not sure what that belies about me or him, but it was a while back, before I was born and before the Internet existed.
A study on the impacts of violent and degrading sexually explicit material was done in 1985. I remember distinctly, in 1991, going to my uncle's house. He had gotten the Internet. I had heard about it and said I wanted to see the Internet, so he showed me where the phone line plugged into the wall. I asked if that was it and he said we should look at it. He turned his computer on. It had a giant monitor and a big tower beside his desk that hummed. Members may remember the sound coming through the speaker of dial-up Internet. I remember, for the first time ever, seeing the Internet. We went to dogpile.com, which was an early search engine. That was the beginning of the Internet for me, in 1991.
Here we are nearly 30 years later, and we are still grappling with how to manage this. It is a public information highway. There are public highways all over the country, and the government manages a licensing system for folks who get to use the public highways and roads. There is no controversy around that. It seems like an effective way to manage it. Given that it is tangible and we can see it in front of us, that is a manageable thing. In reality, we are dealing with the information highway. Up to this point, there has been very little direction on the role of the government in managing the expectations of Canadians.
Many parents who I have talked to are looking for tools they can use to protect their children online, and they are not satisfied with being told they should just be better parents. They say they want help from the internet service providers. They want help from their government. They want the ability to have some recourse with these large platforms. I am interested to see that.
The government says the Internet should be free from hate and violent extremism. That is something that I support notionally. Video imaging is the area where I am most concerned. In the other direction, I am concerned about free speech, and particularly the use of words and typed messaging. That, I guess, is a little harder to manage. However, particularly with images and video content, I think there is a lot of room for the government to operate in, especially with the violent and extremely degrading sexually explicit material that we have seen since 2007.
Since then, we can chart the impacts of those on Canadian society on a number of different indicators, and they have gotten worse. We see this particularly with our children in terms of the loneliness index going up and the isolation index going up. All of these things are exacerbated by the COVID lockdowns.
These are all things that we need to ensure come into this. Freedom from hate and violent extremism is necessary, and we have to get that right. This is what governments are built for. This is what we need to do, and we have to get it right, so I am looking forward to continuing debate around that.
The last thing I want to point out, which I find to be a little interesting, and I am hoping for some answers on from the government side, is this bill, the procedure of the House and how this bill will roll out over time. I must say this bill was unceremoniously dumped on Parliament. I was not anticipating it. I have been working on these issues for a while, and it was not something that was clearly on my radar.
I had written to the Minister of Canadian Heritage around this issue, and I was wondering how he was going to manage it, because I do remember seeing in his mandate letter that he was to try to remove hate and violent extremism from Canada through the Internet. I had some ideas and concerns around that, so I had written to him about it. I did not receive any feedback back saying the bill is coming, so I was a little surprised that this bill came when it did.
The other thing that I am really looking for an answer on is why the rumour around here is that this bill will be going to the ethics committee. I am wondering why the bill is going to the ethics committee. This seems like a bill built for the industry committee. That is typically where this would be dealt with, so I am left wondering. The ethics committee is seized with a number of other issues, and I am wondering why this bill would be rumoured to be headed toward the ethics committee, when industry seems like the committee that would be more in tune with where we would like to go with this particular bill.
I am going to be continuing to monitor the debate around this bill. I am looking forward to having a robust debate. I know that, given the size of the bill, we will be discussing it for a while, whether in this place, in the other place or in the committee, as well as out there in the general public.
I know that this will be a hot topic of discussion. I look forward to continuing that debate, and I look forward to the questions.