Madam Speaker, it is an honour today to rise to speak to Bill C-27, the digital charter implementation act.
I think it is important to reflect on how long it has been since we last had an update to legislation regarding the privacy laws that exist around data. The last time was over 20 years ago. Twenty years might not seem like a long time, but when we think about it, 20 years ago Facebook was probably just a program Mark Zuckerberg was working on in his dorm room.
If we think of iPhones, they were pretty much non-existent 20 years ago. Smart phones were out, but they certainly did not have anywhere near the capabilities they do today. So many other technologies we have come to rely on now have been getting smarter over the years. They are acting in different manners and are able to do the work they do because of the data being collected from individual users.
Another great example would be Google. Twenty years ago it was nothing more than literally a search engine. One had to type into the Google form what one was looking for. Sometimes one had to put weird characters or a plus symbol between words in the search terms. It literally was just a table of contents accessing information for people. However, now it is so much more than that. How many of us have, at some point, said to somebody that we would love to get a new air fryer, and then suddenly, the next day or later that day, we see in Google, on Facebook, or whatever it might be, advertisements for air fryers that keep popping up. I am sure that sometimes it is a coincidence, but I know in my experience it seems it happens way too often to be a coincidence.
These are the results of new technologies that are coming along, and in particular AI, that are able to work algorithms and build new ones based on the information being fed into the system. Of course the more information that gets fed in, the smarter the technologies get and the more they are looking to feed off new data that can give them even further precision with respect to advertising and targeting tools at people.
This is not just about selling advertising. AI can also lead to incredible advancements in technology that we otherwise would not have been able to get to, such as advancements in health and the automotive industry. If we think of our vehicles, the big thing now in new cars is the lane-assist feature, which uses technology such as lidar to read signals in the road.
There is technology that, when we enter our passwords to confirm we are human beings, sometimes requires us to pick different things from pictures. When we do that, we are feeding information back into helping those images be properly placed. We are not just confirming that we are human beings; there is an incredible amount of data being used to give better evaluations to various different formulas and equations based on the things we do.
When we think of things like intelligent and autonomous vehicles, which basically drive themselves, 20 years ago would we ever have thought a car could actually drive itself? We are pretty much halfway there. We are at a point where vehicles are able to see and identify roads and know where they need to be, what the hazards are, and what the possible threats are that exist with respect to that drive.
What is more important is that, when I get into my vehicle, drive it around and engage with other vehicles, it is analyzing all of this data and sending that information back to help develop that AI system for intelligent vehicles to make it even better and more predictive. It is not just the data that goes into the AI, but also the data that it can generate and then further feed to the algorithms to make it even better.
It is very obvious that things have changed quite a bit in 20 years. We are nowhere near where we were 20 years ago. We are so much further ahead, but we have to be conscious of what is happening to that data we are submitting. Sometimes, as I mentioned in a previous question, it can be data that is submitted anonymously for the purposes of being used to help algorithms around lidar and self-driving vehicles, for example. At other times it can be data that can be used for commercial, marketing and advertising purposes.
I think of my children. My six-year-old, who is in grade one, is developing his reading quite quickly. Two years ago, even at the age of four, when he would be playing a video game and would not be able to figure out how to get past a certain level, he would walk up to my wife's iPad and basically say, “Hey, Siri, how do I do this?”
Just saying that, I probably set off a bunch of phones to listen to what I am saying, but the point is that we have children who, already at such a young age, are using this technology. I did not grow up being able to say, “Hey, Siri, how do I do this or that?”
What we have to be really concerned about is the development of children and the development of minors, what they are doing and how that can impact them and their privacy. I am very relieved to see there is a big component of this that, in my opinion, aims to ensure the privacy of minors is maintained, even though I have heard the concern or the criticism from some members today that the definition of “minor” needs to be better reflected in the legislation.
I feel as though if it is not known what a minor is, in terms of how it relates to this legislation, then I believe this is something that can be worked out in committee. It is something to which the governing members would be more than welcome, in terms of listening to the discussion around that and why or why not further clarifying the definition is important.
I would like to just back up a second and talk more specifically about the three parts of this bill and what they would do. The summary reads as follows:
Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities.
A consequence of this first part would be to repeal other older pieces of legislation. I think this is absolutely critical, because this goes back to what I have been talking about in terms of how things have changed over the last 20 years. We are now at a place where we really do not know what information we are giving or is being used from us. I realize, as some other colleagues have indicated, 99.9% of the time, we always click that “yes, I accept the terms” without reading the terms and conditions, not knowing exactly how our information is being used and what is actually being linked directly back to us.
Through the consumer privacy protection act, there would be protections in place for the personal information of individuals while, at the same time, really respecting the need to ensure companies can still innovate, because it is important to innovate. It is important to see these technologies do better.
Quite frankly, it is important for me personally, and this will be very selfish of me, that, when I am watching on Netflix a show that I really like, I get recommendations of other shows I might really like. As the member for South Shore—St. Margarets mentioned earlier, when it comes to Spotify, it is important to me also that, when I start listening to certain music, other music gets suggested to me based on what other people who share similar interests to mine have liked, and how these algorithms end up generating that content for me.
It is important to ensure that companies, if we want them to continue to innovate on these incredible technologies we have, can have access to data. However, it is even more important that they be responsible with respect to that innovation. There has to be the proper balance between privacy and innovation, how people are innovating and how that data is being used.
We have seen examples in recent years, whether in the United States or in Canada, where data that has been collected has been used in a manner not in keeping with how that data was supposed to be used. There has to be a comprehensive act in place that properly identifies how that data is going to be used, because, quite frankly, the last time this legislation was updated, 20 years ago, we had no idea how that data would be used today.
By encouraging responsible innovation and ensuring we have the proper terminology in the legislation, companies would know exactly what they should and should not be doing, how they should be engaging with that data, what they need to do with that data at various times, how to keep it secure and safe and, most importantly, how to maintain the privacy of individuals. It is to the benefit not just of individuals in 2022, or 2023 almost, to have data that is being properly secured. It is also very important and to the benefit of the businesses, so that they know what the rules are and what the playing field is like when it comes to accessing that data.
The second part of this bill, as has been mentioned:
...enacts the Personal Information and Data Protection Tribunal Act, which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act.
This is absolutely critical, because there has to be somewhere people can go to ensure that, if they have a concern from a consumer perspective over the way their data is used and they are not happy with the result from the commissioner, they have an avenue to appeal those decisions. If we do not do that, and we put too much power in the hands of a few individuals, or in this case the Privacy Commissioner under the consumer protection act, if we give all that power and do not have the ability for an appeal mechanism, then we will certainly run into problems down the road. This legislation would help ensure that the commissioner is kept in check, and it would also help consumers have the faith they need to have in terms of accountability when it comes to their data and whether it is being used and maintained in a safe way.
The third part of the bill is the more controversial in terms of whether or not it should be part of this particular legislation or in a separate vote. The summary reads:
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate the risks of harm and biased output related to high-impact artificial intelligence systems.
That act would provide for public reporting and authorizes the minister to order the production of records related to artificial intelligence systems. The act also would establish prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system in an intentional or reckless way that causes material harm to individuals.
One of the consequences of artificial intelligence, quite frankly, is that if we allow all of this biased information to be fed into the artificial intelligence systems and be used to create and produce results for important algorithms, then we run the risk of those results being biased as well if the inputs are going to be that way. Therefore, ensuring that there are proper measures in place to ensure individuals are not going to be treated in a biased manner is going to require true accountability.
The reality is that artificial intelligence, even in its current form, is very hard to predict. It is very hard to understand exactly when a person is being impacted by something being generated from an artificially intelligent form. Quite often, a lot of the interactions we already have on a day-to-day basis are based on these artificial intelligence features that are using various different inputs in order to determine what we should be doing or how we should be engaging with something.
The reality is that if this is done in a biased manner or in a manner that is intentionally reckless, people might not be aware of that until it is well past the point, so it is important to ensure that we have all of the proper measures in place to protect individuals against those who would try to use artificial intelligence in a manner that would intentionally harm them.
As I come to the conclusion of my remarks, I will go back to what I talked about in the beginning, that artificial intelligence, quite frankly, has a lot of benefits to it. It is going to transform just about everything in our lives: how we interact with individuals, how we interact with technologies, how we are cared for, how we move around by transportation, how we make decisions, as we already know, on what to listen to or what to watch.
It is incredibly important that as this technology develops and artificial intelligence becomes more and more common, we ensure that we are in the driver's seat in terms of understanding what is going into that and making sure we are fully aware of anybody who might be breaking rules as they relate to the use of artificial intelligence. It will become more difficult, quite frankly, as the artificial intelligence forms take on new responsibilities and meanings to create new decisions and outputs, and we must ensure that we are in a position to always be in the driver's seat and have the proper oversight that is required.
I recognize that some concerns have been brought forward today by different members. At first glance, when the member for South Shore—St. Margarets and others brought forward the concern around the definition of a “minor”, which is not something I thought of when I originally looked at this bill, I can appreciate, especially after hearing his response to my question, why it is necessary to put a proper definition in there. I hope the bill gets to committee and the committee can study some of those important questions so we can keep moving this along.
I certainly do not feel as though we should just be abandoning this bill altogether because we might have concerns about one thing or another. The reality, and what we know for certain, is that things have changed quite a bit in the last 20 years since the legislation was last updated. We need to start working on this now. We need to get it to committee, and the proper studies need to occur at this point so we can properly ensure that individuals' privacy and protection are taken care of as they relate to the three particular parts I talked about today.