Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

I'd like to thank all the witnesses.

Mr. Bennett, in your February 12, 2021, submission to the public consultations on Bill C‑11, you distinguished between the concepts of interoperability and harmonization. I believe this is particularly germane to the subject before us, because these two concepts can be confused. You showed the difference between the two with an example I'd like to quote:

For instance, the processes for doing PIAs should be interoperable between the federal government and the provinces. If an organization does a PIA under the authority of one law, it may need the assurance that the PIA will also be acceptable in another jurisdiction. But that does not necessarily mean the harmonization or convergence of rules.

First, can you provide us with a definition of these two distinct concepts?

Second, can you tell us whether the provisions of Bill C‑27 promote the interoperability of processes among the various levels of government or rather the harmonization of rules?

October 26th, 2023 / 4:20 p.m.


See context

Acting Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual

Dr. Brenda McPhail

I think there will always be differences of opinions as to whether definitions are sufficiently stringent or overly weak.

What would address our concerns? There are three categories of concerns that we have around de-identified and anonymized information. The first is that the definition has been weakened between Bill C-11 and the current iteration, Bill C-27. In the past definition, it included indirect identifiers. You can identify me by my name, but you can also identify me if you have a combination of my postal code, my gender and a few other factors about me. To truly de-identify information to an adequate standard where re-identification is unlikely, I believe—and my co-submitters believe—that the definition should include indirect identifiers.

To some degree, that definition has been weakened because Bill C-27 includes the addition of a new category of information: anonymized information. The problem with that new category is that technically people agree that it's extremely difficult to achieve perfect and effective anonymized information, and by taking anonymized information out of the scope of the bill, what we do is remove it from the ability of the Office of the Privacy Commissioner of Canada to inspect the processing that has happened to ensure that it has been done to a reasonable standard.

Like some of the witnesses you heard from—who would disagree with me about whether or not definitions should be stronger or weaker—I think we all agree on the reality that when personal information is processed, whether it is used to create de-identified information or anonymized information, there should be some checks and balances to make sure that the companies doing it are doing it to a reasonable standard that is broadly accepted. The way to achieve that is by including the ability within the bill for the Office of the Privacy Commissioner to inspect that processing and give it a passing grade, should that be necessary.

The last piece of concern we have with anonymization, which makes that scrutiny even more important, is that the bill conflates anonymization with deletion. It was introduced to great fanfare when this bill was put forward that individuals would now have a right to request deletion of their personal information from the companies with which they deal.

That right, I believe, is rendered moderately illusory. Certainly members of the public would not expect that if they ask for their information to be deleted, an organization could say, yes, they'll do that, and then simply anonymize the information and continue to use it for their own purposes. If we are going to allow anonymized information to be equivalent to deletion, again, it's incredibly important that we are 100% certain that the equivalency is real and valid, that truly no individual can be identified from that information and that it's not going to harm them in its use after they've explicitly exercised their right to ask for deletion.

October 26th, 2023 / 4:10 p.m.


See context

Canada Research Chair in Information Law and Policy, Faculty of Law, Common Law Section, University of Ottawa, As an Individual

Dr. Teresa Scassa

Thank you.

I have concerns about both the CPPA and the AIDA. Many of these have been communicated in my own writings and in the report submitted to this committee by the Centre for Digital Rights. My comments today focus on the consumer privacy protection act. I note, however, that I have very substantial concerns about the AI and data act, and I would be happy to answer questions on that, as well.

Let me begin by stating that I am generally supportive of the recommendations of Commissioner Dufresne for the amendment of Bill C‑27, as set out in his letter of April 26, 2023 to the chair of this committee.

I will address three other points.

The minister has chosen to retain consent as the backbone of the CPPA, with specific exceptions to consent. One of the most significant of these is the “legitimate interest” exception in proposed subsection 18(3). This allows organizations to collect or use personal information without knowledge or consent if it is for an activity in which an organization has a legitimate interest. There are guardrails: The interest must outweigh any adverse effects on the individual; it must be one that a reasonable person would expect; and the information must not be collected or used to influence the behaviour or decisions of the individual. There are also additional documentation and mitigation requirements.

The problem lies in the continuing presence of “implied consent” in proposed subsection 15(5) of the CPPA. PIPEDA allowed for implied consent because there were circumstances where it made sense and there was no legitimate interest exception. However, in the CPPA, the legitimate interest exception does the work of implied consent. Leaving implied consent in the legislation provides a way to get around the guardrails in proposed subsection 18(3). An organization can opt for the implied consent route instead of legitimate interest. It will create confusion for organizations that might struggle to understand which is the appropriate approach. The solution is simple: Get rid of implied consent. I note that implied consent is not a basis for processing under the GDPR. Consent must be expressed, or processing must fall under another permitted ground.

My second point relates to proposed section 39 of the CPPA: an exception to an individual's knowledge and consent where information is disclosed to a potentially very broad range of entities for “socially beneficial purposes”. Such information need only be de-identified—not anonymized—making it more vulnerable to re-identification. I question whether there is social licence for sharing de-identified rather than anonymized data for these purposes. I note that proposed section 39 was carried over verbatim from Bill C-11, when “de-identified” was defined to mean what we now understand as anonymized. Permitting disclosure for socially beneficial purposes is a useful idea, but proposed section 39, especially with the shift in meaning of “de-identified”, lacks necessary safeguards.

First, there is no obvious transparency requirement. If we are to learn anything from the ETHI committee's inquiry into PHAC's use of Canadians' mobility data, transparency is fundamentally important. At the very least, there should be a requirement that written notice of data sharing for socially beneficial purposes be given to the Privacy Commissioner of Canada. Ideally, there should also be a requirement for public notice. Further, proposed section 39 should provide that any sharing be subject to a data-sharing agreement, which should also be provided to the Privacy Commissioner. None of this is too much to ask where Canadians' data are conscripted for public purposes. Failure to ensure transparency and a basic measure of oversight will undermine trust and legitimacy.

My third point relates to the exception to knowledge and consent for publicly available personal information. Bill C-27 reproduces PIPEDA's provision on publicly available personal information, providing in proposed section 51 that “An organization may collect, use or disclose an individual's personal information without their knowledge or consent if the personal information is publicly available and is specified by the regulations.” We have seen the consequences of data scraping from social media platforms in the case of Clearview AI, which used scraped photographs to build a massive facial recognition database. The Privacy Commissioner takes the position that personal information on social media platforms does not fall within the “publicly available personal information” exception.

Not only could this approach be upended in the future by the new personal information and data protection tribunal, but it could also easily be modified by new regulations. Recognizing the importance of proposed section 51, former Commissioner Therrien recommended amending it to add that the publicly available personal information be “such that the individual would have no reasonable expectation of privacy.” An alternative is to incorporate the text of the current regulations specifying publicly available information into the CPPA, revising them to clarify scope and application in our current data environment. I would be happy to provide some sample language.

This issue should not be left to regulations. The amount of publicly available personal information online is staggering, and it is easily susceptible to scraping and misuse. It should be clear and explicit in the law that personal data cannot be harvested from the Internet, except in limited circumstances set out in the statute.

Finally, I add my voice to those of so many others in saying that data protection obligations set out in the CPPA should apply to political parties. It is unacceptable that they do not.

Thank you.

Dr. Brenda McPhail Acting Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual

Thank you, Mr. Chair and members of the committee, for inviting me here today to speak to the submission authored by Jane Bailey, professor at the faculty of law of the University of Ottawa; Jacquelyn Burkell, professor at the faculty of information and media studies at Western University; and myself, currently the acting executive director of the public policy and digital society program at McMaster University.

It is a privilege to appear before you on this omnibus bill, which needs significant improvement to protect people in the face of emerging data-hungry technologies.

I will focus on part 1 and very briefly on part 3 of the bill in these initial remarks, and I welcome questions on both.

Privacy, of course, is a fundamental underpinning of our democratic society, but it is also a gateway right that enables or reinforces other rights, including equality rights. Our written submission explicitly focuses on the connection between privacy and equality, because strong, effective privacy laws help prevent excessive and discriminatory uses of data.

We identified eight areas where the CPPA falls short. In these remarks, I will focus on four.

First of all, privacy must be recognized as a fundamental human right. Like others on this panel, while we welcome the amendment suggested by Minister Champagne, we would note that proposed section 12 in particular also requires amendment so that the analysis to determine whether information is collected or used for an appropriate purpose is grounded in that right.

Bill C-27 offers a significant improvement over PIPEDA in explicitly bringing de-identified information into the scope of the law, but it has diminished the definition from the predecessor law, Bill C-11, by removing the mention of indirect identifiers. The bill also introduces a new category, anonymized information, which is deemed out of the scope of the act, in contrast to the superior approach taken by Quebec. Given that even effective anonymization of personal data fails to address the concerns about social sorting that sit at the junction of privacy and equality, all data derived from personal information, whether identifiable, de-identified or anonymized, should be subject to proportionate oversight by the OPC, simply to ensure that it's done right.

Third, proposed subsection 12(4) weakens requirements for purpose specification. It allows information collected for one purpose by organizations to be used for something else simply by recording that new purpose any time after the initial collection. How often have you shared information with a business and then gone back a year later to see if it had changed its mind about how it's going to use it? At a minimum, the bill needs constraints that limit new uses to purposes consistent with the original consensual purpose.

Finally, the CPPA adds a series of exceptions to consent. I'll focus here on the worst, the legitimate interest exception in proposed subsection 18(3), which I differ from my colleagues in believing should be struck from the bill. It is a dangerously permissive exception that allows collection without knowledge or consent if the organization that wants the information decides its mere interest outweighs adverse impacts on an individual.

This essentially allows collections for organizational purposes that don't have to provide benefits to the customer. Keeping in mind that the CPPA is the bill that turns the tap for the AIDA on or off, this exception opens the tap and then takes away the handle. Here, I would commend to you the concerns of the Right2YourFace coalition, which flags this exception as one in which organizations may attempt to justify and hide their use of invasive facial recognition technology.

Turning to part 3 of Bill C-27, the AIDA received virtually no public consultation prior to being included in Bill C-27, and that lack of feedback has resulted in a bill that is fundamentally underdeveloped and prioritizes commercial over public interests. The bill, by focusing only on high-impact systems, leaves systems that fail to meet the threshold unregulated. AI can impact equality in nuanced ways not limited to systems that may be obviously high-impact, and we need an act that is flexible enough to also address bias in those systems in a proportionate manner.

A recommender system is mundane these days, yet it can affect whether we view the world with tolerance or prejudice from our filter bubble. Election time comes to mind as a time when that cumulative impact could change our society. Maybe that should be in, and maybe it should be out. We just haven't had the public conversation to work through the range of risks, and it's a disservice to Canadians that we're reduced to talking about amendments to a bad bill in the absence of a shared understanding of the full scope of what it needs to do and what it should not do.

Practically, in our submission, we nonetheless make specific recommendations in our brief to include law enforcement agencies in scope, to create independent oversight and to amend the definitions of harm and bias. We further support the recommendations submitted by the Women's Legal Education & Action Fund.

I would be very happy to address all of these recommendations during the question period.

Thank you.

Vivek Krishnamurthy Associate Professor of Law, University of Colorado Law School, As an Individual

Thank you, Mr. Chair and members of the committee. I am very honoured to be speaking with you today regarding Bill C-27.

I am currently a professor of law at the University of Colorado, but when I was the director of CIPPIC at the University of Ottawa, we published two reports in the spring of 2023 that consider AIDA and the CPPA. I am going to focus my remarks on the CPPA, particularly on provisions that relate to the privacy of minors. I would be happy to share some of my thoughts around AIDA as well.

I would like to begin by saying that I agree with everything that Professor Bennett and Professor Geist said. You could treat these remarks as additive.

While it is very welcome that the CPPA, unlike PIPEDA, specifically defines that the personal information of minors is sensitive information, Professor Bennett already told you about how “sensitive information” is not a defined term in the legislation. It is positive that children would have—if this bill passes into law—some recognition of the importance of protecting their personal information to a higher standard. However, we believe that this legislation can do far better.

For context, it is important to realize that children spend increasing amounts of time online, at younger and younger ages. This is a trend that accelerated during COVID-19 and the transition to digital online learning. I am a parent, and I am sure many of you are parents. Our children are using devices under commercial terms of service all the time, and this poses a very significant risk to the privacy rights of children.

While COVID has receded, it's the new reality that kids are using more and more technology at younger ages. What can we do? There are three things, and then a fourth about jurisdictional competence.

The Privacy Commissioner, in his recommendations regarding the CPPA, suggested that “best interests of the children” language should be incorporated into the law, and he suggested doing that in the preamble. I take no position myself as to where that should be done, but it is clear that this is international best practice. The United Kingdom and California have both incorporated such language into recently enacted statues, and we think that Canada should follow this approach. What would that mean? It means that organizations that handle children's personal data must take the best interests of children into account. That must come ahead of their commercial interests.

Second, we think it is important for the CPPA to require organizations that develop products or services that are likely to be accessed by children to set their privacy settings to the highest level. Defaults play a really important role in our subjective experience of privacy. It is great to have rights, but you can very easily leave those rights on the table if a setting is such that it contracts you out. We think that requiring a company to set those defaults to high levels when children are their likely users or their known users is very important.

Third, I'd like to pick up on what Professor Bennett told you about data protection impact assessments, a made-in-Canada idea. Bill C-27 is extremely weak when it comes to data protection impact assessments. The provisions apply only when the legitimate interest, excepting the consent, is being used. This is a problem for everyone, especially for children.

We believe—and I specifically believe this personally—that the data protection impact assessment requirements of this bill need to be considerably strengthened whenever data-processing activities pose a high risk to the privacy rights of Canadians. I would say that if children's data is sensitive data, that means we basically need to do that impact assessment all the time.

Last, I'd like to talk about constitutional competence here. There may be some concerns that it may be beyond federal competence to protect the privacy rights of children with more expansive provisions. Our analysis suggests otherwise. CPPA, like PIPEDA before it, is being enacted under Parliament's power to regulate trade and commerce.

Now, it is true that in our federal system, provincial governments get to determine the age of majority, but there is plenty of federal legislation that is designed to protect the rights of children. This also leads to how we think of this law, the consumer privacy protection act. It's not just a form of privacy regulation; it's also, when you think about it, a form of consumer protection legislation that is regulating the safety of digital products that invade and interfere with our right to privacy.

In view of the long history of federal regulation directed at protecting children in the marketplace, we think it would be appropriate for the federal government to include stronger privacy protections, and that would not prejudice provincial laws, like Quebec's, that are stronger. Just as PIPEDA yields to provincial legislation when it's substantially equivalent or better, the same could be true of strengthened children's privacy protections in the new CPPA.

Thank you very much.

Prof. Colin Bennett Professor, Political Science, Unversity of Victoria, As an Individual

Thank you very much, Mr. Chair.

I'm from the University of Victoria, although I'm currently in Australia. I wish everybody a good day.

I would like to emphasize five specific areas for reform of the CPPA and to suggest ways in which the bill might be brought into better alignment with Quebec's law 25. I don't think that Bill C-27 should be allowed to undermine Quebec law, and in some respects, it does. I also think these are some of the areas where the bill will be vulnerable when the European Commission comes to evaluate whether Canadian law continues to provide an adequate level of protection.

Some of these recommendations are taken from the report that you have from the Centre for Digital Rights, which I'd like to commend to you.

First, I believe that CPPA's proposed section 15, on consent, is confusing to both consumers and businesses. In particular, I question the continued reliance on “implied consent” in proposed subsection 15(5), which states, “Consent must be expressly obtained unless...it is appropriate to rely on an individual's implied consent”.

The bill enumerates those business activities for which consent is not required, including if “the organization has a legitimate interest that outweighs any potential adverse effect on the individual”. That's a standard that has been imported from the GDPR. However, in the GDPR, “consent” means express consent; it's “freely given, specific, informed and unambiguous”.

In the current version of the CPPA, businesses can have it both ways. They can declare that they have implied consent because of some inaction that a consumer allegedly took in the past because of not reading the legalese in a complex terms-of-service agreement, or they can assert a “legitimate interest” in the personal data by claiming that there is no “potential adverse effect on the individual”. That is a risk assessment performed by the company rather than a judgment made about the human rights of individuals to control their personal information.

In that respect, it's really important that the bill be brought within a human rights framework. There should be no room for implied consent in this legislation. It's a dated idea that creates confusion for both consumers and businesses.

Second, there is no section in the CPPA on international data transfers. I find that very odd. I know of no other modern privacy law that fails to give businesses proper guidance on what they have to do if they want to process personal data offshore. The only requirement is for the organization to require the service provider, “by contract or otherwise,” to ensure “a level of protection of the personal information equivalent to that which the organization is required to provide under this Act.” That's proposed subsection 11(1) of the CPPA.

That due diligence applies whether the business is transferring personal data to another province in Canada or overseas to a country that may or may not have strong privacy protection or, indeed, a record of the protection of human rights. That's particularly troubling because of proposed section 19 of the CPPA, which reads, “An organization may transfer an individual's personal information to a service provider without their knowledge or consent.”

The Canadian government has never gotten into the business of adopting a safe harbour approach or a white list, and I'm not recommending that. However, Quebec, I believe, has legislated an appropriate compromise under section 17 of law 25, which requires businesses to do an assessment, including of the legal framework, when sending personal data outside of Quebec. As many businesses will have to comply with the Quebec legislation, why not mirror that provision in Bill C-27?

Third, the bill ignores important accountability mechanisms that were pioneered in Canada and exported to other jurisdictions, including Europe. Therefore, it's very strange that those same measures do not appear in the CPPA. In particular, privacy impact assessments are an established instrument and a critical component of accountable personal data governance, and they should be required in advance of product or service development, particularly where invasive technologies and business models are being applied, where minors are involved, where sensitive personal information is being collected, or where the processing is likely to result in a high risk to an individual's rights and freedoms. Businesses do the PIAs, and they stand ready to demonstrate their compliance or their accountability to the regulator.

A fourth and related problem is the absence of any definition of sensitive forms of personal data. The word “sensitivity” appears throughout the legislation in several provisions of the bill, but with the exception of the specification about data on minors, it is nowhere defined. In my view, the bill should define what “sensitive information” means, and it should also enumerate a non-exhaustive list of categories, which, in fact, occurs in many forms of legislation.

Finally—I know you've heard about this in the past, and I've researched on this—the absence of proper privacy standards for federal political parties is unjustifiable and untenable. The government is relying on the argument that the FPPs’ privacy practices are regulated under the Elections Act, but those provisions are nowhere near as strong as in Bill C-27. I think businesses resent the fact that parties are exempted. This is not an issue that will go away, given advances in technology and its use in modern digital campaigning. Canada is one of the few countries in the world in which political parties are not covered by applicable privacy law.

Thank you so much.

The Chair Liberal Joël Lightbound

Good afternoon, everyone. I call this meeting to order.

Welcome to meeting no. 92 of the House of Commons Standing Committee on Industry and Technology.

Today's meeting is taking place in a hybrid format, pursuant to the standing orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C-27, An Act to Enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other acts.

I'd like to welcome our witnesses today and also apologize for the brief delay caused by a vote in the House.

Joining us today are Colin J. Bennett, professor; Dr. Michael Geist, professor of law and Canada research chair in Internet and e‑commerce law; Vivek Krishnamurthy, associate professor of law at University of Colorado Law School; Dr. Brenda McPhail, acting executive director of the public policy in digital society program; and lastly, Teresa Scassa, Canada research chair in information law and policy, Faculty of Law, Common Law Section, University of Ottawa.

I'd like to welcome you all.

We'll begin the discussion without further ado.

Mr. Bennett has the floor for five minutes.

Motions in AmendmentNational Security Review of Investments Modernization ActGovernment Orders

October 26th, 2023 / 1:35 p.m.


See context

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

Mr. Speaker, today, we are debating Bill 34, an act to amend the Investment Canada Act, at report stage. We are dealing with a new amendment to this bill from the Conservative side of the House, as well as some housekeeping amendments from the government side.

To make sure everybody watching understands what the Investment Canada Act is about, it deals with the acquisition of Canadian companies by foreign entities: companies and governments that come to Canada to try to acquire our businesses. There is a government process, through Investment Canada, that these entities need to go through with the Minister of Innovation, Science and Industry and cabinet. Through the bill before us, cabinet would be removed from the process. I will speak to this in a moment.

Wayne Gretzky, whom I know everybody here admires, said, “You miss 100% of the shots you don't take”, and this bill fits that description. While it would make administrative amendments and speed up the process a little, it missed the opportunity to look at what is happening in the Canadian economy and deal with the increasing acquisitions of assets and businesses of various sizes, from small businesses worth a few million dollars up to minerals rights and large corporations, by states that are hostile to us. As has been said before, it has been 14 years since the act was amended. A lot has changed in the world, in particular around the way that state-owned enterprises have become extraterritorial in taking over companies around the world for their own economic interests. The Conservatives' challenge with the bill is that it thinks small. It did not use this opportunity to take a shot on net and score a goal by recognizing the change in the global economy and what is happening with the outright sales of Canadian businesses and assets to hostile states.

The minister is the minister of broken bills, which is why we are having to make more amendments to this one. On his other bill, Bill C-27, after a year and a half, he has had to make amendments. Perhaps if he had spent more time here in Canada understanding what was going on, he might have produced better legislation. The Liberals missed the chance to think big and understand what is going on in our economy. What is going on in our economy is what I call the Chinese government cold war. We are in a new cold war. It is not one of bombs and the military in that sense; it is the silent takeover of the economic assets of other countries. This is how China is gaining influence all around the world. We all know about the election interference issues, but those things are perhaps a little more obvious than this is to Canadians, this creeping strategic control by the Communist Party of China of Canada's assets and those of other countries. Other countries have put mechanisms in place within their investment acts to recognize this and prevent it. The bill, as it was introduced in the House and debated at second reading, did not contain any of that.

Small businesses in my riding, such as lobster buyers, are $2-million businesses being bought for $10 million by China. The Chinese government owns a number of lobster businesses in my riding. It is how it is getting control of our seafood assets behind the door. It is doing the same in agriculture. It is buying land and farms in western Canada and mineral rights in our land. It is buying more obvious things, which I will speak to. It is buying companies like the only producing lithium mine in Canada. Therefore, Bill 34 missed a lot and would just make small administrative changes.

The Communist Party of China cold war's being ignored in Canada might be out of incompetence, but it also could be the case, as we know, that the Prime Minister believes that China is his most admired country, so maybe it is more strategic. Let us take a look at the Liberal government's record on this issue.

In 2017, the Liberal government allowed a telecom company from B.C. called Norsat to be acquired by a company called Hytera, which is Chinese-based. Hytera does not make any money. Conservatives demanded, at the time, a full national security review. The Liberal minister of the day refused to do one and approved the acquisition. Lo and behold, in 2022, Hytera was charged with 21 counts of espionage in the United States and was banned from doing business there, but only eight months later, the RCMP in Canada, shockingly, bought telecommunications equipment from Hytera to put in its communications system. When I asked the RCMP, at the industry committee, because it was in all the newspapers, whether its members were aware that eight months before, Hytera had done this and been banned in the U.S., the RCMP, shockingly, said no.

I referred earlier to the Tanco mine, our only producing lithium mine, which was bought by the Sinomine Resource Group, a Chinese-owned mining company. Every ounce of that lithium in our critical minerals industry goes to China.

The record on this is very awkward for the government to hear, but it is a growing concern. It did not take those things into consideration in drafting the bill before us, As a responsible opposition to His Majesty, the Conservatives proposed a number of amendments in committee, and thanks to the support of the other two opposition parties amidst the objections of the Liberals, we made some significant amendments. Those amendments include that with any state-owned enterprise from a country that does not have a bilateral trade relationship with Canada, the threshold for review by the Government of Canada would now be zero dollars. Any transaction over zero dollars would be reviewed, compared to the threshold now, which is $512 million. China is buying a lot of assets for under $512 million, and the threshold would now be zero. The same would apply for a new concept we added, which is that all asset sales would need to be included in that test with a state-owned enterprise.

Today, we are also taking this one step further by saying that the minister has made yet another error. That error was trying to consolidate all his power and ignore his cabinet colleagues. The bill would change the Investment Canada Act process that requires that at the beginning, when an acquisition is made, the minister take his recommendation on how far to go with a national security and net benefit review into a study. The bill before us says that he would not have to do that anymore and that he could decide on his own, that at the end of the process, whatever the results are, he would come back and say he will decide whether or not he goes to cabinet with the results.

Removing cabinet from the decision-making process would mean that we would not get the breadth of experience of people around the cabinet table and that we also would not get the breadth of experience from regional perspectives. For example, there have been companies bought in Quebec. If an industry minister is from Ontario and our public safety minister is from out west, they would make the decision on their own without any input from Quebec. I suspect that the Bloc Québécois would be opposed to that issue and would want to see Quebec representation in those decision-making processes, but the bill before us has the potential to eliminate that part of it.

We are proposing common sense Conservative amendments, as we did in committee. Thankfully we upped the ante of the bill and made it more than an administrative bill such that it would deal with the serious international challenges we had, through the four amendments that were accepted. By the way, there are two national tests in there. One is on national security and the other is on the net benefit to Canada. Conservatives in committee added a third: if a company has been convicted of bribery or corruption, the minister would now have to take that into consideration in deciding whether to approve the acquisition. It would add much benefit, but, for some reason, Liberals did not think it was worthy when they voted against it.

We believe that Conservatives have improved the bill dramatically. We are trying to improve it again in the spirit of good public policy for Canada and protecting our economy against hostile interests, which the Liberals seem not to care about. I urge the House, including all members from the Bloc Québécois, the NDP and the government, to recognize that cabinet's decision-making process is essential to getting the full breadth of things, and I urge members to vote for our amendment.

October 25th, 2023 / 6:35 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Indeed.

This raises the whole question of online age verification and techniques for determining whether a person is underage or not. This will be important in the context of Bill C‑27, which explicitly grants rights and treats information differently. It's an issue we're looking at, in the privacy field. There's a lot of discussion about it. In fact, the Information Commissioner's Office of the U.K. has issued guidelines on verification tools.

What we're saying, at the Office of the Privacy Commissioner, is that these tools need to be appropriate and not ask for too much personal information. Age verification needs to be managed, but we don't necessarily want to ask for too much personal information to do that. That said, there are ways of doing it and technologies to do it. It's another area where we need to be creative.

Also, it has to be context-appropriate. Some sites may be higher-risk and will require tighter verification. We can think of gambling or pornography sites, for example. Some sites may be less sensitive. Others may be aimed specifically at children. There may be a presumption.

I think this will be part of the implementation of this law. My office will have a role to play in this as it can issue guidelines.

In addition, the bill also provides for the creation of codes of practice and certification programs.

This will encourage organizations to adhere to a series of rules. If they respect them, it will have an effect on the complaints process, which will be beneficial for these organizations. So it will be one more tool. I suspect that the Office of the Privacy Commissioner will be able to work on it, precisely to give these details.

The Office of the Privacy Commissioner also has an advisory mandate. Companies, especially small and medium-sized enterprises, can contact us for answers to specific questions. We're here to help them with questions like these, especially those of a more technical nature.

October 25th, 2023 / 6:30 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Yes, a revision of the two laws is necessary. One is under way for the law in the private sector. This is Bill C‑27. This also includes a specific component for artificial intelligence.

A revision is necessary because the law is 20 years old. It's older than social media. We're still applying it, the principles are there, but technology is advancing rapidly. In my opinion, this calls for stronger proactive obligations, for example. We need to force organizations to make basic assessments that they have to disclose to our office; we also need to impose greater transparency, particularly when it comes to artificial intelligence.

The law governing the public sector, on the other hand, is even older. It dates back 40 years. It needs to be modernized and strengthened, because when it was passed, it was really at a time when the impact of data was not what it is today.

October 25th, 2023 / 6:30 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

It's a problem for kids because of their greater vulnerability. We've made a number of recommendations in terms of making sure that we're not using these behavioural techniques of nudging. We shouldn't be nudging individuals generally, but certainly not children, into making bad decisions and making bad privacy decisions. There needs to be work on that.

There have been reports on social media being addictive and being addictive for children generally. Sometimes the business model is to try to encourage them to stay longer, because that's what generates more revenues. That has to be taken into consideration with children who have been online more and more during the pandemic, and since then with school. I've seen it and parents have seen it.

We need to adjust to this new reality as parents, children and society as a whole, so that there's a greater awareness of what this means and what their rights are.

Bill C-27 proposes a right to disposal. That's informing.... When I say that children have a right to be children, that's what I'm alluding to. Children do things online. If it stays online forever, then they're treated as adults right from when they're teenagers. It stays forever, and it could be used against them for jobs and so on and so forth.

We need to deal with this. Bill C-27 will deal with it to some extent, but we certainly need to build greater awareness of it as we are living more and more in a digital world. It brings innovation and it brings great things, but we need to be well equipped to deal with it and we need to learn about it. I would hope to see mandatory training in schools early on, so that individuals can get the tools early on.

We'll get these reflexes. We're going to ask questions. We're going to ask why they need this information. We're going to learn to see what a good privacy policy is, and if it's not, we're going to learn how to complain about it so that it could become a good privacy policy in the future.

That way, we're creating ambassadors for privacy everywhere.

October 25th, 2023 / 6:10 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

That's why the European model, which is the General Data Protection Regulation, the model in Quebec, which is Bill 25, and the model proposed in Bill C‑27 provide that it's going to be a maximum amount of $10 million, for example, or 3% of sales. I think that addresses the issue you raised.

If a company has significant sales, $10 million isn't a lot; setting a percentage addresses that.

October 25th, 2023 / 6:05 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Again, in terms of meaningful consent, Bill C-27 would make it stronger in terms of explicitly saying that this has to be provided in information that the person can understand. That's what we look at. This is very complex information. How are you giving those notices? Are you giving a notice that only an expert will understand?

Even if you're an expert, you may be reviewing this at the end of the day. You may be tired. You may be bombarded with so many things. Every time you go on a website, you get a cookie page or whatnot. We provide a number of tips in that guidance: Make it user-friendly. Make it not just a one-time thing. Make sure that you sometimes provide follow-ups. Make it as understandable as possible. In the context of children, make it appropriate to the child. Maybe there are opportunities for video or other ways.

The goal is to provide the information so that individuals can understand what's going on and to bring that same innovation.... We often talk about innovation requiring data, and that's true, but let's use innovation to protect data. That would assist in terms of the consent and the explainability.

October 25th, 2023 / 6 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

There are extra issues, in the sense that we will generally consider minors' information or children's information to be “sensitive information”. That brings with it greater obligations in terms of care and in terms of methods of consent.

We've issued guidance under current law about obtaining meaningful consent. We are expecting organizations to make it user-friendly and so on, but specifically with respect to children, there are circumstances in which they won't be able to give that consent. They may need a parent to do that if they're below a certain age. In our current guidance, although certain provinces might take different views, for us, if they're under13 years old, there's almost a presumption that you need that parental consent.

It certainly has to be considered in how you look at information. They will have different needs. They will have greater vulnerabilities. That is something that's recognized in the European legislation. It's proposed to be recognized in Bill C-27, which I certainly hope will happen.

October 25th, 2023 / 5:50 p.m.


See context

Privacy Commissioner of Canada, Offices of the Information and Privacy Commissioners of Canada

Philippe Dufresne

Thank you, Mr. Chair.

I'm pleased to now turn to this part of the discussion. I thank the committee for its interest in the ways that social media platforms such as TikTok harvest, handle and share personal information.

The online world brings with it a host of possibilities for innovation and connection, but it also carries potential for significant harm, especially for young people.

As you know, my office, along with our counterparts in Quebec, British Columbia and Alberta, launched an investigation into TikTok in February. We are examining whether TikTok's practices comply with Canadian privacy legislation, and in particular whether it obtains valid and meaningful consent for the collection, use and disclosure of personal information.

We are also looking at whether a reasonable person would consider the purposes for which it handles personal information, in particular children's information, to be appropriate in the circumstances.

This matter is a high priority for my office, especially given the importance of protecting the fundamental right to privacy of young people, who represent a notable proportion of TikTok users. As a result of the ongoing investigation, there are limits to my ability to speak publicly about the company’s practices at the moment.

For that reason, I will focus my remarks today on the privacy principles that underpin my office’s approach to the digital world from the perspective of the privacy rights of children.

Growing up in the digital age presents significant new challenges for the privacy of young people. As children and youth embrace new technologies and experience much of their lives online, we need strong safeguards to protect their personal information, and how it may be collected, used and disclosed. Increasingly, their information is being used to create personalized content and advertising profiles that are ultimately aimed at influencing their behaviours.

Children have a right to be children, even in the digital world. As UNICEF notes in its policy guidance on artificial intelligence for children, young people are affected by digital technologies to a greater extent than adults. Young people are also less able to understand and appreciate the long-term implications of consenting to their data collection. Privacy laws should recognize the rights of the child and the right to be a child. This means interpreting the privacy provisions in the legislation in a way that is consistent with the best interests of the child.

I'm encouraged by statements from the Minister of Innovation, Science and Industry indicating that there is a desire to strengthen children's privacy rights in Bill C-27, the Digital Charter Implementation Act, 2022. My office has recommended that the preamble of the modernized federal privacy law should recognize that the processing of personal data should respect children's privacy and the best interests of the child. I believe that this would encourage organizations to build privacy for children into their products and services by design and by default. I was pleased to hear the minister signalling his agreement with that recommendation.

The law must have strong safeguards to protect children’s information from unauthorized access, and reflect greater consideration of the appropriateness of collecting, using and disclosing their information.

Earlier this month, my provincial and territorial colleagues and I adopted a resolution calling on organizations in the private and public sectors to put the best interests of young people first by, among other things, providing privacy tools and consent mechanisms that are appropriate for young people and their maturity level; rejecting the kind of deceptive practices that influence young people to make poor privacy decisions or to engage in harmful behaviours; and allowing for the deletion and de‑indexing of information that was collected when users were children.

I am happy to see this was included in Bill C‑27.

In closing, it's critical that government and organizations take action to ensure that young people can benefit from technology and be active online without the risk of being targeted, manipulated or harmed as a result. I expect that the findings from our investigation into TikTok will be informative not just for that company but also for other organizations that collect and handle children’s sensitive personal information.

I also look forward to seeing Bill C-27 progress through the legislative process in a way that will provide children and minors with the privacy protections that they need in this increasingly digital world.

With that, I will be happy to take your questions.