Evidence of meeting #109 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was risk.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Nicole Foster  Director, Global Artificial Intelligence and Canada Public Policy, Amazon Web Services, Inc.
Jeanette Patell  Director, Government Affairs and Public Policy, Google Canada
Rachel Curran  Head of Public Policy, Canada, Meta Platforms Inc.
Amanda Craig  Senior Director of Public Policy, Office of Responsible AI, Microsoft
Will DeVries  Director, Privacy Legal, Google LLC
John Weigelt  National Technology Officer, Microsoft Canada Inc.

6:30 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you.

Mr. Masse, you have the floor.

6:30 p.m.

NDP

Brian Masse NDP Windsor West, ON

Thank you, Mr. Chair.

This is the quandary we're in. We have to trust either this legislation process or what we're hearing. This is where we're getting mixed messages from a lot of witnesses.

My question is for Mr. DeVries. You're the director of privacy legal for Google. Perhaps you can answer this.

I know you've paid several different fines and penalties, most recently in an antitrust lawsuit in the U.S. for $700 million. Specifically, you should hopefully know about the lawsuit you had where you were secretly tracking the Internet use of millions of people who thought they were browsing privately. For that, you were fined $5 billion.

Were Canadians caught up in that too? If that is the case, are we going to get compensated for it?

6:35 p.m.

Will DeVries Director, Privacy Legal, Google LLC

I'm not aware of that settlement being outside of the United States, where it was made, but our changes to the incognito mode we offer in the Chrome browser—I'm actually using it right now—were made globally. All users are benefiting from those changes, which didn't change the functionality of the system but made it clearer how you were using and what you were using.

6:35 p.m.

NDP

Brian Masse NDP Windsor West, ON

That's really helpful. I appreciate that.

The bottom line, then, is that the product had the same consequences whatever your national boundary. However, the U.S. and its citizens got compensation for the situation because the legal case went through there. Is that a good summary of how that took place?

6:35 p.m.

Director, Privacy Legal, Google LLC

Will DeVries

I would say that was not based on privacy law—the same kind of law that the CPPA would consider in this bill. That was based on private litigation. There is a different private litigation approach in the U.S. compared to in Canada, as you know.

6:35 p.m.

NDP

Brian Masse NDP Windsor West, ON

That's where I'm struggling.

I appreciate the fact that you're open to saving taxes. I personally get lobbied—and so do the rest of parliamentarians and senators when we go U.S.—against Canada's position on the digital tax.

I'm raising that because the position of your companies is to fight all of the fines and penalties that have been paid internationally, mostly in the United States and other places. However, Canada has the same products but doesn't get the same benefit of reparations. Then, on top of that, we even have to wait for taxes to come in from the OECD decision, which could take another decade.

I know that I'm out of time, Mr. Chair, but this is the disappointment that I have. The challenge Canadians have over those who are lobbying us on whether we act or don't act on this legislation is a matter of trust: trust the administration and regulations, or trust the companies and then keep a blind spot for Canadian consumers, because our laws right now don't even give us the same compensation that Americans and Europeans enjoy.

Thank you.

6:35 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Mr. Masse.

Mr. Vis, you have the floor.

6:35 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Mr. Chair.

Thank you to all of the witnesses for an excellent discussion today. I'm going to move very quickly because I have a lot to cover in a very short period of time.

Ms. Foster, proposed section 40 of the AIDA is in relation to proposed sections 38 and 39. It's for a case where a company or an individual has committed a crime in contravention of the act, and it would apply a penalty of up to “5% of the person’s gross global revenues in its financial year”.

Are you aware of any other legislation related to artificial intelligence that applies such a fine?

6:35 p.m.

Director, Global Artificial Intelligence and Canada Public Policy, Amazon Web Services, Inc.

Nicole Foster

I'm not aware of any jurisdiction that has such stringent fines or penalties.

6:35 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Can any of the other companies present comment on the fine contained in proposed section 40 of the AIDA?

6:35 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

Similarly, we're not aware. We're equally concerned about the remote-access provisions in the bill, which would allow the AI commissioner to remotely access data from all of our companies without any concern for privacy or protection of that data.

6:35 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Ms. Craig, answer very quickly. Is it yes or no?

6:35 p.m.

Senior Director of Public Policy, Office of Responsible AI, Microsoft

Amanda Craig

No, especially for the criminal provisions. We're not aware of other OECD partners taking that approach.

6:35 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

I'll go to Google.

6:35 p.m.

Director, Government Affairs and Public Policy, Google Canada

Jeanette Patell

No, we're not aware of anyone else taking that approach. I would point out that the test is also a likelihood to contravene, so it goes beyond what we tend to see.

February 7th, 2024 / 6:35 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you.

As to the data commissioner in proposed section 33 of the legislation, I'm particularly concerned about this, largely because in other panels we've heard—and it's well known—that the Government of Canada doesn't have the intellectual capacity to regulate AI.

I'm concerned that in this legislation we're going to be giving very broad powers to regulators, and in some cases, the minister may divest those powers—in the current form of the proposed bill—to an artificial intelligence and data commissioner who reports directly to the minister. I'm concerned about this relationship because it may create a conflict between multiple objectives contained in the Department of Industry, namely economic development and protecting citizens from online harms in this case.

Would any of the panellists be able to comment on whether they believe an artificial intelligence and data commissioner, in the context of artificial intelligence, may be better served if that individual or future government organization reports directly to Parliament and not to the Minister of Industry?

6:40 p.m.

Director, Global Artificial Intelligence and Canada Public Policy, Amazon Web Services, Inc.

Nicole Foster

I don't know if we have a super strong view about that other than to say—just to repeat my previous comments—that I think it is very difficult for one organization or one person to understand risk mitigation in financial services or health care. These are very complex use cases. I would strongly recommend that the government consider devolving responsibilities of overseeing AI in those sectors to those regulators specifically.

6:40 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Okay. Thank you.

Does anyone else want to comment on that?

6:40 p.m.

Head of Public Policy, Canada, Meta Platforms Inc.

Rachel Curran

I think it's a great idea to have an AI commissioner report to Parliament rather than to the minister of the department directly.

As we've said, the remote access provisions in here allow the commissioner to conduct audits and access user data in a company's possession. That legal power is unprecedented in liberal democracies, and even non-liberal countries.

6:40 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you so much.

The next question I will direct to Google. When I said “Google” earlier, my Pixel 3 said, “Hey, Brad. How's it going?”

The first iteration of this legislation is on privacy. We've had a big discussion about sensitive information. I think it was equally important in this section of the bill.

What we haven't talked a lot about with respect to AI up until today is the impact it's going to have on children. I heard one witness earlier mention that we need to look at taking a proportionate approach for high- and low-risk systems. Actually, I think everyone has commented on this.

In the context of children, how would Google define sensitive information as it relates to its policies and technologies applied to children?

6:40 p.m.

Director, Government Affairs and Public Policy, Google Canada

Jeanette Patell

Thank you for the opportunity to respond to that.

Safety is our number one priority. We take a comprehensive approach to child safety. That begins with designing age-appropriate products, then providing settings and tools for parents and users to make the right choices for them and, finally, having policies that we enforce.

With regard to the specifics around the sensitivity of data from children, maybe my colleague Will can speak to how we approach privacy for children.

6:40 p.m.

Director, Privacy Legal, Google LLC

Will DeVries

Thank you.

With respect to the data related to AI, or any data, we're going to take the circumstances of children into account, but obviously not as a monolith. You're going to think of children who are younger, whose parents will be very closely involved in their use of our products and services. You're going to think about teenagers, who have more agency but still are a special audience that needs consideration—

6:40 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

I think that's the key point right there. I'm sorry for interrupting.

The United Kingdom designed privacy legislation that was proportionate to age. Do we need to do something similar for artificial intelligence regulation? Perhaps we need to be a little more explicit, as it relates to high-impact systems, with designating certain ages and with the impact the systems could have on the psychological development of children.

Go ahead, Mr. DeVries.

6:40 p.m.

Director, Privacy Legal, Google LLC

Will DeVries

I'm happy to talk about that with respect to data use in general. My colleague Jeanette could talk more broadly about this in the context of the AI bill.

I'd say for children overall, yes, we need something that gives us a framework as providers to design our products in relation to the age of our different users. That's the framework that has emerged globally. I think that's the same kind of idea we want to see here in Canada.

6:40 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you.

Finally, I have one more quick question for Microsoft related to national defence.

We haven't spoken a lot about national defence in the context of artificial intelligence at this committee, largely because national defence is outside the scope of this regulation. What's interesting is that we hear a lot about the possible harms that Canadians can face from artificial intelligence designed by national defence systems in countries such as Russia and China, which are being discussed right now at the foreign interference commission that's taking place.

What is Microsoft doing with the Government of the United States to counter some of the actions taken by China and Russia with respect to AI and destabilizing democracies like Canada?