Evidence of meeting #155 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was apple.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Mark Ryland  Director, Security Engineering, Office of the Chief Information Security Officer for Amazon Web Services, Amazon.com
Marlene Floyd  National Director, Corporate Affairs, Microsoft Canada Inc.
John Weigelt  National Technology Officer, Microsoft Canada Inc.
Alan Davidson  Vice-President, Global Policy, Trust and Security, Mozilla Corporation
Erik Neuenschwander  Manager of User Privacy, Apple Inc.
Sun Xueling  Senior Parliamentary Secretary, Ministry of Home Affairs and Ministry of National Development, Parliament of Singapore
Hildegarde Naughton  Chair, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas
James Lawless  Member, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas
Damian Collins  Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Ian Lucas  Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Jo Stevens  Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

11:25 a.m.

Liberal

David Graham Liberal Laurentides—Labelle, QC

Then at no time did Amazon distribute anti-union marketing materials to newly acquired companies.

11:25 a.m.

Director, Security Engineering, Office of the Chief Information Security Officer for Amazon Web Services, Amazon.com

Mark Ryland

I'm not familiar with that particular scenario.

11:25 a.m.

Liberal

David Graham Liberal Laurentides—Labelle, QC

Well, the next time we have this committee, I hope we have somebody at Amazon who knows the policies of Amazon rather than AWS. You do very good work at web services, but we need to know about Amazon as a whole company.

I think that's all I have for the moment. Thank you.

11:25 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, David.

I did neglect Jo from the U.K., and—

11:25 a.m.

Member, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas

James Lawless

Chair, before you hand it to Ms. Stevens, I must apologize; we need to catch a flight.

I would just thank the committee for the engagement over the last few days.

11:25 a.m.

Conservative

The Chair Conservative Bob Zimmer

All right. Thank you. We will see you again in November of this year.

11:25 a.m.

Member, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas

James Lawless

Absolutely.

11:25 a.m.

Conservative

The Chair Conservative Bob Zimmer

Give our best to Ms. Naughton. Thank you for coming.

11:25 a.m.

Member, Joint Committee on Communications, Climate Action and Environment, Houses of the Oireachtas

James Lawless

Thank you very much.

11:25 a.m.

Conservative

The Chair Conservative Bob Zimmer

Go ahead, Jo.

11:25 a.m.

Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Jo Stevens

Thank you very much, Chair.

I want to go back to something you said earlier, John, about a board or a group that you have to look at sensitive use of AI. Can you give me an example of what sort of AI deployment you would describe as “sensitive”?

11:30 a.m.

National Technology Officer, Microsoft Canada Inc.

John Weigelt

One area is the use of artificial intelligence in medical diagnosis. We look at three criteria: Can it approve or deny consequential services? Is there infringement on human rights or human dignity? Are there health and safety issues at hand?

In one case, researchers were training artificial intelligence algorithms on chest X-rays. They then wanted to put that onto the emergency room floor, and they said, “Hey, this might impact patients. We need to understand how this works.” Our committee came together. We reviewed the datasets. We reviewed the validity of that open-source dataset and the number of people there. Then we provided guidance back to the researchers who were putting this in place. The guidance was along the lines of, “Hey, this is not for clinical use”, because software as a medical device is a completely different area. It requires certifications and whatnot.

However, if we're trying to assess whether or not artificial intelligence could potentially have the ability to learn from these scans, then that would be a good use. That's how we would tend to look at that situation.

11:30 a.m.

Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Jo Stevens

That's a really useful example. Thank you.

We do know, and there's plenty of evidence about this, that there is both deliberate and unconscious bias inherent in AI. I think there's quite a strong argument for a regulatory approach to govern AI deployment, much like we have in, say, the pharmaceutical sector. When you look at a product, before you can put it on the market, you have to look at what might be the unintended side effects of a particular medicine.

What do you feel about that? Do you think there is an argument for a regulatory approach, particularly because, as we know, the current deployment of AI does discriminate against women and does discriminate against black and ethnic minority citizens? People are losing jobs and are not gaining access to services like loans and mortgages because of this.

11:30 a.m.

National Technology Officer, Microsoft Canada Inc.

John Weigelt

Absolutely. I think your point is well made around the unintended creep of bias into AI decision-making solutions, so we do need to guard against that. It's one of those engineering principles that we're working hard on to come out with guidance and direction to our teams.

There are some areas where we've advocated for very swift and direct action to move more carefully and more deliberately, and one area is facial recognition software. It's to your very point that a lot of these models have been trained on a very homogeneous community and are therefore not looking at the diverse community that they must serve.

We are quite strong advocates for putting in place legislative frameworks around some of the consent regimes, such as whether you have banners on the streets that say that, whether you have measurements, what the difference is between public and private space, and things like that.

11:30 a.m.

Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Jo Stevens

How willing are you to make the work that you've been doing public? I appreciate if you're doing it behind the scenes. That's great, but it would be useful to know what you're doing and what your colleagues are doing.

11:30 a.m.

National Technology Officer, Microsoft Canada Inc.

John Weigelt

Absolutely. Clearly, we need to do more about advising and alerting the community about all the great work that's under way. We've published guidance around bots and how to make sure that bots are behaving properly, because we've had our own negative experience around a foul-mouthed, bigoted bot that was trolled for a while. We learned from that. Our CEO stood behind our team, and we did better. Now we've provided guidance, and it's publicly available.

We have what is called the AI Business School, which has a complete set of lectures for business leaders to put in an AI governance model. We're working with that community to help them. We're working to evangelize the work that we're doing internally around our AI ethics overview.

Lastly, I would say that we're working in the 60-plus different regulatory guidance document activities that are happening around the world so that we can start to socialize this from a practical experience perspective. Here in Canada there's the AI impact assessment and the AI ethics guidance standard that are being developed.

11:30 a.m.

Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Jo Stevens

It would be really nice to see a virtual assistant that is not a subservient female in the future. I look forward to seeing something different.

Thank you.

11:30 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Jo.

Now we'll get into our closing comments from our vice-chairs and then the co-chair.

Nate, would you start with your 60 seconds, please?

11:30 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I think if we've learned anything from the last few days, it's that we continue to live in an age of surveillance capitalism that has the potential for serious consequences to our elections, to our privacy and to innovation, frankly.

While it has been frustrating at times, I do think we have made progress. We have had every single platform and big data company now say what they haven't said previously: They are going to embrace stronger privacy and data protection rules.

We had the platforms yesterday note that they need public accountability in their content control decisions and yesterday they acknowledged corporate responsibility for algorithmic impacts, so there is progress, but there is also a lot more work to do with respect to competition and consumer protection, and with respect to moving from an acknowledgement of responsibility for the algorithms that they employ to real accountability and liability when there are negative consequences to those decisions.

I think there's a lot more work to do, and that will depend upon continued global co-operation. I think our Canadian community has worked across party lines effectively. This international committee has now worked effectively across oceans, in some cases, and across countries.

The last thing I will say is that it's not just about addressing these serious global problems with serious global co-operation among parliamentarians; it requires global co-operation from companies. If there is any last takeaway, it is that the companies simply didn't take it seriously enough.

11:35 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Erskine-Smith.

Next we will go to Charlie.

11:35 a.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

Thank you to our two excellent chairs. Thank you to our witnesses.

I think we have seen something extraordinary. I've been very proud of the Canadian Parliament and our willingness to be part of this process.

There's been some extraordinary testimony in terms of the quality of questions, and I've been very proud to be part of it. Two extraordinary facts are that we have never in my 15 years ever worked across party lines on pretty much anything, and yet we came together. Also, we have never, ever worked across international lines. We can thank a Canadian whistle-blower, Christopher Wylie, who opened the door to the digital Chernobyl that was happening around us.

As politicians, we stay away from complex technical things. They frighten us. We don't have the expertise, so we tend to avoid them, which I think was a great advantage for Silicon Valley for many years.

These things are not all that technical. I think what we've done these last two days with our international colleagues—and what we will continue to do internationally—is to make it as simple and clear as possible to restore the primacy of the person in the realm of big data. Privacy is a fundamental human right that will be protected. Legislators have an obligation and a duty to protect the democratic principles of our country, such as free expression and the right to participate in the digital realm without growing extremism. These are fundamental principles on which our core democracies have been founded. It's no different in the age of the phone than it was in the age of handwritten letters.

I want to thank my colleagues for being part of this. I think we came out of this a lot stronger than we went in, and we will come out even further. We want to work with the tech companies to ensure that the digital realm is a democratic realm in the 21st century.

Thank you all.

11:35 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Charlie.

Go ahead, Damian.

11:35 a.m.

Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Damian Collins

Thank you very much, Mr. Chairman.

I'd just like to start by congratulating you and the members of your committee for the excellent job you've done in hosting and chairing these sessions. I think it's done exactly what we hoped it would do. It has built on the work we started in London. I think it's a model for co-operation between parliamentary committees in different countries that are working on the same issues and benefiting from related experience and insights.

The sessions have been split between what we call social media companies yesterday and other data companies here. Really what we're talking about is that while there are different functions, these are all basically huge data businesses. What we're interested in is how they gather their data, what consent they have for doing so and how they use it.

Across the sessions, time and again we saw companies unwilling to answer direct questions about how they gather data and how they use it. Whether it's asking how Amazon and Facebook share data.... Even though this is widely reported, we don't know. My colleague, Mr. Lucas, asked about LinkedIn and Microsoft data being shared. It's possible to totally integrate your LinkedIn data with your Microsoft tools, and a quick Google search can tell you exactly how to it.

I don't understand why companies are unwilling to talk openly about the tools they put in place. People may consent to use these tools, but do they understand the extent of the data they're sharing when they do? If it's as simple and straightforward as it seems, I'm always surprised that people are unwilling to talk about it. For me, these sessions are important because we get the chance to ask the questions that people won't ask and to continue to push for the answers we need.

Thank you.

May 29th, 2019 / 11:35 a.m.

Conservative

The Chair Conservative Bob Zimmer

I'll speak to the panellists first and then get into some closing comments.

I want to encourage you. You had promised, especially Mr. Ryland, about giving us a lot of the documents that you didn't.... Various commenters didn't have all the information that we were asking for. I would implore you to provide the information we requested to the clerk next to me so we can get a comprehensive answer for the committee. We'll provide it to all the delegates here.

Something that's really going to stick with me is a comment by Roger McNamee about the term "voodoo dolls”.

I watch my kids. I have four children. One is 21, one is 19, one is 17 and one is 15. I watch them becoming more and more addicted to these phones. I see work done by our colleagues in London about the addictive capabilities of these online devices. I wondered where are they going with this. You see that the whole drive from surveillance capitalism, the whole business model, is to keep them glued to that phone, despite the bad health it brings to those children, to our kids. It's all for a buck. We're responsible for doing something about that. We care about our kids, and we don't want to see them turned into voodoo dolls controlled by the almighty dollar and capitalism.

Since we like the devices so much, I think we still have some work to do to make sure we still provide access. We like technology and we've said that before. Technology is not the problem; it's the vehicle. We have to do something about what's causing these addictive practices.

I'll say thanks and offer some last comments.

Thanks to our clerk. We'll give him a round of applause for pulling it off.

He has that look on his face because events like this don't come off without their little issues. We deal with them on a real-time basis, so it's challenging. Again, I want to say a special thanks to Mike for getting it done.

Thanks also to my staff—over to my left, Kera, Cindy, Micah, Kaitlyn—for helping with the backroom stuff too. They're going to be very much de-stressing after this.

I'll give one shout-out before we finally close—oh, I forgot the analysts. Sorry. I'm always forgetting our poor analysts. Please stand.

Thank you for everything.

Thanks to the interpreters as well. There were three languages at the back, so thank you for being with us the whole week.

I'll give a little shout-out to our friend Christopher Wylie, despite being upstaged by sandwiches. I don't know if somebody saw the tweets from Christopher Wylie: “Democracy aside, Zuckerberg also missed out on some serious sandwich action.” He suggested that I mail the leftovers to Facebook HQ. Maybe that's the way we get the summons delivered into the right hands.

I want to thank all the media for giving this the attention we think it deserves. This is our future and our kids' future.

Again, thanks to all the panellists who flew across the globe, especially our U.K. members, who are our brothers and sisters across the water.

Singapore is still here as well. Thank you for coming.

Have a great day.

We'll see you in Ireland in November.

The meeting is adjourned.