Evidence of meeting #49 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was things.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Clerk of the Committee  Ms. Nancy Vohl
Bianca Wylie  Partner, Digital Public
Matt Malone  Assistant Professor, Thompson Rivers University, As an Individual
Mary Francoli  Director, Arthur Kroeger College of Public Affairs and Associate Dean, Faculty of Public Affairs, Carleton University, As an Individual
Patrick White  As an Individual

3:30 p.m.

Conservative

The Chair Conservative John Brassard

I call the meeting to order.

Welcome to meeting number 49 of the House of Commons Standing Committee on Access to Information, Privacy and Ethics.

Today's meeting is taking place in a hybrid format, pursuant to the House order of June 23, 2022. Therefore, members can attend in person in the room and remotely using the Zoom application.

Should any technical challenges arise, please advise me. Please note that we may need to suspend for a few minutes, as we need to ensure that all members are able to fully participate.

Pursuant to Standing Order 108(3)(h) and the motion adopted by the committee on Monday, November 14, the committee is commencing its study of privacy concerns in relation to the ArriveCAN application.

I would now like to welcome our witnesses today.

3:30 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

For the benefit of the interpreters, I'd like to know whether the sound checks were done before the meeting.

3:30 p.m.

Conservative

The Chair Conservative John Brassard

Were the sound checks done, Ms. Vohl?

3:30 p.m.

The Clerk of the Committee Ms. Nancy Vohl

It wasn't necessary to do sound checks for the witnesses appearing in person, here in the room.

In the second panel, one of the witnesses will be appearing by video conference, and we'll let you know then how the sound check went.

3:30 p.m.

Conservative

The Chair Conservative John Brassard

Thank you.

I'd like to welcome our witnesses today.

From Digital Public, we have Bianca Wylie, who's a partner. As an individual, we have Mr. Matt Malone, assistant professor at Thompson Rivers University.

Ms. Wylie, the floor is yours, and you have five minutes for an opening statement. Thank you.

3:30 p.m.

Bianca Wylie Partner, Digital Public

Good afternoon. Thank you for the opportunity to speak to you about ArriveCAN today.

Our firm, Digital Public, does work focused on digital transformation, both in government and more broadly. I'm sharing thoughts today based on my experience working with software as a product manager and as a facilitator to support democratic process.

There is a long list of what went wrong with ArriveCAN. At the top of the list is the inequity in public service delivery it created and the damage it did to public trust in government, particularly during a public health crisis.

We can discuss the specific details of what went wrong together, but for the purpose of these short remarks, I'm going to share three proposals that may help us avoid replicating our ArriveCAN mistakes. The recommendations fall under three headings—equity, sovereignty, and democratic accountability and oversight.

First, on equity, most importantly, ArriveCAN should always have been a voluntary app. It never should have been mandatory. The first proposal here is to implement mandatory redundancy in our digital public service delivery. What this means is that if there is a digital way to access a public service, there always, including in emergencies, needs to be a non-digital mode as well, one that is properly staffed and delivers just as high quality and experience.

Two very telling things happened over the course of ArriveCAN that illustrate why we need this kind of policy as a gating mechanism to force equity in public service delivery.

First, the government roundly ignored the federal, provincial and territorial privacy commissioners who stated clearly that technology used during the pandemic must be voluntary in order not to destroy public trust. To quote from the 2020 joint statement by federal, provincial and territorial privacy commissioners entitled “Supporting public health, building public trust”:

Consent and trust: The use of apps must be voluntary. This will be indispensable to building public trust. Trust will also require that governments demonstrate a high level of transparency and accountability.

Second, the public service should have had a deep and clear knowledge of the access and digital literacy issues, the discomfort and the fear that mandating this technology created for people in this country. This is about public service ethics. Yes, we were operating under emergency powers. If anything, this should have increased the care taken to support comfortable human experiences. Instead, the moment was used to accelerate an underlying desire to modernize the border.

Our work of democracy is easing access to each other's care. The mandatory nature of this app did the opposite. It created barriers. It devalued the work and possibility of the public service.

My second proposal is on sovereignty: Do not deliver public services through apps and app stores, full stop. We should not be building the delivery of public services with and through digital infrastructure that we don't own or control. This should be a non-starter.

The app stores are for consumer products. They are not for government service delivery. There is also a significant issue with moving the work done by the public service away from physical interactions and into private devices done in private places.

One of the problems with honing in on procurement is that we talk about purchasing. We skip over what it would mean to build our digital infrastructure, which is a conversation we need to have more of.

Finally, on democratic accountability and oversight, a third proposal is to create an independent public advisory board to oversee ArriveCAN's ongoing development and use. This will help address transparency problems, open the code, explain where the data goes and how it's used, and engage with communities on changes and updates to the app. The app's development is funded into next fall, so there's lots of time to set up an improved oversight mechanism.

In closing, the development, design, launch and implementation of ArriveCAN was rife with digital governance issues and errors. We can do better in the future, but only if we understand, acknowledge, and accept the harm caused by ArriveCAN and the lack of defensible public health rationale to do so.

Thank you. I'm happy to discuss any and all of this further.

3:35 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Ms. Wylie. I appreciate your staying to time because that will give us a lot more opportunity for questions.

Mr. Malone, you have five minutes, sir.

The floor is yours.

3:35 p.m.

Matt Malone Assistant Professor, Thompson Rivers University, As an Individual

Thank you, Mr. Chair.

My name is Matt Malone. I am an assistant professor at Thompson Rivers University in the faculty of law. I am attending the hearing today in a personal capacity, representing only my own views.

I'd like to thank the committee for this unexpected invitation and opportunity to discuss my privacy concerns regarding the ArriveCAN application.

After my opening remarks, I would be glad to answer the committee members' questions.

First, I would like to talk about how the government failed to take reasonable steps to ensure that personal information collected and retained by the app was kept safe. Unquestionably, the worst example of this was the glitch that sent 10,200 people who had correctly used the app faulty quarantine orders. The government’s response to and transparency about the glitch were appalling. Some affected users were not notified that they were victims of the glitch for 12 days. During those 12 days, the ArriveCAN privacy notice stated that disobeying a quarantine order issued by the app was punishable by a fine of up to $750,000, or six months in jail.

When I wrote about this issue in the Globe and Mail in August, I received numerous harrowing stories from Canadians. This correspondence made it very clear that many elderly and rural Canadians in particular were seriously affected. In my own experience, when I requested the personal information about me, collected by CBSA through the app, it was not forthcoming from CBSA for four months. When I finally received it, there were many errors in my personal information.

The foregoing suggests that the government failed to take reasonable steps to ensure that the personal information it collected was both adequately safeguarded as well as accurate, up to date, and complete, as required by section 6 of the Privacy Act.

Second, I want to talk about secrecy. CBSA has not been forthcoming with Canadians or Parliament, including this committee. On November 14, 2022, the CBSA president told the government operations and estimates committee that the CBSA spent 4% of its budget on ArriveCAN for security. But it has produced almost no records speaking to those efforts.

The work of the primary contractors involved in building ArriveCAN also raises serious concerns. Based on my review of previous access to information requests, extensive correspondence between GC Strategies’ managing partner, Kristian Firth, and Canada's chief technology officer, Marc Brouillard, shows that GC Strategies appears to operate more as an unregistered lobbyist than a primary contractor. As a primary contractor, it appears that the only real service they offer is secrecy, by subcontracting work through contracts that are shielded from disclosure as proprietary information. This is a deeply unsettling way to deliver government services that involve the mandatory collection and retention of Canadians’ personal information.

Third, I want to talk about the justification for the app. I have noted in my public and academic writings that the mandatory use of ArriveCAN did not meet the threshold under the Quarantine Act for emergency measures. Moreover, the government’s rationale for the app kept changing. This became most obvious following the introduction of the “advance CBSA declaration”, an optional feature that was inserted into the mandatory architecture of the ArriveCAN app. When the advance CBSA declaration was unveiled, it was done so hastily that the government did not include a privacy notice as required under subsection 5(2) of the Privacy Act. I believe this also likely implicated sections 4 and 7 of the Privacy Act.

Fourth, I want to talk about the government’s disregard for existing oversight measures when it introduced ArriveCAN. With ArriveCAN, many of these measures were simply ignored entirely.

It is crucial to point out that the government disregarded key measures in a number of acts and directives—the Directive on Automated Decision-Making, for one.

Fifth, I believe this episode underscores the need for urgent reform in the access to information system. We need robust access to information that sheds light on the work of quasi-lobbyists like GC Strategies. Using such entities to deliver services that are making decisions about Canadians and are subject to neither disclosure nor review is concerning in the context of mandatory collection and retention of Canadians’ personal information.

Ironically, GC Strategies itself once even pitched to the Treasury Board Secretariat using subcontractors to reform the access to information system's search function itself. The existing system needs more funding and more disclosure. Many of my own requests have been egregiously delayed. Some have been simply ignored. I'm happy to discuss those.

Finally, to echo the comments of my colleague Bianca Wylie, for whom I have great respect, I want to emphasize that the government should never have deviated from its own promises early in the pandemic that it would introduce health apps only on a voluntary basis. This was echoed and supported by a joint statement of all privacy commissioners, who came together to say the same.

I believe public trust is essential in driving successful technology adoption, and I believe this kind of trust cannot be mandated.

Again, I'd like to thank the committee for inviting me.

3:40 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Malone.

We'll start with our questions. The first round will be six minutes.

We'll start with Mr. Barrett.

3:40 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Thank you, Chair, and thanks, Ms. Wylie and Mr. Malone, for joining us here today.

I will start with you, if I can, Ms. Wylie.

What are the risks that happen when security clearances are waived for some subcontractors who would be working on an app that deals with Canadians' biometric data, personal health data, and passport information? What are the risks to their personal privacy rights when something like that occurs?

3:40 p.m.

Partner, Digital Public

Bianca Wylie

They are numerous, in terms of the fact that when you don't know how data can move and how it can evolve, if you lose control of it and you're allowing people to use it outside of the construct within which people thought it was collected, you can have problems.

One thing that's important to know is that data is so easily replicated and then adapted and moved that losing control over how it's managed or used is a serious risk and creates significant liability. Should there be reasons to have exceptions to these rules? One would hope that those would be made clear and would make sense. The rules are there for a reason. That's a question of process. If there's an exception, why?

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Right. All of our public service employees already have the requisite clearances. Keeping a program like this in-house and developing it over time would eliminate that particular risk.

You talked about some of the examples on how data can move and how it can be later utilized. Can you give us a brief example of one of those risks?

3:45 p.m.

Partner, Digital Public

Bianca Wylie

For any kind of data breach, it is difficult to follow where the data goes and how it's been used. We're seeing how numerous they are. Really, this is why you want to minimize data collection in the first place, because once things have a breach it's very difficult to follow, continue and understand. This is one of those situations where you can't put the toothpaste back in the tube.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Right.

You talked about public trust. What impact does mandating the use of this technology have on public trust?

3:45 p.m.

Partner, Digital Public

Bianca Wylie

Thanks for asking, because I think that was the most significant outcome here. Without confidence in how the government is using data, the public can't trust it.

In this instance, the fact that this was mandated and there was for sure a lack of clarity from the government to the people as to how this data was being used, beyond the idea that we're in a pandemic where there's a crisis and therefore you must do X.... What happens when there are already issues with trust is that this accelerates the distrust. This was so unnecessary, because some people like this app, and if they like it and they feel comfortable using it and they can consent, perfect. If someone is not that person, they need a great path to access public service too. The failure to create that path just really inflamed this trust and it was a very difficult point in time.

We can see it's a completely unnecessary loss of trust and it happened. As it was happening, it was shocking to me that—I don't know how much people here saw—there were concerns about how this data could or couldn't be used because it wasn't clear, and this accelerated and was fomenting distrust. That's the word to use here.

The obvious antidote is that you build alternatives for people. This lack of investment to make sure people were comfortable.... To Matt's point earlier, if you want to get into good digital service delivery, you're going to get there by building trust and bringing people along with you. You don't force it; you open it up. If you like the option, you use it and then you continue along.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

Thanks very much.

I'll turn to you, Mr. Malone, with respect to the governance issues that you see with the execution, but also with the development of the app.

3:45 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

The government has in place many policies and directives that should guide the development, construction and deployment of an app like ArriveCAN. What's boggling in this instance is that it essentially threw all of these well-developed policies out the window. For example, the directive on automated decision-making states that there should be an algorithmic impact assessment done at the time artificial intelligence will be deployed, so when that is constructed at the outset of the program, that will do that. That never occurred.

The only algorithmic impact assessment that is available, to my knowledge, is one that was done a year and a half after ArriveCAN was introduced. The policy, the directive, says that the assessment should also occur whenever the app is significantly updated. That occurred at many instances, but rather than adhering to its own policies, the government simply unveiled the developments in the app store. That's what got the government into trouble when it introduced an advance CBSA declaration into the iOS version of the app, because it was an update in June that caused the glitch.

3:45 p.m.

Conservative

Michael Barrett Conservative Leeds—Grenville—Thousand Islands and Rideau Lakes, ON

To be clear, with respect to the assessment of potential impacts, you're talking about the potential impacts that it would have on the user, like mandatory quarantine or, effectively, house arrest and facing possible jail time or substantial monetary fines.

Is that the type of impact?

3:50 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

No, not precisely—

3:50 p.m.

Conservative

The Chair Conservative John Brassard

Give a very short answer, please, Mr. Malone.

3:50 p.m.

Assistant Professor, Thompson Rivers University, As an Individual

Matt Malone

The directive has risk mitigation items, and they're slightly different from that.

3:50 p.m.

Conservative

The Chair Conservative John Brassard

Thank you, Mr. Barrett.

Next we'll go for six minutes to Mr. Fergus.

3:50 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Thank you, Mr. Chair.

Thank you to both witnesses for being here.

Ms. Wylie, I was fascinated by your remarks, so I have a series of questions based on what you've said here and in posts in which you express your views on the subject.

First of all, you raised concerns about the security of people's personal information. Are you aware that the Public Health Agency of Canada, PHAC, had asked the Privacy Commissioner of Canada to evaluate the safeguards relating to Canadians' data in the ArriveCAN app?

3:50 p.m.

Partner, Digital Public

3:50 p.m.

Liberal

Greg Fergus Liberal Hull—Aylmer, QC

Are you aware that the commissioner had done an evaluation of the ArriveCAN app and found no major concerns?