Evidence of meeting #135 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was going.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Amanda Clarke  Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual
Jeffrey Roy  Professor, School of Public Administration, Dalhousie University, As an Individual
David Eaves  Lecturer in Public Policy, Digital HKS, Harvard Kennedy School, As an Individual

3:30 p.m.

Conservative

The Chair Conservative Bob Zimmer

We'll call to order meeting 135 of the Standing Committee on Access to Information, Privacy and Ethics, pursuant to Standing Order 108(3)(h)(vii), on the study of the privacy of digital government services.

Today we have witnesses as individuals. We have Jeffrey Roy, Professor, School of Public Administration, Dalhousie University; David Eaves, Lecturer in Public Policy, Digital HKS, Harvard Kennedy School; and last but not least, Amanda Clarke, Assistant Professor, Carleton University.

Before we get into it, I wanted to say that most of you have seen by now the release about the International Grand Committee. It was sent out at noon today. This morning I spoke with Damian Collins, the U.K. chair, as well. It's going to be a developing story as things roll out. We'll send out requests for groups to appear. Likewise, we'll be adding countries to those that we already have. That will be forthcoming. If you want any further information, feel free to ask.

Madame Fortier had submitted a witness. I just wanted to say, for the record, that you can submit witnesses at any point. We had a deadline just because we needed an initial number to get going. If you have a witness who you think would benefit the committee, send the name to the clerk or to my office and we'll make sure it gets put on the list. That said, I do want to get the witness list to you—where we stand right now—so you know where your witness is in the queue.

I have a question from Charlie.

3:30 p.m.

NDP

Charlie Angus NDP Timmins—James Bay, ON

I have two points. One is that I'd like us to look at a witness list. So far, we haven't done what is normally done on committee, which is to say how many meetings we'll have and then break down witnesses and decide whether we need them all. I'd like to do that.

I don't want to take any time from our wonderful, astute witnesses, but we're going into a week break and I would like to put out for attention that, given the Globe and Mail article on the allegations about SNC-Lavalin, and given what's being posted about the lobbying that was done, it will fall to Ethics to start to look at this, particularly the question of what kind of lobbying was being done by SNC-Lavalin.

I will be bringing a motion for discussion, because our committee will be expected to look at anything that has to do with allegations about improper lobbying that may have changed the direction of any kind of policy. I'll be bringing that forward at our next meeting.

3:30 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Angus. Are there any further comments?

Okay, we'll get going. To all the witnesses today, thank you for appearing. You have 10 minutes.

We'll start with Ms. Clarke—ladies first.

3:30 p.m.

Dr. Amanda Clarke Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual

Thank you very much.

My name is Amanda Clarke. I'm an Assistant Professor at Carleton University's School of Public Policy and Administration, here in Ottawa, where I hold the public affairs research excellence chair.

I've been researching and advising governments on digital government for the past 10 years. This work actually first began here. I used to be an analyst with the Library of Parliament, and I was on the scene when parliamentarians first started asking us questions about things like Twitter, Facebook and open data. It's very interesting to be back here speaking on some of these topics again.

My work in this field continued with doctoral studies at the Oxford Internet Institute at the University of Oxford, where I completed a Ph.D. study comparing digital government reforms in Canada and the United Kingdom. The U.K. portions of that research looked quite a bit at the “government as a platform” model that the U.K. government has instituted. You've talked a little bit about that and the Government Digital Service, so I'll be happy to speak to that in the questions.

The Canadian portions of that research most recently have been published in a book laying out the history and the trajectory of digital government in Canada, where I focus in particular on the tensions between some of the demands of digital government and our tradition of Westminster government in Canada.

I'm currently leading a research project on civic technologies and data governance. In particular, this work is unpacking the role that private actors play in digital government service delivery. It explores governance mechanisms that can be used to ensure more accountable and equitable stewardship of personal and public data.

I'm really grateful for the opportunity to speak to the committee today. I applaud you for putting what I think is a really important issue on the parliamentary agenda.

I'm going to focus on three topics. The first is the tensions and the complementarities between digital government services and privacy and security. Second, I want to look at data governance and the privatization of digital government services. Third, I'll talk very briefly about indigenous data governance.

On the first theme, the committee's study really aims to promote effective digital government services, while also protecting Canadians' privacy and the related issue of security. I think you're right to identify these objectives as potentially being in competition and to try to seek a balance between those priorities.

In discussing this balance, the committee and earlier witnesses have identified a number of ways in which it appears that federal public servants in particular are too lax in regard to the privacy imperative. There has been discussion about lost hard drives containing user data and about withholding information from the Privacy Commissioner regarding data breaches, for example. At the same time, in my research with federal civil servants, I regularly—

3:35 p.m.

Conservative

The Chair Conservative Bob Zimmer

Ms. Clarke, our interpreters are having a hard time keeping up with you. Can you just slow it down a little bit? Thank you.

3:35 p.m.

Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual

Dr. Amanda Clarke

Fair enough. I did have a coffee right before I came here.

3:35 p.m.

Voices

Oh, oh!

3:35 p.m.

Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual

Dr. Amanda Clarke

My students complain of the same thing. I'm sorry.

3:35 p.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

We all do it. Don't worry.

3:35 p.m.

Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual

Dr. Amanda Clarke

Okay.

All right. I'll take a breath.

The committee seems to be under.... I mean, there was a lot of discussion about the federal civil service not having an appropriately robust appreciation for privacy. At the same time, in my research with federal civil servants, I regularly hear an alternative narrative. That narrative presents public servants as, in some cases, overly zealous in their concerns over privacy and the related question of cybersecurity.

Now, many of you might respond with, “How can governments be overly careful when it comes to privacy and security? Shouldn't that always be top of mind?” But that view, if you buy into it, essentially allows those concerns to be a trump card. In many instances, that can directly undercut scope for much-needed innovation and improvements to the services that governments provide to Canadians. It can also really undermine the efficiency and effectiveness of the daily operations of the government, in particular when it comes to policy analysis. Oftentimes, this overzealous concern for privacy and security doesn't even really address real privacy and security questions.

There are three concrete examples. Many government offices don't have Wi-Fi, in part because overly risk-averse managers have decided that the risk to security and privacy is simply too high. Many government officials similarly can't download the tools they need to do their work, such as free software online that would allow them to do sophisticated data analysis, or even really simple data analysis. They're often banned from accessing websites with really pertinent information to their policy work—websites that are regularly used by stakeholders and by service users.

Perhaps more significantly, in part due to privacy concerns, current legislation, vertical accountability regimes, and corporate information management strategies favour the siloing of data in the civil service. This approach really undermines the potential to produce important improvements, not only in service delivery but also in allowing for policy analysts to work with data across many different policy areas. That kind of crosscutting policy analysis that draws on data from multiple departments is increasingly important as we acknowledge that the policy challenges of today don't sit nicely into departmental silos. They are inherently crosscutting. They don't respect departmental boundaries.

In these instances, civil servants—this is a regular, daily complaint—really can't access the tools, data and people they need to do their jobs well. This creates work environments that are reinforcing the stereotype of government as being out of touch and not being innovative, which certainly does very little for the recruitment efforts of our current public service. More significantly, it ensures that we're going to continue to see substandard and failing government services that reinforce Canadians' already low levels of trust in the state.

To be clear, I'm not advocating that the government cast aside a concern for security and privacy. Rather, I'm suggesting that the approach the committee should be advancing is one that accounts for the trade-offs and costs to the efficiency and effectiveness that can come from overly prioritizing privacy and security without taking a more balanced approach. Here I'm advocating for permissive, flexible frameworks. What would those look like in practice? We can actually look to some current efforts already under way in the federal government that show there are some really promising efforts being taken by civil servants to strike the balance the committee is seeking. I'll just name a few for you.

First, the Government of Canada recently introduced a digital service standard that prioritizes privacy and security but provides means of upholding those principles while also developing services that meet users' needs. In addition, Canada has actually recently emerged as a global leader in developing very progressive frameworks on the responsible use of artificial intelligence in government. This work is attempting to balance, again, those imperatives of respecting principles of equity, democratic representation, transparency, and privacy and security while also being very innovative in how we use data to improve government services and develop more robust policy solutions.

Importantly, this work on responsible artificial intelligence was done in the open. It was developed with stakeholders and experts through a Google doc, giving it a degree of legitimacy but also adding an important level of transparency that I think we should be applauding.

Last, I'd point to the Canadian digital service housed in Treasury Board Secretariat. It was created in the 2017 budget. This is another place where the government is recruiting a fleet of talent with a lot of technological expertise but also bringing in some really sharp policy minds in order to balance the imperative of improving government services while also upholding principles of privacy and security. Here I mean top-of-class, best industry standard privacy and security that I think are really pushing some great innovations in government.

I think, then, that we actually face a very promising landscape, and the takeaway for this committee should be the need to keep reinforcing that work. This demands funding to enable hires and to build up staffing in these areas. It also means giving some of the existing units leading this work the legislative, policy and administrative levers they need to scale their work across the bureaucracy. While these are promising efforts, they really are just the beginning, and they're largely housed at the centre of government right now.

I have only a few minutes left, so I'll move to the second point that I wanted to discuss, which is on data governance issues that arise in the privatization of digital government services.

What I really want to hit home here is that it's important for the committee to acknowledge that many digital government services are not actually delivered by the state directly or at all. This was the early hope of digital government, in fact, that governments would release their data and others would use it to innovate.

That narrative has been nuanced quite a bit, and I think governments are much more realistic about this now. Nonetheless, there are many federal government services that we access through platforms the government doesn't own. I would turn you to the example of the TurboTax software, which, since 2012, more than 12 million Canadians have used to file their taxes, or something such as CANImmunize, an app developed in partnership with hospitals, but also partially funded and endorsed by a number of governments. This is a mobile application to keep track of vaccinations.

There are also countless other digital interfaces that we use to access government services. Some are directly procured by government; others aren't, but are endorsed by government, and some are independently run.

I think the important question to ask here is, when those digital interfaces not managed by government—and thus privately owned—become the only or the easiest way of accessing government services, what is the role of government and how can government ensure that the data those interfaces collect respects privacy concerns and also adheres to other principles of good data governance? When they are contracting private actors, let's say, to deliver digital services, governments need to be very aggressive in defining what data can be collected, how such data can be used and monetized, and who would benefit from that monetization.

We also need to be very realistic about citizens' capacity to give informed consent to some of these private services. One recent New York Times editorial calculated that if the average person were to read all the digital privacy policies they agreed to in a year, it would take 76 working days. I think that in our models of consent for some of these private services, we need to be a bit more thoughtful about this matter as well.

3:45 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Ms. Clarke.

3:45 p.m.

Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual

Dr. Amanda Clarke

Can I just put in the last one, which I really think we need to get on the agenda?

3:45 p.m.

Conservative

The Chair Conservative Bob Zimmer

You're a minute over already. If you can do it in 10 seconds, go ahead.

3:45 p.m.

Assistant Professor and Public Affairs Research Excellence Chair, School of Public Policy and Administration, Carleton University, As an Individual

Dr. Amanda Clarke

It's just a quick last point. I haven't seen it yet in the discussions, but some of the discussions the committee has had haven't been made public yet, so maybe it has been on the agenda.

I would like to point the committee to specifically approach the issue of indigenous data sovereignty in its work. I'm not an expert on this, but I can suggest others you should speak to. There are very unique concerns at play here concerning the way the Government of Canada collects and uses data relating to indigenous people, and in particular the way services are delivered to those communities. Given ongoing ways in which that data has been used to marginalize and oppress indigenous peoples, I think it's really incumbent upon this committee to particularly carve out some space for that issue.

Thank you.

3:45 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Ms. Clarke.

Next up is Jeffrey Roy, for 10 minutes.

3:45 p.m.

Dr. Jeffrey Roy Professor, School of Public Administration, Dalhousie University, As an Individual

Thank you very much.

I'm just going to read a brief opening statement, in order to discipline myself to stay within our time limits today.

Good afternoon, Chair, members of the committee and esteemed colleagues. I wish to thank the committee for this opportunity today. I am pleased to participate in this discussion on such an important topic, namely how governments can both expand and improve digital service capacities, while protecting the privacy and security of citizens and all stakeholders.

I will begin by building upon a few of the comments made by the Privacy Commissioner in his thoughtful remarks to the committee last week. Three points stood out for me, in particular: first, the importance of the Government of Canada's data strategy road map; second, notions of barriers versus safeguards; and third, the Estonian model, as a comparator for Canada and other countries.

In my view, the data strategy road map is an important reference point in this debate, as the Privacy Commissioner observed. It is a comprehensive discussion of both opportunities and challenges, based on the cumulative efforts of both Liberal and Conservative-led governments over the past decade, as well as like-minded efforts across all government levels and the private and non-profit sectors.

Data-driven capabilities are now widely regarded as critical enablers of service innovation in today's digital age. Such capabilities often imply, and even necessitate, data sharing across multiple government entities, yet the road map aptly describes a fragmented public sector environment, often more vertical than horizontal, as Amanda noted, with a host of legislative and cultural barriers impeding a whole-of-government approach.

From his vantage point, the Privacy Commissioner observes that “what is a legal barrier to some may been seen as a privacy safeguard by others.” In my view, the essential task of this committee is to reconcile the inherent tensions embedded within this prescient observation with the shifting realities of today's society and the emerging opportunities presented by digitization. While I laud the critical efforts of the Privacy Commissioner to safeguard and enforce privacy rights, it is also the case that many legislative, organizational and political barriers do inhibit greater innovation through information and data sharing.

Several pilot initiatives across the country, across all government levels and often encompassing more than one government level, have demonstrated how information and data can be shared without sacrificing privacy. Nonetheless, such pilots all too often flow against the currents of traditional public administration and proprietary notions of protection and control.

In an era where openness and engagement are drivers of networked and agile governance models that challenge traditional hierarchies, privacy is bound to be a contested notion. While a large segment of society remains deeply concerned about privacy, others have simply written off the concept as dated and unrealistic. Bridging this widening cleavage requires trust in public and collective governance mechanisms, and key enablers of such trust are openness and dialogue stemming from political institutions, in large part.

In this regard, and to your point, Estonia is an enlightened example of a country embracing open-source technologies and leading-edge solutions for more integrated and online services. Central to that country's widely recognized success in this regard is the sustained and transpartisan political commitment to making digital transformation a societal project in the aftermath of the collapsed Soviet Union.

In terms of political history and institutional structures, a closer comparator to Canada is Australia, which also presents a compelling case study. Despite widely reported digital failures and privacy breaches, which all countries experience, Australia has steadily climbed to the upper echelon of the United Nations global e-government surveys over the past decade—which is inversely correlated to Canada's performance—partly due to a robust political dialogue and strong engagement on digital matters by elected officials from both the House and the Senate.

Such political literacy helps to facilitate digital literacy across society at large. Australia has also recently created a new national agency, with both federal and state-level involvement, devoted to e-health solutions and, by extension, reconciling privacy and sharing in that critical space. While I have great respect for the boundaries and benefits of federalism, an important lesson for Canada in health care reform is the need for greater intergovernmental collaboration in devising new digital frameworks for shared policies and more virtual forms of delivery.

More broadly, in this country, the absence of more robust collaboration, particularly with respect to financing and shared political accountability, is a major inhibitor of greater progress in digital service innovation. The plethora of public sector service centres in large and medium-sized cities merely underscores this point, further encouraging each government level to focus on its own service apparatus in largely separate manners.

Canada is not alone in facing such struggles, of course. I am presently engaged as a consultant to the OECD, assisting in a groundbreaking study examining digital government from subnational and interjurisdictional perspectives. An emerging theme from this project is the essential role of a holistic governance architecture for the public sector as a whole.

I would offer two final observations. First, privacy in a digital era should not be framed solely or even predominantly as a matter of rights. Citizens, too, have responsibilities in becoming “data activists,” to quote CBC journalist Nora Young in her book entitled The Virtual Self.

A new social contract for the digital era cannot be predicated upon unrealistic promises for unfettered privacy rights, especially in a world where governments must themselves challenge such rights for a host of reasons. Of course, the private sector also carries important responsibilities to customers and to all stakeholders. A more sophisticated dialogue is essential as a basis for public education and collective action. As well, in my view, new forms of more direct public engagement in devising digital service solutions are also warranted.

The final observation I would make is the essential role within the legislative branch for what I would call anticipatory capacities to better understand the challenges and the risks that lie ahead. The committee has undoubtedly heard experts discuss the potential of blockchain technologies, which some might associate with cryptocurrencies such as Bitcoin.

Beyond Estonia's widespread adoption of blockchain, Finland is deploying such technologies to deliver support services to refugees, while a separate Finnish pilot enjoins agriculture producers and local governments in a shared effort to improve employment services in rural communities. The European Union has funded several like-minded blockchain pilots, and it is notable here that the European Parliament has appointed a special adviser on blockchain to facilitate collective learning.

In closing, I would commend this committee for its efforts as an important enabler of strengthened digital innovation in the delivery of public services, and I look forward to your questions.

3:50 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you once again.

Our last witness for today is Mr. Eaves, for 10 minutes.

3:50 p.m.

David Eaves Lecturer in Public Policy, Digital HKS, Harvard Kennedy School, As an Individual

Thank you.

Good evening, everybody.

My name is David Eaves. I'm a lecturer here at Harvard University. I teach technology in government and digital transformation at the Harvard Kennedy School. That said, I was born and raised in Vancouver Quadra, so I know Ms. Murray, who may be in attendance. I used to live in her riding until a few years ago.

I have also been advising on and thinking about transformation for about 15 years now. In fact, I appeared twice before the ETHI committee to talk about open data and my framework around open data, open information and open dialogue. It kind of turned into the policy framework that I think is still broadly used to organize transparency in government.

Today I want to talk a little bit about digital transformation and its impact on privacy. Particularly, I'm concerned with issues of governance and trust. One thing that the chair may, if he is so inclined.... Just today, I published an article in Policy Options about lessons from Estonia. It deals with some of the governance issues that I think are particularly pressing, questions that need to be asked. If it is of interest, it might be worth translating so that the committee can share it with all its members.

First, I just want to level set about what we're actually talking about when we're talking about Estonia, and what Estonia has done that makes it unique and worth talking about. There are really three things I think the members need to take away about what Estonia has done.

The first is that it has created a set of what we would call canonical databases, where it stores information about its citizens—that is, where you live, what your driver's licence number is and so on. All these things are being stored in databases, but they're being stored in a single database. There's only one database for addresses, one for drivers' licences, one for something else and one for something else.

The second is that the information in these databases is linked together because every citizen has a unique identifying ID. Everybody has their own number. The number gets attached to that information in those various databases, so it's easy to pull disparate information about a citizen all together to get a very clear view about who that person is, and then to offer that information to different parts of government as it's trying to do its service. This is a very different model from what you would find in most countries, including Canada, where these databases tend to be what my colleagues refer to as siloed. The information is actually stored in several places. It doesn't get shared. It's hard to get a full picture, and it's hard to grab all the information you have about someone, and that's why you have to keep collecting it over and over again.

Finally, the third big piece the Estonians have done is that they've gathered information, connected it to individuals through unique IDs and then made those databases—what I want to call “core infrastructure”—available to anybody who works in government, across all government agencies, so they can then leverage it to build new services or improve the services they already offer.

Those three innovations, for me, are at the core of what we're talking about, and if you don't understand those, then it's very hard to talk about the innovations or the costs or the dangers that are facing us if we want to go down that path. First, I just want to level set the committee around understanding those core issues.

Why does this matter? Just speaking a little bit to my predecessor Amanda Clarke's point, once you have this infrastructure in place, it's much easier to innovate and build new services. The core promise that the Estonian government makes to its people is that, by law, it will only ever ask for a piece of information from you once. If, say, the Canada Revenue Agency asks for your address, that means that if you go to the passport office, they'll already have your address on file and you won't have to give it to them again. The advantage of this is that, as you're building services as a government, you don't have to re-collect and re-store all this information. You have it in a single place, so you can leverage it when you build a new service and not have to ask for that information again, nor do you have to build all the infrastructure in that service to store and manage that information.

There are three key questions I would really like the committee to think about.

The first is that, as you're thinking about privacy information, I would love for you to be at least asking this question: What is the threat model that we're trying to protect ourselves against? There are predominantly two types of concerns people have about privacy, particularly in government. One is that they're worried about an external actor attacking the system and gaining access to data that the government stores about people. This is typically a foreign power. The fear is that it will then use that information to undermine the government or possibly even collapse confidence in government institutions and thereby cause people not to want to access information or not to trust the government.

The other core threat model that I hope a lot of time is actually spent thinking about is the internal threat model. I'm actually much more concerned about what my own government can do to me than I am concerned about what a foreign government might do to me. I'm significantly more concerned about what my own government can do to me than what a private actor might be able to do to me. In this particular example, this can range from a government engaged in surveillance to relatively narrow activities.

I'm particularly concerned about perhaps the ex-husbands of women using their access to government information to track where their former spouses are living and what they are doing. We certainly have ample history of that happening in all sorts of places, particularly in police forces, but in other places as well.

Even in small ways, this happens and comes up on our radar. People may remember that when Rob Ford went to the hospital, his records were illegally looked at by multiple people within the hospital records system, and relatively recently, two of those people were charged and fined. That type of access, what you can do with someone's personal information and the way you can share it as an internal actor, in some ways, concerns me more than what an external actor can do. Who we are worried about matters a lot here.

The second piece is that, while I am concerned about internal actors, this does not mean I want to create so many burdens for them in using these types of systems or gaining access to them. I very much want to echo Professor Clarke's points about how increasing security can be good, but if it comes at the cost of usability, then you create a system that's highly secure that no one can access or use. I have students who work in the military here who talk to me about their laptops that take 45 minutes to boot for them to access because they have so much security on them. As a result, people don't tend to use their laptops. I'm not sure we want a system that's so secure that nobody will end up using it.

The third is that privacy is not actually absolute. We want some flexibility. I may not want you to be able to look at my health care records at any point, but if I'm dying in the street, as my colleague Jim Waldo says, I definitely want you to have access to my health care records, and I might not be in a place where I'm able to give you permission to do that. We need a system that, while secure, provides some flexibility.

My key recommendations on this particular piece are.... Before any technical work happened on their systems, the Estonians did a lot of work of really updating their privacy laws for the 21st century and, more importantly, creating systems of logs and audits, so individual citizens could see who was accessing their data, and they could pose questions about whether said access was legitimate or not, and challenge authorities accordingly.

The second thing that I'm particularly concerned about is whether building this type of infrastructure might break the social contract that government has with its citizens. This may be humorous to hear, but most people are often quite comfortable giving information to their government because they believe their government does not have the capability to actually use that information to know very much about them. They're willing to hand information over because they don't actually think government has the competency to weave information together to create a story about them.

In the type of world that the Estonian government has created, this is simply not true anymore. The government's ability to pull together information about someone and actually really understand the totality of that person's life is vastly increased. Estonia has a very specific history and context that allowed that to happen. It's not clear to me that this exists in Canada, so I would strongly encourage the committee to do outreach to the Canadian public to understand how much comfort there is in the public for them to have that type of experience, what they want the government to know about them and what they want the government to be able to do with it.

The particularly large challenge I think you will have is that the citizens will tell you they want two things simultaneously. They will want you to treat them as Amazon does, which means they will want you to recommend new services to them, and they will want customized experiences. They will not want to have to re-enter their information over and over again, but they will say, “Don't you dare use my data to figure out that I have not been filling out my tax forms correctly, or that I actually owe money to the government for this other reason, and I don't want you to invade my life in ways that will make me unhappy.” It's not clear to me that you can have one without the other, or if you can, it's going to require a fair amount of rule thinking in order to get to that place. I don't think we've even begun to have the public conversation to engage and educate the public about how to get to that place and rate what their comfort levels are about such a possible future.

Finally, I'm very concerned about who's going to end up building—and more importantly, controlling—the infrastructure that Estonia has built. These database systems and the unique identifiers that come with them.... I wrote a case recently about a similar system in India, and I went in thinking there was a way to build this infrastructure to prevent a future political actor or a future actor from abusing this infrastructure, and the short answer is that there is not. There is not going to be a technology solution to the types of problems of privacy that we're talking about. There may be technology that can help, but ultimately, we're going to be relying on governance solutions. What is the governance that's going to protect the public from current actors and from future actors?

There are three futures that I can imagine for us. One is that we decide that building this infrastructure is simply too scary, that a government that knows this much is not one that we're comfortable with.

There's a second model, which is that we build it the way the Estonians did: highly distributed, so different ministries own different parts of this core infrastructure and they're sharing their databases with other ministries. The dangerous piece about this is that I actually think the governance in some ways is quite weak; ministries may be unwilling to cut off other ministries' access to data if they're doing something inappropriate, because they fear retaliation from that ministry cutting them off.

Finally, the third option might be that we build it in a way that's highly centralized, where there may be new governance models around the central institution.

I'm almost done, sir.

4 p.m.

Conservative

The Chair Conservative Bob Zimmer

Go ahead and finish up.

4 p.m.

Lecturer in Public Policy, Digital HKS, Harvard Kennedy School, As an Individual

David Eaves

But there, I worry that a single actor would have control over this type of infrastructure and they could use that control to leverage control over other parts of government to prevent them from launching services or force them to design services in certain ways that please them and not in a way in which perhaps Parliament or perhaps that ministry would like to offer those services.

My recommendation here is that a lot of investigation around the governance models needs to take place.

I'll pause there, and I can answer your questions.

4 p.m.

Conservative

The Chair Conservative Bob Zimmer

That's perfect. Thank you very much.

We'll start the first seven-minute round with Mr. Saini.

4 p.m.

Liberal

Raj Saini Liberal Kitchener Centre, ON

Good afternoon, everybody.

Mr. Eaves, I'll start with you first because you are living in the town where I also went to school. I went to Northeastern, in Boston, so let's start with you.

You brought up the concept of platform government. If we leave privacy aside for a moment and we look at the core infrastructure, which I think is an important way to recognize what is really involved, as you know, as a developed country we don't have any greenfields as Estonia does. It received its independence and it basically started from scratch. To some extent, part of India, depending on where you look, was also greenfielded. But we are an advanced country. We have advanced systems—systems that have been in place for 20 or 30 years. We have a way of doing business, and certain protocols.

However, when we look at Estonia, it's a unitary government, and there are only 1.3 million people. In Canada we have two issues. We have cross-department sharing of information, and also, because of the system of our federalism, each level of government controls different pieces of information. You have the x road in Estonia that goes across one level of government with separate databases, but here, in some cases.... Where I'm from, the Waterloo region, we have four levels of government: municipal, regional, provincial and federal.

When you talk about this infrastructure, and if we use the Estonia model, in which all information is not housed in one database but spread across multiple databases, which would also incur a certain level of security, how do we do it in Canada, where you have four different levels of government with four different core responsibilities?

4:05 p.m.

Lecturer in Public Policy, Digital HKS, Harvard Kennedy School, As an Individual

David Eaves

All the constraints you have just mentioned.... The Estonians did have a greenfield, which meant that they did not have existing infrastructure, and it's much easier to build something from scratch than it is to try to basically rebuild a plane while it's flying in the air.

I think there are two answers that I would say need to happen simultaneously in order for us to do this. I actually think the technical challenges of building this are going to be significantly smaller than the governance challenges. Finding ways to get governments to agree on how to share information and how to share data is enormously difficult, so we'd better start getting the lawyers in the room together now because it's going to take many, many years probably for them to get to a place where they feel comfortable.

In fact, I was just chronicling this today. In the HealthCare.gov debacle, for that website, the amount of data you needed to have in order to sign up for health care in the United States had to come from 12 different agencies just within the federal government. It took them, I think, a year and a half to negotiate agreements for one service among stakeholders just within the federal government in order to share data so they could pump it into a single system to do one type of service delivery. So we'd better start thinking about that now.

My other piece of advice on that is that if you start just doing that, it will never get done. You need a forcing function, so it might behoove us to find the critical service that we think would have the highest impact on Canadians, the one it would be most helpful to make easy, and start working today on that service and figuring out what data we need from various provincial stakeholders, local stakeholders, ministerial stakeholders and the federal government, and pull that in now to work on something very practical and very real. We shouldn't get overly ambitious. We should focus on one, and then we would probably learn a lot about what we need to be doing.

4:05 p.m.

Liberal

Raj Saini Liberal Kitchener Centre, ON

My second question is for Professor Roy. We have heard from other people that data collected by the government should be used only for the reasons for which it is collected. The term I think you used before is “data minimization”. When we look at the Estonia model, it's one-touch. In that regard, if you're going to have this system in Canada and advance digital government, there cannot be a continual repeat of information.

Now, the way Estonia works is that once you sign in, there is certain basic information—address, date of birth, social insurance number, or whatever they call it there—that is housed in one place, and from that place it goes to different areas. Again, that concept in law in Estonia, which I believe is one-touch, how do we do that here? How are we going to make sure that we can have the same effect? The purpose of digital government is to make things more efficient and easier. How do we put that in place here?

We look at the complexity of the country. We look at the population, which is 20 or 25 times greater. It's an advanced country in other areas. How are we going to be able to have that concept? If you don't have that concept of one-touch, then the efficiency won't be there and you won't get public buy-in, which is the other thing that I think all of you have spoken about.

4:05 p.m.

Professor, School of Public Administration, Dalhousie University, As an Individual

Dr. Jeffrey Roy

I think this is one of the most interesting contradictions or paradoxes of government right now, this notion of privacy protection and the idea of using information only for the reason it's collected.... Let's be clear, that contradicts a lot of what governments are promising to achieve with respect to more citizen-centric, more integrated service models. So there is that contradiction there.

Certainly David Eaves could speak about Estonia much better than I can, but prior to this committee I was looking at some of the data governance work that's been done by the Government of Australia over the past year. They're currently preparing a new legislative framework to address your question. They put out a thought-piece late last year talking about data sharing and reusability within the public sector, how that could work and how to make that work essentially with a privacy framework that recognizes the need for limitations and transparency.

To be very concrete, probably in the short to medium term at least, there will need to be an opt-out clause in order for people to feel they're not participating. I'll give you two examples, one in B.C. several years ago, when they introduced the new integrated services card that brought together the driver's licence and the health care card. Working with the privacy commissioner in that province, the decision was made to allow citizens who weren't comfortable with that integration to opt out. I believe a small minority chose to do so, and that continues to this day.

The second example is with respect to digital health and the new health agency that's created now in Australia to create a health record for every citizen. There, too, very clearly, there is an opt-out clause that allows individuals to have their digital record removed from the system. I don't know whether they do it themselves or whether they sign in and make a request. I suspect it will have to be a tiered approach where we create these new models, but there will be some opt-out.

Finally, I would go back to what I said earlier about your committee's work and the need to have a wider public conversation about what level of comfort citizens have in data sharing, and also bringing citizens more into the conversation, having perhaps citizens' advisory panels, citizens' oversight committees, to provide tangible input in understanding the trade-offs and the solutions going forward.