Evidence of meeting #7 for Access to Information, Privacy and Ethics in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was used.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

David Lyon  Professor Emeritus, Queen's University, As an Individual
David Murakami Wood  Director, Surveillance Studies Centre and Associate Professor, Department of Sociology, Queen's University, As an Individual
Christopher Parsons  Senior Research Associate, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual
Alain Deneault  Professor of Philosophy, As an Individual

11:40 a.m.

NDP

Matthew Green NDP Hamilton Centre, ON

Before we pass it over to Mr. Parsons, who can take the rest of the time, I would ask each of you—perhaps you would be able to submit your remarks to this committee in writing—for ways in which we can improve on the EU's general data protection regulation. It's always my intention that we have the opportunity as legislators to create global leadership in this regard, so I'd like for you to be bold and expect the impossible, demand the impossible.

Mr. Parsons, you've been identified by some pretty esteemed colleagues as being a subject matter expert. I'll leave the last two minutes to you.

11:40 a.m.

Senior Research Associate, Citizen Lab, Munk School of Global Affairs and Public Policy, University of Toronto, As an Individual

Christopher Parsons

Thank you for the question.

In the brief I submitted to the committee, there are a number of specific recommendations that I make throughout. I won't and can't go through all of them right now. However, the first one that I think is important for the committee to remember is that the ETHI committee a few years ago actually did a study of the Privacy Act. They saw a number of esteemed experts come. They produced a report. I would recommend starting there to see what still resonates. I believe much of what's in there still does.

More broadly as it pertains to the current PHAC situation, I think it is important and essential that the Government of Canada, when it's obtaining datasets from private organizations, whether it be identifiable or de-identified data, whether it be aggregated or not, be able to demonstrate that meaningful consent was first received before that information was collected by those private entities and then shared with the government. The Privacy Commissioner of Canada should both be apprised of and be required to approve any and all such projects. Further, within the Privacy Act itself, there should be a requirement that privacy impact assessments are performed and are made public. Currently, that's not often occurring.

Shifting slightly to PIPEDA, one of the real problems here is that a series of private organizations collected information and subsequently disclosed it. That information was largely collected without the knowledge of individuals. Privacy policies don't work. They do not constitute meaningful consent. However, the Privacy Commissioner of Canada does have guidance as to what should be done. I believe there should be a requirement that this kind of guidance should be built into PIPEDA itself.

Furthermore, there will, of course, be situations where information is disclosed to government agencies and others. One way that Industry Canada has worked with industry in the context of law enforcement has been to recommend that private companies produce what are called transparency reports. I have more on this in my brief. I would argue that while that is a step in the right direction from several years ago, these reports are not mandatory. They should be; moreover, they should be more comprehensive. They should include not just law enforcement disclosures. They should also pertain perhaps to copyright information and, in this case, the sharing of aggregated and de-identified data, and to whom that is shared.

11:45 a.m.

Conservative

The Chair Conservative Pat Kelly

Thank you very much, Mr. Parsons.

My thanks to all of our panellists.

Members, we will reconvene with the second panel after the vote.

Until then, we are suspended.

12:25 p.m.

Conservative

The Chair Conservative Pat Kelly

Welcome to the second panel of our meeting.

Today's meeting had to be interrupted by votes, which of course is our first and primary responsibility as members of Parliament.

I would now like to welcome our witness for the second part of the meeting. We have Alain Deneault, professor of philosophy.

If we have a five-minute opening statement, then we should get a full round for our first four members, at six minutes each.

Take it away, Monsieur Deneault.

12:25 p.m.

Alain Deneault Professor of Philosophy, As an Individual

Thank you.

I'm a professor of philosophy at the Shippagan campus of the Université de Moncton, in the Acadian Peninsula. I teach ethics and environmental ethics courses.

I'd like to quickly provide five pieces of context.

First, as we know, the health policies surrounding COVID‑19 have led governments to adopt freedom‑destroying measures in terms of lockdowns, curfews and mandatory disclosure of medical information. These measures have led to the non‑renewal of contracts or dismissals and to electronic surveillance. The scientific basis for these decisions has often been debated and challenged. This has given some people the impression that public authorities are taking advantage of, or even exacerbating, the health situation to give free rein to unconstitutional practices.

Second, the technological infrastructure required to produce more big data at a faster rate leads to an increase in harmful environmental effects. To produce the big data that we use so much today, we need industrial server farms that consume a great deal of electricity, not to mention the 5G network that we must soon “accept” and the increasing production of information technology products in Asia. This sometimes leads to water issues. There are serious consequences in terms of greenhouse gas emissions, the depletion of rare metals and water issues. These consequences don't in any way point to sustainable practices in keeping with solutions to the environmental challenges that governments have claimed to be addressing in recent years.

Third, the production of big data, which comes from what I'll quickly call GAFAM, meaning Google, Amazon, Facebook, Apple and Microsoft—you understand that I mean the entire computer engineering sector—also constitutes a legal impoverishment from the governments' perspective. These companies, which hold a technical monopoly over what they generate, very often end up making law through giant contracts that we must constantly accept when use the software “given” to us.

These private ways of legislating result in law on which many court decisions are based. As you know, when it comes to information technology, representatives of these large companies often advise you, members of Parliament, since they have the best technical knowledge.

Fourth, this commercial stewardship of big data in the midst of the health crisis has been largely profitable for the major information technology companies, or GAFAM. The profits of these companies have increased by tens of billions of dollars, at the expense of SMEs and workers, who are far more trapped by the situation resulting from the health policies than these major companies.

I'll focus on the fifth point, even though I have very little time left. We'll discuss it later. The production of big data is, in itself, a totalitarian device. It involves monitoring the behavioural reality of subjects and making it predictable, even controllable. We know that, when we can monitor 150 actions of Facebook users, we know them better than their relatives. When we can follow only 300 actions, we know them better than they know themselves. It's a manipulation tool that Cathy O'Neil summarized as “Weapons of Math Destruction.”

I personally advocate, not that we regulate this sector and make it ethical or acceptable, but that we prevent its production at source. This should be done in the manner of war diplomacy where sometimes there is agreement to refrain from developing certain methods or processes.

12:30 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you, Mr. Denault.

We now have Mr. Patzer, for six minutes.

12:30 p.m.

Conservative

Jeremy Patzer Conservative Cypress Hills—Grasslands, SK

Thank you, Mr. Chair, and thank you to the witness for his testimony here today.

Quite often, when we have an emergency.... We can look back at what happened with 9/11 and the level of surveillance and security at that point in time. We're now looking at the measures that are going on throughout this COVID-19 pandemic.

What are the risks here that, because of the extraordinary measures that we have gone through to collect all this data, the government is not going to relinquish some of the ways and means by which it is surveilling citizens? Are they going to let people revert back to normal? I guess this is kind of what we're looking for. Is there going to be a backing off in the amount of surveillance and the amount of data that's being collected here?

12:35 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

The risk issue is broad. The first mistake that one could make here—I am not saying that this is your case—and that should be prevented, would be to read things in light of a single criterion. We are not in a situation where everything is black and white. The issue is to look at several criteria and ask how much risk there is.

There may be risks in not using massive data, but we also have to take into account the fact that we are dealing with a totalitarian mechanism that consists in controlling people to such an extent and with such efficiency that we even make them susceptible to manipulation.

The risk is to trivialize surveillance and make it a management technique that we have reduced to an almost technical modality, without gravity. This is what we have been doing for the last two years because of the emergency situation. In fact, we renew the health emergency from 10 days to 10 days, in discrete periods, without justification.

There will come a time when we will extend the scope of these so-called emergency measures to citizens who will be deprived of their constitutional rights. We cannot treat lightly the fact that we can have information about people on the grounds, for example, that they have not been vaccinated—which is a constitutional right, by the way—or that they are participating in demonstrations, which also are constitutionally protected, in principle.

Therefore, the perceived risk is to generate a mechanism that, in the name of technical management, allows for an unconstitutional attitude, measures and processes.

12:35 p.m.

Conservative

Jeremy Patzer Conservative Cypress Hills—Grasslands, SK

I feel like we're getting very close to a threshold here of infringing too far into people's lives, and the data people are generating, their own data, is being used against them.

Is there an ethical concern about the use of this data?

12:35 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

In my opinion, the problem is the mechanism itself. It is inherently totalitarian. To monitor people's every action, every move and every purchase, to cross-reference that data, and thus make it so that we know these people better than they know themselves, is a problem right from the outset. It is the very possibility of generating this amount of information that we should be mobilizing against.

I'm not going to give you a lot of bibliographic data. However, look at the thickness of this book written by Marc Goodman, a former Interpol and UN employee. In it, he sums up the technological crimes linked to mass data. I invite you to read this book, Future Crimes, the original version of which is in English. It shows to what extent the citizens of states that are no longer states governed by the rule of law when they allow this data to be collected and used, are structurally at risk of falling into an order where control is total.

I would have an example to give you, but I will let you ask a question so as not to monopolize your time.

12:35 p.m.

Conservative

Jeremy Patzer Conservative Cypress Hills—Grasslands, SK

Maybe you could give that quickly, and then comment really quickly as well on consent. Are people able to very clearly give consent to their data being used or taken?

12:35 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

Thank you for asking me this question. The answer is no, quite simply. Studies have been done on how difficult it is to really understand the contracts we are made to sign when we become users of these software programs that collect our data the moment we use them. We all know the saying: when we are given something such as software, it is because we are the product. It takes a legal background, and then some, to make an informed judgment about what we are signing up for when we use this software.

In any case, today, if you want to work and organize yourself socially, these instruments are coercive. Either you live in your basement and don't leave your house, or you use them, because society demands them. The issue, basically, is letting a totalitarian device unfold without any form of control and trying, after the fact, to patch things up in frameworks that will always be shaky, because the mechanism itself is problematic.

12:40 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you.

Now we will go to Ms. Khalid.

February 14th, 2022 / 12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you very much, Mr. Chair.

Thank you to the witness for his testimony today.

Perhaps I'll start by reframing the questions that have been posed so far. With respect to how in this specific instance that data was used, we have heard from various witnesses throughout the study so far that balance needed to be created in order to have those COVID restrictions, for example, be properly applied in a good way, in one that restricted that infringement upon people's rights.

To our witness, would you agree that this mobility data helped us to better understand how people were moving and to better implement policies that protected people's health and safety, while also ensuring that their rights were protected as much as they could be?

12:40 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

First of all, to the question itself, I would like to answer with a question that explains the confusion in which we find ourselves as citizens faced with this mechanism: who can answer this question?

Who can know if this data is used in a fair way? Who controls it? Are the bodies that have access to this data not using it for purposes other than those for which it was intended to be used in the context in question?

We don't know. There is an opacity that arises at some point, ultimately, and no citizen has the time to check that out.

So we are strictly bound by relationships of trust. Whether we trust these entities or not, in the first instance, we cannot verify that. Secondly, the question that arises here must be broader. I insist on this, ladies and gentlemen. The question cannot simply be about one tiny use, it must be about the mechanism and all its possible uses.

12:40 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you. I appreciate that.

Understanding the complexity of this whole conversation and this issue, I really hesitate to go down the path of whataboutism and philosophically talking about what the possibilities are, what the best practices are or what that perfect scenario is.

I've heard members and witnesses compare this to the 9/11 situation. In this instance, we're talking about a government using data to really protect and to develop COVID health policies for our nation, whereas, as we go down this path of more complex data and data production, as you mentioned in one of your five points, there is this role that private companies play that was not there with 9/11.

Can you compare, as our member talked about, what the distinction is between a government using this data and restricting a government's use of this data versus private companies doing so? What role does a government have to play in ensuring an adequate balance in the use of data?

12:40 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

First, I would like to play down the situation.

For at least a year, we have known that the COVID‑19 pandemic is not comparable to the pandemics that struck down a third or half of the population in the Middle Ages. Indeed, this has been officially established by several countries in recent weeks.

We are dealing with a disease that has very clearly, in the past few months, been behaving in a way that can be characterized as endemic. It is particularly serious for certain categories of people, for example those who are gravely ill or who are older, among others. Public policies should therefore be able to protect certain groups of people.

Personally, if I had to answer the question about the relevance of this research, this is what I would say.

First, the health system is underfunded; basically, that is the crisis. If the health system were not underfunded, we would be able to support and accommodate groups that are vulnerable to this virus.

Secondly, the problem is ecological. This is where we should invest and do research. It is an ecological problem because we are dealing with zoonotic diseases, as we have seen many of them since the beginning of the century. Ebola and H1N1, among others, are zoonoses caused by the loss of biodiversity.

We can always develop even more polluting—I said this earlier and I would not like us to forget it— and destructive techniques that create even more problems with regard to the causes of these epidemics. Furthermore, we must stop locking ourselves into advanced techniques, which are likely to be used by ill-intentioned entities, or to be used excessively.

12:45 p.m.

Liberal

Iqra Khalid Liberal Mississauga—Erin Mills, ON

Thank you.

I have just a very short question.

I know you mentioned it a bit in your opening remarks with respect to private companies and big data being produced. Do you think government should be regulating that usage?

12:45 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

Yes, I think the government should ban it. It's hard to hear a statement like that, because it's not often made. Yet I think we should, as a precaution, make sure that this data...

12:45 p.m.

Conservative

The Chair Conservative Pat Kelly

Thank you. We're out of time for this round.

Now we will go to Monsieur Villemure for six minutes.

12:45 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Thank you, Mr. Chair.

Good afternoon, Mr. Deneault.

I'm going to ask you two questions and I'm going to give you time to answer them in the six minutes I'm allotted.

Until now, the government side has often told us about the benefits of the end purpose, regardless of the premise itself. You have talked about banning the production of data, for example. People have said that there are benefits, without regard to the rest. The situation is trivialized. In fact, the Minister of Health was evasive when I put the question to him.

In the Monde diplomatique, you talked about mediocracy. You have in fact published a book entitled Mediocracy: The Politics of the Extreme Centre. You assessed the topic based on the following elements, among others: education, economy and culture. You mentioned that there was a loss of critical thinking.

Do you believe that this loss of critical thinking is also operative in government?

12:45 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

Critical thinking means trying to identify the ideological motivation for everything we are offered.

Why are we being offered such and such a thing?

Maybe, indeed, there is a benefit to using this data if it is done in a surgically relevant way. Let's face it, it's like putting a lid on a boiling pot. You're trying to control a mechanism that wasn't created to allow the Canadian government to deal with an epidemic. That's what critical thinking is all about, trying to identify the ideological motivation of products and social modelling. A mechanism has been created that allows for surveillance, that allows for control, that allows for predictability and manipulation.

I lived in East Germany. I saw people who could, if they wanted to, access the files that the Stasi had compiled on them. These files contained all sorts of entries, including telephone tapping and so on, like tailing of citizens who were considered to be undesirable elements of society. The people who had access to their Stasi files were terrified. Yet these files were nothing compared to what Google, Microsoft and Apple know about us. The Stasi files were nothing compared to that.

Today, if people can get access to the harvested data... I can tell you that it happens. Sometimes lobbyists go to public decision-makers and show them what they know about them. It's not pleasant.

When you find yourself in that kind of situation, then you think that there may be a tiny percentage of relevant uses that you can make of these instruments, but are they essential to those uses? I don't know, but I doubt it.

In any case, we cannot avoid asking the question in a general way. Today, there are a considerable number of books on this subject. As you can see, I've collected some myself, and I'm not working on that. They're all books criticizing the hold of digital technology on our lives, which dispossesses us intellectually and rationally.

12:50 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Do you believe that this kind of situation is likely to erode public confidence or trust in government institutions?

12:50 p.m.

Professor of Philosophy, As an Individual

Alain Deneault

It's interesting, because on COVID‑19 and health policies, there have been two ethical discourses. I refer to documents from the Quebec government on trust and transparency. In these government documents, which are written by in‑house ethicists, they say that for there to be trust, there must be transparency. Yet, at the same time, the message must be unique enough and unassailable enough to be accepted by minds that might capsize if the science were called into question.

Science thus holds a discourse that is supposed to generate confidence. In the case of the health crisis, we are always told about science and public management methods to generate confidence, but this is only achieved if there is no evidence that causes doubt. When measures are presented, we are informed of their benefits and told that, since we have been informed, we must believe in those benefits.

These same ethicists say we need transparency. People feel they have all the conclusive evidence to trust what they are told. However, officials should not say too much. That's what this document says. I could send it to you, if you like, for the committee's work. I am on page 15 of the Quebec government document entitled “Cadre de réflexion sur les enjeux éthiques liés à la pandémie de COVID‑19”.

12:50 p.m.

Bloc

René Villemure Bloc Trois-Rivières, QC

Since we only have a minute left, I'll go back to critical thinking. You know I have ethical reservations when I see what happened at the Public Health Agency of Canada, and when I look back at the WE Charity case and the Aga Khan case.

Is critical thinking still operative, or on the contrary, are we drifting towards a mediocracy?