Evidence of meeting #119 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was data-opolies.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Bianca Wylie  Co-founder, Tech Reset Canada
Maurice Stucke  Professor, College of Law, University of Tennessee, As an Individual

11:05 a.m.

Conservative

The Chair Conservative Bob Zimmer

Welcome, everybody, to the Standing Committee on Access to Information, Privacy and Ethics. This is meeting number 119. Pursuant to Standing Order 108(3)(h)(vii), we are doing a study of the breach of personal information involving Cambridge Analytica and Facebook.

The witnesses are Maurice Stucke as an individual and Bianca Wylie from Tech Reset Canada.

We'll start off with Ms. Wylie for 10 minutes.

October 4th, 2018 / 11:05 a.m.

Bianca Wylie Co-founder, Tech Reset Canada

Thank you very much for having me today.

I am here on behalf of Tech Reset Canada. We are an advocacy organization looking at the innovation economy, the public good and the impacts of the innovation economy on our society. I am really happy to get to talk to you today, because it means we're talking more about the issues related to technology.

The Facebook and Cambridge Analytica case has been one I have used often when speaking and doing public education and community events to highlight one core truth right now—there are a lot of unintended consequences coming out of the use of technology. Framing that as the reality we're dealing with, I'm just going to share some remarks regarding our work, what we have found in it and how it ties into this particular issue and, more broadly, data governance and technology and society.

Having said that, I spent some years running public consultations. I am currently living in Toronto, and one of the projects that is front and centre for me is Sidewalk Toronto. Is everyone in the room familiar with this project? It's a subsidiary company of Alphabet, a sister company to Google. It's investing up to $50 million to create a plan for a smart city on Toronto's waterfront. It's just a plan. There's no real estate transfer. It's about a year old now. What it has given us in Toronto, and I think others, is a very focused view of the level of education we have as people in this country to engage in this discourse around technology and society.

What I would like to say about all of that is that a lot of us have no idea what is going on, what data is, where our data goes, who has our data or how our data can be used. All of these issues, which are fundamental and central to making decisions about them, we do not have a good handle on.

I'm at almost the year mark of watching a company have consultations with the public while knowing that nobody understands what anybody is truly talking about. As someone who has done public consultation and who holds the profession and the practice dear to my heart—and I think it is central to democracy—I am extremely troubled at the state of that project and also about the idea that we should be making any kind of quick decision or policy. If we do that right now, I can tell you for sure that it will not be inclusive of the people who live in this country and what they want to do about some of the issues related to Cambridge Analytica and to any sort of tech company and its relationship to people. I just want to set up that this is one big thing, starting at a high level.

Another theme related to this that I think is really important to consider, whether it's Facebook, Google or any other large company, is that we're beginning to blur the line between the market and the state in this country. We're beginning to lose track of who's in charge of what, who's responsible for what, and the implications of data being used by non-government actors.

In this country, we work from a social contract. People give us data—us in terms of government—and people understand what government does with their data generally. We are introducing corporate actors into social situations, whether it's using Facebook to communicate and organize in a community and do many things, or maybe existing in a city. This sort of blurring of this line, I should hope, is becoming more visible to the people in this room. I think it is a thing of grave concern, and we need to delineate and understand who is in charge of this whole....

What's happening now is this enthusiasm for technology, and it's somehow making everybody forget what their roles are, that we have rules and laws, and that those are things that help us determine how our society looks. I don't think it was ever the intention to be enthusiastic about the innovation economy and have that then become governance of social impacts. I really don't think that was something that happened on purpose, and I think we need to be very aware of the fact that this is now happening regardless.

There is an article written in 1998 by a scholar named Lawrence Lessig that said “code is law”. Software code is, in some cases, determining.... They are not “law laws”, but they are determining social norms and the ways we interact with each other. I just think these are things we might not have understood as this began. I do not want to ever think—and I don't want anyone here to think—that people who are technologists even have a handle on the implications of all of this.

Having said those things, I have just a couple more points.

One of them is that democracy moves slowly. This is good. This stuff is hard. I would really caution everyone in this room to consider how much education we need to be doing before we can even be making decisions that are informed by people who live in this country.

I know there's a lot of enthusiasm, and everybody says tech moves incredibly quickly. We have agency over technology. Technology is not something that just pops up and doesn't exist because of humans and their agency, so we need to remember some of those facts.

Another thing to be very clear about is that we are blurring the lines between procurement and the influence of purchasing products, or using products, and how that trickles down to the people who live here.

In my opinion, what is happening in Toronto is problematic because you should not be making policy with the vendor. This is essentially what we're doing. We are allowing someone who is going to be a vendor to influence how the policy for said vendor's work will go. I do not understand how anyone could have thought this was a good idea to begin with. I don't think we should continue this for much longer. In these cases, we really need to be aware of the ways these two issues are linked to each other.

Another thing that relates to this is that we've been thinking about technology as an industry. I see that in this country, a lot of the narrative is about wanting to do well, wanting to be innovative, wanting to do the things that make us leaders in technology, and there being a lot of opportunity for prosperity and wealth development. This is true. However, there's also a much larger narrative about what it means to lead in the governance of technology and the governance of data, and Canada has an opportunity right now to lead.

You have probably heard a lot of good things about the General Data Protection Regulation in Europe. It's not perfect but it is definitely moving towards some of the things we should be thinking about. I am confident that if we really take this seriously, if we look at impacts and engage people better, we can lead.

This is an opportunity. There's a lot of fear and anxiety about what to do. If we don't go fast and we are very considerate in what we're doing, I see a great opportunity here for the country to show global leadership in what to do with data governance and governance around technology. I don't want us to miss that in this need to react to fear, anxiety or issues that are quite complicated. I really don't want to miss that point.

I also want to talk about opportunity as a technologist. I think it is something we need to think more about. How do we develop social and public structures that use all the wonderful things that technology can produce, more for public good and more within government? We need to look at our academic institutions and ask ourselves why we're not developing technology that we are using.

If you go out into our communities where people are talking about digital rights and digital justice, they are wondering why we aren't building tools that we could be using for community organizing, or for social good—lots of the ways people use Facebook or other things.... Why aren't we doing better at building systems, at building competency so that we can be building those products, figuring out different models, and thinking about how we can use these things within government.

I really want to stress this. The idea that government can't keep up with tech, or that there's a problem here because people in government don't.... This is not my belief. I'm telling you what I hear a lot. We really need to shut that down and start to show that if there is an interest in really using technology well across the board in our society, we can be intentional and make investments to make sure that happens. These are all opportunities for the country.

Again, when you respond to fear, you respond quickly, and I don't think that will be a good response. I think this case is a very good one to watch, as is the Sidewalk Toronto example. There are big issues coming out of here. There is nothing wrong. I will say this as a technologist: Everybody will think we are doing wonderful things for technology if we take it slow and figure out what to do.

This includes industry. It is not helpful to industry if you are not clear with them as to what the guardrails are, how their operations have to be law-abiding and how they can be encouraged to reflect some of the values that we as technologists think should be there in terms of sharing values, being open with things and considering things that aren't necessarily proprietary.

There are lots of ways to use technology. There are lots of ways to use math. We shouldn't think this is only a business thing. This is a social thing. There are a lot of really exciting things to do in there.

I'm trying to end on a hopeful note here because I truly believe there is great opportunity. I want to make sure we follow processes that ensure people are engaged in the development of what we're going to do next, and we do not rush that. There is no need. There is a lot of urgency in terms of not going fast. We need to really quickly decide that we are going to not go fast and be thoughtful about the process we follow from here.

Thank you.

11:10 a.m.

Conservative

The Chair Conservative Bob Zimmer

You finished within 10 seconds, so that's pretty good.

Next up is Mr. Stucke, for 10 minutes, please.

11:10 a.m.

Professor Maurice Stucke Professor, College of Law, University of Tennessee, As an Individual

Thank you very much.

I recently co-authored two books on the data-driven economy. The first, with Allen P. Grunes, is Big Data and Competition Policy, and the second, with Ariel Ezrachi, is Virtual Competition. In both books we discuss some of the benefits of a data-driven economy. We also discuss some of the risks, including algorithmic collusion, data-driven mergers and behavioural discrimination. I won't touch on that.

I'd like to talk to you today about the risks if a few powerful firms monopolize our data. I'd like to break it up into four parts. First, what are data-opolies? Second, how have competition officials in the EU and U.S. viewed them? Third, from an antitrust perspective, do these data-opolies pose any risk of harm to consumers? Finally, I will have some final thoughts.

First, what are data-opolies?

Data-opolies control a key platform through which a significant volume and variety of personal data flows. The velocity of acquiring and exploiting this personal data can help these companies obtain significant market power. In Europe, they're known as GAFA—Google, Apple, Facebook and Amazon. As these firms have grown in size and power, they have also attracted significant antitrust scrutiny, particularly in Europe.

In the United States, it's been relatively quieter. I'll give you a couple of stats. From 2000 onward, the U.S. Department of Justice brought only one monopolization case, in total, against anyone. In contrast, the DOJ, between 1970 and 1972, brought 39 civil and three criminal cases against monopolies and oligopolies.

One question is this. Is there a difference in the perception of harm across the Atlantic between the U.S. and the EU over these data-opolies? In the U.S., antitrust plaintiffs must allege actual or potential harm to competition. Ordinarily when we think of harm, we think of a cable company—higher prices, reduced output, lower quality. Superficially, it appears that data-opolies pose little if any risk of these traditional harms. Ostensibly Google's and Facebook's services are free. Amazon is heralded for its low prices. Because of these network effects, the quality of the products can improve.

If you have low or free prices and better quality, what's the problem? Some, such as the late antitrust scholar Robert Bork, have argued that there “is no coherent case for monopolization”.

One factor for this divergence may be the perceived harm. If there is a consensus over the potential harms, then the debate can switch to the best policy measures to address these harms. I've identified at least eight potential antitrust harms from these data-opolies.

The first is degraded quality. Companies can compete on multiple dimensions, including price and quality as well as privacy, so a data-opoly can depress privacy protection below competitive levels and collect personal data above competitive levels. The data-opoly's collection of too much personal data can be the equivalent of charging an excessive price. Data-opolies can also fail to disclose what data they collect and how they'll use the data, and they face little competitive pressure to change their opaque privacy policies. Even if the data-opoly were to provide better disclosure, so what? Without a viable competitive option, the notice and consent regime is meaningless when the bargaining power is so unequal.

A second concern involves surveillance. In a monopolized market, data is concentrated in a few firms and consumers have limited outside options that offer better privacy protection. This has several implications. One is government capture. The fewer the firms that control the personal data, the greater the potential risk that a government can capture the firms, using its many levers.

One risk is covert surveillance. Even if the government cannot obtain the data directly, it can try to get the data indirectly. The data-opoly's rich data trove increases a government's incentive to circumvent the data-opoly's privacy protections to tap into the personal data. This is what happened with Cambridge Analytica. There are several implications of a security breach or violation of data-opolies' data policies. A data-opoly has greater incentive to protect its data, but hackers also have a greater incentive to tap into this data, because of the vastness that it has. While consumers may be outraged, a dominant firm has less reason to worry about consumers switching to rivals.

A third concern involves the wealth transfer from consumers to data-opolies. Traditionally, you'd think of a monopoly taking money out of your pocket. Even though the product may be free, data-opolies can extract significant wealth through several levels. The first is not paying for the data's fair value. The second is that data-opolies can get creative content from users for free, for example, from YouTube videos or contributions on Facebook. The third level is that data-opolies can extract wealth from suppliers upstream. This includes scraping content from photographers, authors, musicians and newspapers, and posting it on their own website. Finally, data-opolies can engage in what's called “behavioural discrimination”. Basically, this is getting us to buy what we would not otherwise want to buy, at the highest price we're willing to pay. It's a more pernicious form of price discrimination.

A fourth concern is the loss of trust. We can view this as a dead-weight welfare loss. Some consumers will simply forgo the technology out of privacy concerns.

A fifth concern is that the data-opoly can impose significant costs on third parties. Here in our work, we talk about the frenemy relationship that data-opolies have with app makers. They need these app developers in order to attract users to their platform, but once they start competing with them, they can then have an enemy relationship. There are various anti-competitiveness practices they can engage in, including degrading the app's functionality. What is particularly important for you is that data-opolies can impose costs on companies seeking to protect our privacy interests. One example, which our book Virtual Competition explores, is how Google kicked the privacy app Disconnect out of its Android app store.

A sixth concern involves less innovation in markets dominated by data-opolies. Here we point out how data-opolies can promote innovation, but also hinder innovation. One tool they possess that earlier monopolies did not have is what we call “nowcasting radar”. They can perceive trends well in advance of, let's say, the government antitrust enforcer—nascent competitive threats—and they can squelch those threats by either acquiring them or engaging in anti-competitive tactics.

A seventh concern is the social and moral concerns of data-opolies. A historical concern of antitrust was about individual autonomy. Here, a data-opoly can hinder the individual autonomy of those who want to compete on their platform. A related concern is data-opolies making their products intentionally addictive. Here you have an interesting interplay between monopoly and competition. Ordinarily, a monopolist doesn't have to worry about consumers going elsewhere. Here, however, the data-opolies can profit by getting users addicted to spending more time on their platform. They can thereby obtain more data, target them with advertising and increase their profits.

The eighth concern is the political concerns of data-opolies. Economic power often translates into political power, and here data-opolies have tools that earlier monopolies didn't—namely, the ability to affect the public debate and our perception of right and wrong. Data-opolies, as shown in the Facebook emotional contagion study, can affect how we think and feel, particularly as we migrate to digital personal assistance and much greater interaction with the data-opolies' products. You have several risks. One of them is bias. The news we receive will be more filtered, creating echo chambers and filter bubbles. The second risk is censorship. A third is manipulation.

Several themes, in conclusion, run through my papers.

The first theme is that the potential harms from data-opolies can exceed those from monopolies. They can affect not only our wallets. They can affect our privacy, autonomy, democracy and well-being.

Second, markets dominated by these data-opolies will not necessarily self-correct.

Third, global antitrust enforcement can play a key role, but here, antitrust is a necessary but not sufficient condition in order to spur privacy competition. There really needs to be coordination with the privacy officials and the consumer protection officials.

Thank you.

11:20 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you.

First up is Mr. Erskine-Smith for seven minutes.

11:20 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

Thank you very much.

Mr. Stucke, you talked a lot about potential harms. Facebook, Google, Amazon, Apple—they've all existed for quite some time, or in my lifetime. Show me some actual harms.

11:20 a.m.

Prof. Maurice Stucke

Let's start off with privacy protection. There's a perception that consumers aren't concerned about their privacy, but if you look at the data, it actually shows that consumers are resigned about privacy. They want greater privacy protection—this goes across age groups, not necessarily just the older group—but they don't really feel they have any power to do so.

Then think about Facebook and Cambridge Analytica. There was this whole “delete Facebook” movement. Nonetheless, when Facebook reported its first quarterly earnings after the scandal broke, it did not take a hit on either the number of users or their revenues. In a competitive marketplace, you would think, then, that consumers would get products and services that would tailor to their privacy interests, but they don't.

The other thing is just look at the EU and the Google shopping case. There you can see the power that the platform can have in promoting a product. According to the European Commission, Google recognized that its product was subpar, yet by its ability to allocate traffic in such a way as to promote its own products, where it put its own product on the first page of the search results and hid the competitors' products on the fourth or later pages, that had a significant impact on rivals.

That's a concern. I mean, we went through the annual reports of companies, and one of the things they identified as a risk was their dependency on these super-platforms and how these super-platforms, in hindering the functionality and the like, can really adversely affect them. We have the example with the Google comparison shopping case.

I could go through all eight that are in my paper, which was published by Georgetown University, and give specific evidence for each of those eight.

11:25 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I have only four minutes left, so rather than doing that, let's turn to solutions.

We tabled a report in the House in February of this year recommending changes to PIPEDA. We tabled an interim report in June recommending some additional changes to better protect privacy. I'm not sure whether you've read those reports.

Where do you see the answers?

11:25 a.m.

Prof. Maurice Stucke

One thing is that there is not a simple answer. The way I look at it, you can look at ex post, after-the-fact measures, such as increased antitrust enforcement. That would be one thing.

11:25 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

What does that look like in this case—that Facebook can't acquire Instagram? What are we talking here?

11:25 a.m.

Prof. Maurice Stucke

That would be one. Right now, looking at the Canadian competition officials as well as the U.S. competition officials, there's very much a price-centric focus on mergers, so it's improving, then, their tools for non-price effects, including data-driven mergers.

One way would be more informed antitrust enforcement. That's ex post. Then you would have, ex ante, GDPR-like requirements that could help kick-start privacy. That might be greater data portability so that users can transfer their data. Another might be greater resolution on who owns the data and on the property rights an individual has with regard to personal data.

I would look at it from both an ex post and ex ante perspective.

11:25 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

With the rest of the time, Ms. Wylie, you in general terms set out some of the big picture problems. If we were to get more granular at some of the solutions this committee should be looking at.... In February we recommended data portability. We've recommended privacy by default. We've recommended not quite GDPR-level standards in some cases because we didn't recommend such a strong version of the right to be forgotten and suggested further study, but certainly well above where we're at right now.

Are we missing anything, and if so, what are we missing?

11:25 a.m.

Co-founder, Tech Reset Canada

Bianca Wylie

The issue is bigger than privacy. You need to go up a notch and get into ownership and control because we're talking about issues of power. Privacy is definitely an issue.

I'm going to give you the example from the Sidewalk Toronto project that concerns me and might indicate a solution. One track...and I totally agree. I think we're looking for a bundle of solutions here, not one. In city planning people are looking for more data. They say they need better data, more data, and they need to use that data to inform public service delivery. We should not be losing control of the inputs to our policy creation, whether it's in vehicles or ways of getting at data or ownership or access to data. This is just one little example that applies across every piece of every policy in this country. Honestly, we cannot lose access to data that we need to make policy and that is what is further down the line. When you lose control of data, that for me is terrifying.

11:30 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I read about the café where they give the coffee to students for free. Students have to make sure they share their information about what they're studying, I think what year they're in, and then they get free coffee as long as they're in university. It's an exchange. They're giving out something that isn't particularly meaningful to them, but when aggregated for the company is quite useful. It is a market exchange. How do you get away from that?

To put it more bluntly Google is giving me a service for free and I'm giving them data and now we want to say that data has to be used for a public good, but I've given it to them and I've gotten an exchange of value.

11:30 a.m.

Co-founder, Tech Reset Canada

Bianca Wylie

This is early days thinking of this for me, but we are talking about ownership, control of data. We need to start thinking about usage. We need specificity from people as to what they're doing with data and to start to negotiate at a more granular level what can be done with people's data, because right now people are getting open-ended access. At the other end of the contract, the person who is getting the coffee is not being given any real insight into what might be done with that data after the coffee, shifting thinking to usage and being clearer about how people's data are applied to particular individual cases and what the true exchange is. It's not to open it all up and then you've completely lost track of how your data is being used.

11:30 a.m.

Liberal

Nathaniel Erskine-Smith Liberal Beaches—East York, ON

I'm out of time. I hope to come back though.

11:30 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thanks, Nate.

Next up for seven minutes is Mr. Kent.

11:30 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Thank you, Chair.

Thanks to you both for very helpful, very informative presentations today.

Just to start, Ms. Wylie, with regard to your concern about sidewalk labs and the Toronto waterfront revitalization partnership, the Auditor General of Ontario has actually launched a value-for-money study to find out exactly what the details are that she is unaware of. She has questions about some of the issues that you raised, but not really understanding whether the assignment of a very large and valuable part of downtown Toronto to the Google sister company's control for $50 million was a deal worth the value that they've placed on it.

I'd like to start first with one of the little-explored areas of the new U.S.-Mexico-Canada trade agreement announced this week. We're still waiting for details on specific points with regard to digital data from the Canadian government. There are translating issues to be resolved. From the office of the U.S. trade commissioner, under what he considers to be the key highlights of the digital trade chapter, is to me a very concerning point, which says:

The new Digital Trade chapter will....

Limit the civil liability of Internet platforms for third-party content that such platforms host or process, outside of the realm intellectual property enforcement, thereby enhancing the economic viability of these engines of growth that depend on user interaction and user content.

This would seem to be a strengthening of the, as you say Professor, data-opolies' rush for revenue-generating profit, as opposed to concerns for protecting individual privacy. It's been suggested by some tech commentators here in Canada that in fact this digital trade chapter will make it much more difficult for governments like ours to set new standards that may be closer or not to the GDPR protection regulations, and would basically allow Facebook to remain aloof and above any investigation of Cambridge Analytica's bad practices or illegal practices.

Professor, could you respond first.

11:35 a.m.

Prof. Maurice Stucke

I'm unfamiliar with that provision, so I can't speak directly on that point. More generally, the point is well taken that if these companies have very little to fear in terms of liability, their incentives then can be askew. To Bianca's point earlier, there was discussion about a fair exchange. The point that she raises is correct that users may get a free cup of coffee, but they don't necessarily know, first, what the value of their data is; second, who else can have access to that data; or then third, how that data could be used to profile them. That could have significant implications, not only in terms of economic implications but also implications on governance, implications on start-ups and the like.

Any sort of limitation of liability of these data-opolies should be something that should be scrutinized quite carefully to ensure that the incentives that the data-opolies have are aligned with the citizens' interests.

11:35 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Would that apply also to the responsibilities for those who use their platforms? In other words, it would apply to third parties as in the case of Cambridge Analytica, AggregateIQ, Facebook, and that interaction where there seem to be claims of plausible deniability among all of the players in this scandal because data came and it went and it was manipulated or processed and applied.

11:35 a.m.

Prof. Maurice Stucke

Exactly. Here you have consumers whose data was being used and they could never envision I think Cambridge Analytica. I think it's telling, because if you've had companies come to you and say, look, we're going to promise greater transparency and the like, but they're not going to hold others accountable who have access to that data, then that's a real problem.

11:35 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Ms. Wylie, what are your thoughts?

11:35 a.m.

Co-founder, Tech Reset Canada

Bianca Wylie

To build on what Maurice opened with, the major theme is power asymmetry. Knowing that we're sitting with this big power asymmetry, technology entrenches those things. Anything that's going to take what exists now and just entrench it further will only accelerate all of these negative impacts. I'm the same. I'm not familiar with the specifics, but if what this is doing is trying to hold onto the status quo, that's not a good thing.

11:35 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Again, the U.S. trade commissioner calls this an unprecedented accomplishment in the area of digital trade, and it provides for the movement of data across borders, which in some cases would concern Canadians and Canadians' privacy.

I have another key question that I'd like to pose to you, Professor, but I'll save it for my next round when I have a little more time.

I'll yield. Thank you.

11:35 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Kent.

Next up, for seven minutes is Mr. Angus.