Evidence of meeting #109 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was scl.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Christopher Wylie  As an Individual

10:30 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Gourde.

Next up, for five minutes, is Madam Fortier.

10:30 a.m.

Liberal

Mona Fortier Liberal Ottawa—Vanier, ON

Mr. Wylie, thank you very much for contributing to our study today. Your input is extremely important.

We've heard a lot about what went on. I would like to focus, however, on the future and talk about transformation.

A few times, you said that steps had to be taken to protect the personal information of Canadians. Do you have any such measures to suggest to the committee?

10:30 a.m.

As an Individual

Christopher Wylie

The first thing—and this is a really straightforward example that I was speaking about before—is that, in the same way that you require transparency for donations and spending, you should require transparency for the use of information and advertising. When a party puts out an ad online, it should have to report that. It should also have to report who it is going to.

Personally, I think that companies should do that, too. I don't see why not. I think it would be healthy for people to be able to scrutinize the advertising markets online, in general.

The other thing is that we have to understand that this is not always going to be a data issue. The developments of algorithms and artificial intelligence moving forward means that it will not always be clear whether or not there was consent in an inference, for example. I will give you a tangible example. Your cousin joins a genetic-profiling company, like 23andMe for example. That company is acquired later by an insurance company or some other kind of company that looks at that genetic profile and infers, based on your relationship—because you're their cousin—that you have a 95% chance of having a particular type of breast cancer, and then denies health insurance. This might not be as applicable in Canada, but it absolutely is in the United States.

Here, there was consent in the actual data, because the data was the genetic profile of your cousin who consented to that use. However, the behaviour or action or result applies to you, where you didn't know that this was happening and you didn't consent to that.

Currently, it's difficult to say whether that information was about you. Was that an inference about you? When we're looking at artificial intelligence, we're looking at memories, understandings, behaviours, and inferences.

In the law, when we regulate people about their behaviour, we do have a component as to what's in their heads, but we also have another component, which is their behaviour. Taking a step back and not just looking at the issue of data and consent, but looking at the behaviour and acceptable behaviour of AI in general, is a really healthy mindset for people to have, I think.

These are decision-making machines, so we should be regulating how they can make decisions. This is really important because as society moves forward, all of this information is going to start being connected to each other. What you do with your toaster may affect what your office computer does later down the road, or it may affect the price of Starbucks, when you walk into Starbucks.

There are real issues that aren't to do with consent, but are to do with the ultimate impact in behaviour. That's a more broad mindset.

The third thing is that when you look at technology—whether you're a Canadian, an American, or a Brit, or whoever—the Internet is here to stay. You do not have a choice. You have to use Google. You have to use social media. You cannot get a job anymore if you refuse to use the Internet. This means that the issue of consent is slightly moot. In the same way that we all have to use electricity...it's a false choice to say that if you don't want to be electrocuted, don't use electricity. In the same way, if you don't want to participate in the modern economy, don't use data collection platforms.

We should be looking at these platforms as a utility in the same way that we would look at electricity, water, or roads as a utility, rather than as an entity where people or consumers are “consenting”.

The fourth thing is that there should be rules on reasonable expectations. When I joined Facebook in 2007, it did not have facial profiling algorithms. I put all of my photos onto Facebook, and I consent to “analysis of the data that I put on”, but that technology did not yet exist. Facebook then creates facial recognition algorithms that read my face. Was that reasonably expected at that time? For a lot of people, it probably was not. There's very little regulation or rules on something that's very unique to technology, which is the rapid development of new things.

Having some sort of rule or principle about reasonable expectation.... You might have consented to some platforms several years ago, but if something new happened, was that reasonably expected? If the answer is “no”, then maybe it shouldn't be allowed.

10:40 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you.

Next up, for five minutes, is Mr. Kent.

10:40 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Thank you, Mr. Wylie. Your recommendations in terms of every political party advertising that it has micro-targeted in elections is a good one, and I suspect that the commissioner, on receiving that sort of advice, may well come back and recommend it himself.

When Chris Vickery, director of cyber-risk research at UpGuard, testified before this committee, he described his accidental discovery of the public website called GitHub, the subdomain called GitLab, and the website gitlab.aggregateiq.com, which he said was essentially inviting the entire world to log on to register and participate in what he called a collaboration portal.

Do you think that the portal was left open deliberately by AIQ so that the company, or individuals in the company, could have plausible deniability of what was going on within areas of that portal?

10:40 a.m.

As an Individual

Christopher Wylie

I can't speak to the intentions of AIQ, so I don't know why that misconfiguration was there. It seems like it was a pretty careless—which is one word you could use—oversight, to have their entire code base and systems exposed to the public. If not intentional, it certainly was extremely reckless.

10:40 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Did you ever avail yourself of the open accessibility of that site? Were you aware of it?

10:40 a.m.

As an Individual

Christopher Wylie

I've used GitHub, as most people in tech have used GitHub, but this particular subdomain that you're referring to, I have not accessed before, no.

10:40 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

Were you aware of GitLab-AIQ's Ephemeral project?

To be more specific, Chris Vickery walked us through the Ephemeral project, as he discovered it. That site has been taken down, but it had within it something called the database of truth, and then it had two different project levels to influence election outcomes. One was called Saga and one was called Monarch. Were you familiar with these projects?

10:40 a.m.

As an Individual

Christopher Wylie

I am vaguely familiar with the names, actually through Chris Vickery, because those weren't necessarily the names that SCL would use to refer to the products. It's hard for me to answer specifically, because I am familiar with the general set-up. For example, for the “database of truth” that you're referring to, the parlance I used was the “database of records”. I am familiar generally with the set-up, but I'm not familiar with the specific names off the top of my head.

10:40 a.m.

Conservative

Peter Kent Conservative Thornhill, ON

I know there is good humour among those who are experts in the digital world, but the subtitle of the Ephemeral project on this website was, “Because there is no truth.” Do you think that was just a bit of humour, or more of an underlying reflection of the mentality of AIQ?

10:40 a.m.

As an Individual

Christopher Wylie

I can't speak to the specific intentions of AIQ and why they put certain things there, but there was a systemic culture in the group of companies that we've been speaking about that completely disregarded the importance of truth in an election. SCL and Cambridge Analytica regularly advertised disinformation as a service offering.

Part of it could be dark humour, and part of it could also be reflective of the fact that this kind of humour would be completely acceptable in this group of companies.

10:40 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you.

Next up, for five minutes, is Mr. Baylis.

10:40 a.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

I'd like to continue with our discussions about the foundations of Cambridge Analytica. What was the intention in setting up Cambridge Analytica?

10:45 a.m.

As an Individual

Christopher Wylie

Robert Mercer and Steve Bannon wanted to use the services of SCL, but there were two issues. First of all, there were issues in the United States about using a foreign contractor, particularly a foreign military contractor, in domestic U.S. elections. This was in both a compliance sense—that's not allowed—and secondly an optics sense—that doesn't look good. They needed a domestic U.S.-focused brand for that operation.

Secondly, Rob Mercer wanted more control over the project than simply handing it to a client would allow. He wanted a role as an investor and as a shareholder and as a director.

10:45 a.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

He put Steve Bannon there as the vice-president.

10:45 a.m.

As an Individual

10:45 a.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Was it your understanding that he was doing this for a profit motive?

10:45 a.m.

As an Individual

Christopher Wylie

Actually, no; the opposite. You have to remember that Robert Mercer is one of the wealthiest people in the United States. He's a billionaire. He doesn't need more money. The contract values you get in politics pale in comparison with corporate finance. For him, he wasn't necessarily out to make money, particularly, on political projects. He was out to play the sport of billionaires, which is to compete in elections.

10:45 a.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

In fact, this company was so poorly run it went bankrupt, right ?

10:45 a.m.

As an Individual

10:45 a.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

So we can look at it and say that he didn't do it for financial profit.

Now, let's say I set up a sign company here in Canada and say to a given party, “I'm going to sell you your political signs at a discount. I'm going to lose money. I'm a rich person and I want to 'subsidize' selling the election signs you're going to stick up. I'm selling them under price.”

So I'm actually giving money indirectly, which is against the law.

10:45 a.m.

As an Individual

10:45 a.m.

Liberal

Frank Baylis Liberal Pierrefonds—Dollard, QC

Is it possible that Steve Bannon and Robert Mercer, in the structure they had in this money-losing operation, were also circumventing election spending limits—or election spending laws, if you will?

10:45 a.m.

As an Individual

Christopher Wylie

So one of the ancillary benefits.... I can't speak to the specific intention or to the reason it was set up this way. I mean, it was set up with an incredibly convoluted structure, but as it was explained to me, there is a benefit. That is, if you are an investor in a company that then provides political services, and you put money into that company that then provides services at a particular rate, then when you put money into that company, that's not a donation. That's an investment in a company that you're the owner of. In that way, it doesn't necessarily need to be reportable, because it's an investment.

If you are then putting in millions and millions and millions of dollars to generate IP, which is then worth millions and millions and millions of dollars, but you then only need to charge your clients a nominal amount, you could argue that this is a subsidy. You could argue that it's a donation in kind, or that it's a proxy donation. The problem in many cases in the United States is that companies are much more opaque than in other countries, so it's difficult to actually parse out how much money goes where.

So that was explained to me as an ancillary benefit.