Evidence of meeting #152 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was democracy.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Damian Collins  Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Jim Balsillie  Chair, Centre for International Governance Innovation, As an Individual
Roger McNamee  As an Individual
Shoshana Zuboff  As an Individual
Maria Ressa  Chief Executive Officer and Executive Editor, Rappler Inc., As an Individual
Ian Lucas  Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Jo Stevens  Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons
Edwin Tong  Senior Minister of State, Ministry of Law and Ministry of Health, Parliament of Singapore
Sun Xueling  Senior Parliamentary Secretary, Ministry of Home Affairs and Ministry of National Development, Parliament of Singapore
Jens Zimmermann  Social Democratic Party, Parliament of the Federal Republic of Germany
Keit Pentus-Rosimannus  Vice-Chairwoman, Reform Party, Parliament of the Republic of Estonia (Riigikogu)
Antares Guadalupe Vázquez Alatorre  Senator, Senate of the United Mexican States
Mohammed Ouzzine  Deputy Speaker, Committee of Education and Culture and Communication, House of Representatives of the Kingdom of Morocco
Carolina Hidalgo Herrera  Member, Legislative Assembly of the Republic of Costa Rica
Andy Daniel  Speaker, House of Assembly of Saint Lucia

8:30 a.m.

Conservative

The Chair Conservative Bob Zimmer

We're calling to order meeting 152 of the Standing Committee on Access to Information, Privacy and Ethics. Today in Ottawa, we have the international grand committee on big data, privacy and democracy.

I'm going to go over the countries quickly. We're not going to go through introductions because it would take up too much time, unfortunately.

We have the United Kingdom. With me today is Damian Collins, the co-chair of the international grand committee. He'll make comments in a few minutes.

We have the Parliament of Singapore with us. The Houses of the Oireachtas are here from Ireland. The Parliament of the Federal Republic of Germany is with us. The Chamber of Deputies of the Republic of Chile is with us. The Parliament of the Republic of Estonia is here. The Senate of the United Mexican States is with us. The House of Representatives of the Kingdom of Morocco is with us. We have the National Assembly of the Republic of Ecuador. The Legislative Assembly of the Republic of Costa Rica is here. Finally, the House of Assembly of Saint Lucia is with us.

I want to introduce my co-chair, Mr. Damian Collins.

Welcome.

8:30 a.m.

Damian Collins Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Thank you.

It's a pleasure to be with you here in Ottawa.

It's great to see that since the first meeting of the grand committee in London in November we have new members of the committee here today with additional countries represented. I think it just shows how these issues are only growing in significance. I'm sure today's discussions will add greatly to that debate.

8:30 a.m.

Conservative

The Chair Conservative Bob Zimmer

Absolutely. Thank you, Mr. Collins.

We'll start off with our witnesses this morning.

As individuals, we have Mr. Jim Balsillie, chair, Centre for International Governance Innovation; Mr. Roger McNamee, author of Zucked; Shoshana Zuboff, author of The Age of Surveillance Capitalism; and last but certainly not least, from Manila in the Philippines, we have Maria Ressa, chief executive officer and executive editor of Rappler Inc.

Today, we'll start off with our very own, Jim Balsillie.

Go ahead.

8:30 a.m.

Jim Balsillie Chair, Centre for International Governance Innovation, As an Individual

Thank you.

Co-chairs Zimmer and Collins and committee members, it's my honour and privilege to testify today.

Data governance is the most important public policy issue of our time. It is cross-cutting, with economic, social and security dimensions. It requires both national policy frameworks and international coordination.

Over the past three years, Mr. Zimmer, Mr. Angus and Mr. Erskine-Smith have spearheaded a Canadian bipartisan effort to deal with data governance. I'm inspired by the seriousness and integrity they bring to the task.

My perspective is that of a capitalist and global tech entrepreneur for 30 years and counting. I'm the retired chairman and co-CEO of Research in Motion, a Canadian technology company that we scaled from an idea to $20 billion in sales. While most are familiar with the iconic BlackBerry smartphone, ours was actually a platform business that connected tens of millions of users to thousands of consumer and enterprise applications via some 600 cellular carriers in more than 150 countries. We understood how to leverage Metcalfe's law of network effects to create a category-defining company, so I'm deeply familiar with multi-sided, platform business model strategies, as well as with navigating the interface between business and public policy.

I'll start with several observations about the nature, scale and breadth of our collective challenge here.

Disinformation and fake news are just two of the many negative outcomes from unregulated attention-based business models. They cannot be addressed in isolation. They have to be tackled horizontally as part of an integrated whole. To agonize over social media's role in the proliferation of online hate, conspiracy theories, politically motivated misinformation and harassment is to miss the root and scale of the problem.

Second, social media's toxicity is not a bug—it's a feature. Technology works exactly as designed. Technology products, services and networks are not built in a vacuum. Usage patterns drive product development decisions. Behavioural scientists involved with today's platforms help design user experiences that capitalize on negative reactions, because they produce far more engagement than positive reactions.

Third, among the many valuable insights provided by whistle-blowers inside the tech industry is this quotation: “The dynamics of the attention economy are structurally set up to undermine the human will”. Democracy and markets work when people can make choices aligned with their interests. The online advertisement-driven business model subverts choice and represents a foundational threat to markets, election integrity and democracy itself.

Fourth, technology gets its power through control of data. Data at the micro-personal level gives technology unprecedented power to influence. Data is not the new oil. It's the new plutonium—amazingly powerful, dangerous when it spreads, difficult to clean up and with serious consequences when improperly used. Data deployed through next generation 5G networks is transforming passive infrastructure into veritable digital nervous systems.

Our current domestic and global institutions, rules and regulatory frameworks are not designed to deal with any of these emerging challenges. Because cyberspace knows no natural borders, digital transformation's effects cannot be hermetically sealed within national boundaries. International coordination is critical.

With these observations in mind, here are my six recommendations for your consideration.

One, eliminate tax deductibility of specific categories of online ads.

Two, ban personalized online advertising for elections.

Three, implement strict data governance regulations for political parties.

Four, provide effective whistle-blower protections.

Five, add explicit personal liability alongside corporate responsibility to affect CEO and board of director decision-making.

Six, create a new institution for like-minded nations to address digital co-operation and stability.

Technology is disrupting governance and, if left unchecked, could render liberal democracy obsolete. By displacing the print and broadcast media in influencing public opinion, technology is becoming the new fourth estate. In our system of checks and balances, this makes technology coequal with the executive, the legislative bodies and the judiciary.

When this new fourth estate declines to appear before this committee, as Silicon Valley executives are currently doing, it is symbolically asserting this aspirational coequal status, but is asserting this status and claiming its privileges without the traditions, disciplines, legitimacy or transparency that check the power of the traditional fourth estate.

The work of this international grand committee is a vital first step towards redress of this untenable current situation. As Professor Zuboff said last night, we Canadians are currently in a historic battle for the future of our democracy with a charade called Sidewalk Toronto.

I'm here to tell you that we will win that battle.

Thank you.

8:35 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Balsillie.

Next up, for five minutes we'll go to Mr. McNamee.

8:35 a.m.

Roger McNamee As an Individual

Co-Chairs Zimmer and Collins, members of the committee, thank you for the opportunity to address you today. My remarks will build on last night's presentations by Professor Zuboff, Professor Tworek, Ben Scott and today's by Jim.

For the 35 years I spent as an investor, I shared Silicon Valley's commitment to technology that empowers the people who use it. Beginning in 2004, however, I noticed a transformation in the culture of Silicon Valley, and over the course of a decade, customer-focused models were replaced by the relentless pursuit of global-scale monopoly and massive wealth.

As Professor Zuboff told you, Google was the first to see the economic opportunity from converting all human experience into data. Google wants to make the world more efficient. They want to eliminate user stress that results from too many choices. Now, Google knew that society would not permit a business model based on denying consumer choice and free will, so they covered their tracks. Beginning around 2012, Facebook adopted a similar strategy, later followed by Amazon, Microsoft and others.

For Google and Facebook, the business is behavioural prediction. They build a high-resolution data avatar of every consumer—a voodoo doll, if you will. They gather a tiny amount of data from user posts and queries, but the vast majority of their data comes from surveillance: web tracking, scanning emails and documents, data from apps and third parties, and ambient surveillance from such products as Alexa, Google Assistant, Sidewalk Labs and Pokémon GO.

Google and Facebook use data voodoo dolls to provide their customers, who are marketers, with perfect information about every consumer. They use the same data to manipulate consumer choices. Just as in China, behavioural manipulation is the goal.

The algorithms of Google and Facebook are tuned to keep users on site and active, preferably by pressing emotional buttons that reveal each user's true self. For most users, this means content that provokes fear or outrage. Hate speech, disinformation and conspiracy theories are catnip for these algorithms. The design of these platforms treats all content precisely the same, whether it be hard news from a reliable site, a warning about an emergency or a conspiracy theory. The platforms make no judgments: users choose, aided by algorithms that reinforce past behaviour. The result is 2.5 billion Truman Shows on Facebook, each a unique world with its own facts.

In the U.S., nearly 40% of the population identifies with at least one thing that is demonstrably false. This undermines democracy. The people at Google and Facebook are not evil. They are products of an American business culture with few rules, wherein misbehaviour seldom results in punishment. Smart people take what they can get and tell themselves they've earned it. They feel entitled. Consequences are someone else's problem.

Unlike industrial businesses, Internet platforms are highly adaptable, and this is the challenge. If you take away one opportunity, they will move on to the next one, and they are moving upmarket, getting rid of the middleman. Today they apply behavioural prediction to advertising, but they have already set their sights on transportation and financial services.

This is not an argument against undermining their advertising business, but rather a warning that it may be a Pyrrhic victory. If your goals are to protect democracy and personal liberty, you have to be bold. You have to force a radical transformation of the business model of Internet platforms. That would mean, at a minimum, banning web tracking, scanning of email and documents, third party commerce and data, and ambient surveillance. A second option would be to tax micro-targeted advertising to make it economically unattractive.

You also need to create space for alternative business models, using anti-trust law. Start-ups can happen anywhere. They can come from each of your countries.

At the end of the day, though, the most effective path to reform would be to shut down the platforms at least temporarily, as Sri Lanka did. Any country can go first. The platforms have left you no choice. The time has come to call their bluff. Companies with responsible business models will emerge overnight to fill the void.

Thank you very much.

8:45 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. McNamee.

We'll go next to Ms. Zuboff.

8:45 a.m.

Shoshana Zuboff As an Individual

Thank you, co-chairmen Zimmer and Collins. It's such a pleasure to be here today.

As you know, I hail from the Harvard Business School, where I am a professor emerita. More importantly, I am the author of this book on surveillance capitalism. I say that because I want you to know that any statements and conclusions I reach today are amply supported by the information and analysis in that work. I might add that my scholarly work on the digital future began in the year 1978. I'll let you do the math on that.

My remarks this morning cover some highlights of a longer written statement that I have submitted to the committee. I add for the record that I am deeply committed to the work of this very important group. That includes continuing to support your work in any way I can, off-line or in future meetings, as we engage in this world-historic challenge.

The Internet is now an essential medium of social participation, and it is owned and operated by private surveillance capital. The questions of law and regulation that this committee seeks to explore cannot be answered without a clear grasp of surveillance capitalism as a novel economic logic defined by distinct economic imperatives that compel specific practices. I don't want to repeat everything that I talked about last night. Roger has touched on some of the key issues, as has Jim, so I will skip ahead to the idea of economic imperatives.

What we see in surveillance capitalism is the unilateral claiming of private human experience, its translation into behavioural data and their fabrication into prediction products, which are sold in a new kind of marketplace that trades exclusively in human futures. When we deconstruct the competitive dynamics of these markets, we get to understand what the new imperatives are. First of all, it's scale. They need a lot of data in order to make good predictions; economies of scale. Secondly, it's scope. They need varieties of data to make good predictions. Ultimately, in the third phase of this competitive struggle, it was discovered that the most predictive data comes from actually intervening in human behaviour, intervening in the state of play, in order to have predictions that come closer and closer to actual observations so that they can guarantee outcomes to their business customers. That is how you win in human futures markets.

I'll share with you one brief quote from a data scientist that rings in everybody's ears when they hear it. He said to me, “We can engineer the context around a particular behaviour and force change that way.... We are learning how to write the music, and then we let the music make them dance.”

Friends, this is behavioural modification, systemically institutionalized on a global scale, mediated by a now-ubiquitous digital infrastructure. It began online. It travelled off-line into the real world on our telephones, our cellphones, and ultimately now we live in a world of devices, which allows this to be amplified and perpetuated. This digital architecture is growing every day. I call it the “big other”. It is at this new level of competitive intensity that it is no longer enough to automate information flows about us. The goal now is to automate us. The goal is to automate us not only as individuals, not only as small groups, but increasingly also on the scale of populations. The goal is to have surveillance capitalism's computational analysis that favours its own commercial outcomes replace democracy and governance as we know it.

In fact, at this very moment in the city of Toronto, Alphabet-owned Sidewalk Labs is spinning its own new euphemisms, which it calls “governance innovation”. This is Orwellian code for the deconstruction of local democracy in favour of Sidewalk's computational rule, which is, in the final analysis, a reincarnation of a kind of absolutist tyranny that we thought we had left behind us in the 18th century, now served with cappuccino and draped in ones and zeroes.

Surveillance capitalism assaults democracy from below and from above. From below, it is a direct assault on human autonomy and agency essential for the possibility of a democratic society. From above, it is marked by asymmetries of knowledge and power the likes of which human history has never seen.

I want to move on to the question of what is to be done, because this is what we really didn't have time to discuss very much last night, and build on Jim's excellent, excellent recommendations, all of which I agree with.

Surveillance capitalism has thrived in the absence of law, as we all know. I take that as a positive sign, because what this means is that we have not failed to rein in this rogue mutation of capitalism. The real issue is that we haven't really tried. The accompanying good news is that our societies have experience in reining in the raw excesses of a destructive capitalism. We did it to end the Gilded Age. We did it to mitigate the Great Depression. We did it in the post-war era. We did it in the seventies to save creatures, air, water, workers and consumers. We know how to do this. This is what democracy is for. It is time to do it again.

The great business historian Tom McCraw wrote a brilliant history of regulation in the 20th century, the 19th and 20th centuries. He identified several phases of regulatory regimes, starting in the late 19th century with the muckrakers and moving into the early 20th century with the progressives. Later, in the New Deal and in the early 1970s, the regulatory frameworks were run by legal minds, legal scholars and legal experts. Finally, by the late 1970s, the eighties and right down to today, it's the economists who have held sway.

But this has been a changing dynamic, and what he notes is that at the end of the day, when you look at the more than a century of regulatory issues and regulatory frameworks, the emphasis has come down on fairness and justice over narrow considerations of economic growth. McCraw asks this question: The economists' hour will not last; what is it that will come next?

I want to tell you what it is that will come next. The next great regulatory vision will be framed and implemented by you and by us. It will be elected officials, citizens and specialists, allied in the knowledge that, despite its failures and shortcomings, democracy is the one idea to emerge from the long human story that enshrines the people's right to self-governance and asserts the ideal of the sovereign individual, which is the single most powerful bulwark against tyranny. We give up these ideas at our peril, but only democracy can impose the people's interests through law and regulation.

McCraw also warns that regulators have failed when they did not adequately frame strategies appropriate to the particular industries that they were regulating. The question is, what kind of law and regulation today will be 21st-century solutions aimed at the unique 21st-century complexities of surveillance capitalism?

There are three arenas in which legislative and regulatory strategies can effectively align with the structure and consequences of surveillance capitalism.

Briefly, first, we need lawmakers to devise strategies that interrupt and in many cases outlaw surveillance capitalism's foundational mechanisms. This includes the unilateral taking of private human experience as a free source of raw material and its translation into data. It includes the extreme information asymmetries necessary for predicting human behaviour. It includes the manufacture of computational prediction products, based on the unilateral and secret capture of human experience. It includes the operation of prediction markets that trade in human futures.

Second, from the point of view of supply and demand, surveillance capitalism can be understood as a market failure. Every piece of research over the last decades has shown that when users are informed of the backstage operations of surveillance capitalism, they want no part of it. They want protection. They reject it. They want alternatives.

We need laws and regulatory frameworks designed to advantage companies that want to break with the surveillance capitalist paradigm. Forging an alternative trajectory to the digital future will require alliances of new competitors who can summon and institutionalize an alternative ecosystem. True competitors who align themselves with the actual needs of people and the norms of market democracy are likely to attract just about every person on earth as their customers.

Third, lawmakers will need to support new forms of citizen action—collective action—just as, nearly a century ago, workers won legal protection for their rights to organize, to bargain and to strike. New forms of citizen solidarity are already emerging in municipalities that seek an alternative to the Google-owned smart city future, in communities that want to resist the social costs of so-called “disruption” imposed for the sake of others' gain, and among workers who seek fair wages and reasonable security in the precarious conditions of the so-called gig economy.

Citizens need your help but you need citizens, because ultimately they will be the wind behind your wings. They will be the sea change in public opinion and public awareness that supports your political initiatives. If, together, we aim to shift the trajectory of the digital future back toward its emancipatory promise, we resurrect the possibility that the future can be a place that all of us might call home.

Thank you.

8:55 a.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Ms. Zuboff, for that testimony.

We'll go next to Ms. Ressa, for 10 minutes.

8:55 a.m.

Maria Ressa Chief Executive Officer and Executive Editor, Rappler Inc., As an Individual

Co-chairmen Zimmer and Collins, I'm still in the same clothes. Good evening from Manila.

As I said early in our morning—your night last night—we here in the Philippines are a cautionary tale for you, an example of how quickly democracy crumbles and is eroded from within and how these information operations can take over the entire ecosystem and transform lies into facts. If you can make people believe that lies are facts, you can control them. Without facts, you don't have truth. Without truth, you don't have trust.

Journalists have long been the gatekeepers for facts. When we come under attack, democracy is under attack. When this situation happens, the voice with the loudest megaphone wins.

The Philippines is a petri dish for social media. As of January 2019, as We Are Social and Hootsuite have said, Filipinos spend the most time online and the most time on social media globally.

Facebook is our Internet, but as I'll show you with some of the data—you should get them handed to you—this is about introducing a virus into our information ecosystem. Over time, that virus lies, masquerading as facts. That virus takes over the body politic and you need to develop a vaccine. That's what we're in search of, and I think we do see a solution.

I've been a journalist for more than 30 years. My book, published in 2011, From Bin Laden to Facebook, looked at how this transformation, this virulent ideology of terrorism, moved from the physical world to the virtual world, and how the al Qaeda-linked group, the Abu Sayyaf here in the Philippines, actually in 2011 used YouTube to try to negotiate ransoms for the people it kidnapped.

I first began looking at social networks in this spread of the virulent ideology. While writing the book, I stumbled on the strategy for Rappler, the start-up that we created in 2012. Using social media and journalism—we embraced it, I drank the Kool-Aid—we built communities of action in a country with weak institutions and endemic corruption. If social networks are your family and friends in the physical world, social media is your family and friends on steroids—no boundaries of time and space.

Understanding information cascades was essential to the growth of Rappler. We were alpha partners of Facebook. We believed and made real social media for social change, and we grew by 100% to 300% year-on-year from the time we were founded in 2012 to 2015. Then, like in the rest of the world, 2016 happened. In May of 2016, President Duterte was elected. A month later, there was Brexit and so on and so on. That was a tipping point for the information operations in our system.

In the Philippines, the weaponization of social media began in July 2016, after President Duterte won—not coincidentally when our brutal drug war began. In a global study with 12 other research groups, we helped define patriotic trolling: online state-sponsored hate meant to pound you into silence, to incite hate against the target and to stifle dissent or criticism. One of the first targets of attack was journalists and newsgroups.

I'm going to quickly show you here the astroturfing that's typical of a three-pronged attack on a target in the Philippines.

The first step is to allege corruption. It doesn't have to be true. Just allege it. If you do it exponentially, it becomes truth. A lie told a million times is truth. Step two, for a woman, if you're a female, you will get attacked sexually. Step three is to lay the groundwork for what you want to happen, whatever that policy is.

In this case, the propaganda machine tried to trend—if you can zoom in here on what I'm showing you, hopefully you'll get this—#ArrestMariaRessa. From there, it went on to jump from the government's creator, the blogger, to a Twitter account that was used in the campaign, so whatever was used in the campaigns then became weaponized. In Tagalog, it says, [Witness spoke in Tagalog], “Call her to the Senate #ArrestMariaRessa.” Then it moves to “I can smell an arrest and possible closure of Rappler.com”. Then finally it moves to the sexual attacks: “Maybe Maria Ressa's dream is to become the ultimate porn star in a gangbang scene”—it is not.

Then finally—and this is a real person who just graduated from college—“Me to the RP government, make sure Maria Ressa gets publicly raped to death when martial law expands to Luzon. It would bring joy in my heart.” #ArrestMariaRessa was an attempt to trend this, to astroturf it. This was in May 2015. My first arrest was in February 2019.

When I was arrested...the methodology is all too familiar. You astroturf on social media, you jump laterally to co-opted traditional media, then repeat and pound top down. In the case of the attack against me and Rappler, it came from President Duterte himself during his state of the nation address in July 2017.

Social media, in 2016, began to lay down the foundation of the legal cases that were filed against us. Starting in January 2018, the government filed 11 cases and investigations against me and Rappler in a 14-month period—roughly a case a month. In about three months, I posted bail eight times. In a five-week period, I was arrested twice and detained once. My only crime is to be a journalist, to speak truth to power, to defend the press freedom that is guaranteed under our constitution.

Here's how it happened. Let me show you.

This is a database that we actually began to put together as a defence. Since we lived on social media, we were able to identify the attacks early on. We found a sock puppet network of 26 fake accounts. As journalists, we then did due diligence to make sure it was fake, and then we went and counted manually. How many accounts could it impact? From 26 fake accounts, they could impact as many as three million.

That became the basis of this database. This is over time, from January 2015 all the way to April 2017. You can basically see the same thing that's happened in the west, which is that there is a fracture line of society, and then, after the drug war began, it was pounded, literally pounded a million times, and it becomes fact. It becomes a solid line.

After this, bayaran—it translates to corrupt—was pounded so frequently that it had 1.7 million comments in a one-month period.

I want to show you the database and the very crude UX that we built for our social media team, because it shows you how the information ecosystem is interrelated. This one shows you the URLs that are controlled, or can be, by Google or YouTube. In the middle rung here, you'll see the Facebook pages that actually spread that URL. Then here, you'll see the average reposting time.

What we did for our team so they could find the difference between information operations and a real person was to actually show, after we published the propaganda series in October 2016.... When it's red, that means it's been reposted more than 10 times. We zoomed in on one account, and you can see that this is actually just the same post reposted over and over again, not just on websites but also on Facebook pages that were used in the campaign, not just that of President Duterte but also that of vice-presidential candidate Bongbong Marcos.

So what do we do? Here's the last thing I want to show you. This is data, which, when you look at it this way, actually doesn't show you much. It's just a list of Facebook pages, and then the weighted degree—in degree, out degree, and then a weighted degree. But, if you put it together, you will see this network. This is the social network that was behind the attack on our vice-president, Leni Robredo, in 2017. I think it's because these same.... It was so organized and it has been sustained. We're talking about almost three years that we've lived through this. The content creators are broken down by demographic. This account—this is where the attack began—takes care of the pseudo-intellectual, the supposed thinking class.

Next is the middle-class content creator in this account, and then we have the mass base account. From there it jumps to traditional media, but the co-opted one is the newspaper and, essentially, the chairman emeritus is the man in charge of international public relations for President Duterte. From there, it connects with state media, and then you close the link on this entire group.

By the way, at that point in time, in 2017, the Philippines and Russia inked a partnership, and we actually had state media employees in Sputnik's offices.

Finally, you close it by taking that mass base account and appointing her to head social media for the presidential palace. It's an incredible ecosystem.

Where does this go and what can we do about it? In the long term, it's education. You've heard from our other three witnesses before me about exactly some of the things that can be done. In the medium term, yes, there is media literacy, but in the short term, frankly, it's only the social media platforms that can do something immediately. We're on the front lines. We need immediate help and immediate solutions.

Rappler is one of three fact-checking partners of Facebook in the Philippines, and we do take that responsibility seriously. We don't look at the content alone. Once we check to make sure that it is a lie, we look at the network that spreads the lie. The first step is to stop a new virus from entering the ecosystem. It is whack-a-mole if you look only at the content, but when you begin to look at the networks that spread it, then you have something that you can pull out.

9:05 a.m.

Conservative

The Chair Conservative Bob Zimmer

Ms. Ressa, could we have you close off your testimony? We're at 12 minutes. I'd like to get to questions if we could.

9:10 a.m.

Chief Executive Officer and Executive Editor, Rappler Inc., As an Individual

Maria Ressa

Sure.

To end with this, I don't know...unless you've been the subject of an attack.... It's very difficult to go through 90 hate messages per hour, sustained over days and months. That is what we're going through, that kind of astroturfing that turns lies into truth. For us, this is a matter of survival.

9:10 a.m.

Conservative

The Chair Conservative Bob Zimmer

My apologies for cutting you off. Your testimony is powerful, and we have watched your story from afar.

We'll get to questions.

I will have to warn you that we're only going to have enough time for one question per delegation in this particular round. In the next round, we have enough time for everybody.

We're going to start off with Damian Collins, then go to Nathaniel Erskine-Smith, Peter Kent and Mr. Angus, and then go through the delegation. That should give us five minutes each. Again, it's going to be tight. I'm going to try to keep us on a five-minute timeline as much as I possibly can.

We'll start with Mr. Collins.

Go ahead.

9:10 a.m.

Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Damian Collins

Thank you, Mr. Chair.

Is that five minutes per delegation?

9:10 a.m.

Conservative

The Chair Conservative Bob Zimmer

Yes, that's correct.

9:10 a.m.

Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Damian Collins

I have two short questions and hopefully my colleagues will be able to get in.

Roger McNamee, in your book, you said, “As far as I can tell, Zuck has always believed that users value privacy more than they should.” On that basis, do you think that we are going to have to establish in law the standards we want to see enforced in terms of users' rights and data privacy, with independent regulators to oversee them? Because the companies will never do that effectively themselves. They just don't share the concerns we have about how the systems are being abused.

9:10 a.m.

As an Individual

Roger McNamee

Yes, I believe that not only is that correct in terms of their philosophy, but as Professor Zuboff points out, it is baked into their business model. It is this notion that any data that exists in the world, claimed or otherwise, they will claim for their own economic use.

Again, framing how you do that privacy is extremely difficult and, in my opinion, would be best done by simply banning the behaviours that are used to gather the data.

9:10 a.m.

Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Damian Collins

This is the final question for me.

You also suggest in your book that the problems in terms of election interference could have started around the time that certain advertising tools, such as lookalike audiences, were launched on the platform. I'd be interested to hear if you have anything more to say about that.

Also, do you believe that some of these targeting tools—as I think Professor Zuboff suggested as well—should be banned from digital advertising? Maybe you shouldn't be able to use lookalike audiences. Indeed, in the U.K., the information commissioner has already questioned whether they're legal under GDPR.

9:10 a.m.

As an Individual

Roger McNamee

Essentially, the problem here is the inversion of politics from the advocacy of a set of policies, and convincing people to join you on those policies, to an election where the number of campaigns is equal to the number of voters and you can use the micro-targeting to take these campaigns to the individual level.

In the United States, it was used to suppress the vote. I can't speak to exactly how it was done in Brexit, but it's very obvious there was a dramatic effect there.

The essential point here is whether you believe that one can have a healthy democracy in an environment where there is advertising that's completely unaccountable because the only people who see it are the intended recipients.

9:10 a.m.

Chair, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Damian Collins

Thank you.

I'll cede the rest of my time.

9:10 a.m.

Conservative

The Chair Conservative Bob Zimmer

It's two and a half minutes.

May 28th, 2019 / 9:10 a.m.

Ian Lucas Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

I was very interested, Mr. Balsillie, in what you were saying about creating a structure of holding the platforms to account.

Do you think that the creation of a liability for platforms empowering citizens to take action for damage caused, like the law of torts, would be a way of holding platforms to account?

9:10 a.m.

Chair, Centre for International Governance Innovation, As an Individual

Jim Balsillie

One thing I can assure you is that when a board of directors or a CEO has to sign an attestation whereby they're personally liable, whether it's civil or criminal, I guarantee you, that sobers the mind and introduces a form of prudence and conservatism into their behaviour. If you introduce that tort or criminal construct and you get an attestation they have to sign, and if the citizens have the ability to be compensated for that, I assure you, that focuses the mind in the corporate boardrooms of tech companies and others, in my experience.

9:10 a.m.

Member, Digital, Culture, Media and Sport Committee, United Kingdom House of Commons

Ian Lucas

I raised the tort concept because we heard from the broadcasting regulator in the U.K. last week that she doesn't really consider that a regulator alone has sufficient flexibility or resources to deal with the scale of the challenge.

I wonder if we individualize the accountability through the development of a liability for the platforms whether that would be a way to empower citizens to take the action we need.

9:15 a.m.

Chair, Centre for International Governance Innovation, As an Individual

Jim Balsillie

I think you create liability, whether it's through class action or an individual or whether it's through regulators. I assure you, if it's corporate, that's one liability, but if it's personal and it ensnares....

The other thing is that it's one thing to be the CEO who is liable. If you're a board person who says, “My board fees aren't enough for me to be ensnared in this”, that changes behaviour. If you introduce liability and shifts on that—how you specifically create somebody who can apply through the courts and all that is specific to each jurisdiction—it changes the decision approaches.