Evidence of meeting #151 for Access to Information, Privacy and Ethics in the 42nd Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was data.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Keit Pentus-Rosimannus  Vice-Chairwoman, Reform Party, Parliament of the Republic of Estonia (Riigikogu)
Sun Xueling  Senior Parliamentary Secretary, Ministry of Home Affairs and Ministry of National Development, Parliament of Singapore
Edwin Tong  Senior Minister of State, Ministry of Law and Ministry of Health, Parliament of Singapore
Jens Zimmermann  Social Democratic Party, Parliament of the Federal Republic of Germany
Jason Kint  Chief Executive Officer, Digital Content Next
Jim Balsillie  Chair, Centre for International Governance Innovation, As an Individual
Roger McNamee  Author of Zucked, As an Individual
Taylor Owen  Associate Professor, McGill University, As an Individual
Ben Scott  As an Individual
Heidi Tworek  Assistant Professor, University of British Columbia, As an Individual
Shoshana Zuboff  As an Individual
Maria Ressa  Chief Executive Officer and Executive Editor, Rappler Inc., As an Individual

7:25 p.m.

Dr. Ben Scott As an Individual

Thank you very much, Mr. Chairman. It's a privilege and an honour to be here in front of this assembled international committee.

I appear before you this evening as an unlikely witness. I say that because I've spent pretty much my entire career promoting the virtues of the open Internet. I came of age during the Internet revolution of the late 1990s. I worked on the first truly digital political campaign for President Barack Obama in 2008. I was one of Hillary Clinton's digital diplomats in the heyday of Internet freedom during the Arab Spring. It was a moment in time when it seemed like smart phones and social media were the genuine catalysts of social and political movements to democratize the world. It was an inspiring moment. These technologies did help those things happen.

I'm an idealist at heart. I wanted to be in the middle of that revolution, but I sit before you today as a troubled idealist. I went back and worked for my old boss in 2016 on her presidential campaign. I ran the technology policy advisory committee. I had a ringside seat to what happened in America between 2015 and 2017. What I saw was that the open Internet that was meant to expand freedom instead turned into a powerful technology of social manipulation and political distortion. You all know the story. What was once the great hope for the revitalization of democracy is now considered by many to be among its greatest threats. My friends, that is a bitter irony—bitter—but it doesn't need to be that way.

The promise of information networks to distribute power to the people is a promise that we can reclaim, but we need to see at what point the astonishing control over wealth and power in this industry began to develop and steer things off course. The roots of this are deep, and we can track it back for decades, but I pinpoint a moment in time between 2014 and 2017 when machine learning technologies were applied to social media platforms, so-called artificial intelligence.

These technologies were not core to the Facebook and Google business models in 2011 and 2012 during the heyday of the Arab Spring. They arrived on the scene sometime later. If you want to know exactly when they arrived on the scene, look at the profit and revenue charts of Google and Facebook. I've written down the numbers for Facebook just to give you the case in point. In 2011, Facebook for the first time made $1 billion in profit on $4 billion in revenue. In 2017, just six years later, after the advent of these new technologies, they made $16 billion in profit on $40 billion in revenue. That's a more than 10x increase in six years.

How did that happen? It happened because they figured out a business model for superprofits. Step one: Track everything that billions of people do online and put it in a database. Step two: Sort that data and group people into target audiences and then sell access to their attention, engineering your entire information marketplace to optimize not for the quality of information or the civility of the dialogue in our society, but optimize just for addictiveness and time spent on the platform. Because the more time people spend on the platform, the more ads they see, and the more money the superprofits make.

It's a beautiful business model, and it works. It works with 10x profit in six years. Very few companies can claim anything like that kind of growth.

Also, it's not just the ads that get targeted. Everything gets targeted. The entire communications environment in which we live is now tailored by machine intelligence to hold our attention. This is not a recipe for truth and justice. What feels true performs better than what is true. Conspiracy and hate have become the organizing themes of social media, and that is a space that is easily exploited by propagandists peddling bigotry, social division and hatred to the disillusioned.

This is the connection between the data markets that we've heard talked about at this table and the abhorrent content you see online, whether we're talking about everyday hate speech or about something truly awful like the shootings in New Zealand. It is the algorithms that lead us into the temptation of our biases. This is what we have to address. We cannot rely on the industry to fix this problem.

The core problem lies at the heart of the business model, what Professor Zuboff calls “surveillance capitalism”, and these companies are kings of the market. Public traded companies—

7:30 p.m.

Conservative

The Chair Conservative Bob Zimmer

Just a second. The translation just went from English to Spanish, I believe. We'll just let the translators know.

I'm good at English, but not at Spanish.

7:30 p.m.

As an Individual

Dr. Ben Scott

That's that machine intelligence coming in to steer you away from my testimony.

7:30 p.m.

Conservative

The Chair Conservative Bob Zimmer

Translation, are we good to go? Okay.

Go ahead, Mr. Scott. Sorry.

7:30 p.m.

As an Individual

Dr. Ben Scott

I've spent the last two and a half years studying this problem, pretty much from the day I woke up after the U.S. presidential election in November 2016, and I'm convinced of the thesis that I've just laid out to you. However, I want to be clear: Technology doesn't cause this problem. It accelerates it. It shapes it. It shapes its growth and its direction. It determines in what ways social development and history flow. Technology is an amplifier of the intentions of those who use it. These consequences are, in my view, not inevitable. There's no technological determinism here. We can fix this.

Just as we made policy decisions to expand access to affordable Internet and to make net neutrality the law of so many lands—we did that to support the democratizing potential of the technology—we can now make policies to limit the exploitation of these tools by malignant actors and by companies that place profits over the public interest. We have to view our technology problem through the lens of the social problems that we're experiencing.

This is why the problem of political fragmentation, hate speech or tribalism in digital media, depending on how you want to describe it, looks different in each of your countries. It looks different in each of your countries because it feeds on the social unrest, the cultural conflict and the illiberalism that is native to each society. There are common features that stretch across the board, but each country is going to see this in a slightly different way.

To be fair, our democracies are failing a lot of people. People are upset for good reason, but that upset is not manifesting as reform anymore. It's manifesting as a kind of festering anger. That radicalism comes from the way technology is shifting our information environments and shaping how we understand the world. We rarely see the world through the eyes of others. We are divided into tribes, and we are shown a version of the world day in and day out, month after month, that deepens our prejudices and widens the gaps between our communities. That's how we have to understand this problem.

To treat this, this sickness, this disease, we have to see it holistically. We have to see how social media companies are part of a system. They don't stand alone as the supervillains, as much as we might like to brand them that way, although they carry a great deal of responsibility. Look and see how the entire media market has bent itself to the performance metrics of Google and Facebook. See how television, radio and print have tortured their content production and distribution strategies to get likes and shares and to appear higher in the Google News search results. It's extraordinary. It reinforces itself, the traditional media and the new media.

Yes, I completely agree with Professor Owen that we need a public policy agenda and that it has to be comprehensive. We need to put red lines around illegal content. We need to limit data collection and exploitation. We need to modernize competition policy to reduce the power of monopolies. We also need to pull back the curtain on this puppet show and show people how to help themselves and how to stop being exploited.

I think there's a public education component to this that political leaders have a responsibility to carry. We need to invest in education, and we need to make commitments to public service journalism so that we can provide alternatives for people, alternatives to the mindless stream of clickbait to which we have become accustomed, the temptations into which we are led as passive consumers of social media.

I know this sounds like a lot, but I invite you to join me in recommitting yourself to idealism. It isn't too much to ask because it's what democracy requires.

Thank you very much.

7:30 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Mr. Scott.

Next up is Heidi Tworek, and on deck is Ms. Zuboff.

7:35 p.m.

Dr. Heidi Tworek Assistant Professor, University of British Columbia, As an Individual

Thank you so much, Mr. Chair.

Thank you to the distinguished members of the international grand committee for the kind invitation to speak before you today. It's really an honour to support international co-operation in this form.

In my work, I wear two hats. I'm a historian and I analyze policy. I know wearing two hats is a bit of a strange fashion choice, but I think it can help to lead us to much more robust solutions that can stand the test of time.

In my policy work, I have written about hate speech and disinformation in Canada, the United States and in Europe. I'm a member of the steering committee of the transatlantic high level working group on content moderation online and freedom of expression.

Wearing my history hat, I've been working for nearly a decade on the history of media. I just finished this book, which is called News From Germany: The Competition to Control World Communications, 1900-1945. Among other things in this book, I detail how it is that Germany's vibrant, interwar media democracy descended into an authoritarian Nazi regime that could spread anti-Semitic, racist and homophobic propaganda around the world.

While I was writing this book, the present caught up with history in all sorts of, frankly, disturbing ways. The far right around the world revived Nazi terminology using lügenpresse and systempresse—the lying press and the system press—to decry the media. Marginalized groups were targeted online and they were blamed for societal ills that they did not cause. News was falsified for political and economic purposes. Like with radio in the first half of the 20th century, a technology designed with utopian aims became a tool for dictators and demagogues.

As our other witnesses have described, some aspects of the Internet are unprecedented, such as the micro-targeting, the scale, the machine learning and the granular level of surveillance, but some of the underlying patterns look surprisingly familiar to the historians among us.

I'm going to offer five brief lessons from this history that I think can guide our policy discussions in the future and enable us to build robust solutions that can make our democracies stronger rather than weaker.

The first lesson is that disinformation is also an international relations problem. Information warfare has been a feature, not a bug, of the international system for at least a century. The question is not if information warfare exists, but why and when states engage in it.

What we see is that it's often when a state feels encircled, weak or aspires to become a greater power than it already is. This is as true for Germany a hundred years ago as it is for Russia today. If many of the causes of disinformation are geopolitical, we need to remember that many of the solutions will be geopolitical and diplomatic as well.

Second, we need to pay attention to the physical infrastructure of what is happening. Information warfare and disinformation are also enabled by physical infrastructure, whether it's the submarine cables a century ago or fibre optic cables today. One of Britain's first acts of war in World War I was to cut the cables that connected Germany to the rest of the world, pushing Germany to invest in a new communications technology, which was radio. By the time the Nazis came to power, one American radio executive would call it the most potent political agency the world had ever known.

We often think of the Internet as wireless, but that's fundamentally untrue; 95% to 99% of international data flows through undersea fibre optic cables. Google partly owns 8.5% of those submarine cables. Content providers also own physical infrastructure. Sometimes those cables get disrupted because they get bitten through by sharks, but states can bite, too. We do know that Russia and China, for example, are surveying European and North American cables.

We know China, of course, is investing in 5G, but it is combining that in ways that Germany did as well, with investments in international news networks like the Belt and Road News Network, English language TV channels like CGTN, or the Chinese news agency, Xinhua.

The third lesson, as many of the other witnesses have said, is that we need to think about business models much more than individual pieces of content. It's very tempting to focus on examples of individual content that are particularly harmful, but the reason that those pieces of content go viral is because of the few companies that control the bottleneck of information.

Only 29% of Americans or Brits understand that their Facebook newsfeed is algorithmically organized. The most aware are the Finns and only 39% of them understand that. That invisibility accords social media platforms an enormous amount of power. That power is not neutral. At a very minimum, we need far more transparency about how algorithms work, whether they are discriminatory and so on and so forth. As we strive towards evidence-based policy, we need good evidence.

Fourth, we need to be careful to design robust regulatory institutions. Here, the case of Germany in the interwar period offers a cautionary tale. Spoken radio emerged in the 1920s. Bureaucrats in the democratic Weimar Republic wanted to ensure that radio would bolster democracy in a very new democracy after World War I. As that democracy became more politically unstable, those bureaucrats continually instituted reforms that created more and more state supervision of content. The idea here was to protect democracy by preventing news from spreading that would provoke violence. The deep irony of this story is that the minute the Nazis came to power, they controlled radio. Well-intentioned regulation, if we're not careful, can have tragic unintended consequences.

What does that mean for today? It means we have to democracy-proof whatever the solutions are that we come up with. We need to make sure that we embed civil society in whatever institutions we create.

One suggestion that I made with Fenwick McKelvey and Chris Tenove was the idea of social media councils that would be multi-stakeholder fora and that could meet regularly to actually deal with many of the problems we're describing. The exact format and geographical scope are still up for debate, but it's an idea supported by many, including the UN special rapporteur on freedom of expression and opinion.

Fifth, we need to make sure that we still pay attention to and address the societal divisions exploited by social media. The seeds of authoritarianism need fertile soil to grow. If we do not attend to the underlying economic and social discontent, better communication cannot obscure those problems forever.

Let me then remind you of these five lessons. First, disinformation is also an international relations problem. Second, we need to pay attention to physical infrastructure. Third, business models matter more than do individual pieces of content. Fourth, we need to build robust regulatory institutions. Fifth, we must pay attention to those societal divisions that are exploited on social media.

Attending to all of these things, there is no way they can be done within any one nation; they must be done also through international co-operation. That's why it's such a great honour to have had the chance to appear before you today.

Thank you very much.

7:40 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Ms. Tworek.

Next up is Ms. Zuboff.

You, and I with a name like Zimmer, have always been at the end of every list, and you're just about there.

Go ahead, Ms. Zuboff. It's good to have you here.

7:40 p.m.

Shoshana Zuboff As an Individual

Thank you so much, Chairman Zimmer.

Indeed, I'm reminded of elementary school tonight.

Of course, you reverse the order tomorrow morning.

7:40 p.m.

Conservative

The Chair Conservative Bob Zimmer

That's just because I understand.

Go ahead.

7:40 p.m.

As an Individual

Shoshana Zuboff

It's a lifelong burden.

It's such an honour to be speaking with you tonight, not least in part because I feel this committee right now is our information civilization's best hope for making progress against the threats to democracy that are now endemic as a result of what you've already heard referred to as surveillance capitalism.

I'm so pleased to hear the kind of synergy already in our comments. The themes that the committee has identified to target, the themes of platform accountability, data security and privacy, fake news and misinformation, are all effects of one shared cause. We've heard that theme tonight, and that's such a big step forward. It's very important to underscore that.

I identify this underlying cause as surveillance capitalism, and I define surveillance capitalism as a comprehensive, systemic economic logic that is unprecedented in our experience. I want to take a moment to say what surveillance capitalism is not, because that sets up a set of distinctions we all need to hear.

First of all, and it has been mentioned—thank you, Ben—surveillance capitalism is not technology. It has hijacked the digital for its own purposes. It is easy to imagine the digital without surveillance capitalism. It is impossible to imagine surveillance capitalism without the digital. Conflating those is a dangerous category error.

Second, surveillance capitalism is not a corporation nor is it a group of corporations. There was a time when surveillance capitalism was Google. Then, thanks to Sheryl Sandberg, who I call the typhoid Mary of surveillance capitalism, surveillance capitalism could have been called Google and Facebook. Ultimately, it became the default model for Silicon Valley and the tech sector, but by now this is a virus that has infected every economic sector.

That is why you began with such a startling and important claim, which is that personal data is valued more than content. The reason is that all of these activities, whether we're talking about insurance, retail, publishing, finance, all the way through to product and service, manufacturing and administration, all of these sectors are now infected with surveillance capitalism, so much so that we hear the CEO of the Ford Motor Company, the birthplace of managerial capitalism a century ago, now saying the only way for Ford to compete with the kind of P/Es and market cap that companies like Facebook and Google have is to reconceptualize the company as a data company and stream the data from the 100 million drivers of Ford vehicles. Those data streams now will put them on a par with the likes of Google and Facebook. “Who would not want to invest in us?” he says. We can no longer confine surveillance capitalism to a group of corporations or a sector.

Finally, surveillance capitalism cannot be reduced to a person or a group of persons. As attractive as it is to identify it with some of the leaders of the leading surveillance capitalists or the duopoly, the Zuckerbergs, the Pages, the Brins and so forth, we have blown past that point in our history when we can make that kind of identification.

As an economic logic, which is structure and institutionalize...change the characters, there may be good, independent reasons for changing the characters, limiting their roles and limiting their extraordinary and unprecedented power, but that will not interrupt or outlaw surveillance capitalism.

Having said what it is not, let us just say very briefly what it is. Surveillance capitalism follows the history of market capitalism in the following way. It takes something that exists outside the marketplace and brings it into the market dynamic for production and sale. Industrial capitalism famously claimed nature for the market dynamic, to be reborn as land or real estate that could be sold or purchased. Surveillance capitalism does the same thing, but now with a dark and startling turn. What it does is it claims private human experience for the market dynamic. Private human experience is repurposed as free raw material. These raw materials are rendered as behavioural data.

Some of these behavioural data are certainly fed back into product and service improvement, but the rest are declared as behavioural surplus, identified for their rich predictive value. These behavioural surplus flows are then channelled into the new means of production, into what we call machine intelligence or artificial intelligence. From there, what comes out of this new means of production is a new kind of product—the prediction product. These factories produce predictions of human behaviour.

You may recall a 2018 Facebook memo that was leaked, and we still don't know exactly by whom. That Facebook memo gave us insight into this hub, this machine intelligence hub, of Facebook: FBLearner Flow. What we learned there is that trillions of data points are being computed in this new means of production on a daily basis. Six million “predictions of human behaviour” are being fabricated every second in FBLearner Flow.

What this alerts us to is that surveillance capitalists own and control not one text but two. There is the public-facing text. When we talk about data ownership, data accessibility and data portability, we're talking about the public-facing text, which is derived from the data that we have provided to these entities through our inputs, through our innocent conversations, and through what we have given to the screen. But what comes out of this means of production, the prediction products and how they are analyzed, is a proprietary text, not a public-facing text. I call it the shadow text. All of the market capitalization, all of the revenue and the incredible riches that these companies have amassed in a very short period of time have all derived from the shadow text. These proprietary data will never be known to us. We will never own that data. We will never have access to that data. We will never port that data. That is the source of all their money and power.

Now, what happens to these prediction products? They are sold into a new kind of marketplace that trades exclusively in human futures. The first name of this marketplace was online targeted advertising. The human predictions that were sold in those markets were called click-through rates. Zoom out only a tiny bit and what you understand is that the click-through rate is simply a fragment of a prediction of a human future.

By now we understand that these markets, while they began in the context of online targeted advertising, are no more confined to that kind of marketplace than mass production was confined to the fabrication of the Model T. Mass production was applied to anything and everything successfully. This new logic of surveillance capitalism is following the same route. It is being applied to anything and everything successfully.

Finally, when we look at these human futures markets, how do they compete? They compete on the quality of their predictions. What I have understood in studying these markets is that by reverse engineering these competitive dynamics, we unearth the economic imperatives that drive this logic. These economic imperatives are institutionalized in significant ecosystems that thread through our economy, from suppliers of behavioural surplus to suppliers of computational capabilities and analysis, to market makers and market players.

These imperatives are compulsions. From these imperatives, every single headline—we open the paper every day and see a fresh atrocity—can be predicted by these imperatives. It began with economies of scale. We need a lot of data to make great predictions. It moved on to economies of scope. We need varieties of data to make great predictions. Now it has moved into a third phase of competition, economies of action, where the most predictive forms of data come from intervening in human behaviour—shaping, tuning, herding, coaxing, modifying human behaviour in the directions of the guaranteed outcomes that fulfill the needs of surveillance capitalism's business customers.

This is the world we now live in. As a result, surveillance capitalism is an assault on democracy from below and from above.

From below, its systems globally institutionalize systems of behavioural modification mediated by global digital architectures—our direct assault on human autonomy, on individual sovereignty, the very elements without which the possibility of a democratic society is unimaginable.

From above, what surveillance capitalism means is that we now enter the third decade of the 21st century. After all the dreams we held for this technology, which Ben has described to us, we enter this third decade marked by an asymmetry of knowledge and the power that accrues to that knowledge that can be compared only to the pre-Gutenberg era, an absolutist era of knowledge for the few, and ignorance for the many. They know everything about us; we know almost nothing about them. They know everything about us, but their knowledge about us is not used for us, but for the purposes of their business customers and their revenues.

To complete, it is auspicious that we are meeting tonight in this beautiful country of Canada, because right now, the front line of this war between surveillance capitalism and democracy is being waged in Canada, specifically in the city of Toronto. Surveillance capitalism began with your online browsing and moved to everything that you do in the real world. Through Facebook's online massive-scale contagion experiments and Google-incubated Pokémon GO, it experimented with population-level herding, tuning and behaviour modification.

Those skills, by the way, have now been integrated into Google's smart city application called Waze. But the real apple here, the real prize, is the smart city itself. This is where surveillance capitalism wants to prove that it can substitute computational rule, which is, after all, a form of absolutist tyranny, for the messiness and beauty of municipal governance and democratic contest.

The frontier is the smart city. If it can conquer the smart city, it can conquer democratic society. Right now, the war is being waged in Toronto. If Canada gives Google, that is, Alphabet—Sidewalk Labs now goes out of its way to claim that it is not Google—Toronto, a blow will be struck against the future possibilities of a democratic society in the 21st century.

Thank you for your attention. I hope to return to this discussion tomorrow with the rest of the testimony.

Thank you so much.

7:55 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Ms. Zuboff.

Last, I will go to Maria Ressa. She's coming all the way from Manila in the Philippines.

Go ahead, Ms. Ressa.

7:55 p.m.

Maria Ressa Chief Executive Officer and Executive Editor, Rappler Inc., As an Individual

Good morning.

First of all, what a privilege it is to be in front of you and to listen to everyone. Take everything that you've heard, especially what Shoshana just said.

I'm living this stuff right now. I've been a journalist for more than 30 years, and in the last 14 months, I've had 11 cases, five against me by the government. I've had to post bail eight times in a little over three months, and I've been arrested twice in five weeks. All of this stuff, bottom up.... I call that astroturfing on social media bottom-up information operations that are going down, and then you have top down, which again was described for you much more fully earlier.

I'm going to keep it short because I'll give you a formal presentation tomorrow. I think that, in the end, it comes down to everything that you have heard. It comes down to the battle for truth, and journalists are on the front line of this, along with activists. We're among the first targeted. The legal cases and the lobbying weaponized against me came after social media was weaponized. So, with regard to this battle for truth, at no other time do we really know that information is power. If you can make people believe lies, then you can control them, and that's aside from the commercial aspects of it. We're talking about information as a means to gain geopolitical power. If you have no facts, then you have no truth. If you have no truth, you have no trust.

We've seen that erosion. We first, at Rappler, were a start-up that really looked at information cascades, social networks, family and friends. Social media are your family and friends on steroids, so we looked at how information cascaded. What I'll show you tomorrow is the data that shows you exactly how quickly a nation, a democracy, can crumble because of information operations. If you say a lie a million times, it is the truth. There is this phrase “patriotic trolling”. It is online state-sponsored hate that targets an individual, an organization or an activist, pounding them to silence, inciting hate. We all know that online hate leads to real world violence.

We're the cautionary tale. I've had as much as 90 hate messages per hour. My nation has moved in three years' time from a very vibrant democracy where social media for social good was really used—we lived it. I believed we were.... My organization was one of the ones that worked very closely with Facebook, and then to see it weaponized at the end of 2015 and 2016.... It wasn't until after President Duterte took office in July 2016, the beginning of the drug war. The first targets were anyone on Facebook who questioned the numbers of killings. The UN now estimates that 27,000 people have been killed since July 2016. It's a huge number.

I'll end by saying that tomorrow I will give you the data that shows it. It is systematic. It is an erosion of truth. It is an erosion of trust. When you have that, then the voice with the loudest megaphone wins. In our case, it's President Duterte. We see the same things being carried out in the United States. Whether it's Trump, Putin or Duterte, it's a very similar methodology.

I'll end with this and just say thank you for bringing us in. I mean, what's so interesting about these types of discussions is that the countries that are most affected are democracies that are most vulnerable, like ours here in Southeast Asia, in the global south. Every day that action is not taken by the American tech platforms, the social media platforms that should have American values.... The irony, of course, is that they have eroded that in our countries.

There is some action that has been taken. I will say that we work closely with Facebook as a fact-checker, and I've seen that they're looking at the impact and that they have been trying to move at it. It has to move much faster.

Here's the last part of this: If they're responding to political situations in the west, that normally leads to neutral responses. Neutral responses mean that, in the global south, people will die and people will get jailed. This is a matter of survival for us.

Thank you.

8 p.m.

Conservative

The Chair Conservative Bob Zimmer

Thank you, Ms. Ressa.

That pretty much brings us to the end of our testimony for this evening.

We're going to Room 225A down the hall to meet more informally and to be able to ask you questions directly. I'm sorry again, Ms. Ressa, that you're not able to be here, but again the idea tonight was to get the conversation going and that will continue over there.

I want to remind everybody we are going to start crisply with testimony at 8:30 tomorrow morning, so I challenge you to be here when you can. I am going to be much more limiting. I gave some latitude with time tonight. Folks on the committee, tomorrow it's five minutes each for questioning. Again we look forward to the testimony.

We won't hear some of you again, but we thank you for making the special trip to be part of this panel tonight, and we look forward to this conversation continuing. Regardless that the Wednesday meeting ends at noon, the conversation will continue on how to make our data world a better place.

We'll see you just down the hall.

The meeting is adjourned.