It's a lifelong burden.
It's such an honour to be speaking with you tonight, not least in part because I feel this committee right now is our information civilization's best hope for making progress against the threats to democracy that are now endemic as a result of what you've already heard referred to as surveillance capitalism.
I'm so pleased to hear the kind of synergy already in our comments. The themes that the committee has identified to target, the themes of platform accountability, data security and privacy, fake news and misinformation, are all effects of one shared cause. We've heard that theme tonight, and that's such a big step forward. It's very important to underscore that.
I identify this underlying cause as surveillance capitalism, and I define surveillance capitalism as a comprehensive, systemic economic logic that is unprecedented in our experience. I want to take a moment to say what surveillance capitalism is not, because that sets up a set of distinctions we all need to hear.
First of all, and it has been mentioned—thank you, Ben—surveillance capitalism is not technology. It has hijacked the digital for its own purposes. It is easy to imagine the digital without surveillance capitalism. It is impossible to imagine surveillance capitalism without the digital. Conflating those is a dangerous category error.
Second, surveillance capitalism is not a corporation nor is it a group of corporations. There was a time when surveillance capitalism was Google. Then, thanks to Sheryl Sandberg, who I call the typhoid Mary of surveillance capitalism, surveillance capitalism could have been called Google and Facebook. Ultimately, it became the default model for Silicon Valley and the tech sector, but by now this is a virus that has infected every economic sector.
That is why you began with such a startling and important claim, which is that personal data is valued more than content. The reason is that all of these activities, whether we're talking about insurance, retail, publishing, finance, all the way through to product and service, manufacturing and administration, all of these sectors are now infected with surveillance capitalism, so much so that we hear the CEO of the Ford Motor Company, the birthplace of managerial capitalism a century ago, now saying the only way for Ford to compete with the kind of P/Es and market cap that companies like Facebook and Google have is to reconceptualize the company as a data company and stream the data from the 100 million drivers of Ford vehicles. Those data streams now will put them on a par with the likes of Google and Facebook. “Who would not want to invest in us?” he says. We can no longer confine surveillance capitalism to a group of corporations or a sector.
Finally, surveillance capitalism cannot be reduced to a person or a group of persons. As attractive as it is to identify it with some of the leaders of the leading surveillance capitalists or the duopoly, the Zuckerbergs, the Pages, the Brins and so forth, we have blown past that point in our history when we can make that kind of identification.
As an economic logic, which is structure and institutionalize...change the characters, there may be good, independent reasons for changing the characters, limiting their roles and limiting their extraordinary and unprecedented power, but that will not interrupt or outlaw surveillance capitalism.
Having said what it is not, let us just say very briefly what it is. Surveillance capitalism follows the history of market capitalism in the following way. It takes something that exists outside the marketplace and brings it into the market dynamic for production and sale. Industrial capitalism famously claimed nature for the market dynamic, to be reborn as land or real estate that could be sold or purchased. Surveillance capitalism does the same thing, but now with a dark and startling turn. What it does is it claims private human experience for the market dynamic. Private human experience is repurposed as free raw material. These raw materials are rendered as behavioural data.
Some of these behavioural data are certainly fed back into product and service improvement, but the rest are declared as behavioural surplus, identified for their rich predictive value. These behavioural surplus flows are then channelled into the new means of production, into what we call machine intelligence or artificial intelligence. From there, what comes out of this new means of production is a new kind of product—the prediction product. These factories produce predictions of human behaviour.
You may recall a 2018 Facebook memo that was leaked, and we still don't know exactly by whom. That Facebook memo gave us insight into this hub, this machine intelligence hub, of Facebook: FBLearner Flow. What we learned there is that trillions of data points are being computed in this new means of production on a daily basis. Six million “predictions of human behaviour” are being fabricated every second in FBLearner Flow.
What this alerts us to is that surveillance capitalists own and control not one text but two. There is the public-facing text. When we talk about data ownership, data accessibility and data portability, we're talking about the public-facing text, which is derived from the data that we have provided to these entities through our inputs, through our innocent conversations, and through what we have given to the screen. But what comes out of this means of production, the prediction products and how they are analyzed, is a proprietary text, not a public-facing text. I call it the shadow text. All of the market capitalization, all of the revenue and the incredible riches that these companies have amassed in a very short period of time have all derived from the shadow text. These proprietary data will never be known to us. We will never own that data. We will never have access to that data. We will never port that data. That is the source of all their money and power.
Now, what happens to these prediction products? They are sold into a new kind of marketplace that trades exclusively in human futures. The first name of this marketplace was online targeted advertising. The human predictions that were sold in those markets were called click-through rates. Zoom out only a tiny bit and what you understand is that the click-through rate is simply a fragment of a prediction of a human future.
By now we understand that these markets, while they began in the context of online targeted advertising, are no more confined to that kind of marketplace than mass production was confined to the fabrication of the Model T. Mass production was applied to anything and everything successfully. This new logic of surveillance capitalism is following the same route. It is being applied to anything and everything successfully.
Finally, when we look at these human futures markets, how do they compete? They compete on the quality of their predictions. What I have understood in studying these markets is that by reverse engineering these competitive dynamics, we unearth the economic imperatives that drive this logic. These economic imperatives are institutionalized in significant ecosystems that thread through our economy, from suppliers of behavioural surplus to suppliers of computational capabilities and analysis, to market makers and market players.
These imperatives are compulsions. From these imperatives, every single headline—we open the paper every day and see a fresh atrocity—can be predicted by these imperatives. It began with economies of scale. We need a lot of data to make great predictions. It moved on to economies of scope. We need varieties of data to make great predictions. Now it has moved into a third phase of competition, economies of action, where the most predictive forms of data come from intervening in human behaviour—shaping, tuning, herding, coaxing, modifying human behaviour in the directions of the guaranteed outcomes that fulfill the needs of surveillance capitalism's business customers.
This is the world we now live in. As a result, surveillance capitalism is an assault on democracy from below and from above.
From below, its systems globally institutionalize systems of behavioural modification mediated by global digital architectures—our direct assault on human autonomy, on individual sovereignty, the very elements without which the possibility of a democratic society is unimaginable.
From above, what surveillance capitalism means is that we now enter the third decade of the 21st century. After all the dreams we held for this technology, which Ben has described to us, we enter this third decade marked by an asymmetry of knowledge and the power that accrues to that knowledge that can be compared only to the pre-Gutenberg era, an absolutist era of knowledge for the few, and ignorance for the many. They know everything about us; we know almost nothing about them. They know everything about us, but their knowledge about us is not used for us, but for the purposes of their business customers and their revenues.
To complete, it is auspicious that we are meeting tonight in this beautiful country of Canada, because right now, the front line of this war between surveillance capitalism and democracy is being waged in Canada, specifically in the city of Toronto. Surveillance capitalism began with your online browsing and moved to everything that you do in the real world. Through Facebook's online massive-scale contagion experiments and Google-incubated Pokémon GO, it experimented with population-level herding, tuning and behaviour modification.
Those skills, by the way, have now been integrated into Google's smart city application called Waze. But the real apple here, the real prize, is the smart city itself. This is where surveillance capitalism wants to prove that it can substitute computational rule, which is, after all, a form of absolutist tyranny, for the messiness and beauty of municipal governance and democratic contest.
The frontier is the smart city. If it can conquer the smart city, it can conquer democratic society. Right now, the war is being waged in Toronto. If Canada gives Google, that is, Alphabet—Sidewalk Labs now goes out of its way to claim that it is not Google—Toronto, a blow will be struck against the future possibilities of a democratic society in the 21st century.
Thank you for your attention. I hope to return to this discussion tomorrow with the rest of the testimony.
Thank you so much.