Good afternoon.
Thank you for the opportunity to speak before your committee today. I have closely followed the work of this committee, including its superb representation by the chair and vice-chairs in last November's International Grand Committee on Disinformation and ‘fake news’. I am honoured to be here, and I appreciate your overall interest in consumer privacy.
I am the CEO of DCN. Our mission is to serve the unique and diverse needs of high-quality digital content. This includes small and large premium publishers both young and centuries old. To be clear, our members do not include any social media, search engine or ad tech companies. Although 80% of our members' digital revenues are derived from advertising, we are working with our members to grow and diversify.
DCN works as a strategic partner for its membership by advising and advocating with a particular eye on the future.
As you are aware, there are a wide variety of places where consumers can find online content. In light of this dynamic, premium publishers are highly dependent upon maintaining consumer trust. As an organization, DCN has prioritized shining a light on issues that erode trust in the marketplace, and I'm happy to do so today. This makes enhancing consumer privacy while also growing our members' interests a critical strategic issue for DCN.
Over the past decade, there has been a significant increase in the automation of content distribution and monetization, particularly with advertising. We've shifted to a world where the buying, the bidding, the transacting, and the selling of advertising happens with minimal human involvement. We do not expect nor do we seek to reverse this trend, but today I hope to explore with you a few of the major challenges impacting the industry, the public and democracy.
The first area I would like to explore is the rise of what your December report aptly labels “data-opolies”. Unfortunately, an ecosystem has developed with very few legitimate constraints on the collection and use of consumer data. As a result, personal data is more highly valued than context, consumer expectations, copyright or event facts.
Today, consumer data is frequently collected by unknown third parties without any consumer knowledge or control. Data is then used to target users across the web, without any consideration of the context, and for as cheaply as possible.
In our mind, this is the original sin of the web—allowing for persistent tracking of consumers across multiple contexts. This dynamic creates incentives for bad actors and sometimes criminal actors, particularly on unmanaged platforms like social media where the bias is for a click, whether it's from a consumer or a bot.
What is the result? A massive concentration of who is benefiting from digital advertising, namely Google and Facebook. Three years ago, DCN did the original analysis, including giving them the label of “the duopoly”. The numbers are startling. In the $150 billion-plus digital ad market across North America and the EU, 85% to 90% of the incremental growth and over 70% of the total ad spend is going to just these two companies.
Then we started digging deeper and, as in your report, we connected their revenue concentration to their data practices. These two companies are able to collect data in a way that no one else can. Data is the source of their power. Google has tracking tags in which they collect data on users across approximately 75% of the top one million websites. We also learned, thanks to evidence provided in the U.K. to the DCMS committee, that Facebook has tracking tags on over eight million sites. This means that both companies see much of your browsing and location history.
Although your work is mostly focused on Facebook, we would strongly encourage you to also review the role of Google in the digital ad marketplace. DCN recently helped distribute research conducted by Dr. Doug Schmidt of Vanderbilt University, which documented the vast data collection of Google.
Google has used its unrivalled dominance as a browser, operating system and search engine to become the single greatest beneficiary in the provision of ad tech services. Google has no peer at any stage of the ad supply chain, whether buying, selling, transacting or measuring advertising. In any other marketplace, this would be illegal. In the financial world, it is akin to being the stockbroker, the investment banker, the stock exchange and the stock itself.
Therefore, we believe that recommendations 12 and 13 in your report are important as you seek to understand the clear intersection between competition and data policy. The emergence of these data-opolies has created a misalignment between those who create the content and those who profit from it. It has also allowed a vicious cycle in which the industry rules and the consumer privacy bar are set to protect incumbent industry interests rather than consumer trust.
We would also encourage you to further explore law professor Maurice Stucke's arguments, along with Anthony Durocher's from your Competition Bureau, recommending a shift beyond price-centric analysis as companies offer free products to exploit consumer data. With the U.K. ICO's findings regarding Facebook's privacy practices from 2007 to 2014, which your own report labels as “severe”, I would call attention to a research paper published last week by Dina Srinivasan, titled “The Antitrust Case Against Facebook”, in which Ms. Srinivasan documents this bait and switch by Facebook in its early years, originally using privacy protection as a paramount differentiator in a very competitive set of free products of social networks that were forced to compete on quality and, over time, lowering the quality of privacy.
Finally, the scandal involving Facebook and Cambridge Analytica underscores the current dysfunctional dynamic. Under the guise of research, GSR collected data on tens of millions of Facebook users. As we now know Facebook did next to nothing to ensure that GSR kept a close hold on that data. Facebook's data was ultimately sold to Cambridge Analytica to target political ads and messaging, including in the 2016 U.S. elections.
With the power Facebook has over our information ecosystem, our lives and our democracy, it's vital to know whether or not we can trust the company. Many of its practices prior to reports of the Cambridge Analytica scandal clearly warrant significant distrust. Although there has been a well-documented and exhausting trail of apologies it's important to note there has been little to no change in the leadership or governance of the company. With this in mind, there is an acute need to have a deeper probe, only made more apparent by the company's repeated refusals to have its CEO offer evidence to DCMS and your grand committee. They've said the buck stops with CEO Mark Zuckerberg, but at the same time he's avoided the most difficult accountability questions. There is still much to learn about what happened and how much Facebook knew about the scandal before it became public. The timeline is troubling to me.
We learned from Mr. Zuckerberg's testimony to the U.S. Senate judiciary committee that a decision was made not to inform Facebook users that their data had been sold to Cambridge Analytica after The Guardian reported in December 2015. The Guardian reporter had said he reached out to GSR as early as late 2014, nearly a year before reporting on it. GSR co-founder Aleksandr Kogan testified to Senator John Thune that he and his partner had met with Facebook several times throughout 2015. Even more incredulous to me was that Facebook hired to its staff Kogan's so-called equal partner at GSR, Joseph Chancellor, on November 9, 2015, an entire month before The Guardian reported. Time and again, Facebook has been asked when exactly Mr. Zuckerberg became aware of Cambridge Analytica, yet Facebook only offers a non-answer by replying that he became aware in March of 2018 that the data had not been deleted. On a personal note, I find this answer offensively obtuse.
Considering that the FTC has a consent decree with Facebook to report any wrongful uses of data, it's incredibly relevant to know when its CEO was first aware of Cambridge Analytica. We now know Facebook spent significantly more time and resources in 2016 helping Cambridge Analytica buy and run ad campaigns than they did trying to clean up their self-titled “breach of trust”. Although Facebook's CEO testified to the U.S. Congress in April 2018 that they immediately worked to have the data deleted upon being made aware in 2015, Facebook has already submitted evidence to DCMS that no legal certifications happened with Cambridge Analytica until well into 2017 when its CEO returned a fairly useless piece of paper.
Finally, Facebook disclosed in September 2018 that Mr. Chancellor no longer worked at Facebook without any explanation after a nearly six- months-long investigation, which began only after the TV show 60 Minutes drew further scrutiny to his role.
Equally troubling in all of this, other than verbal promises from Facebook, is that it's not clear what would prevent this from happening again. Moving forward, we urge policy-makers and industry to provide consumers with greater transparency and choice over data collection when using practices that go outside consumer expectations. Consumers expect website or app owners to collect information about them to ensure that the site or app works. Indeed, data collecting used within a single context tends to meet with consumers' expectations, because there is a direct relationship between these activities to the consumer experience, and because the consumer's data is collected and used transparently within the same context. However, as happened in the case of Facebook and Cambridge Analytica, data collected in one context and used in another context tends to run afoul of consumer expectations.
Also, it is important to note that secondary uses of data usually do not provide a direct benefit to consumers. We would recommend exploring whether service providers that are able to collect data across a high threshold of sites, apps and devices should even be allowed to use this data for secondary uses, without informed and specific consent. A higher bar here would solve many of the issues previously mentioned.
Finally, it is important to shed light on these practices and understand how best to constrain them going forward. I appreciate your efforts to better understand the digital landscape. By uncovering what happened and learning from it, you are helping to build a healthy marketplace and to restore consumer trust.