Thank you, Mr. Chair.
What brings me to sit before you today is a tale of regret. I look south at the political and democratic disaster playing out in my own country with great distress and great humility. I was among those young, idealistic, tech-savvy staffers who went to join the Obama administration in the early days after he was elected.
It was a time when we had big ideas about open data, social media, and global digital markets for speech and commerce as liberatory, as a new tool of democratic soft power, and they were—we've benefited tremendously from those forces over the last decade—but it was a double-edged sword. We were not prepared for the way that technology proved instrumental in ushering in one of the darkest chapters in American political history. We didn't do enough.
We're not alone in this. We are now seeing related phenomena across the democratic world—in Britain, Germany, Italy, France, and many other places.
The politics of resentment that we're seeing in contemporary populism mixed with the distorting power of the digital information market are a toxic brew. You have rightly pointed this out in the examination you've conducted so far, and in what we've seen in parallel examinations of this phenomenon in other legislatures.
My message to you today is a simple one: Don't wait to see how it plays out in Canada. Act right now. It will happen here too. The only question is how, and whether the consequences will be effectively mitigated in the Canadian context.
What is to be done? The first thing I want to say is, don't count on the private sector to deal with this problem. Publicly traded monopolies do not self-regulate. If we didn't know that before, we've certainly learned it over the course of the last year and a half. It brings to mind a quote that I like from my favourite chronicler of monopoly capitalism from a century ago, Upton Sinclair. He said, “It is difficult to get a man to understand something when his salary depends on his not understanding it.”
The answer here is not going to be the market; the answer is going to be government using its tools to steer the market back in the direction of the public interest. We need a kind of digital charter for democracy, one that lays out a set of principles and comes in behind it with clear policies that begin to make the changes we need to protect the integrity of our democratic public sphere.
We need to start right away, but we need to expect that this will take time. There are no single solutions to this problem. It's going to be a combination of things, none of which are sufficient by themselves, and all of which are necessary. It's going to be a messy process, because no one thing will appear to be moving the needle and making the difference that we would all like to see. However, together these things can first contain the problem, then treat the symptoms, and ultimately begin to get at the root causes of the structural problems in the market, both on the supply side and the demand side.
We begin first with security. This is the simplest and most important piece of the puzzle. The combination of cyber-attack and disinformation campaigns that we have seen unleashed on elections in several different countries is a dire threat, and we have to treat it that way. We need to increase the cybersecurity applied to our democratic institutions, including not just election administration but also political parties and campaigns. They should be treated as critical infrastructure, in my view. We also need to be much better about coordinating the research, monitoring, and exposure of disinformation campaigns that are happening with security services, with outside research entities, and with companies.
We're beginning to see a model developing in the U.S. that is worthy of examination and expansion, but let me be clear: Even if we solve the security problem, we're only eliminating a minor part of the problem. Most of the threats come from within, not from without. The most important thing in my mind about the foreign interventions we have seen across the world is that they took advantage of standard market-based tools. They were opportunistic amplifications of existing domestic political movements, and they were using tools that are perfectly well known and understood by commercial marketers across the digital world.
The second piece we can begin to deal with is illegal content. Again, it's not a huge part of the problem, but it's an important part. Citizens have a right to be protected from illegal content. There are now categories of content that are illegal in the off-line world; they should be illegal in the online world. These include hate speech, defamation, harassment, and incitement to violence.
All of these things can be removed on an accelerated timetable with a process that is rigorously overseen by regular judicial oversight and that has an appeals process so that we are not endangering freedom of expression when we begin to move into the space of removing illegal content. You can't cede that power to the platform companies, but we need their involvement in order to speed up the process.
Once we've dealt with the security issues and the illegal content issues, we get into the real meat of the problem: How do we mitigate the influence of disinformation campaigns that are homegrown, that begin to separate people from facts that help inform their judgments and that begin to polarize our society over time?
One thing we can do is really cultivate the research community to spend more time, energy, and money studying the problem. We simply don't know enough about how disinformation works and how the digital market works to shape political views and electoral outcomes. We need to develop ways to signal users to be wary and to be critical consumers of digital media.
Consider for a moment the average consumer who is accustomed to the traditional media environment. When you step into a news agent at an airport and look at the periodicals arrayed before you, you see the daily newspapers, and you see the political magazines and the sports, automotive, entertainment, and home and garden magazines. Depending on where you're standing, when you pick a periodical off the rack, you have a pre-set schema in your mind about what to expect.
In the digital environment, all of that is compressed into a single stream, and it looks the same. It's a Facebook newsfeed. It's a Twitter feed. It's a YouTube NextUp list of videos. In that environment, all of the signals about source credibility and quality that we once had begin to attenuate. People will tell you that they read an outrageous thing the other day and that it has really shaped their views on an important matter, whether it's climate, immigration or economic policy. You ask them where they read that, and they say they read it on Facebook—but they didn't read it on Facebook. They read it through Facebook on some other source. What was the other source? They don't remember.
We've lost the normative structure that in the old media environment allowed us as citizens to make implicit judgments about source credibility and, when we're reading digital media, to engage in critical thinking. We need to begin to find ways to understand this problem better through the research community and to begin to address it through public education and digital literacy.
As well, there are many things we can do in the market with a regulatory intervention. We can ask the companies and compel them to be much more transparent in the way they operate. This starts with political ads.
There's no reason in the world why every citizen who sees a political ad shouldn't know exactly who bought it, how much they spent, and how many people they paid to reach. Most importantly, why did I as an individual voter get that message? Is it because of my gender, my age, my income? Is it because of where I live? Is it because my characteristics are similar to those of other people they're targeting? I should be able to know that, because when I know that, it allows me to engage in a much more critical view about why that ad came to me.
To me, transparency is the simplest and easiest way to regulate the companies to move in the right direction. It's something they're voluntarily doing, but only in some countries and only when they're getting public pressure to do it. In no case has there been law laid down to mandate it. I think that's an easy first step.
There are a variety of other things that I think we ought to engage in as well. These are longer-term structural issues. They include algorithmic accountability. We need to look at how algorithms work and how they impact social welfare. We need to look at data privacy; we need to reduce the amount of data that companies collect, and we need to restrict how they use it.
Also, we need to be looking at competition policy. We need to be looking at modernizing antitrust policy to put shackles on anti-competitive practice, to restrict mergers and acquisitions, and to ease access to market entry for new kinds of services that offer alternatives to the existing models whose externalities have led to such negative outcomes.
Finally, we need to focus on the long-term task of addressing public education. We need to help people help themselves by helping them to become stronger and more insightful media consumers.
That includes not only digital literacy but also investments in better and more independent media. We can't expect people to steer their way away from nonsense on the Internet if there isn't a large body of quality information and journalism available to them.
I can't predict the future of where this combination of policies will go, but I do think it's the right starting point. I don't think we have a lot of time to lose. I'm encouraged and inspired by the work of this committee that government is moving in the right direction.
Thank you for your attention. I look forward to the discussion.