Mr. Rosenberg, on page 20, says, “They found that, notwithstanding more assertive moderation and election integrity policies, large social media platforms continued to be home to widespread misinformation.”
It's sort of interesting to think about this topic in relation to this work and study, because there is evidence documented that some of the social media platforms used to spread misinformation related to Conservative candidates in the last election. I think it's interesting to dive into that area and find out more, because we've heard, from our national security and intelligence community, about the fact that our election was free and fair, and that there's no evidence there was any impact on the overall results of the election. Otherwise, it would have triggered the protocol.
In fact, I know we heard.... I'll have to dig out this quote. I remember it, but I can't remember who exactly said it. I think it was David Morrison, but I will check and come back to the committee in a future intervention to verify that. He said something to the effect that the panel did not deliberate as to whether or not any misinformation or attempts at foreign interference in the election reached the threshold. They didn't. The debates and deliberations they had were only around whether or not something actually counted as foreign interference.
This is an interesting distinction, because those are two very different things. Of course, the protocol and the threshold associated with it is quite high on purpose. Notifying the public of foreign election interference is set at a fairly high level to ensure it's only in times when an election has actually been affected. I think what he was saying in his remarks.... Again, I will check as to who said that for the committee members. I'll come back with that specific reference.
Listening to the national security and intelligence advisers' community.... I think it's been very clear in the testimony we've heard to date. There are many other things that point to this, in terms of what they were able to say or not say. It was very clearly implied in what they said: If CSIS had that information, as has been claimed, it would have been provided to the RCMP and the commissioner of Canada elections. It would have been investigated. When asked, the RCMP said there were no investigations under way.
You know, we heard from national security and intelligence experts as prominent as the director of CSIS. They said they could not verify whether or not this leaked information was coming from CSIS or some other organization. To me, there's no confidence. I think Mr. Fergus eloquently spoke to this in his intervention. It's quite concerning when we see uncorroborated allegations floating out there. Obviously, they are dangerous in and of themselves when they're based on likely partial or pieces of intelligence that might not be interpreted correctly, or might not have been analyzed and converted into evidence. I think that's a very big gap, in terms of what's being said, claimed or reported.
These are all very important points to consider. They're all good reasons for understanding the fact that anybody who has a security clearance would be breaking the law if they came to our committee and said things they're not allowed to say.
I don't know what opposition members are hoping to get out of having the chief of staff of the Prime Minister come to the committee, other than trying to perpetuate some kind of false narrative that there's some big cover-up of some big scandal, which to me is just playing politics with a really important issue. What we heard from Fred DeLorey in the Toronto Star article that he wrote was very clear, and I'm going to quote him. I know others have quoted him and I've quoted him in the past. I don't think I used this exact quote, but this is an even better quote. He wrote:
As the national campaign manager for the Conservative Party of Canada during the 2021 election, it’s important to clarify one critical issue. I can confirm, without a shadow of a doubt, that the outcome of the election, which resulted in the Liberals forming government, was not influenced by any external meddling.
The national campaign manager of the Conservative Party of Canada is saying that he can confirm “without a shadow of a doubt” that whatever forms of attempted interference were present did not have an impact on the results of the election. Be that as it may, I think it makes the case for having him come to our committee and speak to why he says something like that with such a degree of confidence.
We have many Conservative members claiming all kinds of things that are untrue, as sensationalistic and absurd as claiming that the Prime Minister is working against the interests of Canada, which is treasonous. Those kinds of claims are hyperpartisan, sensationalistic. They're not true. They're disgusting and they detract from the overall stability of our democracy as a whole. I don't know how anyone who's a member of Parliament can utter such things without having some basis for making such absurd claims. They're completely unfounded claims.
For me, if we have Fred DeLorey, a person as prominent as the national campaign manager for the Conservative Party of Canada, claiming that, beyond a shadow of a doubt, there was no impact on the election results, we should probably hear from that individual. Certainly, hearing from the national campaign director or manager of the Liberal Party of Canada would be great and the NDP's manager, etc., and other parties' managers, would be very helpful as well. I think it would allow us to dive into a very important topic, which is to what degree online campaigns of misinformation and disinformation were present and being spread throughout the country during and perhaps before the election.
I know that during the election there was an independent third-party analysis done for the 2021 Canadian federal election. Despite what Mr. Barrett said, which I found very offensive when I made a point of order earlier and he implied that I had just got out of bed at something like 11 o'clock or something like that—I don't know why he would say such a thing—I was actually up early reading this quite extensive 80-page report on misinformation and disinformation during the 2021 Canadian federal election in preparation for our meeting today. I have lots and lots of observations and, I would say, insights from this report that I would like to share.
It all goes to the argument that we should really be having national campaign managers come before the committee because, of course, they would be best placed, in my view, to comment on the level of misinformation that was circulating during the election, and probably corroborate some of this independent report that has been done by a group called the Media Ecosystem Observatory, which consists of the centre for media, technology and democracy at the Max Bell school of public policy at McGill University, and PEARL, which is the policy, elections and representation lab at the Munk school of global affairs and public policy at University of Toronto.
Prominent individuals in their field have all participated in this work. There's a long list of contributors. For anybody who wants to look up the report and read it, it's called “Mis- and Disinformation During the 2021 Canadian Federal Election”. It's dated March 2022. It's been around for at least a few months, long enough for us to have a review and read it. I've spent quite a lot of time looking at it, because I think it has quite a lot of really useful information.
Why is this report important? It's important because we know that, dating back to at least 2018, there were reports done by parliamentary committees on threats to Canadian democracy. Let me put it this way: The major factor that seems to be evolving or changing the threat environment....
We've heard from every national security and intelligence professional that the threat environment is evolving. Why is it evolving? You could say it's evolving predominantly because of the spread of online information, digital platforms and the prevalence with which they're used by Canadians. That is one of the most important vehicles for the spread of information that may mislead Canadians, erode Canadian democracy and change intentions in the voting behaviour of the public.
If that's the case, I would go back to Morris Rosenberg's report. In the report there are instances of misinformation listed. Some of them do target candidates in the last election, but what's interesting is that we can't just immediately jump to conclusions about that. We already know that domestic and foreign interference online is happening all the time. It's ubiquitous. Literally every single day there's information being spread that's not entirely accurate. Sometimes it's entirely fabricated, but most of the time it's partially inaccurate or partially true, so it's stretching the truth, in a sense. What's interesting about this is that it actually has an impact on the population over time. We should be looking at how we make useful recommendations out of our work today and over the course of this study on that topic.
What's interesting is that in the very first pages of this report, they have a summary. I've read the whole thing, so I'm not going to quote from the executive summary. I've done the work here. The most extensive documented misinformation in the last 2021 Canadian federal election was on COVID-19 misinformation and widespread claims of voter fraud. Those were the two biggest misinformation campaigns online. They also note at the beginning that a lot of the discussion has focused on Chinese interference, which is interesting, because there is actually a lot more evidence that COVID-19 misinformation and claims of widespread voter fraud circulated on social media platforms. They should actually be part of our conversation on this topic. We're not looking at all the threats to our democracy if we're only focusing on the forms of foreign interference coming from China. We actually have to broaden our scope and look at all forms of foreign interference related to misinformation spread online.
One thing that I think is important for us to note is that Canadians are generally able to detect false stories. That's kind of important when considering this topic. There's really strong evidence here that Canadians are able to detect what is false from what is true. That's not to say that the rapid spread of misinformation isn't having an impact on the population. It is to say, thank God for Canadians' ability to discern what is true and not true. That is somewhat holding up in an era where misinformation and disinformation is so rampant and far-reaching.
The third point in the summary is, “we find no evidence that Chinese interference had a significant impact on the overall election.” For that to be on the front page of an executive summary is pretty important.
That's not to say that there weren't attempts. It also says, “Misleading information and information critical of certain candidates circulated on Chinese-language social media platforms.” There's lot of comment in the report about that. However, that it did not have an impact and that there's no evidence of its having an impact on the overall election again corroborates what we've heard from national security and intelligence experts, from the national security and intelligence adviser to the Prime Minister, from all the public servants who are part of the panel that oversees the protocol during the caretaker period and also from ministers who have come before this committee.
If we trust the experts—the people whose job it is to do that work and to protect us in our democracy—we have to say that not only have we had every level of accountability come before this committee, we also have independent reports and professionals from outside of government commenting and corroborating the same conclusion, which is that it did not have an impact on the overall election results.
I think that's important for us to note. I speculate that it's probably why someone like Fred DeLorey, a Conservative campaign manager from the last election, could so confidently say.... I would also say that maybe he could confidently say it had no impact on the election results probably because—and I don't know this for a fact—he had an opportunity to participate in some of those briefings that were given. Again, that is a mechanism our government set up during federal elections—to have party briefings during the election on attempts of foreign interference.
I think it's good to note that in this research study that was done by the Media Ecosystem Observatory, they certainly have verified the fact that most Canadians believe the election was safe from foreign interference and a minority of Canadians believe that misinformation was a serious problem. That may be changing in our discourse today, as a country. Maybe more Canadians are believing it's a bigger, more serious problem. I think that raises public awareness. If, out of all this, we get a greater degree of public awareness around this issue, that's probably a good outcome, but in terms of this committee's work, I think we have to be working to get to the bottom of things to make really good, clear recommendations on how to move forward.
One thing that struck me as a conclusion that they drew from this huge body of research—and I'll go into a little bit more detail as to how extensive the research was—was that a “cohesive misinformed and misinforming group has emerged”, which is interesting. They said that there's a “rise of a 'big tent' of misinformation, where groups who hold false or conspiratorial beliefs about one topic appear to adopt similarly distorted opinions about a broad range of topics.” That's a direct quote from the report, by the way.
What's interesting to me about that is how we see that the sliver of the population that buys into misinformation campaigns gets co-opted by these distorted opinions they're receiving through online sources and memes. They are then further susceptible to absorbing other sorts of conspiratorial beliefs and opinions that come at them online.
It's interesting, because that coincides with my personal experience at the doors in the last election campaign. I saw that the prevalence of that was becoming more clearly identifiable. It's very disconcerting to me that the population of individuals who might already be slightly susceptible to that will then consume more of that misinformation and adopt it into a world view that becomes more and more extreme.
In other circles and conversations we've had on Parliament Hill, that's part and parcel of the challenges that online digital media presents to an evolving era of information consumption in how we get news, media sources and information today, and how we absorb that. How much do we question it? How prevalent is it in our lives?
It's really important for us to think about that and to think through how we combat that ubiquitous kind of foreign interference. We have to be asking ourselves at every step along the way what the truth is and what is factual about how this is done. How is it being adopted by Canadians? To what degree are people buying into it? To what degree is it impacting their behaviour?
There is some good news in this report, and there's some bad news. There's some good news in relation to foreign attempts at election interference when it comes to misinformation, which is probably the vast majority of the attempts at foreign interference in our elections. It really came through misinformation online.
I think that's fair to say. I'm not a national security and intelligence expert, but if we read through the reports and information, I think there's a lot of work to be done in this area, at the very least. It is certainly something that has been documented over and over again, the changing threat environment that we need to be responding to. It continues to evolve very quickly.
One of the other things that they outline at the very beginning of this report is the vulnerabilities that we have as a Canadian society. One of them is what they call “A fracturing of the Canadian information ecosystem”. I'll quote this, because it's probably better said by them than by me. They said:
Canadians are increasingly obtaining their political information from a range of untrustworthy sources. There is an increasing danger of echo chambers or filter bubbles where people will mostly be exposed to information that supports their existing worldview and/or promotes a narrow political view.
This is one of the big vulnerabilities. We've talked about algorithmic transparency and the need to understand how the algorithms that social media companies utilize are feeding people information based on their preferences, and how that can take them down the path to becoming more polarized and potentially having more extreme views that coincide with their overall world view over time. That leads to heightened divisions within Canadian society and less tolerance for sitting down and talking through our differences and really respecting and appreciating the perspectives of others.
One of the other vulnerabilities that is mentioned is “Increasing difficulties in detecting disinformation and coordinated information operations”. What's interesting is that it's hard to detect. The report says:
The rise of platforms focused on privacy that exercise minimal moderation has led to a more vibrant and chaotic environment that can provide opportunities for those seeking to mislead, misinform, or manipulate.
That is another aspect of this that we need to take quite seriously. It's difficult to detect. It's becoming easier and easier to mislead and manipulate that information.
One of the other vulnerabilities was a “gap between the reality and perceptions of mis- and disinformation”. This one's quite concerning as well. This is probably true for our foreign adversaries who are attempting to mislead and misinform the Canadian public, whether during elections or outside of election periods. Many times we've heard our members say this. I'm sure we all acknowledge it. Their intentions are to draw out of us and sow the seeds of division so that our society becomes less trusting, more chaotic, more extreme and more polarized. It really erodes the fabric of our democracy.
This is one of the gaps they mention in this report. It is that over time, in a way, we're sowing the seeds of distrust of all information sources. It doesn't matter whether you're a politician, a journalist or an online platform. Wherever people are getting information, they're able to say, “I don't really trust that.”
How do they really know whether something is truthful or not? Over time, it's shifting. I was happy to hear that the findings in this report still showed that Canadians were generally able to discern what's truthful and what's not, but I think that is changing. There are more and more Canadians who are consuming misinformation and not necessarily identifying it as false or being able to pull out the pieces of falsehood from information that is combined with some truth. You cloak your lies in truth, or the opposite.
It reminds me of my philosophy course called “Truth and Propaganda” when I studied at Carleton. Randal Marlin from Ottawa taught us about truth and propaganda. I won't get into that.
The other thing is the emerging distrust in Canadian democratic institutions. This is another vulnerability that was highlighted in this report. It's pretty significant. They link that with individuals who have really tried to use the pandemic to sow the seeds of distrust. They say there is a growing number of individuals who no longer share the same factual reality as the majority of Canadians do. To me, that's really scary. It's scary, because there's a growing percentage of the population that doesn't share in the factual reality of the majority of Canadians. If there was ever a symbol or sign that we should be concerned, that, to me, is it.
I would say that, if democracy is about anything, it's about the pursuit of truth. It's amazing that in terms of our work here on this committee we're not taking more seriously the threat of misinformation online. It would be great to do some more in-depth work on that.
I'll go to another section here. I think it's important to note a few things that are really helpful for our work. I feel they are important and that they relate to why we would have a campaign director component to this and have the national campaign managers come before this committee.
One of the summary notes on the global context, which is one of the chapters in this report, is that “The tactics used by large-scale, foreign influence and disinformation operations have increasingly been employed by non-state actors including hate groups, extremist organizations, and populist political parties.” That's really interesting, because one of the big summary points, conclusions or findings is about non-state actors using disinformation. Foreign influence isn't just about state actors. It's about non-state actors as well, which is interesting.
I think we should also be looking at that in our study, in our work, which is to say that if information is coming from foreign sources by non-state actors and that information is being picked up and spread in Canada within our elections process, that potentially has an impact on Canadians. Again, I'm going to call into question how much of an impact that has on Canadian voters' behaviour and intentions. I think there's some interesting data in this report on how much misinformation coming from China had an impact on the voting intentions of voters, even in the ridings that individuals are saying were impacted. It's interesting to look at what this independent report says about that, and there are some really interesting findings there. I'll get to that in a few minutes, but I think it will be eye-opening for a lot of us.
There's another finding here from the summary that says, “Disinformation tactics are no longer simply the dissemination of ‘fake news’ stories by easily identifiable bot networks. They now include more subtle manipulation of pre-existing polarized issues, such as immigration, equity-advancing policies, climate change, and LGBTQ+ rights.” It's interesting that the issues that are already polarizing are the ones that these disinformation tactics and campaigns seem to centre around. If you were a foreign actor, what would you do to try to disrupt Canadian democracy? You'd focus on the more contentious issues and try to amplify the amount of discord in the Canadian public over those issues.
It's interesting that those are some of the tactics that are being used, again, by Conservative members making this a partisan activity. They're playing right into the hands of our foreign adversaries. They're essentially sowing seeds of distrust in our democratic institutions by doing that. They're pushing that narrative and claiming all kinds of untrue things and then having the public start to....
This is a tactic. It's a tactic straight out of the playbook of our foreign adversaries, and I don't know why they would perpetuate that. It doesn't make any sense to me that they would take that approach when we're all sitting here as rational human beings and saying, “Let's do what makes sense and what all the intelligence experts and advisers are telling us, and what many Conservative senators, former senators and their national campaign manager have said.” The former director of CSIS Ward Elcock has said the same thing.
I don't understand why they're continuing to call us back to this committee over and over again to debate something that is so clearly a rational approach, which they just don't want to admit, for whatever reason. The only conclusion I could draw is that political gamesmanship is more important to them than doing real work on this issue. Obviously, to do that, they would have to admit the factual reality that our government has done more on election interference than any previous government, as far as I can tell.
That's another example of how they're not living in the factual reality most Canadians are living in, which is something we've seen quite a number of times, from the denial that climate change is real to.... There are many other examples. I won't get into all of those.
I think the fact that disinformation campaigns are exploiting those polarizing issues is quite concerning, as well. We should be looking at that—at how misinformation wraps around, gravitates to, or is really heightened during times when Canadians are focused on big, polarizing issues.
It's also important to note that one of the big challenges they identified was the accusation of election fraud in the United States in the 2020 presidential election, and just how much that sort of campaign seeped over the border, through our social media platforms, networks and chat groups, etc. That was present during our last federal election campaign.
They also note that Canada has, historically, been relatively resilient to misinformation and disinformation, and has adopted a series of measures to limit the spread of misinformation over past years. Again, this acknowledges the work our government has done, which I made mention of in my previous remarks. The Canada Declaration on Electoral Integrity Online was adopted in 2019. All of the major social media platforms signed onto it. Well, it's not just social media platforms: Google, Microsoft, LinkedIn, Facebook, YouTube, TikTok, Twitter, etc. all signed that declaration. Then, we updated it. Before the last election, it was signed again.
There was also quite a lot of awareness-raising around citizen preparedness. This included the digital citizen initiative, led by Canadian Heritage, which increased digital literacy skills, and a public awareness campaign called “get cyber safe”. There was also training for journalists and all political parties, etc. Again, no one can say our government isn't taking these threats seriously. I would say it has done some very significant things. One public awareness campaign reached over 12 million Canadians. That's pretty significant. No one can say reaching a third of the Canadian population isn't significant, in terms of its reach.
How much impact would an awareness campaign have on Canadians? Certainly, it would allow them to, perhaps, start to identify when they're seeing misinformation. Perhaps it even prevented some of the impacts of attempted election interference, both domestically and from foreign sources. We don't know that. It would be hard to establish a causal link, but it's certainly something we could look at. That's a lightning-bolt idea: How do we determine which awareness-raising and citizen-preparedness initiatives have had a positive impact on the Canadian public, in terms of being able to identify and pull apart a question...what is true and not true from what they're consuming online? That, to me, is a worthwhile pursuit, because we could then optimize our strategies and approach to have the greatest impact. To me, that's very rational.
Another thing was a section on Canadians' attitudes towards misinformation. There are some very important findings, here. I'll quote from this, briefly, then discuss why it's important:
Canadians perceive many common political phenomena as misinformation, from politicians exaggerating their promises to the publication of completely made-up stories by a media organization to [as extreme as] hate speech. There is significant ambiguity and politicization of the term.
That doesn't help, obviously. Politicizing misinformation as a term is going to create more challenges and exacerbate things, because what we really need to do is understand what it means and what it is, stick to a common definition and then educate the public around it. I would opt for that in terms of an approach.
Another finding in terms of the summary was that “[a]pproximately [one-]quarter of Canadians reported seeing misinformation during the campaign, while approximately 40% believed misinformation was a serious problem during the election”. That's interesting: A quarter of Canadians, or 25%, reported seeing misinformation and 40% believed it was a serious problem.
This is another finding: “A strong majority of Canadians believe that misinformation is a threat to Canadian democracy, [and] is polarizing Canadians and threatens social cohesion.” That's a strong majority, so there you go: A strong majority of Canadians believes that misinformation is a threat to Canadian democracy. I think that's a significant finding that demonstrates why in this work we should have a focus on misinformation and look at that seriously.
Here's another finding, which relates to the Conservative Party:
Supporters of right-wing parties (Conservative[s] and [the] People's Party) report higher levels of exposure to misinformation. However, they [do] not think of misinformation as a more serious problem during the election and tend to perceive misinformation as less threatening to democracy.
That is really interesting. In fact, the Conservatives and the People's Party, the right-wing parties...and this is not me saying this. I'm just quoting from this report, so don't get mad at the messenger here. What it says is that right-wing parties in general report “higher levels of exposure to misinformation”, which is interesting in itself, but then they don't see it as a serious problem, which is really interesting as well. Why would right-wing perspectives or people with those values...? I don't know what the answer is to that, but I find it an interesting finding in this very thorough work and research.
It brings up questions in my mind as to why, if the opposite of that is true—that left-leaning parties are less likely to consume or less likely to be exposed to misinformation but then see it as a greater threat to democracy—what does that mean? It's interesting. It might enable us to come to terms with some of the differences we have and maybe even highlight a way forward if we were to unpack that a little bit together. I'm not saying that we'll have the opportunity to do that, but I think that would be worthwhile.
It also states: “Canadians are largely in favour of content moderation but tend to believe that social media platforms and not governments should be making moderation and banning decisions.” Interestingly, the Canadian public seems to want content moderation, but most of it should be done by social media platforms, which is kind of interesting as well.
Also, then, it states: “There are significant differences in perceptions of misinformation and support for content moderation across partisan lines, socio-demographic groups, and media consumption patterns.” That's interesting as well.
There's lots more in here. There are some big aha moments. Maybe I will flip to those and give you some more important findings.
One is that as shown on page 23, they have done an interesting kind of experiment in taking four stories that are based on facts, four stories that are partially true or that they would consider misinformation—stories online—and then two stories that are completely fabricated, and then looking at the “exposure” of individuals and their perception of the “truthfulness” of those stories. It's interesting to compare those and see what the results are and what that tells us.
There are two or three findings from that piece of research that seem pretty interesting. “Conditional upon exposure to the story, factual stories were perceived as more truthful than misinformation stories both during the election...and post-election”, so it's good news for us that stories that had misinformation in them were more likely to be perceived as false, and factual stories were more likely to be perceived as truthful.
“While exposure might increase the likelihood of believing that a story is true, exposure to the stories might also be driven by citizens' predispositions, with those denying the existence of climate change being more likely to be exposed to the climate lockdown story, for example”, which is interesting.
This speaks to why algorithmic transparency or algorithms can be so impactful when you think about how often someone is exposed to a story that has misinformation in it. What it says is that if you're exposed once, you're likely to be able to determine that it is not true, but if you're exposed over and over again or if you have a specific predisposition to not believing in climate change for whatever reason, for example, and then you're exposed to a piece of misinformation like the one they document from MP Gallant, which is about climate lockdowns, you would now start to believe that over time.
This is important for us to understand. It's not just about one exposure; it's about the prevalence of this and how often you are exposed. We know even from marketing professionals and how marketing works that exposure to something over and over again eventually weakens your ability to determine that it's false and you become lulled into believing that something is true.
I think another really important quote or finding from this is that “a partial truth is perceived as more credible than completely false information”. This is interesting because it suggests that there is a trickle.... I hope one of the points members take from my intervention today is that we should be looking at misinformation, but we should be looking at it not just during election campaigns, because what we need to understand is that there is a slow trickle of misinformation that is happening throughout our society every single day. When you mix partial truths with things that are not true, that is, predominantly, what misinformation is. Parts of the stories and the things that are being reported are actually true, but there is some exaggeration or there is a spin on something or there are things that are being drawn from what is truthful but they are actually false, so they're extrapolated and they are more prevalent. However, they're also more effective at lulling people into that false sense of security and getting them to let down their guard and just absorb that information passively and having it affect their world view.
What I read from this is that it's happening all the time. If we're to take foreign election interference seriously, we also have to be considering what happens outside of the writ period. We have to be considering what misinformation and disinformation is circulating out there and where it is coming from. I don't think it's easy to determine where it's coming from all the time. We heard from security and intelligence professionals who came before us that it's not always easy to determine where information is actually originating when it comes to online sources.
I have covered that and I think that's important.
Here is another really big important finding.
What's interesting is that the highest volume of misinformation in the last election, generally speaking, was on Twitter. It's the highest volume of discussion of misinformation, because people are commenting more, and engaging more with misinformation on Twitter.