Thank you, Mr. Chair.
Honourable members, thank you for inviting me today.
Good morning, my name is Bernie Farber. I am the past CEO of the Canadian Jewish Congress, where I worked for almost three decades. I am also the son of a Holocaust survivor, a survivor who was the sole Jewish person to survive in his village of over 1,500 Jews, so I have some visceral understanding of what hate is.
During my time at the congress, I spent much of it monitoring hate, extremism, white supremacy, racism, anti-Semitism and xenophobia. We undertook this work because, of all people, we understood that hatred run wild is a deadly virus without a cure.
Today I am retired, or as I prefer to say “rewired”, since I still act as a social justice consultant with various boards of education, as well as Human Rights Watch and Community Living. I also chair the Anti-Hate Network. The Anti-Hate Network itself is non-partisan. We monitor, expose and counter hate groups. We are journalists, researchers, court-recognized experts, lawyers and leaders in the community. We've held workshops in schools with law enforcement. Our investigations have shut down some of Canada's worst neo-Nazis and exposed so-called patriot groups that are actually anti-Muslim hate groups. We've become the go-to experts nationwide.
Our strategy to counter hate is really one of containment. We monitor and expose the worst of the worst hate propagandists so that they face social consequences. We put pressure on platforms to make the principled decision to remove hate groups both online and in communities across Canada, and we file criminal complaints. I just returned yesterday from a meeting with Facebook. Facebook called this meeting to deal with this exact issue of online hate. I give them credit for becoming, finally, a corporate leader. Let's see where it goes.
I want to emphasize that online harassment is harassment, and that online threats are threats. Our laws apply to the Internet and we need to enforce them. That means holding individuals accountable for what they post and holding social media companies accountable for giving them a platform. Our goal should be to drive the worst hate groups offline, to de-platform them. Often we hear the counter-argument that by driving hate groups off of the largest online platforms that it gives them attention. It helps them grow. We hear that people will seek them out in the darker corners of the Internet or that it makes them more dangerous. I want to be very clear. There's no evidence for these arguments. They're simply not true.
Last year, our investigations took us down one of the largest alt-right, neo-Nazi forums used by Canadians and we had the opportunity to watch what happened very closely. They had user names on these forums and they got to know each other and trust each other and they got to vouch for each other. They had a huge audience. They had a network. They had propaganda materials. Suddenly it was all gone. When they lose these online platforms, one or more of them may try to move people to a new one but many of them never make the switch. They lose their megaphone. They lose their network.
Most importantly, it means that the high schooler who has been watching hate propaganda on YouTube and has started to believe that women shouldn't have rights and that some races are biologically inferior, is going to have a much harder time finding one of these online echo chambers where he would be exposed to even more insidious propaganda and people trying to recruit him or her to hatred.
When we deal with online hate targeting the platforms, we're preventing countless untold incidents of radicalization. It's similar with hate groups in the anti-Muslim movement. They mostly use Facebook and when they get barred from Facebook, they usually come back with a new Facebook, but they lose all of their previous work and they have a 10th of their previous followers. It defangs them.
The problem is that while Facebook is taking a lead in responding to online hate, it's really only dealing with the tip of an iceberg. For example, it has yet to remove some of the worst Canadian groups out there. Take the Yellow Vests Canada page, for example. We and other organizations have documented hundreds of incidents of overt racism and death threats. That page is still up and we're worried that the next Quebec City mosque shooter is reading that page and pumping himself up with anger.
This is why we need the government to enforce the Canadian Human Rights Act when it comes to social media companies. It's the law that no company in Canada can discriminate in providing a good or service in this country. If I were a baker, I couldn't refuse to bake a wedding cake for a gay couple. Social media companies are breaking this law because different people have very different experiences on social media. Persons of colour, women, LGBTQ+ persons, Jews, when these Canadians go online, they are much more likely to experience harassment, threats and propaganda that dehumanizes them or calls them vermin.
The act says that every company has an obligation to give people non-discriminatory service. The government could give the Human Rights Commission a clear mandate and the resources to enforce the law and beef up our legislation with stricter financial penalties to hold social media companies accountable for their role in spreading hatred.
Of course, it's not just the platforms at fault here. Very bad people are spreading hate propaganda and they are getting away with it. We can deal with most haters by exposing them to the natural social consequences, but we do have subsection 319(2) of the Criminal Code, which makes spreading hate propaganda illegal. However, realistically, these investigations take a long time and few charges are laid.
Most importantly, we need section 13 of the Canadian Human Rights Act back. Section 13 allowed an individual to make a complaint about online hate speech to the Canadian Human Rights Commission. If the commission's investigation said it was a reasonable complaint, it would go to the tribunal. The Human Rights Tribunal would hear the case, render a decision based on the evidence and could order the person spreading hate to stop and maybe pay a small fine. The Supreme Court ruled this law was constitutional but the government of the day repealed it in 2013 anyway. This was an effective law. It shut down some of the worst online purveyors of hate in its day and neutered a generation of white supremists and neo-Nazi leadership.
Additionally, the CHRC has a central role in enforcing the act and protecting Canadians from the social destruction of online hate. It should be resourced accordingly. Simultaneously with the re-establishment of section 13, we need to continue to encourage individuals and groups within society to file complaints. Over the years, this has proven to be the best mechanism to enforce regulation. The loss of section 13 has left us terribly vulnerable. I can't stress this enough.
We also need the Human Rights Commission and the tribunal to have the resources to hear cases in a reasonably speedy manner.
In conclusion, we need the best tools possible. We've been fighting a losing battle. Our intelligence services acknowledge that they dropped oversight of extremist hate groups many years ago and only in the last year have they tried to re-establish a presence. Police services no longer have dedicated hate crime units so their expertise has waned and hatred is getting worse. It has moved from evil words to evil actions, from minor property damage to outright murder.
We count on our leaders to lead. I ask you today to lead. Be brave. Be bold. Give our country the tools it needs to protect us from this growing menace before it's too late.
Thank you.