When it comes to that “more”, it seems to me there are really two avenues of attack here. One is large social media companies. Google made $9 billion in Q4 of 2018 and Facebook made $7 billion. They're 75% of digital ad revenue in Canada combined, so let's take those two as an example.
One answer is to say, where there's obviously illegal content and we don't want you to be the final arbiter in any way, there is going to be judicial appeal as far as it goes, but you're going to be accountable and we're going to restrict safe harbour for hate speech in the same way we do for terrorism and child porn. We have to strike the right balance, but that's one avenue.
In regard to the other answer, you highlighted the issue of usefulness. You said you didn't know about section 13, because you didn't know how useful or effective that was, but you did highlight the need for a non-criminal administrative law remedy. By that, presumably, I take what you mean is it's not just about holding social media companies and platforms to account as the broadcasters or publisher hosts, as it were, it's also about holding the people themselves accountable in some fashion who are posting this hateful content. Is that right?