In fact, the issue of algorithms is crucial, because by default, that's how speech is regulated in the large platforms. At present, it is the companies that own these large platforms that have control over the algorithmic processes, and it is they who are targeted by the bill. Let's not kid ourselves. It is not individuals who are targeted, but the large companies that own the large platforms. However, these algorithmic processes have the disadvantage of being very opaque. They are not very visible. We do not know how these platforms and algorithms work, and no independent authority is in a position to know.
The strength of Bill C‑11 is that it puts in place mechanisms that will allow an independent body, the CRTC, to hold the major platforms accountable, particularly with respect to the operation of their algorithms.
Are the algorithms of these platforms compatible with the principle and values of inclusion that we cherish in Canada? Do these algorithms discriminate against some of our fellow citizens, for example against LGBT+ groups, which were mentioned earlier? We don't know at this point. We must rely on the good faith of companies, and I do not doubt their good faith. In fact, if these companies are acting in good faith, they should have no difficulty explaining how these algorithms work and demonstrating that this is completely compatible with Canadian values, particularly with regard to equality.