I'd like to thank the hon. member for his question.
You're absolutely right when you said that there are sometimes collateral impacts. The idea behind the current deliberations on regulating social networks is to create basic standards that would enable the platforms to determine what regulations they are required to comply with, and to contribute to a form of standardization that factors in the specific features of each of these platforms.
You gave some very clear examples of content that had not been withdrawn from platforms when it was problematic. I don't want to mention too many names, but Twitter and a number of platforms had recently placed restrictions in connection with documentaries about QAnon that were critical of QAnon. These platforms claimed to be placing restrictions on these documentaries based on their internal policies, which they could not talk about, and that they had to limit the dissemination of this type of content. That's definitely a problem.
Canada has decided that it now wants regulation, and I believe that's excellent. It will be important to measure how effective such regulation will be. One avenue open to us, as you were saying, is to create a digital security commissioner position in Canada. Other countries have done so, and the commissioner would be responsible for ensuring that platforms comply with these obligations.
A second key point that the chair is working on with the support of Canada's Department of Public Safety and the Community Resilience Fund is the matter of evaluation. This is somewhat related to the previous question you asked my colleague in terms of prevention programs. It's essential to have much more rigorous evaluation mechanisms—Canada is headed in that direction—to be able to determine what works and what doesn't, particularly upstream prevention programs. Primary, secondary, and tertiary prevention are all very important today if we are to rectify our practices and adapt how we are all working.