I believe you're referring to the duty laid out in sections 67 and 68 and subsequent sections of the proposed act in part 1 of the bill. Those sections address specific types of very harmful content, in particular, content that sexually victimizes a child. The provisions also set out a deadline for the platform operator to assess the flagged content and remove it if the content turns out to be real.
The CCLA does not have a problem with the provisions. As we see it, the problem has more to do with the much more general duties laid out for operators. I'm talking mainly about sections 55 to 59 of the proposed act.
Section 55 sets out a general duty to take reasonable measures to prevent users from being exposed to harmful content. When I say harmful content, I'm talking about the seven types listed in the bill. Unfortunately, without adequate parameters, an operator might be tempted to take a very cautious approach in fulfilling the duty, an approach that could unreasonably limit freedom of expression in Canada.
For example, proactively searching and deleting content amounts to state surveillance by proxy. The CCLA considers that to be a problematic practice, but it isn't prohibited in the bill as it currently stands. An operator could also decide to take down content without even reviewing it, which we also consider problematic.
Frankly, we are not saying that freedom of expression is an absolute right in Canada that should not be subject to reasonable limits. However, the duties imposed on operators need to be circumscribed in a way that makes clear to operators not only what their duties are, but also the fact they must act reasonably to fulfill those duties in accordance with freedom of expression principles.