Our position is similar to what was discussed. The Criminal Code provision would need to be amended if it were to apply to altered images and deepfakes, in our interpretation.
While that's important, it's not going to provide a remedy in many cases, in part because deepfakes are so easy to produce anonymously that the person who produced them, in many cases, won't be identifiable. As we discussed, it won't necessarily provide the complete remedy that all victims are seeking in the sense that the content itself can continue to cause reputational harm and be circulated.
It's our position, also, that it would be important to work with platforms and have them be held accountable for the content distributed on their websites. They are the ones that have control over the algorithms listing the results, and they are the ones that can take the content down—at least make it less visible, if not remove it entirely.
I can defer to my colleague for further comments.