Thank you for the invitation to appear before you.
My name is Emily Laidlaw. I'm a Canada research chair and associate professor of law at the University of Calgary.
At the last committee meeting, and earlier today, you heard horrific stories, bringing home the harms this legislation aims to address. With my time, I'd like to focus on the legal structure for achieving these goals, why law is needed, why part 1 of Bill C-63 is structured the way it is and what amendments are needed.
My area of expertise is technology law and human rights: specifically, platform regulation, freedom of expression and privacy. I have spent my career examining how best to write these kinds of laws. I will make three points with my time.
First, why do we need a law in the first place? When the Internet was commercialized in the 1990s, tech companies became powerful arbiters of expression. They set the rules and how to enforce them. Their power has only grown over time.
Social media are essentially data and advertising businesses and, now, AI businesses. How they deliver that to consumers and how they design their products and services can directly cause harm. For example, how they design their algorithms makes decisions about our mental health, pushing content encouraging self-harm and hate. They use persuasive techniques to nudge addictive behaviour, such as with endless scrolling rewards and constant notifications.
Thus far in Canada, we have largely relied on corporate self-governance. The EU, U.K. and U.S. passed legislation decades ago. Many are on their second-generation versions of these laws, and a network of regulators is working together to create global coherence.
Meanwhile, Canada has never passed a comprehensive law in this space. The law that does apply is piecemeal, mainly a bit of defamation, privacy and competition law, circling important dimensions of the problem, but not dealing with it directly.
Where does that leave us in Canada? Part 1 of Bill C-63 is the product of years of consultation, to which I contributed. In my view, with amendments, it is the best legal structure to address online harms.
That brings me to my second point. This legislation impacts the right to freedom of expression.
Our expert panel spent considerable time on how best to protect freedom of expression, and the graduated approach we recommended is reflected in this bill.
There are three levels to this graduated approach.
First, the greatest interference with freedom of expression is content removal, and the bill requests that for only two types of content that are the worst of the worst, the stuff that we all agree should be taken down: child sexual abuse material and non-consensual disclosure of intimate images, both of which are crimes.
At the next level is a special duty to protect children, recognizing their unique vulnerability. The duty requires that social media integrate safety by design into their products and services.
The third, the foundation, is that social media have a duty to act responsibly. This does not require content removal. It requires that social media mitigate the risks of exposure to harmful content.
In my view, the bill aligns with global standards because it's focused on systemic risks of harm and takes a risk mitigation approach, coupled with transparency obligations.
Third, I am not here to advocate that the bill is passed as is. The bill is not perfect. It should be carefully studied and amended.
There are also other parts of the bill that don't necessarily need to amended but entail hard choices that should be debated. To be debated are the scope of the bill; what harms are included and not; what social media are included based on size or type; the regulatory structure; a new versus existing body and what powers it should have; and what should be included in the legislation versus left to be developed later in codes of practice or regulations.
There are, however, amendments that I do think are crucial. I'll close with this list. I have three.
One, the duty to act responsibly should also include the duty to have due regard for fundamental rights and how companies mitigate risk. Otherwise, social media might implement sloppy solutions in the name of safety that disproportionately impact rights. This type of provision is in the EU and U.K. legislation.
Two, the duty to act responsibly and duty to protect children should clearly cover algorithmic accountability and transparency. I think it's loosely covered in the current bill, but it should be fleshed out and made explicit.
Three, the child protection section should be reframed as the best interests of the child. In addition, the definitions of harmful content for children should be amended. There are two main amendments here. One is that content that induces a child to harm themselves should be narrowly scoped so that children exploring their identity are not accidentally captured and, two, addictive design features should be added to the list.
Thank you for your time. I look forward to our discussion.