It's a fantastic question. A human rights approach as opposed to a risk mitigation approach would be something structured more like the charter or like human rights acts, where we identify values that are important, that need to be prioritized and that should not be violated, and, when they are, there's a structure in place to seek accountability and in particular to seek compensation. I think that would be a stepping back from this approach to legislation altogether and a revisiting of what it is we think is crucial as we support the growth of the AI industry in Canada. This bill is structured with a risk mitigation approach. Within that confine, we are stuck with how to identify risks and avoid them as much as possible. I think both of these need to go hand in hand.
I'm not saying we shouldn't have risk mitigation. Risk mitigation already exists in a lot of our laws of general application, like negligence and product liability. Some of these obligations on companies to think ahead to what kinds of harms might materialize already exist. How can we provide greater clarity to companies that want to utilize AI so that they can better identify particular risks that are especially likely to be dangerous and that could arise from the use of artificial intelligence? If we're going to maintain a risk mitigation structure, our recommendations have been that we need to then expand our understanding of the risks that are pertinent and, ideally, also accompany that with government support for companies to better develop equity audits and to have expertise in-house that can aid them.
I would just say that all of this also benefits companies and industries that exist right now in Canada. I mean, there are companies that are going to lose out from recommender algorithms sending you to Amazon instead of to a small business. Those companies are also, I think, probably very concerned about how we structure our AI regulations. What this whole conversation fits with is that we need to step back and think this through from a different lens.