Thank you for the invitation to appear before this committee for its important review of Bill C-27.
I'm a privacy lawyer and consultant based in Toronto. Having worked in the privacy field for over 15 years while raising three sons, I have a passion for children's privacy, and I will focus my remarks on this area today.
My interest in privacy law was sparked when I was a law student down the street at the University of Ottawa, where I did research with Professor Michael Geist and the late Professor Ian Kerr at the time when PIPEDA was a new bill being debated similarly to today's. When Professor Geist appeared here a few weeks ago, he reflected on his first appearance before committee to discuss PIPEDA, noting that it was important to get it right, rather than to get it fast. When Professor Kerr appeared in 2017 to discuss PIPEDA reform, he stated that, at the time, “the dominant metaphor was George Orwell's 1984, 'Big Brother is Watching You'”, noting that technological developments in the years since PIPEDA go well beyond watching.
Both professors Geist and Kerr were right, especially in the context of children's privacy. Given that children are inundated with emerging technologies well beyond Orwell's 1984—from AI tools to ed tech, virtual reality and our current reality of watching war and its accompanying hatred unfold on social media—it is more important than ever to get it right when it comes to children's privacy.
When Bill C-11 was introduced in late 2020, it didn't address children at all. As I argued in a Policy Options article in 2021, this was a missed opportunity, given that the amount of online activity for children was at an all-time high during the pandemic.
I commend the legislators for addressing children's privacy in Bill C-27 by stating that “information of minors is considered to be sensitive” and by including language that could provide minors with a more direct route to delete their personal information, otherwise known as the right to be forgotten. I also understand that Minister Champagne proposes further amendments to include stronger protections for minors.
However, as the first witness stated, I think there is more the law can do to get it right for children's privacy. I will focus on two points: first, creating clear definitions, and second, looking to leading jurisdictions for guidance.
First, the law should define the terms “minor” and “sensitive”. Without these definitions, businesses, which already have the upper hand in this law, are left to decide what is sensitive and appropriate for minors. The CPPA should follow the lead of other leading privacy laws. The California Consumer Privacy Act, the U.S. COPPA, the EU's GDPR and Quebec's law 25 all establish a minimum age for consent ranging from 13 to 16.
Further, the law should explicitly define the term “sensitive”. The current wording recognizes that minors' data is sensitive, which means that other provisions in the statute have to interpret the treatment of sensitive information through a contextual analysis, whether it be for safeguarding, consent or retention. Similar to Quebec's law 25, the law should define “sensitive” and provide non-exhaustive examples of sensitive data so that businesses, regulators and courts will have more guidance in applying the legislative framework.
Second, I recommend that you consider revising the law—as an amendment or regulation—in order to align the CPPA with leading jurisdictions, namely the age-appropriate design code legislation in the U.K. and California. Both of these demonstrate a more prescriptive approach to regulating the personal information of children.
The California kids code requires businesses to prioritize the privacy of children by default and in the design of their products. For example, default settings on apps and platforms for users under 18 must be set to the highest privacy level. This is something that could be considered in the CPPA as well.
Further, the California code establishes a level of fiduciary care for platforms such that, if a conflict of interest arises between what is best for the platform and what is best for a user under 18, the children's best interest must come first. This is consistent with the recommendation of former commissioner Therrien and others in these hearings about including language around the “best interest of the child” in the legislation.
The CPPA should contemplate requirements for how businesses use children's data, considering the child's best interest. For example, use of children's data could be limited to those actions necessary to provide an age-appropriate service.
As I argued in my Policy Options article in January 2023, we need a collaborative approach that includes lawmakers and policy-makers from all levels of government, coordination with global privacy laws, engagement with parents and coordination with educators. For this approach to work, the law needs to strike the balance between privacy and innovation. We want laws that are flexible enough to last so that technology can evolve, new business ideas can succeed, and children can be innovators while growing up in a world that recognizes their special needs and rights.