Evidence of meeting #98 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was data.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Michael Beauvais  Doctoral Candidate, Faculty of Law, University of Toronto, As an Individual
Avi Goldfarb  Professor of Marketing and Rotman Chair, Artificial Intelligence and Healthcare, Rotman School of Management, University of Toronto, As an Individual
Michelle Gordon  Lawyer and Founder, GEM Privacy Consulting, As an Individual
Antoine Guilmain  Counsel and Co-Leader, National Cyber Security and Data Protection Practice Group, Gowling WLG, As an Individual
Luk Arbuckle  Chief Methodologist and Privacy Officer, IQVIA Solutions Canada Inc.

3:35 p.m.

Liberal

The Chair Liberal Joël Lightbound

I call this meeting to order.

Good afternoon, everyone.

Welcome to meeting No. 98 of the House of Commons Standing Committee on Industry and Technology.

Today's meeting is taking place in a hybrid format, pursuant to the Standing Orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C‑27, an act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts.

I'd like to welcome our witnesses today. We have Michael Beauvais, a doctoral candidate at the University of Toronto Faculty of Law, by videoconference; Avi Goldfarb, a professor of marketing and the Rotman chair at the University of Toronto Rotman School of Management; Michelle Gordon, lawyer and founder of GEM Privacy Consulting; Antoine Guilmain, counsel and co‑leader of National Cyber Security and Data Protection Practice Group at Gowling WLG; and Luk Arbuckle, chief methodologist and privacy officer at IQVIA Solutions Canada Inc.

Each of you will have five minutes for an opening statement.

Thank you all for taking the time to join us in this study this afternoon. Without further ado, I'll give the floor to Mr. Beauvais for five minutes.

3:35 p.m.

Michael Beauvais Doctoral Candidate, Faculty of Law, University of Toronto, As an Individual

Thank you, Chair Lightbound and members of the committee, for today's invitation.

I'm a doctoral candidate at the University of Toronto's faculty of law, and a graduate fellow of the Schwartz Reisman Institute. I have more than a dozen peer-reviewed publications and numerous policy interventions on privacy and data protection law in Canada, the European Union and the United States.

I submitted a brief on children's issues in the consumer privacy protection act with my colleague, Leslie Regan Shade, who is a professor at the University of Toronto's faculty of information, a faculty affiliate of the Schwartz Reisman Institute. I am here today in my personal capacity.

Children's privacy in the digital environment is essential for their agency, dignity and safety. Indeed, data protection laws are one important piece of a response to mounting evidence that corporate surveillance and persuasive design are undermining children's agency and well-being. At the same time, though, digital technologies are vital for children's inclusion and participation in society. Members of the committee, you are in a special position to help ensure that the digital environment aligns with children's rights.

Before highlighting a few of the recommendations made in our submission, let me note that the UN Committee on the Rights of the Child has consistently recommended more robust and standardized mechanisms for meaningfully obtaining children's views on legal and policy matters affecting them. It is thus regrettable that there is no evidence of youth consultation for this important bill. I respectfully urge you to solicit their views.

Let me briefly discuss our recommendations.

First, several key definitions need to be clarified. These include a definition of a minor and a definition of capacity to determine when a minor is “capable” of exercising rights and recourse under the act. The act must also clarify the scope of and the relationship between parental and child decision-making. Additionally, more specification is needed with regard to what happens when minors reach the age of majority. Information about one's childhood should, furthermore, remain “sensitive information” even after one has attained the age of majority.

Second, the best interests of the child should be included as a fundamental principle in the act. Doing so would make the child's interests a primary concern in all aspects of the proposed legislation. For example, the best interests of children should matter in specifying the purposes of data collection, use and disclosure, as well as data retention.

Third, age and parental consent verification requirements and limitations are needed. Treating minors and adults differently makes verification for both age and parental consent an important part of compliance. Such verification, though, can be highly intrusive, unreliable and insecure. Verification also poses serious threats to the freedom of expression of all Internet users.

Fourth, the Office of the Privacy Commissioner should be mandated to develop a children's design code with meaningful participation from youth. Design codes are age-appropriate standards for youth-directed products to ensure the highest level of privacy by design. They also help ensure that youth-directed products do not undermine children's rights. Businesses also welcome the certainty that codes provide. Since codes only elaborate on general principles and obligations arising from the legislation, robust protections for privacy and agency must be in the law itself.

Finally, kindly recognize that providing robust protections for children should not be a justification for meagre protections for adults.

Before concluding, I want to respectfully remind the committee that the ongoing lack of high-speed Internet access among northern, rural, first nations, Inuit and Métis communities deprives children and adults alike in those communities of the same opportunities found elsewhere in Canada. The CPPA's promises and potential are illusory without equitable access to the Internet.

I appreciate your work on this important study, and I look forward to your questions.

Thank you.

3:40 p.m.

Liberal

The Chair Liberal Joël Lightbound

I'll now give the floor to Professor Goldfarb.

3:40 p.m.

Professor Avi Goldfarb Professor of Marketing and Rotman Chair, Artificial Intelligence and Healthcare, Rotman School of Management, University of Toronto, As an Individual

Thank you for your kind invitation to appear before the committee and discuss Bill C-27.

I'm a professor of marketing at the University of Toronto, where I hold the Rotman chair in artificial intelligence and health care. My research focuses on the economics of information technology, including several papers on privacy regulation and on artificial intelligence.

Canada is a leader in AI research. Many of the core technologies underlying the recent excitement about AI were developed right here at Canadian universities. At the same time, our productivity is lacking. My research has shown that AI and related data-focused tools are particularly promising technologies for accelerating innovation, productivity and economic growth. In my view, a big worry for the Canadian economy going forward is that we do not have enough AI, and so our standard of living, including our ability to fund health care and education, would stagnate. It would be a shame if Canada's research success did not lead to applications that increase Canadian prosperity.

This act is a careful attempt to ensure that Canadians benefit from AI and related data-focused technologies while protecting privacy and reducing the potential for these technologies to harm individuals.

Next, I'll provide specific comments on AI regulation in part 3 and on privacy regulation in part 1. I have specific comments [Technical difficulty—Editor] intelligence and data act.

First, the act correctly recognizes that there is always a human or a team of humans behind decisions enabled by AI. In part 1, proposed subsection 5(2) is commendable for noting that “a person is responsible for an artificial intelligence system”. Proposed sections 7 through 9 make these responsibilities clear. In my experience, such clarity about the role of humans in AI systems is both unusual and commendable.

Second, the act constructively defines explainability and transparency in part 1, proposed sections 11 and 12. By making it clear how and why the high-impact system is being used rather than focusing on the inner workings of the algorithm, it will provide useful information without forcing potentially misleading oversimplification of how the algorithms work.

Third, while the details of the act itself implicitly recognize the role of AI in Canadian prosperity, the preamble to the AI and data act does not recognize that technological progress is fundamental to our prosperity, and instead focuses only on regulation and harms.

Fourth, there are two sections of the act that might create incentives not to adopt beneficial AI because the liability is not explicitly benchmarked around some human performance level [Technical difficulty—Editor] and safety.

In part 1 of the AI act, proposed subsection 5(1) examines bias. The bias definition suggests that any bias would be prohibited. AI systems will almost surely be imperfect, because they're likely to be trained on imperfect and biased human decisions. Therefore, this definition of biased output incentivizes the continued use of biased human decision-making processes over potentially less biased but auditable AI-supported decisions.

In part 2 of the AI act, proposed paragraph 39(a) examines physical and psychological harm or physical damage. As with bias, the benchmark seems to be perfection. For example, autonomous vehicles will almost surely cause serious physical harm and substantial property damage, because vehicles are dangerous. If the autonomous vehicle system, however, generates much less harm than the current human driving systems, then it would be beneficial to enable its adoption.

The fifth comment on the AI and data act is about the definition of an AI system in proposed section 2 of the AI act: “the use of a genetic algorithm, a neural network, machine learning or other technique in order to generate content or make decisions, recommendations, or predictions.” This definition is overly broad. It includes regression analysis and could even be interpreted to include the calculation of averages. For example, if an employer receives thousands of applications for a job, calculates the average score on some standardized test and uses that score to autonomously select above-average applications to be sent to a human resource worker for further examination, that scoring rule would be an AI system, as I understand it, under the current definition.

I have two specific comments about the consumer privacy protection act.

First, the purpose of the act in proposed section 5 clearly lays out the often competing goals of protecting privacy while facilitating economic activity. While I do understand the wishful thinking that there would be no trade-offs between privacy and innovation, research has consistently documented such trade-offs. Privacy is not free, but it is valuable. Individuals care about their privacy. In protecting privacy, this act will require companies [Technical difficulty—Editor] on legal expertise for interpretation. Such expertise is readily available for large, established companies, but onerous for small businesses and start-ups. In the implementation by the commissioner, some direction to reduce any unnecessary burden on small businesses and start-ups would be constructive.

Proposed subsection 15(5) makes the cost of an audit payable by the person audited even if the Privacy Commissioner does not bring a successful case. This creates a large burden on small and new businesses if they get audited unnecessarily.

To conclude, while I have specific suggestions to clarify the language of the act, in my view Bill C-27 is a careful attempt to ensure that Canadians benefit from AI and related data-focused technologies while protecting privacy and reducing the potential of these technologies to harm individuals.

Thank you for this opportunity to discuss my research. I look forward to hearing your questions.

3:45 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

I'll yield the floor to Madame Gordon.

3:45 p.m.

Michelle Gordon Lawyer and Founder, GEM Privacy Consulting, As an Individual

Thank you for the invitation to appear before this committee for its important review of Bill C-27.

I'm a privacy lawyer and consultant based in Toronto. Having worked in the privacy field for over 15 years while raising three sons, I have a passion for children's privacy, and I will focus my remarks on this area today.

My interest in privacy law was sparked when I was a law student down the street at the University of Ottawa, where I did research with Professor Michael Geist and the late Professor Ian Kerr at the time when PIPEDA was a new bill being debated similarly to today's. When Professor Geist appeared here a few weeks ago, he reflected on his first appearance before committee to discuss PIPEDA, noting that it was important to get it right, rather than to get it fast. When Professor Kerr appeared in 2017 to discuss PIPEDA reform, he stated that, at the time, “the dominant metaphor was George Orwell's 1984, 'Big Brother is Watching You'”, noting that technological developments in the years since PIPEDA go well beyond watching.

Both professors Geist and Kerr were right, especially in the context of children's privacy. Given that children are inundated with emerging technologies well beyond Orwell's 1984—from AI tools to ed tech, virtual reality and our current reality of watching war and its accompanying hatred unfold on social media—it is more important than ever to get it right when it comes to children's privacy.

When Bill C-11 was introduced in late 2020, it didn't address children at all. As I argued in a Policy Options article in 2021, this was a missed opportunity, given that the amount of online activity for children was at an all-time high during the pandemic.

I commend the legislators for addressing children's privacy in Bill C-27 by stating that “information of minors is considered to be sensitive” and by including language that could provide minors with a more direct route to delete their personal information, otherwise known as the right to be forgotten. I also understand that Minister Champagne proposes further amendments to include stronger protections for minors.

However, as the first witness stated, I think there is more the law can do to get it right for children's privacy. I will focus on two points: first, creating clear definitions, and second, looking to leading jurisdictions for guidance.

First, the law should define the terms “minor” and “sensitive”. Without these definitions, businesses, which already have the upper hand in this law, are left to decide what is sensitive and appropriate for minors. The CPPA should follow the lead of other leading privacy laws. The California Consumer Privacy Act, the U.S. COPPA, the EU's GDPR and Quebec's law 25 all establish a minimum age for consent ranging from 13 to 16.

Further, the law should explicitly define the term “sensitive”. The current wording recognizes that minors' data is sensitive, which means that other provisions in the statute have to interpret the treatment of sensitive information through a contextual analysis, whether it be for safeguarding, consent or retention. Similar to Quebec's law 25, the law should define “sensitive” and provide non-exhaustive examples of sensitive data so that businesses, regulators and courts will have more guidance in applying the legislative framework.

Second, I recommend that you consider revising the law—as an amendment or regulation—in order to align the CPPA with leading jurisdictions, namely the age-appropriate design code legislation in the U.K. and California. Both of these demonstrate a more prescriptive approach to regulating the personal information of children.

The California kids code requires businesses to prioritize the privacy of children by default and in the design of their products. For example, default settings on apps and platforms for users under 18 must be set to the highest privacy level. This is something that could be considered in the CPPA as well.

Further, the California code establishes a level of fiduciary care for platforms such that, if a conflict of interest arises between what is best for the platform and what is best for a user under 18, the children's best interest must come first. This is consistent with the recommendation of former commissioner Therrien and others in these hearings about including language around the “best interest of the child” in the legislation.

The CPPA should contemplate requirements for how businesses use children's data, considering the child's best interest. For example, use of children's data could be limited to those actions necessary to provide an age-appropriate service.

As I argued in my Policy Options article in January 2023, we need a collaborative approach that includes lawmakers and policy-makers from all levels of government, coordination with global privacy laws, engagement with parents and coordination with educators. For this approach to work, the law needs to strike the balance between privacy and innovation. We want laws that are flexible enough to last so that technology can evolve, new business ideas can succeed, and children can be innovators while growing up in a world that recognizes their special needs and rights.

3:50 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Ms. Gordon.

I'll now give the floor to Mr. Guilmain.

3:50 p.m.

Antoine Guilmain Counsel and Co-Leader, National Cyber Security and Data Protection Practice Group, Gowling WLG, As an Individual

Mr. Chair, committee members, thank you for inviting me to comment on Bill C‑27.

Although I'll be testifying in English today, I'll answer your questions in either French or English.

I'm co-leader of the national cybersecurity and data protection group at Gowling WLG. I'm a practising lawyer called to the bars of Quebec and Paris. My evidence today represents my own views. I'm here as an individual, not representing my law firm, clients or any third parties.

Much of my legal career has focused on comparative analysis of legal regimes across the globe, advising clients on their compliance obligations in the jurisdictions in which I am qualified to practice.

Bill C-27 presents a tremendous opportunity to modernize Canada's federal privacy regime. It is possible, and indeed essential, that Canada protects the rights and interests of the public while facilitating competition, investment and ambitious innovation.

Many of the proposals in the bill are highly impactful, but I will focus my comments today on the consumer privacy protection act and two areas in particular that I consider to be of great importance. First are lessons learned from Quebec's law 25.

The majority of the provisions under law 25 came into force in September 2023. Over the last summer, Gowling WLG, in collaboration with the Interactive Advertising Bureau of Canada, conducted a readiness survey of over 100 organizations regarding this new law. The results of the survey were clear. Industry was ill-prepared for such an implementation. Specifically, 69% of the respondents expressed a need for greater clarity, and 52% indicated that they lacked sufficient resources. This also highlights that the compliance burden for SMEs is especially high.

There are four specific learnings from Law 25 that I wish to highlight today.

First, Bill C-27 should not exceed standards set by the EU general data protection regulation. For example, legitimate interest is a flexible legal basis for processing, but it must always be justified and documented in a separate assessment under the GDPR and under other global laws. A similar standard could apply in Bill C-27.

Second, Bill C-27 should not rely on future regulations to substantiate each requirement. This is a recipe for delays and uncertainty. For example, in Quebec, anonymization is currently regarded by the regulator as impossible because the regulations are not yet in place.

Third, Bill C-27's timeline for implementation should be sufficiently long. Based on experience from law 25, implementation should be at least 36 months after the bill becomes law.

Finally, Bill C-27 should be aligned with law 25 on key concepts, including around the legal bases for processing data and legitimate business exceptions. This is especially important when it comes to children's privacy.

I'm a father of two young children, so protecting children in the digital economy is important to me personally, and it's a subject that I engage with regularly in the course of my work. I believe amendments to Bill C-27 are necessary to ensure that minors' data is reasonably, meaningfully and consistently protected.

I wish to highlight four key topics for consideration.

First, as opposed to the GDPR, Bill C-27 lacks a threshold for determining when services are intended to target children. Practically, organizations will not be able to remain age-blind and will therefore have to ask the age of users each time they engage with them, to the potential detriment of user privacy interests and data minimization.

Alternative legal bases for processing should be available, depending on the maturity process of the individual. Specifically, legal capacity should be a baseline for assessing legitimate bases as opposed to the age of majority alone.

The process for collecting parental consent can be extremely complicated. Bill C-27 should set a specific age at which parental consent is required. Under 14 years of age seems the most reasonable standard.

Finally, the concept of the best interest of the child should be positioned as a key determinant of how minors' personal information should be treated, rather than relying primarily on the concept of express consent.

With the chair's permission, I would be pleased to submit a copy of the survey report for the committee's consideration, as well as a short written brief in French and English on the issues I've addressed in my opening remarks.

I wish to thank Michael Walsh for his assistance in preparing this material.

Thank you. I look forward to answering the committee's questions.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Guilmain.

Lastly, I'll now give the floor to Mr. Arbuckle.

November 23rd, 2023 / 3:55 p.m.

Luk Arbuckle Chief Methodologist and Privacy Officer, IQVIA Solutions Canada Inc.

Thank you.

I'm very pleased to have been invited to participate in the work of the House of Commons Standing Committee on Industry and Technology on Bill C‑27. I hope to be able to answer your questions on privacy and artificial intelligence services and technologies.

Although my opening remarks will be in English, please know that I will be pleased to answer your questions in either French or English.

My name is Luk Arbuckle. I am chief methodologist and privacy officer at Privacy Analytics, an Ottawa-based IQVIA company employing over 100 privacy experts.

My role at Privacy Analytics is to ensure that our company and our global clients are aligned on the practical applications of privacy-enhancing technologies and to inform our practices based on current guidance and emerging methods. I also provide guidance on the practice and risks of applying artificial intelligence in real-life applications. My role has been largely informed by my time as director of technology analysis at the Office of the Privacy Commissioner of Canada, when I also drafted guidance on anonymization for the office.

Privacy Analytics operates as an independent entity within the global IQVIA group of companies, so that we can provide both IQVIA and our global clients with services and technology for the safe and responsible use and sharing of data. The Privacy Analytics platform has been deployed globally to protect the privacy of close to one billion patients. For example, our software has enabled safe research that improves cancer outcomes for patients through the European oncology evidence network and the American Society of Clinical Oncology's CancerLinQ. We have also worked with multiple government agencies in Canada, Europe, the United States and globally to implement safe data access models that enable faster data access, promote research and innovation and implement data-driven decision-making.

It is against this backdrop that I wish to provide comments today. In particular, I will provide a perspective on the importance of health data and analytics for Canadians. Health care-related research is increasingly driven by analyses that draw from real-world evidence to reveal the effectiveness of treatments beyond the clinical trial phase. The success of that approach is predicated on the availability of the necessary data from various sources within the relevant health care system and on the ability to analyze data across different health care systems.

For Canada to take part in this new frontier of health care research, it is important that we prioritize a responsible data access model that strikes the appropriate balance between privacy and having useful data for the intended purposes. We also need a data protection framework that allows for efficient and effective data sharing and collaboration with stakeholders from all over the world, including the United States and Europe. As COVID-19 has shown, it is crucial that Canada stays active and competitive in life sciences. This means developing an approach to privacy that supports local research and innovation and allows health care research in Canada to align with efforts outside of the country.

I will only summarize three recommendations in my introductory remarks and invite you to consult IQVIA's full-length comment document on Bill C-27 for additional comments and details.

Recommendation one is to consider a reasonableness component within the definition of “anonymize”. The use of anonymized data in health care analytics is a key element in the research and innovation activities that help drive Canada's health care future. Canada's diverse group of health care stakeholders use anonymized information to identify inefficiencies and allocate resources more effectively, to speed up the development and approval of new treatments and to understand the needs of patients and health care professionals. Such uses of anonymized information contribute to better health outcomes and other notable benefits.

Including a reasonableness component within the bill's definition of anonymization would align better to other Canadian frameworks, such as Quebec's law 25 and Ontario's PHIPA. A reasonableness approach would also align better to the growing consensus in the academic and technical literature regarding the need for a realistic framing of risk in describing anonymized information. Take, for example, the risk-based international standard for an anonymization framework, technically known as ISO 27559. This technical standard was developed by experts from around the world and is consistent with the draft guidance I produced while at the OPC.

Recommendation two is to consider expanding the consent exception for “socially beneficial purposes” to include private sector organizations. A more principled approach would be to enable responsible data sharing between a broader range of actors while also mandating adequate oversight and data protection best practices.

Recommendation three is to consider a consent exception for external research, analysis and development purposes. Removing the internal qualifier would be a more beneficial approach, as it aligns with existing guidance and would enable a more useful model for health care research and innovation.

With that, I would like to thank the committee again for your time and for the opportunity to speak with you today. I strongly believe that it is possible to safely and responsibly use and share data in ways that protect privacy while driving innovation for the benefit of Canadians. I look forward to the continued discussions.

I will remain at your disposal during the discussion.

Thank you for your attention.

4 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much to all of you.

To start the discussion, I will now yield the floor to Mr. Vis for six minutes.

4 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Chair.

Ms. Gordon, thank you so much for your comments. Thank you, all. All of the testimony today was amazing.

Ms. Gordon, you mentioned your three kids, and that's sort of what's driving me, with my three children as well, in the work we're doing to make sure this bill is done right and children's privacy is protected.

I've asked other witnesses questions on proposed section 9 of the bill, on privacy management programs. In some cases, what I'm hearing from your testimony is that in addition to having management programs, especially in relation to children, we need to be prescriptive in some aspects of the bill.

Would you support amendments to proposed section 9 or other additions to the legislation that are prescriptive, specifically in the case of children? Maybe proposed section 9 in its current form could apply broadly to privacy concerns, providing protections and making sure that businesses are providing those protections in the products they're producing, but what do we need to do specifically with respect to privacy management programs as they relate to children?

4:05 p.m.

Lawyer and Founder, GEM Privacy Consulting, As an Individual

Michelle Gordon

As I said in my remarks, I do believe the law can be more prescriptive in terms of making sure we get it right for children. That is something we've seen in other jurisdictions. They've done it separately in a different law with a specific children's code. I think we can make certain amendments to the law or have a separate regulation.

Part of the problem with a regulation, as Mr. Guilmain said, is that sometimes it takes forever to get there and to get that guidance, so it's not always best to leave it to regulation, but that is one way of doing it. I do believe there are ways of adding amendments and making them more prescriptive so we can get it right in the law.

4:05 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Instead of privacy management programs, then, would you support privacy by design, requiring businesses to create products that are designed to have privacy, say, for children as the first and foremost priority?

4:05 p.m.

Lawyer and Founder, GEM Privacy Consulting, As an Individual

Michelle Gordon

I don't think they're two separate things. I think you can have privacy by design in a privacy management program.

4:05 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you. That's what I was looking for, actually. That's very helpful.

How would you define sensitive information?

4:05 p.m.

Lawyer and Founder, GEM Privacy Consulting, As an Individual

Michelle Gordon

I don't have a specific definition, but I do believe we should be able to give a non-exhaustive list of examples, similar to what we've done in law 25 in Quebec.

4:05 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Can you give us some examples of a non-exhaustive list? What would be included in that?

4:05 p.m.

Lawyer and Founder, GEM Privacy Consulting, As an Individual

Michelle Gordon

Sure. Some examples we've seen are biometric data, health data, financial data and children's data. Those are the ones off the top of my head, but I could certainly submit some more examples.

4:05 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

If we apply a non-exhaustive list, is there a risk of leaving out future technologies and having companies being able to break the spirit of the law by not having a certain form of privacy or sensitive information included?

4:05 p.m.

Lawyer and Founder, GEM Privacy Consulting, As an Individual

Michelle Gordon

That's definitely a risk, but I don't think that's a reason not to do it right now. PIPEDA has worked really well for 20 years. There's always going to be modernization of laws, but I think it's important to try to get that list now, with the acknowledgement that there is the possibility of updating it as things change.

4:05 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Okay.

The U.K. model for privacy includes, as I believe the law in California does, certain thresholds. I believe the spirit of the law implies that children at different stages are able to make different types of decisions.

How would you see that type of prescriptive language being included in Bill C-27, which is before us today?

4:05 p.m.

Lawyer and Founder, GEM Privacy Consulting, As an Individual

Michelle Gordon

I don't have any specific comments right now on prescriptive language. Again, that's something I'd have to go back to.

I know that's something my colleague Mr. Guilmain also referred to, so maybe he has comments on that.

4:05 p.m.

Conservative

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Would you like to comment, sir?

4:05 p.m.

Counsel and Co-Leader, National Cyber Security and Data Protection Practice Group, Gowling WLG, As an Individual

Antoine Guilmain

Yes. I would like to take a look at what is being done in Europe at the moment. Of course, the GDPR has been a change for them, and privacy, children's privacy, was a key consideration.

Because it's extremely difficult to provide prescriptive requirements, they instead relied heavily on the notion of “best interest of the child”, and this is working. The reason is that it's not a free pass saying, “You know what? We talk about children but there's nothing to be done.” Organizations need to have documentation regarding what they are doing.

There's a second aspect I would like to highlight. Of course, being a lawyer, I like clear definitions, but the notion of evolving concepts such as “best interest of the child” is important as well in laws, as opposed to very clear-cut concepts such as express consent, where essentially we would need to have express consent every time, regardless of whether the person is a teenager or under 13 years old.

I think this is what we are seeing across the world, because many Parliaments understand that we are living in a world that is constantly evolving and that we need future-proof concepts, and the “best interest of the child” is not moot. It's a good concept. At least that's my personal opinion.