Evidence of meeting #92 for Industry, Science and Technology in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was commissioner.

A recording is available from Parliament.

On the agenda

MPs speaking

Also speaking

Colin Bennett  Professor, Political Science, Unversity of Victoria, As an Individual
Michael Geist  Professor of Law, Canada Research Chair in Internet and e-Commerce Law, Faculty of Law, University of Ottawa, As an Individual
Vivek Krishnamurthy  Associate Professor of Law, University of Colorado Law School, As an Individual
Brenda McPhail  Acting Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual
Teresa Scassa  Canada Research Chair in Information Law and Policy, Faculty of Law, Common Law Section, University of Ottawa, As an Individual

3:45 p.m.

Liberal

The Chair Liberal Joël Lightbound

Good afternoon, everyone. I call this meeting to order.

Welcome to meeting no. 92 of the House of Commons Standing Committee on Industry and Technology.

Today's meeting is taking place in a hybrid format, pursuant to the standing orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C-27, An Act to Enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other acts.

I'd like to welcome our witnesses today and also apologize for the brief delay caused by a vote in the House.

Joining us today are Colin J. Bennett, professor; Dr. Michael Geist, professor of law and Canada research chair in Internet and e‑commerce law; Vivek Krishnamurthy, associate professor of law at University of Colorado Law School; Dr. Brenda McPhail, acting executive director of the public policy in digital society program; and lastly, Teresa Scassa, Canada research chair in information law and policy, Faculty of Law, Common Law Section, University of Ottawa.

I'd like to welcome you all.

We'll begin the discussion without further ado.

Mr. Bennett has the floor for five minutes.

3:45 p.m.

Prof. Colin Bennett Professor, Political Science, Unversity of Victoria, As an Individual

Thank you very much, Mr. Chair.

I'm from the University of Victoria, although I'm currently in Australia. I wish everybody a good day.

I would like to emphasize five specific areas for reform of the CPPA and to suggest ways in which the bill might be brought into better alignment with Quebec's law 25. I don't think that Bill C-27 should be allowed to undermine Quebec law, and in some respects, it does. I also think these are some of the areas where the bill will be vulnerable when the European Commission comes to evaluate whether Canadian law continues to provide an adequate level of protection.

Some of these recommendations are taken from the report that you have from the Centre for Digital Rights, which I'd like to commend to you.

First, I believe that CPPA's proposed section 15, on consent, is confusing to both consumers and businesses. In particular, I question the continued reliance on “implied consent” in proposed subsection 15(5), which states, “Consent must be expressly obtained unless...it is appropriate to rely on an individual's implied consent”.

The bill enumerates those business activities for which consent is not required, including if “the organization has a legitimate interest that outweighs any potential adverse effect on the individual”. That's a standard that has been imported from the GDPR. However, in the GDPR, “consent” means express consent; it's “freely given, specific, informed and unambiguous”.

In the current version of the CPPA, businesses can have it both ways. They can declare that they have implied consent because of some inaction that a consumer allegedly took in the past because of not reading the legalese in a complex terms-of-service agreement, or they can assert a “legitimate interest” in the personal data by claiming that there is no “potential adverse effect on the individual”. That is a risk assessment performed by the company rather than a judgment made about the human rights of individuals to control their personal information.

In that respect, it's really important that the bill be brought within a human rights framework. There should be no room for implied consent in this legislation. It's a dated idea that creates confusion for both consumers and businesses.

Second, there is no section in the CPPA on international data transfers. I find that very odd. I know of no other modern privacy law that fails to give businesses proper guidance on what they have to do if they want to process personal data offshore. The only requirement is for the organization to require the service provider, “by contract or otherwise,” to ensure “a level of protection of the personal information equivalent to that which the organization is required to provide under this Act.” That's proposed subsection 11(1) of the CPPA.

That due diligence applies whether the business is transferring personal data to another province in Canada or overseas to a country that may or may not have strong privacy protection or, indeed, a record of the protection of human rights. That's particularly troubling because of proposed section 19 of the CPPA, which reads, “An organization may transfer an individual's personal information to a service provider without their knowledge or consent.”

The Canadian government has never gotten into the business of adopting a safe harbour approach or a white list, and I'm not recommending that. However, Quebec, I believe, has legislated an appropriate compromise under section 17 of law 25, which requires businesses to do an assessment, including of the legal framework, when sending personal data outside of Quebec. As many businesses will have to comply with the Quebec legislation, why not mirror that provision in Bill C-27?

Third, the bill ignores important accountability mechanisms that were pioneered in Canada and exported to other jurisdictions, including Europe. Therefore, it's very strange that those same measures do not appear in the CPPA. In particular, privacy impact assessments are an established instrument and a critical component of accountable personal data governance, and they should be required in advance of product or service development, particularly where invasive technologies and business models are being applied, where minors are involved, where sensitive personal information is being collected, or where the processing is likely to result in a high risk to an individual's rights and freedoms. Businesses do the PIAs, and they stand ready to demonstrate their compliance or their accountability to the regulator.

A fourth and related problem is the absence of any definition of sensitive forms of personal data. The word “sensitivity” appears throughout the legislation in several provisions of the bill, but with the exception of the specification about data on minors, it is nowhere defined. In my view, the bill should define what “sensitive information” means, and it should also enumerate a non-exhaustive list of categories, which, in fact, occurs in many forms of legislation.

Finally—I know you've heard about this in the past, and I've researched on this—the absence of proper privacy standards for federal political parties is unjustifiable and untenable. The government is relying on the argument that the FPPs’ privacy practices are regulated under the Elections Act, but those provisions are nowhere near as strong as in Bill C-27. I think businesses resent the fact that parties are exempted. This is not an issue that will go away, given advances in technology and its use in modern digital campaigning. Canada is one of the few countries in the world in which political parties are not covered by applicable privacy law.

Thank you so much.

3:50 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Bennett.

I'll now give the floor to Professor Geist, who is here with us in Ottawa.

October 26th, 2023 / 3:50 p.m.

Dr. Michael Geist Professor of Law, Canada Research Chair in Internet and e-Commerce Law, Faculty of Law, University of Ottawa, As an Individual

Thank you very much, Chair.

Good afternoon. As you heard, my name is Michael Geist. I am a law professor at the University of Ottawa, where I hold the Canada research chair in Internet and e-commerce law and am a member of the Centre for Law, Technology and Society. I appear in a personal capacity representing only my own views.

I’d like to start by noting that the very first time I appeared before a House of Commons committee was in March 1999 on Bill C-54, which would later become PIPEDA. I must admit that I don't think I really knew what I was doing at that appearance, but my focus at the time was on whether or not the law would provide sufficient privacy protections for those just coming online who had little background or knowledge of privacy or security, or even the Internet, for that matter.

I highlighted some of the shortcomings of the bill, including poorly defined consent standards that would lead to overreliance on implied consent, broad exceptions on the use or disclosure of personal information and doubts about enforcement. I urged the committee to strengthen the bill, but I have to say that I did not fully appreciate that the policy choices being made back then would last for decades.

I start with this brief trip down memory lane because I feel that we find ourselves in a similar position today, with policy choices on things like artificial intelligence and emerging technologies that will similarly last for far longer than we might care to admit.

It is for that reason that I think it is important to emphasize the need to get it right rather than to get it fast. I often hear the minister talk about being first, at least on AI, and I must admit that I don’t understand why that is a key objective. Indeed, if you leave aside the fact that the core of at least the privacy part of this bill was introduced in 2020 and languished for years, we now find ourselves in a race to conduct hearings that I don’t totally get. We have an AI bill where there is a major overhaul with no actual text available yet. Witnesses seemingly have to pick between privacy and AI, creating the risk of limited analysis all around.

I think we need to do better. I’ll focus these remarks on privacy, but to be clear, the AI bill and the proposed changes raise a host of concerns, including the need for independent enforcement and the high-impact definitions that puzzlingly include search and social media algorithms.

The other lesson from the past two decades is that you can seek to create a balanced statute—I know there's been a lot of talk about balance—but the playing field will never be balanced. It's always tilted in favour of businesses, many of which have the resources and expertise to challenge the law, challenge complaints and challenge the Privacy Commissioner. Most Canadians don’t stand a chance. That’s why we must craft rules that seek to balance the playing field, too, with broad scope of coverage, better oversight and audit mechanisms, and tough penalties to ensure that the incentives align with better privacy protection.

How do you do that? Given my limited time, I have five quick ideas.

First, to pick up where Professor Bennett ended, we must end the practice of “do what I say, not what I do” when it comes to privacy. I think it's unacceptable in 2023 for political parties to exempt themselves from the standards they expect all businesses to follow. Indeed, you can't credibly argue that privacy is a fundamental right and then claim that it should not apply in a robust manner to political parties.

Second, the addition of language around the fundamental right to privacy is welcome, but I think it should also be embedded elsewhere so that it factors more directly into the application of the law. For example, as former commissioner Therrien noted, it could be included in proposed subsection 12(2) among the factors to consider in an “appropriate purposes” test.

Third, the past 20 years have definitely demonstrated that the penalties matter for compliance purposes and are a critical part of the balance. The bill features some odd exclusions. There are penalties for elements of the appropriate purposes provision in proposed section 12, but not the main provision limiting collection, use and disclosure for appropriate purposes.

In the crucial proposed section 15 provision on consent, there are no penalties around the timing of consent or for using an implied consent within the legitimate interest exception. The bill says such a practice “is not appropriate”, whatever that means. It is an odd turn of phrase in a piece of legislation. But the penalty provision doesn't apply regardless.

Fourth, the committee has already heard debate about the appropriate standard of anonymized data. I get the pressure to align with other statutes. I’d note that proposed subsection 6(6) specifically excludes anonymized data from the act, and yet I think we want to ensure that the commissioner can play a data governance role here with potential audits or review, particularly if a lower standard is adopted.

Finally, fifth, provided we ensure that the privacy tribunal is regarded as an expert tribunal that will be granted deference by the courts, I’m okay with creating another layer of privacy governance. I appreciate the concerns that this may lengthen the timeline for resolution of cases, but the metric that counts is not how fast the Privacy Commissioner can address an issue but how fast a complainant can get a binding final outcome. Given the risks of appeals and courts treating cases on a de novo basis, existing timelines can go far beyond the commissioner's decision, and the tribunal might actually help.

Thanks for your attention. I look forward to your questions.

3:55 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much, Mr. Geist.

We'll now turn to Professor Vivek Krishnamurthy. The floor is yours.

4 p.m.

Vivek Krishnamurthy Associate Professor of Law, University of Colorado Law School, As an Individual

Thank you, Mr. Chair and members of the committee. I am very honoured to be speaking with you today regarding Bill C-27.

I am currently a professor of law at the University of Colorado, but when I was the director of CIPPIC at the University of Ottawa, we published two reports in the spring of 2023 that consider AIDA and the CPPA. I am going to focus my remarks on the CPPA, particularly on provisions that relate to the privacy of minors. I would be happy to share some of my thoughts around AIDA as well.

I would like to begin by saying that I agree with everything that Professor Bennett and Professor Geist said. You could treat these remarks as additive.

While it is very welcome that the CPPA, unlike PIPEDA, specifically defines that the personal information of minors is sensitive information, Professor Bennett already told you about how “sensitive information” is not a defined term in the legislation. It is positive that children would have—if this bill passes into law—some recognition of the importance of protecting their personal information to a higher standard. However, we believe that this legislation can do far better.

For context, it is important to realize that children spend increasing amounts of time online, at younger and younger ages. This is a trend that accelerated during COVID-19 and the transition to digital online learning. I am a parent, and I am sure many of you are parents. Our children are using devices under commercial terms of service all the time, and this poses a very significant risk to the privacy rights of children.

While COVID has receded, it's the new reality that kids are using more and more technology at younger ages. What can we do? There are three things, and then a fourth about jurisdictional competence.

The Privacy Commissioner, in his recommendations regarding the CPPA, suggested that “best interests of the children” language should be incorporated into the law, and he suggested doing that in the preamble. I take no position myself as to where that should be done, but it is clear that this is international best practice. The United Kingdom and California have both incorporated such language into recently enacted statues, and we think that Canada should follow this approach. What would that mean? It means that organizations that handle children's personal data must take the best interests of children into account. That must come ahead of their commercial interests.

Second, we think it is important for the CPPA to require organizations that develop products or services that are likely to be accessed by children to set their privacy settings to the highest level. Defaults play a really important role in our subjective experience of privacy. It is great to have rights, but you can very easily leave those rights on the table if a setting is such that it contracts you out. We think that requiring a company to set those defaults to high levels when children are their likely users or their known users is very important.

Third, I'd like to pick up on what Professor Bennett told you about data protection impact assessments, a made-in-Canada idea. Bill C-27 is extremely weak when it comes to data protection impact assessments. The provisions apply only when the legitimate interest, excepting the consent, is being used. This is a problem for everyone, especially for children.

We believe—and I specifically believe this personally—that the data protection impact assessment requirements of this bill need to be considerably strengthened whenever data-processing activities pose a high risk to the privacy rights of Canadians. I would say that if children's data is sensitive data, that means we basically need to do that impact assessment all the time.

Last, I'd like to talk about constitutional competence here. There may be some concerns that it may be beyond federal competence to protect the privacy rights of children with more expansive provisions. Our analysis suggests otherwise. CPPA, like PIPEDA before it, is being enacted under Parliament's power to regulate trade and commerce.

Now, it is true that in our federal system, provincial governments get to determine the age of majority, but there is plenty of federal legislation that is designed to protect the rights of children. This also leads to how we think of this law, the consumer privacy protection act. It's not just a form of privacy regulation; it's also, when you think about it, a form of consumer protection legislation that is regulating the safety of digital products that invade and interfere with our right to privacy.

In view of the long history of federal regulation directed at protecting children in the marketplace, we think it would be appropriate for the federal government to include stronger privacy protections, and that would not prejudice provincial laws, like Quebec's, that are stronger. Just as PIPEDA yields to provincial legislation when it's substantially equivalent or better, the same could be true of strengthened children's privacy protections in the new CPPA.

Thank you very much.

4:05 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

I will now turn the floor over to Dr. Brenda McPhail.

4:05 p.m.

Dr. Brenda McPhail Acting Executive Director, Master of Public Policy in Digital Society Program, McMaster University, As an Individual

Thank you, Mr. Chair and members of the committee, for inviting me here today to speak to the submission authored by Jane Bailey, professor at the faculty of law of the University of Ottawa; Jacquelyn Burkell, professor at the faculty of information and media studies at Western University; and myself, currently the acting executive director of the public policy and digital society program at McMaster University.

It is a privilege to appear before you on this omnibus bill, which needs significant improvement to protect people in the face of emerging data-hungry technologies.

I will focus on part 1 and very briefly on part 3 of the bill in these initial remarks, and I welcome questions on both.

Privacy, of course, is a fundamental underpinning of our democratic society, but it is also a gateway right that enables or reinforces other rights, including equality rights. Our written submission explicitly focuses on the connection between privacy and equality, because strong, effective privacy laws help prevent excessive and discriminatory uses of data.

We identified eight areas where the CPPA falls short. In these remarks, I will focus on four.

First of all, privacy must be recognized as a fundamental human right. Like others on this panel, while we welcome the amendment suggested by Minister Champagne, we would note that proposed section 12 in particular also requires amendment so that the analysis to determine whether information is collected or used for an appropriate purpose is grounded in that right.

Bill C-27 offers a significant improvement over PIPEDA in explicitly bringing de-identified information into the scope of the law, but it has diminished the definition from the predecessor law, Bill C-11, by removing the mention of indirect identifiers. The bill also introduces a new category, anonymized information, which is deemed out of the scope of the act, in contrast to the superior approach taken by Quebec. Given that even effective anonymization of personal data fails to address the concerns about social sorting that sit at the junction of privacy and equality, all data derived from personal information, whether identifiable, de-identified or anonymized, should be subject to proportionate oversight by the OPC, simply to ensure that it's done right.

Third, proposed subsection 12(4) weakens requirements for purpose specification. It allows information collected for one purpose by organizations to be used for something else simply by recording that new purpose any time after the initial collection. How often have you shared information with a business and then gone back a year later to see if it had changed its mind about how it's going to use it? At a minimum, the bill needs constraints that limit new uses to purposes consistent with the original consensual purpose.

Finally, the CPPA adds a series of exceptions to consent. I'll focus here on the worst, the legitimate interest exception in proposed subsection 18(3), which I differ from my colleagues in believing should be struck from the bill. It is a dangerously permissive exception that allows collection without knowledge or consent if the organization that wants the information decides its mere interest outweighs adverse impacts on an individual.

This essentially allows collections for organizational purposes that don't have to provide benefits to the customer. Keeping in mind that the CPPA is the bill that turns the tap for the AIDA on or off, this exception opens the tap and then takes away the handle. Here, I would commend to you the concerns of the Right2YourFace coalition, which flags this exception as one in which organizations may attempt to justify and hide their use of invasive facial recognition technology.

Turning to part 3 of Bill C-27, the AIDA received virtually no public consultation prior to being included in Bill C-27, and that lack of feedback has resulted in a bill that is fundamentally underdeveloped and prioritizes commercial over public interests. The bill, by focusing only on high-impact systems, leaves systems that fail to meet the threshold unregulated. AI can impact equality in nuanced ways not limited to systems that may be obviously high-impact, and we need an act that is flexible enough to also address bias in those systems in a proportionate manner.

A recommender system is mundane these days, yet it can affect whether we view the world with tolerance or prejudice from our filter bubble. Election time comes to mind as a time when that cumulative impact could change our society. Maybe that should be in, and maybe it should be out. We just haven't had the public conversation to work through the range of risks, and it's a disservice to Canadians that we're reduced to talking about amendments to a bad bill in the absence of a shared understanding of the full scope of what it needs to do and what it should not do.

Practically, in our submission, we nonetheless make specific recommendations in our brief to include law enforcement agencies in scope, to create independent oversight and to amend the definitions of harm and bias. We further support the recommendations submitted by the Women's Legal Education & Action Fund.

I would be very happy to address all of these recommendations during the question period.

Thank you.

4:10 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you very much.

For the final speech, Teresa Scassa has the floor for five minutes.

4:10 p.m.

Dr. Teresa Scassa Canada Research Chair in Information Law and Policy, Faculty of Law, Common Law Section, University of Ottawa, As an Individual

Thank you very much, Mr. Chair, for the invitation to address this committee.

I am a law professor at the University of Ottawa, where I hold the Canada research chair in information law and policy. I'm appearing today in my personal capacity.

I have concerns about both the CPPA and the AIDA. Many—

4:10 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Mr. Chair, I must intervene to advise you that we have no interpretation.

4:10 p.m.

Liberal

The Chair Liberal Joël Lightbound

Okay.

Wait just one minute, Ms. Scassa.

We'll make sure the interpretation is working.

4:10 p.m.

Bloc

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

It's working now.

Thank you very much.

4:10 p.m.

Liberal

The Chair Liberal Joël Lightbound

Amazing. You can now resume.

4:10 p.m.

Canada Research Chair in Information Law and Policy, Faculty of Law, Common Law Section, University of Ottawa, As an Individual

Dr. Teresa Scassa

Thank you.

I have concerns about both the CPPA and the AIDA. Many of these have been communicated in my own writings and in the report submitted to this committee by the Centre for Digital Rights. My comments today focus on the consumer privacy protection act. I note, however, that I have very substantial concerns about the AI and data act, and I would be happy to answer questions on that, as well.

Let me begin by stating that I am generally supportive of the recommendations of Commissioner Dufresne for the amendment of Bill C‑27, as set out in his letter of April 26, 2023 to the chair of this committee.

I will address three other points.

The minister has chosen to retain consent as the backbone of the CPPA, with specific exceptions to consent. One of the most significant of these is the “legitimate interest” exception in proposed subsection 18(3). This allows organizations to collect or use personal information without knowledge or consent if it is for an activity in which an organization has a legitimate interest. There are guardrails: The interest must outweigh any adverse effects on the individual; it must be one that a reasonable person would expect; and the information must not be collected or used to influence the behaviour or decisions of the individual. There are also additional documentation and mitigation requirements.

The problem lies in the continuing presence of “implied consent” in proposed subsection 15(5) of the CPPA. PIPEDA allowed for implied consent because there were circumstances where it made sense and there was no legitimate interest exception. However, in the CPPA, the legitimate interest exception does the work of implied consent. Leaving implied consent in the legislation provides a way to get around the guardrails in proposed subsection 18(3). An organization can opt for the implied consent route instead of legitimate interest. It will create confusion for organizations that might struggle to understand which is the appropriate approach. The solution is simple: Get rid of implied consent. I note that implied consent is not a basis for processing under the GDPR. Consent must be expressed, or processing must fall under another permitted ground.

My second point relates to proposed section 39 of the CPPA: an exception to an individual's knowledge and consent where information is disclosed to a potentially very broad range of entities for “socially beneficial purposes”. Such information need only be de-identified—not anonymized—making it more vulnerable to re-identification. I question whether there is social licence for sharing de-identified rather than anonymized data for these purposes. I note that proposed section 39 was carried over verbatim from Bill C-11, when “de-identified” was defined to mean what we now understand as anonymized. Permitting disclosure for socially beneficial purposes is a useful idea, but proposed section 39, especially with the shift in meaning of “de-identified”, lacks necessary safeguards.

First, there is no obvious transparency requirement. If we are to learn anything from the ETHI committee's inquiry into PHAC's use of Canadians' mobility data, transparency is fundamentally important. At the very least, there should be a requirement that written notice of data sharing for socially beneficial purposes be given to the Privacy Commissioner of Canada. Ideally, there should also be a requirement for public notice. Further, proposed section 39 should provide that any sharing be subject to a data-sharing agreement, which should also be provided to the Privacy Commissioner. None of this is too much to ask where Canadians' data are conscripted for public purposes. Failure to ensure transparency and a basic measure of oversight will undermine trust and legitimacy.

My third point relates to the exception to knowledge and consent for publicly available personal information. Bill C-27 reproduces PIPEDA's provision on publicly available personal information, providing in proposed section 51 that “An organization may collect, use or disclose an individual's personal information without their knowledge or consent if the personal information is publicly available and is specified by the regulations.” We have seen the consequences of data scraping from social media platforms in the case of Clearview AI, which used scraped photographs to build a massive facial recognition database. The Privacy Commissioner takes the position that personal information on social media platforms does not fall within the “publicly available personal information” exception.

Not only could this approach be upended in the future by the new personal information and data protection tribunal, but it could also easily be modified by new regulations. Recognizing the importance of proposed section 51, former Commissioner Therrien recommended amending it to add that the publicly available personal information be “such that the individual would have no reasonable expectation of privacy.” An alternative is to incorporate the text of the current regulations specifying publicly available information into the CPPA, revising them to clarify scope and application in our current data environment. I would be happy to provide some sample language.

This issue should not be left to regulations. The amount of publicly available personal information online is staggering, and it is easily susceptible to scraping and misuse. It should be clear and explicit in the law that personal data cannot be harvested from the Internet, except in limited circumstances set out in the statute.

Finally, I add my voice to those of so many others in saying that data protection obligations set out in the CPPA should apply to political parties. It is unacceptable that they do not.

Thank you.

4:15 p.m.

Liberal

The Chair Liberal Joël Lightbound

Thank you, Professor Scassa.

Before I open the discussion to Mr. Williams, I think I speak on behalf of the committee in asking you, if you could, to provide that language to improve proposed section 51 to the committee. It would be much appreciated.

4:15 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

Great.

4:15 p.m.

Liberal

The Chair Liberal Joël Lightbound

Then Mr. Perkins won't have to ask for it.

4:15 p.m.

Conservative

Rick Perkins Conservative South Shore—St. Margarets, NS

That will save four minutes.

4:15 p.m.

Liberal

The Chair Liberal Joël Lightbound

Mr. Williams, the floor is yours.

4:15 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Thank you, Mr. Chair.

After eight long years, we finally have privacy legislation in front of this committee. Of course, we've heard from witnesses that it's actually been 24 years since we updated the last privacy legislation.

When we looked at this at second reading in the House, we really focused on what was missing in this bill. What was missing was listing privacy as a fundamental right. However, when we came to committee and we had witnesses lined up, the minister added a bunch of amendments. The amendments seemed to indicate that he was listening. Of course, we're not sure where we are, because amendments will go in certain parts of the bill.

Mr. Geist, thank you for appearing today. When we had the original copy of this, I understand you were part of the original iteration of this bill, PIPEDA, 24 years ago. You don't look that old, sir.

The minister came and presented a bill and did not list privacy as a fundamental right, and now there are all these amendments. Did the minister break this bill?

4:15 p.m.

Professor of Law, Canada Research Chair in Internet and e-Commerce Law, Faculty of Law, University of Ottawa, As an Individual

Dr. Michael Geist

When the minister first came to committee and suggested a whole raft of changes, and then indicated that the government was not prepared to provide the actual text of those amendments until clause-by-clause, to me, that broke the hearings. It broke them for my fellow witnesses and for the many witnesses to come. We follow closely.... The idea that you can come before a committee and comment intelligently when you don't have the actual text of the legislation means to me that everybody's time is being wasted a bit, because you're basically commenting on an old bill, rather than where things are headed.

I'm glad there are now some amendments there, but obviously we're carrying on. We don't have the specific language around the AIDA. Also, as I mentioned in my opening remarks, even around the issue of the fundamental right to privacy, I think we can still do better.

4:15 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

For the record, sir, do you believe that privacy should be listed as a fundamental right in the first two sections of this bill?

4:15 p.m.

Professor of Law, Canada Research Chair in Internet and e-Commerce Law, Faculty of Law, University of Ottawa, As an Individual

Dr. Michael Geist

I do. I think that would provide clarity in how it is interpreted by the commissioner, obviously, as well as by the courts, and it would provide a strong signal from the legislative branch of the importance it accrues to privacy.

However, as I mentioned, in many respects, I'd love to see this in some core provisions that are ultimately going to serve as a testing ground when there's analysis, when you make the determination, for example, of whether consent is appropriate or whether it is for the appropriate purpose. That's when you can begin to bring in that privacy is a fundamental right, because at that stage, you're engaged a bit in some of that balancing, rather than the more overarching side, which, it seems to me, may come at a later date. A court is reviewing a decision. Did the commissioner take adequate account of the fact that privacy has that elevated status?

4:15 p.m.

Conservative

Ryan Williams Conservative Bay of Quinte, ON

Speaking of how important this is for all sections, you were involved 24 years ago in the first iteration—I'm sorry. I'm going to keep repeating that.

How hard is it, once we enact this legislation, to reverse or update it?