Digital Charter Implementation Act, 2022

An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Sponsor

Status

In committee (House), as of April 24, 2023

Subscribe to a feed (what's a feed?) of speeches and votes in the House related to Bill C-27.

Summary

This is from the published bill. The Library of Parliament has also written a full legislative summary of the bill.

Part 1 enacts the Consumer Privacy Protection Act to govern the protection of personal information of individuals while taking into account the need of organizations to collect, use or disclose personal information in the course of commercial activities. In consequence, it repeals Part 1 of the Personal Information Protection and Electronic Documents Act and changes the short title of that Act to the Electronic Documents Act . It also makes consequential and related amendments to other Acts.
Part 2 enacts the Personal Information and Data Protection Tribunal Act , which establishes an administrative tribunal to hear appeals of certain decisions made by the Privacy Commissioner under the Consumer Privacy Protection Act and to impose penalties for the contravention of certain provisions of that Act. It also makes a related amendment to the Administrative Tribunals Support Service of Canada Act .
Part 3 enacts the Artificial Intelligence and Data Act to regulate international and interprovincial trade and commerce in artificial intelligence systems by requiring that certain persons adopt measures to mitigate risks of harm and biased output related to high-impact artificial intelligence systems. That Act provides for public reporting and authorizes the Minister to order the production of records related to artificial intelligence systems. That Act also establishes prohibitions related to the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system and to the making available for use of an artificial intelligence system if its use causes serious harm to individuals.

Elsewhere

All sorts of information on this bill is available at LEGISinfo, an excellent resource from the Library of Parliament. You can also read the full text of the bill.

Votes

April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts
April 24, 2023 Passed 2nd reading of Bill C-27, An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Francesco Sorbara Liberal Vaughan—Woodbridge, ON

Thank you, Mr. Chair.

Welcome, everyone.

Bill C-27 is a very important bill for consumers, for individuals and for businesses, both domestically and internationally. One of the things I've been able to glean just from the testimony today is the regulatory alignment that's needed between us and other jurisdictions, and also that we in Canada benefit, sometimes, from what's called a fiscal federation. Sometimes the provinces move first, and sometimes we do, but we need to be on the same page due to the importance of the material here.

This is for the Canadian Bankers Association.

In 2018, I was part of the finance committee when we did the statutory review on money laundering and terrorist financing. “Moving Canada Forward” was a report that we issued in November 2018. You've raised some good things and some potential amendments and so forth with regard to the CPPA in relation to money laundering and terrorist financing. Can you comment on that and add any more colour that you wish to add in that vein?

Ulrike Bahr-Gedalia Senior Director, Digital Economy, Technology and Innovation, Canadian Chamber of Commerce

Good afternoon, everyone.

Yes, indeed, we have received a long list of Bill C-27 recommendations from our members. A detailed brief was submitted to INDU in September and is available on the committee page, just so you're all aware. Please note that our analysis of the bill is ongoing as new material becomes available, such as the eight government amendments. Therefore, we are working with members to produce additional feedback to complement our earlier submission.

I’d like to take the opportunity to underscore a few key recommendations. First, a core position of the Canadian Chamber of Commerce is that there need to be amendments to better define many of the principles and concepts in Bill C-27 and to harmonize the bill with the norms and standards found in existing provincial and international law. Interoperability is paramount.

Among our recommendations on the CPPA, we are suggesting that the following elements align with Quebec’s law 25: that the term “minor” be defined to include an age, that the definition of “anonymize” be in line with industry standards, and that the scope of the private right of action be narrowed. We also want to underscore the importance of legitimate interest exceptions in the current bill.

On AIDA, we were encouraged to see that government amendments would be forthcoming with respect to defining high-impact systems, creating clearer obligations along the AI value chain, and ensuring alignment with the EU AI act and those in other advanced economies. We look forward to seeing the text of these amendments to provide more specific feedback.

However, other matters remain unaddressed thus far, such as better defining the use of the term “harm”. Our members have also raised serious concerns around the criminal liability element of AIDA, noting that Canada is the only jurisdiction in the world with such penalties. There is a belief that this provision might discourage businesses developing or deploying AI from setting up operations in Canada or even force some to leave, based on risk assessment.

Finally, in terms of coming into force, it’s important that our businesses, especially SMEs—because small business is big business in Canada—have adequate time to adapt to new environments and requirements. We therefore recommend a phased implementation of CPPA and AIDA over a period of 36 months.

Thank you very much.

Steve Boms Executive Director, Financial Data and Technology Association of North America

Thank you very much, Chair, and good afternoon.

I am the executive director of the Financial Data and Technology Association of North America, or FDATA. We're the leading trade association advocating for consumer-permissioned access to financial data in both Canada and the United States.

Our members include firms with a variety of different business models, which collectively provide more than six million Canadian consumers and SMEs with access to vital financial services and products. Utilizing these products, services and tools, Canadian consumers can, for example, access more competitive banking services, including more affordable credit. They could utilize more efficient payment options and benefit from technology to better manage their finances and grow their wealth. Canadian SMEs depend on FDATA North America member companies to manage their accounting and credit needs and more easily send and receive payments.

We are strong advocates of Canada's implementation of an open finance regime, which was first outlined as a government priority in budget 2018. The core idea of open finance is this: A Canadian consumer or SME should be able to safely and securely share access to their data held at one provider with another provider that offers a better financial product, service or tool. Whether it's a chequing, savings, business, brokerage, pension, mortgage, or auto loan account, or data held by a payroll or benefits provider, open finance is the straightforward notion that the customer should have the right to use that data for their own benefit.

Once built, open finance in Canada will put consumers and SMEs in full control of their financial data, facilitating a more transparent and competitive Canadian financial services marketplace that provides safe and secure data portability. The data portability right and data privacy framework included in Bill C-27 are fundamental cornerstones of this modernized approach to financial services.

A survey of Canadians commissioned last year by FDATA North America and Fintechs Canada found that half of Canadians feel stress when interacting with Canada's existing financial services sector and more than two-thirds of Canadians believe that more competition in the financial services marketplace would lead to a greater choice in products and lower financial services fees. Ninety per cent of Canadians indicated that they found fintech products easy to use, with more than 80% reporting they paid lower fees to fintechs than to their banks for similar services or products. Canadians deserve access to these alternatives.

Canada lags behind virtually every other G20 country with regard to open finance, data portability and data privacy. The U.K., Australia, New Zealand, Singapore, Brazil, the European Union and other jurisdictions have all enacted some version of government-led open finance, under which consumers and SMEs have legally binding data access rights and privacy protections afforded to them.

In contrast, today Canadian consumers and SMEs have no legal right to access or share access to their financial data. Unlike the overwhelming majority of other countries, in Canada, a consumer's or SME's bank is empowered to determine whether their customer may share elements of their data with a third party to get a better deal, access a new product or tool or avoid paying exorbitant fees. To the extent that a bank may allow its customers to do so, there are generally onerous and, in some cases, restrictive terms dictating the limitations under which their customers are able to do so.

While Canada has taken important steps towards such a regime since budget 2018, significant work remains to reach implementation.

Meanwhile, the rest of the world advances. Earlier this month, the United States formally launched its own open finance regime with a CFPB rule-making. Recognizing that incumbents in the financial services market will not, on their own, deliver a more competitive, customer-centric ecosystem, the director of the CFPB noted in his announcement that the rule will “supercharge competition, improve financial products and services, and discourage junk fees”. Like Bill C-27, the CFPB rule would provide data portability rights to consumers and will require those firms that access—with their express consent—end-users’ data to abide by strict data privacy and security provisions.

To advance its open finance regulations, the U.S. had an advantage that the Department of Finance and the Department of Innovation, Science and Economic Development currently do not: strong statutory authority to do so. Finance Canada has been studying how to deliver open finance in Canada for the better part of five years. FDATA views enactment of Bill C-27 as a critical element of the transition from open finance ideation to implementation. Once consumer and SME data portability has been enshrined in law, ISED and Finance Canada will have the statutory tools required to finally deliver open finance.

Consumers and SMEs in Canada are being left behind as the rest of the G20 build and deploy open finance frameworks that facilitate competition, enable greater access to and inclusion within the financial services marketplace and provide their citizens with appropriate data protections. The data portability and privacy provisions included in Bill C-27 represent integrally important statutory tools for ISED and Finance Canada that will help Canada catch up.

Thank you. I would be pleased to answer any questions.

Jim Balsillie Founder, Centre for Digital Rights

Chairman Lightbound and honourable members, thank you for the opportunity to share my views on Bill C-27, legislation that will have profound consequences on Canada's economic prosperity, freedom, democracy, consumer protection and child well-being.

The Digital Charter Implementation Act prioritizes the interests of large data monopolies and their ecosystem of traffickers. It sets a dangerous precedent by allowing corporations to allocate to individuals, children and vulnerable groups the harmful economic, political and social consequences of the data-driven economy. It normalizes and expands surveillance, treating human rights as an obstacle to corporate profits.

Bill C-27 requires a wholesale redo, and my written submission includes comprehensive proposed amendments.

A high-level perspective of some of the foundational flaws with the bill as tabled include the following: one, use of a notice and consent framework, which creates a pseudo-compliance system that enables personal data harvesting and intrusive profiling while spamming users with misleading consent barriers; two, a legitimate business interest carve-out that allows corporations to put the pursuit of profits above the interests of consumers, where businesses are allowed to privately self-determine what constitutes legitimate surveillance and behavioural modification to trample on fundamental rights but are under no obligation to notify consumers how they are tracking and profiling them; three, a diminishment of protections for children and vulnerable persons and an omission of meaningful measures that curtail insidious surveillance and behavioural manipulation practices that are driving the current youth mental health crisis; and four, an artificial intelligence and data act that doesn't include an independent and expert regulator for automated decision systems and excludes the right to contest decisions made with AI, such as insurance, school admissions and credit scoring. AIDA needs to be scrapped completely.

There are many more flawed parts of this legislation, all detailed in my submission.

The recent letter by Minister Champagne indicating willingness to make some unspecified amendments is a woefully inadequate approach to dealing with the serious flaws in this bill. It joins the long list of bad governance practices, which is how we ended up with this untenable bill in the first place.

There has been much gaslighting from industry lobbyists and self-interested parties whose profits depend on mass surveillance, arguing that meaningful AI privacy regulations limit innovation. Privacy and AI regulations are not impediments to innovation. As innovation economists and digital policy experts have shown, the unique features of the data-driven economy—specifically, data's network effects alongside economies of scope, scale and information asymmetry—mean that the more data a company gathers, the more value it gains from it. Every new dataset makes all pre-existing datasets in the hands of the same few companies more valuable, disproportionately enhancing the power of established data giants and their vested assets. This is why, in less than a decade of the data-driven economy, we have seen the greatest market and wealth concentrations in economic history, a reduced rate of entrepreneurship, innovation and business dynamism and, also, lowered wages.

Properly regulating insidious data collection and trafficking, as other jurisdictions are doing, would not only address concentrated economic power, but also force business to compete on the level of quality and innovation, not surveillance and manipulation, as is currently the case.

I am an entrepreneur, investor, co-founder of the Council of Canadian Innovators, and a vocal advocate for Canadian technological and innovation success in global markets. It's deeply troubling to hear the government talk about advancing Canadian innovation, because earlier this year the government admitted that it has no AI strategy. We are merely funding basic research that principally supports the growth of foreign data monopolies.

This lack of capacity to understand and regulate the digital economy has real consequences, chief among them a steady decline in the standard of living and prosperity for the average Canadian, particularly in Ontario and Quebec, which used to drive our national prosperity. Because Canada is unable to create policies to harness the potential of IP, data and AI, the OECD recently projected that Canada's economy will be the worst-performing advanced economy of 2020-30 and the three decades thereafter.

The choice you have is to adopt Bill C-27, a deeply flawed attempt at privacy regulation, or to create new legislation that builds trust in the digital economy, supports Canadian prosperity and innovation and protects Canadians not only as consumers but as citizens. The choice is a continued erosion of Canadian prosperity, emboldening surveillance and manipulation and deepening the mental health crisis of our youth, or a healthy democracy, long-term prosperity, robust freedoms and the protection of our children.

Thank you.

Siobhán Vipond Executive Vice-President, Canadian Labour Congress

Good afternoon, committee members. It is my honour to be here with you today.

The 55 national and international unions affiliated with the Canadian Labour Congress bring together three million workers in virtually all sectors, industries, occupations and regions of the country. We are grateful for the opportunity to speak to the artificial intelligence and data act, AIDA, enacted by Bill C-27.

Across sectors, industries and occupations, workers in Canada increasingly encounter AI applications in their work and employment. Many report that AI has the potential to improve and enrich their work. In certain instances, AI applications could reduce time and energy spent on routine tasks. This could free workers up to focus on more skill-intensive aspects of their jobs, or on directly serving the public.

However, workers are also concerned about the negative potential consequences for jobs, privacy rights, discrimination and workplace surveillance. Workers are troubled by the potential for displacement and job loss from AI. Workers in creative industries and the performing arts are concerned about control over, and compensation for, their images and work. Workers are concerned about the collection, use and sharing of their personal data. Workers and unions are concerned about the use of AI in hiring, discipline and human resource management functions. Almost every week, we hear from workers who have real-life experience with the impact this is already having on their jobs. AI systems carry serious risks of racial discrimination, gender discrimination, and labour and human rights violations.

The number one demand from Canada's unions is greater transparency, consultation and information sharing around the introduction of AI systems in workplaces and Canadian society. Unfortunately, AIDA falls short in this respect.

Our concerns about AIDA are as follows.

First, unions are troubled by the lack of public debate and broad consultation on regulating AI in Canada. We feel there should have been proper public debate prior to the drafting and introduction of AIDA.

Second, the major deficiency of AIDA is that it exempts government and Crown corporations. The Government of Canada is a leading adopter and promoter of AI. Despite this, AIDA provides no protection for public service workers, whose work and employment are affected by AI systems. Government is responsible for many high-impact AI systems for decision-making—from immigration and benefits claims to policing and military operations. AIDA should be expressly expanded to apply to all federal departments, agencies and Crown corporations, including national security institutions.

Third, the bill only requires measures to prevent harms caused by high-impact systems. It leaves the definition of “high-impact systems” to regulation. As well, it is silent on AI systems that can cause real harms and discrimination despite falling outside the classification of “high-impact”.

Fourth, AIDA contemplates a senior Innovation, Science and Economic Development Canada official acting as the AI and data commissioner. The commissioner should be an independent position. An office tasked with supervision and regulatory oversight should not be housed within the department responsible for promoting the AI industry.

Fifth, while AIDA authorizes the minister to establish an advisory committee, we strongly believe the government must go much further than the current advisory council on artificial intelligence, established in 2019. The advisory council is dominated by industry and academic voices, with no participation from civil society, human rights advocacy organizations, unions and the public. The CLC urges the government to create a permanent representative advisory council that makes recommendations on research needs, regulatory matters, and the administration and enforcement of AIDA.

Finally, the purpose clause of the act should be strengthened. Currently, AIDA is intended in part “to prohibit certain conduct in relation to artificial intelligence systems that may result in serious harm to individuals or harm to their interests.” This should be revised to prohibit conduct that may result in harm to individuals and groups, not just “serious harm”. Currently, AIDA is focused on individual harms, not on societal risks, such as to the environment or Canadian democracy.

In summary, the CLC believes there should be much more institutionalized transparency, information sharing and engagement around AI in the workplace and Canadian society.

Thank you. I welcome any questions the committee may have.

Lorraine Krugel Vice-President, Privacy and Data, Canadian Bankers Association

I would like to thank the committee for the opportunity to speak on Bill C-27, the consumer privacy protection act, or CPPA.

My name is Lorraine Krugel, and I am vice-president of privacy and data for the Canadian Bankers Association. The CBA is the voice of more than 60 banks operating in Canada, employing more than 280,000 Canadians and helping to drive Canada’s economic growth and prosperity.

Banks have long been entrusted with significant amounts of personal information, and privacy and trust are paramount to our banks' customer relationships. As global data flows and technological advances have continued to increase, Canadian banks have been able to responsibly innovate to meet consumer demand for even more convenience, value and simplification. The CPPA reflects a unique, made-in-Canada approach that aims to address the needs of consumers and organizations in our evolving digital world.

We need to get this right. Some of the proposed provisions in the CPPA need to be better tailored for the Canadian context. We are concerned that there is a real risk of significant adverse consequences if the scope of certain provisions is not better defined and necessary exceptions are not included.

In particular, we would like to avoid situations where organizations would be required to provide too much information in order to be transparent. For example, certain transparency provisions could end up replicating the equivalent of consent fatigue or cookie banner fatigue, with no meaningful value to the consumer. Transparency obligations also require appropriate limits so that they cannot be abused or leveraged by criminals to circumvent processes designed to protect against fraud, money laundering or cyber-threats. In addition, we need to take care so that any requirements that are highly complex or operationally onerous would, in fact, address the right underlying risks and policy intent without negatively impacting legitimate operations, product and service delivery or the safeguarding of information.

The CBA is supportive of many of the key foundations of the CPPA. The CPPA is principles-based, scalable and technology-neutral and requires organizations to comply with a collection of interconnected provisions that provide a solid privacy foundation based on accountability, reasonability and proportionality; however, we see the need for targeted amendments in the following key areas: de-identification and anonymization, disposal requests and retention, and automated decision systems.

Relating to consent, we recommend an important technical amendment that will ensure continued alignment with provincial approaches while preserving policy intent and avoiding unintended consequences regarding consent obligations. In addition, we recommend an amendment to the CPPA to legally allow certain organizations to share personal information to combat money laundering and terrorist financing as part of a legislative framework that would be further defined through the Proceeds of Crime (Money Laundering) and Terrorist Financing Act. Done in the right way, such sharing could increase privacy protections for Canadians by reducing unnecessary reporting to the government on low-risk transactions and simultaneously increase the effectiveness of Canada’s anti-money laundering regime through targeted and more effective reporting.

Finally, we believe that a minimum two-year implementation period is necessary to accommodate the scope of change and the development of regulations and guidance associated with the CPPA.

Regarding the artificial intelligence and data act, or AIDA, we are in the process of evaluating the minister’s recent proposals and will be submitting comments and recommendations to the committee when the study focuses on the AI portions of the bill.

We have provided the committee with written comments and recommendations on the CPPA and look forward to your questions.

Thank you.

The Chair Liberal Joël Lightbound

I call this meeting to order.

Good afternoon, everyone.

Welcome to meeting No. 93 of the House of Commons Standing Committee on Industry and Technology.

Today's meeting is taking place in a hybrid format, pursuant to the standing orders.

Pursuant to the order of reference of Monday, April 24, 2023, the committee is resuming consideration of Bill C‑27, An Act to Enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other acts.

I would like to welcome our many witnesses today and also apologize for the brief delay caused by votes in the house.

Today we welcome, from the Canadian Bankers Association, Lorraine Krugel, who is vice president, privacy and data.

From the Canadian Labour Congress, we have Siobhán Vipond, who is executive vice-president, and Chris Roberts, director, social and economic policy. From the Centre for Digital Rights, we have its founder, Jim Balsillie. From the Financial Data and Technology Association of North America, Steve Boms is with us via video conference.

From the Canadian Marketing Association, we have Sara Clodman, vice president, public affairs and thought leadership, and David Elder, head of privacy and data protection group, Stikeman Elliott LL. Lastly, we have, from the Canadian Chamber of Commerce, Catherine Fortin LeFaivre, who is vice president, strategic policy and global partnerships, and Ulrike Bahr-Gedalia, senior director, digital economy, technology and innovation.

So we have a lot of witnesses with us today. Once again, I thank you for being here.

I would also inform my member colleagues that the meeting will adjourn at 6:00 p.m. today. Please bear that in mind.

Without further ado, I give the floor to Ms. Krugel for five minutes.

October 26th, 2023 / 5:40 p.m.


See context

Canada Research Chair in Information Law and Policy, Faculty of Law, Common Law Section, University of Ottawa, As an Individual

Dr. Teresa Scassa

I completely agree that there are problems with this provision.

The one I flagged in my opening comments is that it refers to de-identified information. This was taken verbatim from Bill C-11 and put into Bill C-27, but in Bill C-11, “de-identified” was given the definition that is commonly given to anonymized information.

Under Bill C-27, we have two different categories: de-identified and anonymized. Anonymized is the more protected. Now you have a provision that allows de-identified information—which is not anonymized, just de-identified—to be shared, so there has actually been a weakening of proposed section 39 in Bill C-27 from Bill C-11, which shouldn't be the case.

In addition to that, there are no guardrails, as you mentioned, for transparency or for other protections where information is shared for socially beneficial purposes. The ETHI committee held hearings about the PHAC use of mobility data, which is an example of this kind of sharing for socially beneficial purposes.

The purposes may be socially beneficial. They may be justifiable and it may be something we want to do, but unless there is a level of transparency and the potential for some oversight, there isn't going to be trust. I think we risk recreating the same sort of situation where people suddenly discover that their information has been shared with a public sector organization for particular purposes that have been deemed by somebody to be socially beneficial and those people don't know. They haven't been given an option to learn more about it, they haven't been able to opt out and the Privacy Commissioner hasn't been notified or given any opportunity to review.

I think we have to be really careful with proposed section 39, partly because I think it's been transplanted without appropriate changes and partly because it doesn't have the guardrails that are required for that provision.

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

We've had some meaningful discussions. However, I'm wondering whether this committee will really have the will or capacity to move quickly and help get this bill passed. To be honest, I even wonder if the government really wants to get Bill C‑27 passed at this point, in the context of this legislature.

Having said that, I feel like asking you some questions, Dr. McPhail.

In your publications, you put a great deal of emphasis on developing responsible artificial intelligence and transparent governance of artificial intelligence.

Because the rapid development of technology poses significant data security and privacy challenges, what are your thoughts on establishing a technological sandbox that would isolate emerging technologies in a separate environment, with a view to assessing their compliance with privacy standards before they are made available to the public?

October 26th, 2023 / 5:30 p.m.


See context

Associate Professor of Law, University of Colorado Law School, As an Individual

Vivek Krishnamurthy

Undoubtedly, the Bill C-27 package of amendments is an improvement over the status quo. I think all of us would acknowledge that. However, I'm not sure we should settle for a C+ bill. I think Canadians deserve A+ privacy protection, and amendments to this bill can get us there.

I think that is the spirit in which all of us who are scholars and activists, and who think about privacy and take a big-picture approach to this, think of it. We understand that private information does need to be collected and processed, but that needs to be done in a way that respects what is a very fundamental human right, one that is becoming more important in our digital age over time, as technology becomes more invasive, and it is important to get that right.

Political oxygen is scarce. Again, you have many priorities, many things to legislate, so if this is our shot, we have to do our very best. I think everyone here today has provided lots of really good ideas, and if this committee would embrace them and enact some amendments, this could be a much better bill.

Viviane LaPointe Liberal Sudbury, ON

Thank you, Mr. Chair.

My question is for Mr. Krishnamurthy.

You wrote an article in May 2022 called “With Great (Computing) Power Comes Great (Human Rights) Responsibility: Cloud Computing and Human Rights”—great title, I might add. In the article, you stated that the human rights impacts of cloud computing have not been studied to nearly the same extent as newer technologies that are powered by the cloud. Can you expand on this in relation to Bill C-27, recognizing privacy as a human right?

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you, Mr. Chair.

Dr. Geist, I'd like us to discuss the data the government collects.

Is this something we should be concerned about? Do people feel that the public and private sectors are equally subject to the provisions of Bill C‑27? Should we feel reassured? Is our data adequately protected, given what the various levels of government do with it?

Brad Vis Conservative Mission—Matsqui—Fraser Canyon, BC

Thank you, Mr. Chair.

Thank you to our witnesses here today.

I'm somewhat concerned about this bad bill before us today.

With Bill C-11, the Government of Canada had an opportunity to enshrine the fundamental right to privacy for children, to define what a minor is, to define perhaps an age of consent and do a whole bunch of stuff to ensure that children were protected. That bill died on the Order Paper.

Then, we had Bill C-27 when this Parliament opened up again. The minister again had an opportunity to enshrine the fundamental right for children to protect their privacy in some of the actions they may take online. Then the government had the opportunity to define what sensitive information is—likely in the context of a child. They had an opportunity to define what a socially beneficial purpose was in the context of a child.

The minister came before us a few weeks ago. He said, “I have this bill. It's going to do so much work to protect children, but we have to amend it.” Then we had to put a motion forward to get a copy of those amendments. We're here today. I am not going to relent on this until we have more clarification and I hear from as many witnesses as possible to ensure that children's rights are protected.

My question is open-ended. I'll start with you, Mr. Geist. What clauses of the bill do you believe need to be amended to ensure that a child's fundamental right to privacy and their online actions are not used in a way that will compromise them as adults, or at a future period of time in their life?

Sébastien Lemire Bloc Abitibi—Témiscamingue, QC

Thank you very much.

With respect to the shortcomings of the Canadian law, in an article entitled “What political parties know about you”, one thing you talk about is the factors affecting how political parties, MPs or independent candidates protect the personal information of Canadians that they may have in their possession. In the current context, Bill C‑27 makes no mention of protection of this kind.

Is the government falling short of protecting voter data and perhaps moving forward in the quest for open and transparent governance?

Do you think Canada should follow Quebec's lead and subject federal parties to the same privacy standards as organizations?

October 26th, 2023 / 4:30 p.m.


See context

Professor, Political Science, Unversity of Victoria, As an Individual

Prof. Colin Bennett

Thank you for that question.

I was trying to draw, in that statement, a distinction between harmonizational convergence, which is a harmonization of text ensuring that the statutes essentially say the same thing, and interoperability, which I think means something subtlely different. It means that if businesses have a requirement to do something in one province or one jurisdiction, such as a privacy impact assessment under Quebec's law 25, it will in fact be accepted by a regulator elsewhere. You can see that distinction in Canada among different provincial laws that have been worked out over time pragmatically, but it's also important to see it internationally through the GDPR.

That was the point I was trying to make. I'm not an expert on Quebec law, but I was trying to point out certain areas in Quebec's law where I think businesses would be required to do more under that law than they would under the current text of Bill C-27. Then you have to ask this question: What might be the economic impact of that across Canada if the CPPA is perceived to be lowering the standard within the Quebec legislation? That's the point I was making.

I think the particular provision on international data flows is an interesting example, because in the CPPA at the moment there's really nothing explicit for businesses on what to do when they are processing data offshore, and the vast majority of data protection laws that I know of.... This is also something that's of critical importance to the European Union when it comes to making a judgment about the adequacy, and the continued adequacy, of our laws in Canada. What happens when data on Europeans comes to Canada and then it is processed offshore elsewhere? Those are critical questions. I think there would be some concerns about that by our European friends when they come to make those judgments.

I hope that answers your question.