I would like to thank the committee for the invitation today to discuss the privacy implications of online platforms and appropriate legislative responses to the concerns of citizens about how their personal information is being used.
As you are aware, I received a complaint about this matter and announced some weeks ago that my office is conducting a formal investigation into how personal information on Canadians has been impacted by the activities of Facebook and Aggregate IQ.
Due to my confidentiality obligations under the law, I'm not in a position to discuss the details of this investigation with you today. I cannot prejudge our findings.
What I can share with you, however, is some perspective on the wider context that may assist you as you begin your study.
Canadians want to enjoy the many benefits of the digital economy, but they rightly expect they can do so without fear that their rights will be violated and their personal information will be used against them. They want to trust that rules, legislation, and government will protect them from harm.
In the recent Facebook matter, what happened, as acknowledged by CEO Mark Zuckerberg, was, quote, a “major breach of trust”. As recognized by the CEO of another giant tech company, Tim Cook of Apple, the situation is so dire that it is now time to develop well-crafted legislation to regulate the digital economy. The time of self-regulation is over.
In Canada, we of course have privacy legislation, but it is quite permissive and gives companies wide latitude to use personal information for their own benefit. Under PIPEDA, organizations have a legal obligation to be accountable, but Canadians cannot rely exclusively on companies to manage their information responsibly. Transparency and accountability are necessary, but they are not sufficient.
To be clear, it is not enough to simply ask companies to live up to their responsibilities. Canadians need stronger privacy laws that will protect them when organizations fail to do so. This was a major conclusion of my annual report to Parliament last year, and a point I made during your recent study of PIPEDA, Canada's private sector privacy law.
Significantly, given the opaqueness of business models and complexity of data flows, the law should allow my office to go into an organization to independently confirm that the principles in our privacy laws are being respected—without necessarily suspecting a violation of the law.
The time has also come to provide my office with the power to make orders and issue financial penalties, helping us to more effectively deal with those who refuse to comply with the law.
Strengthened legislation does not need to be an impediment to innovation. We know that personal information plays a key role in the digital economy, including advances in the field of artificial intelligence, which are necessary for Canada's social and economic development. We need legislation that ensures, as a general rule, that Canadians provide meaningful, informed consent for the collection and use of their personal information. But consent will not always be possible in the world of big data and artificial intelligence, where personal information may be used for multiple purposes not always known when it is collected.
This is why we recommended that Parliament examine exceptions to consent. We believe such exceptions, subject to conditions that would offer other forms of privacy protection, are preferable to relying on an interpretation of consent that is so broad as to become meaningless. We prefer narrower, specific exceptions, but we recognize that one option could be a European-style legitimate interest exception.
I'm of course very pleased that your committee recently issued a report calling for comprehensive changes to the federal private sector privacy law, which included several recommendations I had made but also others that would significantly improve the privacy rights of Canadians. Your report has shown that you are attuned to the issues stemming from the dated state of federal privacy laws in Canada, and you have actively called upon the government to make comprehensive changes.
Many in society, particularly in the last few weeks, are making similar calls. Even leaders of the tech industry now see the need for enhanced regulations.
If there was ever a time for action, I think, frankly, this is it.
Another area ripe for action concerns privacy protections and political parties.
As you are aware, no federal privacy law applies to political parties; British Columbia is the only province with legislation that covers them.
This is not the case in many other jurisdictions. The UK, much of the EU and New Zealand all cover political organizations with their laws.
In point of fact, in many EU states, information about political views and membership is considered highly sensitive, even within existing data protection regimes, requiring additional protections.
There are also now—in the digital environment—so many more actors involved: data brokers, analytics firms, social networks, content providers, digital marketers, telecom firms and so forth.
So while I am currently investigating commercial organizations such as Facebook and Aggregate IQ, I am unable to investigate how political parties use the personal information they may receive from corporate actors.
In my view, this is a significant gap.
Some independent authority needs to have the ability to review the practices of political parties and to assess whether privacy rights are being truly respected by all relevant players.
This gap requires addressing in one statutory form or another, either in privacy laws, in the Canada Elections Act or in a specific statute.
In conclusion, I would again highlight the urgency to act, as well as the stakes involved.
The integrity of our democratic processes—as well as trust in our digital economy—are both clearly facing significant risks.
I cannot think of more relevant questions for legislators to confront, and I applaud you for doing so.
Thank you again for your invitation, and I would welcome your questions.