Thank you, Mr. Chair.
Thank you for having me here again. My name is Tamir Israel, and I am a staff lawyer with CIPPIC, the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic at the University of Ottawa's centre for law, technology, and society, which is at the faculty of law. CIPPIC is a legal clinic that works to advance the public interest in policy debates that arise at the intersection of law and technology.
I want to thank you for inviting us once again to contribute to the important work the committee undertakes, in this instance in relation to its review of PIPEDA.
We note at the outset that in our view the principled framework adopted by PIPEDA has largely withstood the test of time. Its general adaptability has allowed it to keep pace with often rapid and tectonic social and technological changes. That being said, some targeted clarifications and additions to PIPEDA's consent and transparency mechanisms are desirable, while PIPEDA's lack of effective enforceability continues to hinder the full realization of the important rights it grants Canadians.
As this committee has heard, the modern era has strained one of PIPEDA's core pillars: consent. This strain arises from the increasingly complex nature of modern data practices, which in turn leads to opaque data capabilities, powerful incentives that are often directly at odds with those of consumers, and inaccessible privacy policies that seek either to capture this complexity, or at the other extreme, to obscure it in order to maintain flexibility for future organizational practices.
In light of this complexity, it is neither practical nor desirable to expect every individual to gain the necessary expertise needed to assess the data practices of every data service encountered on a daily basis. It would be equally undesirable, however, to jettison the concept of consent in favour of a risk-based accountability framework. Such a framework would effectively amount to open season on individual data. Moreover, it is likely to undermine the adoption and usage of services, as empirical research suggests that individuals' confidence in and adoption of services are greatly tied to the ability to exercise consent over data practices.
Too often, however, this confidence is misplaced. Frequently, individuals' expectations are simply not reflected in the unintuitive privacy policies and data practices to which they implicitly consent on a regular basis. In this regard, formalizing some elements of PIPEDA's existing principled framework could assist in realigning practices with expectations.
PIPEDA generally recognizes that more explicit forms of consent are required where such a disconnect occurs, and especially where sensitive data is involved. However, recognizing an explicit “privacy by default” approach will further underscore the need to obtain user input in relation to privacy practices, helping to narrow the gap between individual expectations and actual practice.
Formally empowering the Privacy Commissioner to impose context-specific restrictions may encourage greater use of PIPEDA's current power to designate certain practices as generally unacceptable, and create context-specific regulatory policies. Greater recourse to such tools would enhance certainty and consistency on the business side, while allowing for more frequent proactive policies from the Privacy Commissioner. A formal procedural mechanism for their development would in turn strengthen the quality and legitimacy of such policies.
Finally, some measures might be considered to address specific data protection challenges raised by data brokers. Such entities amass detailed profiles on individuals from disparate online and offline sources, typically without the knowledge or input of the affected individual, who is usually far removed from the collection process. Information held by data brokers is increasingly used by a range of secondary entities to make decisions that often have serious impacts on individuals. A 2014 report issued by the Federal Trade Commission recommended that data brokers be obligated to create readily accessible portals that would allow individuals to easily determine whether their data is being held by a particular broker and that data's initial source. This would then act as an avenue for the exercise of other rights, such as the rights of correction or erasure, that are already integral components of PIPEDA's existing data quality mechanisms.
This framework could be imposed by the Privacy Commissioner as a sector-specific regulatory policy under subsection 5(3) of PIPEDA, but legislating it may provide a stronger and clearer mechanism.
With respect to enforcement, PIPEDA's recommendation and de novo enforcement model is significantly out of touch with the realities of modern data protection. The individual stakes and counter-incentives under which many organizations operate require a serious and responsive regulatory regime. PIPEDA's enforcement mechanism is procedurally difficult, unnecessarily time-consuming, and lacking in deference to the expertise of the Privacy Commissioner.
Personal data is the commodity of the information age and requires a regulatory framework of commensurate formality. It is unsurprising that most jurisdictions with data protection regimes have included enforceability measures in recognition of this basic truth. Imbuing the Privacy Commissioner with order-making powers will assist the office in its interactions with large multinational organizations, enabling it to better carry out its mandate with the authority of a regulatory body.
Further, the prospect of incurring damages under PIPEDA violations remains currently distant, and the anticipated quanta of such damage is minimal. We have seen recent developments in tort law that have supplemented this gap to a certain degree and have led to a notable improvement in proactive compliance, with privacy implications being subject to class actions.
Class actions in tort are, however, limited in scope to certain types of privacy invasion, and there remains little incentive for robust and proactive compliance with other critical elements of PIPEDA. We would therefore encourage imbuing the Office of the Privacy Commissioner with the power to issue administrative monetary penalties comparable in character to those recently allotted to the Canadian Radio-television and Telecommunications Commission.
We would further recommend examining the development of an independent private right of action, which would allow for individuals and classes of litigants to advance their privacy claims directly. This could be supplemented with statutory damages covering some or all of PIPEDA. It could apply to specific principles and violations or to all of the act, and that would facilitate an analogous regime of private enforcement, further incentivizing compliance.
Finally, some transparency mechanisms would address specific and pressing problems under PIPEDA's current regime. It has become accepted practice in many industries, and particularly those industries engaging in facilitating electronic communications, to periodically report on the scope and nature of state agency requests for customer data. While such reporting is arguably required under PIPEDA's openness principle, we would recommend adopting a legislative mechanism that would explicitly empower the Privacy Commissioner to designate transparency reporting obligations on a sector-by-sector basis and also to impose detailed obligations as to the substance of the obligations. This would lead to more consistent and standardized transparency reporting in lieu of the current incomplete and ad hoc reporting.
A secondary transparency mechanism that would benefit from legislative adoption relates to algorithmic decision-making. Automated processes are responsible for a growing range of determinations that significantly affect individuals' lives. Academic and legal literature has demonstrated that algorithmic decision-making often operates as a proxy for decision-making that is discriminatory on religious, ethnic, racial, disability, gender-based, and other protected grounds. Algorithmic decision-making can also gloss over important individual distinctions in favour of broad generalizations, leading to incorrect outcomes for affected individuals. More generally, algorithmic decision-making often obscures the reasoning that animates a given output, making it impossible to determine precisely why a teacher was fired, a consumer was denied particular advantages, or an individual's credit request was rejected. It then becomes difficult to assess whether a decision is accurate, fair, or discriminatory.
Transparency in algorithmic decision-making intersects directly with core and long-standing data protection principles designed to ensure the quality of data used for decision-making. In PIPEDA this is encoded through the data accuracy principle and the right of individual access to personal information held by an organization. However, commercial secrecy is increasingly used as a means of obscuring the underlying logic of an algorithmically determined outcome. In addition, and in the absence of strong transparency obligations, more sophisticated algorithms are now evolving that wholly obscure underlying considerations even from the companies relying on them.
CIPPIC would therefore recommend the addition of a distinct right of access to the underlying logic of any automated decision-making process, and in particular in relation to automated decision-making with a substantial impact on individuals' lives, their access to economic opportunities, and their treatment on the basis of protected grounds.
The committee may further wish to consider the need to undertake a broader study of automated decision-making in both private and public sectors.
Those are my comments for today. I welcome any questions.