Mr. Chair and members of the committee, good morning. My name is Tamir Israel, and I am staff lawyer with CIPPIC, the Samuelson-Glushko Canadian Internet Policy and Public Interest Clinic at the University of Ottawa's Centre for Law, Technology and Society and the Faculty of Law. CIPPIC is a public interest legal clinic that works to advance the public interest in policy debates arising at the intersection of law and technology.
I wanted at the outset to thank you for inviting us to testify before you today, as well as for undertaking this important review of the federal Privacy Act, a central component of Canada's privacy, transparency, and accountability framework.
Since the introduction of the Privacy Act in the late 1970s, the policy landscape surrounding data protection has evolved dramatically, driven by tectonic shifts in the technical capability and general practices surrounding the collection and use of personal information. The federal Privacy Act has simply not kept pace with these dramatic changes, a reality that hinders its ability to continue to achieve its objectives, in light of heightened incentives and technical capacities to collect and keep personal information at unprecedented scales. The nature of the objectives incentivizing state data practices has rapidly evolved over the years since the adoption of the act, which initially focused primarily on regulating data practices animated by administrative purposes.
Today's privacy challenges are driven by a far more diverse set of incentives. The era of data-driven decision-making, colloquially referred to as “big data”, increasingly pushes state agencies to cast wide nets in their data collection efforts. Additionally, more often than not, the act is applied in review of activities motivated by law enforcement and security considerations that are far removed from the administrative activities that animated its initial introduction.
Finally, data sharing between domestic and foreign state agencies now occurs on a more informal, and often technologically integrated, basis than could have been envisioned in the late 1970s.
The Privacy Act is in drastic need of modernization, and to that effect, CIPPIC has reviewed and largely endorses the recommendations made by the Office of the Privacy Commissioner of Canada to this committee with respect to changes necessary to ensure today's data protection challenges are met. We will elaborate on a few of these, as well as on some additional recommendations that we have developed in our comments today. In addition, in our written comments, which will eventually make their way to the committee, we provide some legislative language suggestions, which we hope will help guide your review of this act.
The remainder of our opening comments focus primarily on discussing and highlighting specific recommendations designed to enhance proportionality, transparency, and accountability, as well as address shortcomings that have arisen from specific technological developments.
Before turning to these broader themes, however, our first recommendation addresses the Privacy Act's purpose clause, which we believe should be updated to explicitly recognize the objectives of the act: to protect the right to privacy of individuals, and to enhance transparency and accountability in the state's use of personal information. Express recognition of these purposes, as is done in provincial counterparts to the Privacy Act, will assist in properly orienting the legislation around its important quasi-constitutional objectives, and will help to secure its proper and effective application if ambiguities arise in the future, as they surely will.
Necessity and proportionality are animating principles that have become central to data protection regimes around the world, but are absent from the aging Privacy Act. It's important to explicitly recognize these principles in the act, and to adopt additional specific measures that are absent from its current purview, but are nonetheless essential to ensuring private data is collected in a proportionate manner.
As a starting point, first, the Privacy Commissioner's recommendation for explicit recognition of necessity as the standard governing data collection practices should be implemented. Necessity is a formative data protection concept and provides important context for assessing when data should or should not be collected, used, or disclosed. The existing standard, which requires only that data practices relate directly to an operating program or activity, is simply too imprecise in the age of big data, where organizations are increasingly encouraged to collect data that has minimal clear, immediate connection to current objectives.
Second, the Privacy Act imposes no explicit limitations on how long data can be retained once it is legitimately collected. The lack of any explicit obligation to adopt reasonable retention limitations can mean that that data is kept well beyond the point where its utility has expired, exponentially increasing the risk of data breach and of inappropriate uses. The lack of an explicit retention limitation requirement can even lead to the indefinite retention of data that has only a very short window of utility, greatly undermining the proportionality of a particular activity.
As an example, our clinic, along with Citizen Lab at the Munk School of Global Affairs, recently issued a report examining the use of a surveillance tool called a cell site simulator. These devices operate by impersonating cellphone towers in order to induce all mobile devices within range to transmit certain information that is then used to identify or track individuals or devices. The devices operate in a coarse manner. For each individual target the devices are deployed against, the data of hundreds or thousands of individuals within range will be collected. Non-target data collected is only immediately useful for identifying which datasets belong to the individual, the legitimate target of the search, and which do not, an objective that could be accomplished within 24 to 48 hours of collection. However, as the underlying collection of these thousands of non-targeted datasets is legitimate, these datasets might be kept indefinitely. These large datasets can then be reused at any point in the future and, subject to ancillary statutory regimes such as the Security of Canada Information Sharing Act, which was recently adopted via former Bill C-51, can be shared across a wide range of other agencies.
Including an explicit retention limitation provision would not only mandate state agencies to adopt clear retention policies, but would also allow the commissioner to address unreasonable retention in a principled manner. This, in turn, will reduce the risk of data breach and generally increase the proportionality of data collection practices.
Third, we would recommend the adoption of an overarching proportionality obligation that would apply to all collection, retention, use and disclosure of personal information by government agencies into the Privacy Act. This would be comparable to its counterpart that is currently found in subsection 5(3) of PIPEDA. As you have heard from other witnesses, the Privacy Act increasingly provides an important avenue for ensuring charter principles for the protection of fundamental privacy rights are fully realized. An overarching proportionality or reasonableness obligation modelled on subsection 5(3) of PIPEDA would provide an avenue for assessing charter considerations across all data practices. It will also provide the Privacy Act with a measure of flexibility, allowing it to keep pace with technological change by providing a general principle by which unanticipated future developments can be measured.
In addition to these proportionality measures, there are clear gaps in the Privacy Act's current transparency framework and further opportunities to enhance the openness of state practices, which in turn will encourage accountability and public confidence.
At the outset, we encourage the adoption of the Privacy Commissioner's recommendation for a public policy override to the act's confidentiality obligations. This would allow important information regarding anticipated privacy activities to enter the public record in a timely manner.
Second, the Privacy Act should be amended to include statistical reporting obligations attached to various electronic surveillance powers in the Criminal Code. As Mr. Rubin mentioned, statistical reporting obligations were once a hallmark of electronic surveillance regimes and are attached to certain electronic surveillance activities, such as wiretapping, but these activities have largely been superseded by other electronic surveillance activities that have no comparable statistical reporting obligations attached to them.
One investigation conducted by the Privacy Commissioner's office recently found that law enforcement agencies themselves did not have a clear picture of the scope of their own practices in relation to the collection of subscriber information from telecommunication companies. Understanding the nature and scope of state surveillance practices is all the more important in light of the tendency for rapid change in practices in this sphere. Imposing a statistical reporting obligation in the Privacy Act that applies across the spectrum of electronic surveillance powers would therefore provide an important transparency mechanism.
Finally, the adoption of a general obligation on state agencies to explain their data practices would greatly enhance transparency. While the act currently obligates government agencies to explain to individuals the purposes for which their personal information is collected and used, it lacks a general obligation to explain agency practices.
One modelled on PIPEDA's openness principle would be beneficial. If this concept is adopted, it should address the challenges raised by algorithmic non-transparency, which would entail an obligation to explain the logic of any automated decision-making mechanisms adopted by the state.
We have some suggestions on accountability and compliance measures that I will submit in writing and you folks can review at a later time.
I did want to very quickly touch on a couple of recommendations we have that address very specific technological developments that have led to gaps in the Privacy Act.
We would recommend updating the definition of “personal information” so that it is aligned with the comparable definition under PIPEDA. The current definition only applies to personal information that is recorded, whereas many modern data collection and use practices never actively record any personal information, but can still have a very salient privacy impact.
In addition, we would endorse the Privacy Commissioner of Canada's recommendation to adopt an explicit obligation to adopt reasonable technological safeguards, as well as individual breach notification obligations.
Finally, and very briefly, we would also endorse the Privacy Commissioner's recommendation to formalize the privacy impact assessment requirement, as well as recommend an avenue for facilitating public input into the process so that discussions of privacy-invasive programs can occur with public input at the formative stages.
Thank you. Those are my comments for today.