Thank you, Mr. Chair and members of the committee, for the invitation to contribute to your study on the federal government's use of technological tools capable of extracting personal data from mobile devices and computers.
Last fall, CBC/Radio-Canada reported that 13 federal institutions had acquired such tools. The media reports raised questions about the reasons for their use and whether these organizations were respecting their privacy obligations in using the tools.
Initial reports referred to them as covert surveillance or spyware. Since then, it has been clarified that the tools are digital forensic tools, which are distinct from spyware. Digital forensic tools are used to extract and examine large numbers of files from laptops, hard drives or mobile devices. They are typically used in investigations or technical analysis, and often with the knowledge of the device owner.
They can be used to analyze the metadata of a file, or to create a timeline of events, such as when an account was used, when websites were accessed, or to see when an operating system was changed. These tools can also be used to recover deleted data or to ensure that data has been properly wiped from a device before it is discarded or repurposed. This makes them useful investigative tools that can help to preserve the integrity of an evidence chain.
Digital forensics tools are distinct from spyware in that spyware is typically installed remotely on a person's device without their knowledge. It can then covertly collect personal information, such as keylogging and web‑browsing history. One example would be on‑device investigative tools, or ODITs, which are used by law enforcement to obtain data covertly and remotely from targeted devices. Importantly, in the context of law enforcement, judicial authorization is required prior to their use.
In August 2022, I testified before this committee as part of your study about the use of ODITs by the RCMP. You will recall that in that case, the RCMP advised the House that it had been using ODITs in recent years to obtain data covertly and remotely from targeted devices, but had not completed a privacy impact assessment, or PIA, and had not advised my office.
In my appearance at the time, I noted that PIAs were required under Treasury Board policy, but were not a legally binding requirement under privacy legislation. I recommended that the preparation of PIAs should be made a legal obligation for the government under the Privacy Act.
In its November 2022 report, the committee endorsed this recommendation and also called for an amendment to the preamble of the Privacy Act to indicate that privacy is a fundamental right, and for the act to be amended to include the concept of privacy by design and explicit transparency obligations for government institutions. I welcomed and supported these recommendations, and the committee may wish to reiterate them as they remain outstanding and relevant.
With technology increasingly changing the manner in which personal information is collected, used and disclosed, it continues to be important that government institutions carefully consider and assess the privacy implications of their activities to determine if and when PIAs are required.
My vision for privacy is one where privacy is treated as a fundamental right, where privacy supports the public interest and innovation, and where Canadians trust that their institutions are protecting their personal information. Conducting a PIA and consulting my office before a privacy-impactful new technology is used would strengthen privacy, support the public interest and generate trust. This is why it should be a legal obligation for government institutions under the Privacy Act.
Currently, the Treasury Board Secretariat's directive on privacy impact assessment requires that institutions conduct PIAs when personal information may be used as part of a decision‑making process that directly affects an individual; when there are major changes to existing programs or activities where personal information may be used for an administrative purpose; when there are major changes to existing programs or activities as a result of contracting out or transferring programs or activities to another level of government or to the private sector; and when new or substantially modified programs or activities will have an impact on overall privacy, even where no decisions are made about individuals.
In our advisory discussions with federal institutions, we promote the use of PIAs as an effective risk management process. PIAs ensure that potential privacy risks are identified and mitigated, ideally at the front end, across programs and services that collect and use personal information. That said, the use of a new tool does not always trigger the need for a PIA. This will depend on how the tool is being used and what is being done with the information that it collects.
The OPC has used digital forensic tools, for instance, in the context of certain breach investigations to determine the nature, scale and scope of the incident, including how a breach occurred and what types of personal information, if any, may have been compromised.
Digital forensics tools, however, can be used in ways that do raise important risks for privacy that would merit a full privacy impact assessment.
For example, when conducting an internal investigation about an employee's conduct where a decision will be made that will directly impact that individual, or as a tool used as part of an inquiry into alleged criminal activity.
In those types of cases, a privacy impact assessment would be required—addressing not only the specific tool being used to collect personal information, but the broader program under which the tool is being used.
It is incumbent on all federal institutions to review their programs and activities accordingly. Where digital forensic tools are used in the context of employee monitoring, institutions must take steps to ensure respect for the fundamental right to privacy and foster transparency and trust in the workplace. There should be clear rules about when and how monitoring technologies are to be used. My office updated its guidance on privacy in the workplace in May 2023, and my provincial and territorial colleagues and I issued a joint resolution on employee privacy in October 2023.
In the present case, following the CBC/Radio-Canada reports regarding the use of digital forensic tools in the federal government, my office followed up with the institutions that were listed there and in this committee’s motion to proceed with this study.
To summarize what we learned, three organizations indicated that they had completed and submitted a privacy impact assessment—or PIA—on the relevant program; one organization indicated that it had procured the tool but never used it; another organization indicated that a PIA was not required; and the remaining eight organizations indicated that they had either started work on a new PIA, or were considering whether to conduct a new PIA or to update an existing one in light of their use of the tools.
We will continue to follow up with institutions to insist that PIAs be completed in cases where they are required under the Treasury Board policy, but without a requirement in the Privacy Act there are limits to what we can do to ensure compliance. Privacy impact assessments, in appropriate cases, are good for privacy, good for the public interest and they generate trust. In this increasingly digital world, they should be a requirement under privacy law.
I'd be happy to take your questions.