In terms of the public sector, again, this notion of proportionality is not included in the Privacy Act. We recommended, and this committee did as well, that the issue of necessity and proportionality be included. At this point, it is more a Treasury Board directive that this use is necessary to achieve the desired objective.
Currently, the act requires that the use be related to a mandate of the organization. For our part, at the Office of the Commissioner, we will implement that necessity and proportionality by raising questions about it in our investigations. We're talking about it now, just as we talked about it during the investigations into the measures taken during the pandemic, in particular. When we talk about this, though, we have to recognize at the outset that this is not a legal obligation and that, if it were not respected in a given situation, it wouldn't be a violation of the act.
This is a very important recommendation. The approach is very similar to how we proceed in the context of the Canadian Charter of Rights and Freedoms to determine whether there is discrimination or a violation of fundamental rights. We determine whether the objective sought is important, whether the proposed measure achieves the objective, whether the method used to achieve it is the least intrusive and, lastly, whether the method is proportional.
You're absolutely right: We may be tempted to use a tool because we find it very efficient and quick. Artificial intelligence comes to mind. Yes, it's effective, but we're talking about a fundamental right here.
Having said that, it's not an either‑or. Personally, I'm in favour of technology. In the office, we have made it one of our three strategic priorities recently. We want to use technology, but in a way that protects privacy. In that sense, the privacy impact assessment tools are essential. These assessments must not only be done, but also be seen to be done.