We need to make sure that the law will hold up despite the rapid evolution of technology, if not with it. There's a lot of talk about generative artificial intelligence right now. A year from now, it'll be even more powerful. Who knows? So the law has to be able to adapt. That's why the bill contains principles and doesn't talk specifically about generative artificial intelligence, for example, but rather about automated decisions. The definitions need to encompass all this and there needs to be flexibility for the government to set regulations and for my office to set guidelines so we can adapt to new technologies.
The recommendation we're making on privacy impact assessments is very important in this regard. Every time we develop something, we have to document it, assess the risks and carry out consultations, precisely to stay ahead of these technologies. This is one of my priorities, along with protecting children's privacy. We have to keep up with the evolution of technology. This measure makes it possible.
Another of our recommendations concerns de‑identified information. De‑identified information is defined a little too broadly, in my opinion, particularly in French. This definition must be very strict, because it limits legal obligations. In these definitions, we must also take into account the risk of “re‑identification.” The bill says that more can be done with de‑identified information, and that if it's anonymized, the law doesn't apply at all. So there's a big responsibility that comes with that. These definitions need to be strict.
On the issue of de‑identified information, I recommended that we take into account the risk of “re‑identification,” because technology evolves. If a piece of information is de‑identified today, but in two or three years' time, thanks to technology, we'll be able to know again who it's linked to, we'll be right back where we started. This has to be able to evolve over time.