Thank you. I was actually thinking about that as Ian made his case.
There are a couple of things. I don't think we disagree in the essence of the ethic we're looking for, which is control. I use that word. That's what our law is right now, it's control of your personal information. It may be honoured in some ways in the breach today. In fact, you could probably make a case that it's honoured a lot in the breach with big data. In my comments in my submission I give some examples of that and, really, a lot of what Ian is talking about with AI and other research that he's done resolves down to big data. So what's the practicality?
I will say...and I think in essence much of what our colleague at the Université de Montréal is saying is, I don't disagree with it. We have to protect that. I do disagree with the professor. I know you asked me to respond to Ian, but notice is not the solution. Notice is the public sector rule. That's what the professor has alluded to. You just have to give notice, and then you can do whatever you want.
That's what we've got today with the so-called opt-out rule. You give notice, and if you don't like it, you opt out. If the notice isn't adequate, you may not have enough information to opt out or you may not have the opportunity to opt out.
Coming back to your question, I think the consent rule we've got is very strong. It really should be applied. I made the point, I'm not saying we couldn't build into PIPEDA some actual mechanisms to enhance either that rule or address some of the machine learning issues that Ian has raised, but I think the realistic way to do that is through Privacy Commissioner guidance. The commissioner has done a wonderful job. In fact, we're guided in Canada. I'm not trying to minimize our authority at all, but the FTC in the United States, the Federal Trade Commission, which has no privacy law, no general privacy law, has done a phenomenal job, and we listen to it and we are guided by it. The commissioner is guided by it.
Developing mechanisms that can address the issues is frankly the way I would respond to Ian's issue. I don't disagree that you shouldn't have unpredictable results occurring because somehow your data has been amassed with everybody else's, and boom, they're determining something about you that you didn't expect. I totally am on speed with that. The bottom line is, I don't think that's something we could put into PIPEDA as a statutory rule.