Mr. Chair, could I add just a very quick supplemental to that?
Mr. Masse, the question you ask is very important, but we need to recognize that one of the reasons there is the discrimination in algorithms is that they're being trained on data that is currently being used in other contexts.
I think we need to recognize that this problem is not just an algorithmic problem. It's a problem with discrimination that we have in this country that's not being addressed. What's the best entity to deal with that? It's the Canadian Human Rights Commission.
We don't need to solve this problem with AIDA. There's ample authority under the Canadian Human Rights Act to make extensive regulations. They need to be given the power to deal with all discrimination that's currently not covered, including algorithmic. Put the expertise in them, and do not separate them so that the Human Rights Commission is dealing only with older discrimination but not algorithmic discrimination. Let them deal with this so they can take a holistic view and deal with the problem.