Let me quickly state that in addition to being the president of African Scholars Initiative based in Canada, I am also an assistant professor in the faculty of law at the University of Calgary, and my field of specialization is artificial intelligence and law. This is one area in which I have a professional expertise.
I have researched the implication of race with artificial intelligence. That's the major focus of my research.
On what is artificial intelligence based? Artificial intelligence technology is trained with data, and the problem is with garbage in and garbage out, where a new user raises data to train an artificial intelligence technology. That technology simply regurgitates that racism or discrimination, and that is the concern I'm having with regard to the use of that technology by IRCC.
The data in statute six of the report has shown a low approval rate for African countries. The racism and discrimination is evident from a human review of this application. If we train artificial intelligence technology using this data, we're going to have a regurgitation of that same problem, this time not by humans but by technology.
The IRCC has not even made things easy, because the entire use of Chinook technology and artificial intelligence is embedded in secrecy and a black box. I have made access to information requests for these documents, and they have been pushed back. The last response I received was to provide a 160-day extension. I don't even have access to this to be able to tell you, MP Kwan, whether this technology is amazing or discriminatory. I can't do that, because I don't have access to the data.
It might also help if members of the committee...You probably have more access to that data than I do, as well as to those technologies, so take a look at it, and you will probably come to the same conclusion.
There is a very serious risk if those technologies are used by IRCC now, because of the problem with regard to this dismal approval rate, and, of course, the black box behind those technologies.