Thank you, Madam Chair and members of this committee, for inviting the African Scholars Initiative, ASI-Canada, to make this submission on your study.
I will limit my opening remarks to two main issues: first, the differential outcomes on study visa decisions by IRCC relating to applications from Africa; and second, the growing use of artificial intelligence technology by IRCC in visa processing.
Data on study visa refusals from IRCC clearly show that countries in sub-Saharan Africa are most adversely impacted by differential outcomes on study visa decisions by IRCC. The Pollara report revealed that systemic bias, discrimination and racism account for this, from outright reference to African countries as “the dirty 30” by IRCC visa officers to outright branding of Nigerians as corrupt and untrustworthy.
IRCC study visa policies have been designed in ways that make it ever more difficult for people from Africa to be able to secure study visas to pursue education in Canada. In my appearance before this committee on February 8, 2022, I highlighted these discriminatory policies by comparing two visa application programs, the student direct stream, or SDS, and the Nigerian student express, or NSE, especially the differential or discriminatory financial requirements under the NSE program.
In addition, I will also note the language requirement under the NSE program, which requires a Nigerian study visa applicant to undertake English-language proficiency to prove to the visa officer that they are proficient in the English language. This requirement is imposed notwithstanding that English is the only official language in Nigeria. It is the official language of instruction in all formal academic institutions in Nigeria. Foreign students from Nigeria are exempted from English-language proficiency by all academic institutions in Canada, but not by IRCC. These subtle, biased, discriminatory and differential study visa requirements inevitably result in adverse differential outcomes in decisions, not just for Nigeria but for Africa.
My second submission relates to the growing use of computer software and artificial intelligence technology by IRCC in visa processing. ASI-Canada is not opposed to some use of AI technologies by IRCC. IRCC has in its possession a great deal of historical data that can enable it to train AI and automate its visa application processes, but there are serious concerns here. External study of IRCC, especially the Pollara report, has revealed system bias, racism and discrimination in IRCC processing of immigration applications. Inevitably, this historical data in possession of IRCC is tainted by this same systematic bias, racism and discrimination.
The problem is that the use of this tainted data to train any AI algorithm will inevitably result in algorithmic racism—racist AI making immigration decisions. As an assistant professor of AI and law at the University of Calgary Faculty of Law, I have spent the last three years researching algorithmic racism, and I can confidently state that the concerns raised here are legitimate and real. Any use of AI technology by IRCC should be subject to external scrutiny. IRCC should be subject to the oversight that will ensure and enhance transparency and fairness in the use of AI.
In conclusion, we recommend an independent oversight of IRCC in two ways: one, an independent ombudsperson to oversee decisions of IRCC visa officers; and two, the establishment of an independent body of experts to oversee IRCC's use of advanced analytics and artificial intelligence technology in visa processing.
Thank you. I look forward to your questions on the issues that I have raised, as well as any other questions you may have on differential outcomes in IRCC decisions.