It's certainly the case that AI technologies, and maybe even more so quantum technologies, are sensitive from the perspective of being potentially used in sensitive security areas. What we have been doing within our programs is starting to implement some tools and practices that help to better protect Canadian research.
For example, over the last couple of years, we have been working with the university community and the security agencies to develop guidance to enable researchers to better understand the risks associated with their research being potentially stolen by foreign actors. We have had a fair bit of discussion with university researchers and university administrators about how to better safeguard their research. We put up a whole website called Safeguarding Your Research to better inform them of the risks while being sensitive to the fact that success in areas even like AI and quantum require that we be as open as possible but as secure as necessary, because collaboration is a pretty important part of making advances in those fields.
There have also been policy statements by the Minister of Innovation, Science and Industry, the Minister of Public Safety and the Minister of Health, who also has a role with respect to health research, basically asking for government to work with the university community to implement more stringent due diligence processes in federal granting programs that are, in particular, partnership programs between industry and university researchers.
Those have been brought into place with respect to the NSERC alliance program, which is the primary vehicle for national science and engineering collaborations between university researchers and private sector actors. That's also a next step in terms of ensuring that there's an appropriate understanding of who your partners are and what risks they might imply, so that researchers themselves are in control of deciding who they share their research with and that we do more to protect sensitive research from being lost.