To be clear, not all AI systems require a lot of energy. That was my initial point. We mix AI up with large language models.
Large language models, obviously, require a massive amount of energy and computing power, but that's not at all the AI we are working on. AI for industry doesn't typically require a lot of cloud services, energy or computing power because it's very basic AI. AI for industry, which is critical for productivity gains, is not raising any issues in terms of power or water resources.
Foreign players can absolutely be hosted in Canada with very limited computing power. When we talk about large language models—aside from Cohere—most of the players operating here are hyper scalers. They are the ones requiring a massive amount of energy.
That's why we always argue, from Scale AI's industry perspective, that it's nice to build data centres, but it's like building highways and having Korean cars on those highways. If you build data centres, you need to have a Canadian-based application to run on them so that this energy, at least, which belongs to Canadians, serves the creation of Canadian IP and Canadian companies.