In preparation for this, I was trying to look for evidence of where these impacts would be coming from. I wasn't able to find anything that's been published that's talking about the impacts in Canada, necessarily, with the gender-based, I would say, and I'd add that this is really an important part of what's going on in these discussions about artificial intelligence, especially in generative models: the biases that embed and reproduce.
I would like to acknowledge that, when you're talking about what voices, it's also important to recognize what voices these systems reproduce. This is really fantastic work. When you look for and ask for a generative AI model to depict a doctor, is it more likely to be a male than a female? It's the same thing when it comes to depicting.... If you describe someone from a different country, how do they reproduce certain key stereotypes?
I think one part—to add to what I clearly agree with you is a need to identify how automation and generative AI will impact jobs from an intersectional framework—is that this is clear investigative work, clear work that needs to be done. There is also a clear concern about the biases built into and baked into these technologies that are being rolled out as solutions to workplace productivity.