From a research perspective, it's hard to actually correlate who took the training and what their research discovery was. Sorry, but I'm a scientist, so correlation does not imply causation. That's a tough question to answer.
As for what I can tell you, I brought our little infographics to be handed out. They're infographics around “what is sex?” and “what is gender?”, because I do think that's helpful. We have a flyer about the training. I actually have some questions for you all to see if you know how to do sex- and gender-based analysis, so you'll tell me if filling out these questions improves outcomes here in Parliament. We could do a little study there.
On the answer to your question, I'll give you an example from the transgender youth survey: training and awareness about gender diversity has led to less stigmatization around expressing your gender identity. One of our funded researchers did a survey looking at how transgender youth feel. Are they able to talk about it? Are they able to express it? The results of that survey in the media led to schools putting into place inclusiveness policies and gender-diverse extracurricular groups and support groups. Also, there's some evidence that this reduces dropout from schools and possibly even suicidal ideation and suicide.
I don't know if that was a good example. The training has only been in place for a few years. For the data that I talked about, we have a pretest and then a test after the training. For instance, at the beginning of the training, we might ask people, if this is a gender-related variable, is this practice gender transformative, gender blind, or gender unequal? They'll say, oh my gosh, they have no idea what that means. They get a score and then they do the training. After the training, we see if they respond correctly to those questions. We can see if knowledge improves. We ask them how confident they feel, on a scale of zero to 10, that they could do SGBA. At the beginning, most people say.... I don't know what you guys would say. Zero means being not at all confident, with 10 being yes, totally confident. At the end, we see if their score has improved.
Finally, we ask people to evaluate publications and protocols and comment on the impact and knowledge translation of that evidence. We're able to compare the before-and-after answers to see if they're able to do that in an appropriate fashion. I could give you more examples of positive things, but I think it's education, education, and education.