CBSA was handling a lot of sexual misconduct. It was in the news. It was an issue that we were all seized with, given how serious it was.
We were briefed at a meeting with the executive team that the misconduct was increasing and that we needed a different way to solve this problem.
I was the chief information officer at the time. The president turned to me and asked me if there was a different way from a technology perspective, an innovative perspective, to solve this problem or make a difference, because everything we'd done so far wasn't making the difference that was required to stop this. I turned to Mr. MacDonald, who, again, was DG of innovation, and we were looking for innovation. We were looking for something different. We were looking for a different approach, because the traditional approaches around training and awareness weren't sufficient.
Mr. MacDonald took that and went away and found a company—that would be Botler AI—that would use artificial intelligence to help provide a different way for potential victims to have a conversation with this anonymous bot to see if they actually did suffer sexual misconduct and to give them a very safe space. I found the solution very promising.
A demo was made before me and a number of my directors general in which I saw a lot of promise in this technology and in this way of achieving a safe space for victims to potentially consult, get guidance and then decide on what actions they wanted to take.
I saw enough promise—