We are. We're using facial recognition to compare probe photos that would have been uncovered in investigation against our Intellibook, which is our mug shot database.
There is a known set of issues around faces in different training sets. We selected the facial recognition technology we use because it is the least biased, but there are biases that are embedded into photography and the photographic systems that are out there. There are biases towards lighter faces, and having more of a detail range in lighter faces than in darker faces.
What we're doing is countering that bias by having a hurdle rate below which we don't consider it a match. If the technology is weaker, it does not disfavour the generally racialized minorities who have darker skin tones. We're also feeding that into a process whereby a match is not considered an identity. The identity has to be corroborated by other methods.