Absolutely. Part of the challenge with this type of system is that, by relying on artificial intelligence assessing tools, you're able to implicitly do what you couldn't do directly. You couldn't necessarily say, “I'm not renting to you because you're indigenous,” but then maybe you could adopt an algorithm that relies on biased historical data and ends up coming to that conclusion without the transparency that would let someone challenge that type of decision explicitly. That's a very big problem as we move towards this broader set of assessment mechanisms. Again, facial recognition is a tool that really allows a lot of implementation of those types of mechanisms. The KTDI profile has lots of elements of that built into it.
On June 16th, 2022. See this statement in context.