Again, it's really supporting the underlying infrastructure that supports the app.
If we think about other apps that we may be more familiar with, like Uber, you put in your address, and like magic, a car appears at your door and takes you to your destination. You don't even have to speak to the driver. All of that simplicity is supported by a whole lot of other back-end technology that makes that happen, from GPS to fastest route, to how you're paying for the service. You can split your bill. All that underlying infrastructure that supports the simplicity of the user interface has to be built.
In the case of ArriveCAN, it was using a lot of other aspects of infrastructure that are less obvious in terms of what you're seeing in the app. For example, if you upload your vaccination certificate, the app has to be able to read and verify that certificate. It has to be able to do it in multiple languages. You might remember that Ontario, for example, changed its vaccination documentation halfway through all of this. It has to be able to read the document and verify that it's a real document.
Another piece of infrastructure that has to be supported by the app was some of the accessibility features, like the ability to go from text to talk. That is another machine learning application that was enabled in the app.
It also had to be able to speak to other aspects of CBSA's infrastructure or PHAC's infrastructure in terms of data that it was verifying in real time once you uploaded your information in the app, as well as boarding information and your travel information. There were a number of pieces that needed to speak to each other in the simplicity of being able to interface with the app.