Thank you.
I'm a Canada research chair in communication policy and governance at McMaster University. Thanks so much for inviting me.
Today I want to focus on discoverability and algorithmic bias.
Governments around the world are working on measures to ensure that algorithms are accountable. There is a common misconception that streaming platforms recommend what the user wants to see. Platforms show us what they want to show us. They show us what will keep us watching ads, purchasing advertised products, contributing data and subscribing. Platforms are not neutral. They serve their business interests as well.
There are three types of bias I am concerned about. These biases can affect both users and content providers.
First, there could be a bias if algorithms are used to select content for carriage on streaming services by predicting how many viewers the content will attract. A poor algorithmic showing could sink a content provider's chances of being shown.
Second, there can be bias in the recommendation algorithms that users use to discover content. Recommendations often display popularity bias, recommending what's popular and concentrating users' viewing on a smaller catalogue of content. This can be unfair to artists in the long tail and to users who like non-popular content. It could be unfair to Canadian content, including user-generated content.
Third, users' own biases can be amplified. Beyond users' biases for or against Canadian content, if users have a gender bias, for example, this could be amplified in the recommendations that respond to past viewing habits. Such biases can form a feedback loop that can spread throughout the system.
The research in this area has only been developing recently. CRTC intervention in the algorithms raises many difficult problems. The CRTC may not be the first, most likely or best answer to those problems. The CRTC said today it doesn't want to play that role, but the commission could play a role in bringing such problems to light.
There are concerns that requiring discoverability could infringe upon freedom of expression. Streaming service user interfaces and recommendations may be forms of expression. If so, regulatory interventions could contribute a limit on that expression. There are legitimate concerns that promoting some content could mean demoting other content, among other concerns. Sometimes, limits on expression are justified, but they must be justified. To understand whether any justification exists, or even just to understand, we need data.
It may be that the best role for the CRTC will be to monitor and call attention to problems, not just with the discoverability of Canadian content but also with recommender biases relating to other Canadian values, so that civil society and others can intervene. The CRTC can only do that if it has data.
The provisions on information provision and information disclosure in the bill are important to the study and examination of discoverability algorithms and data, and to the CRTC's potential work with outside organizations on this. It may be necessary to require platforms to collect certain data to permit these examinations to happen. The general powers information provision section of the bill could include the phrase “collection and provision”. That section could also name information on discoverability as information that the commission can demand.
I disagree with proposals that would allow a company to prevent the disclosure of information in proposed section 25.3.
The Canadian broadcasting system has often served dominant groups. It has also been open to change and improvement based on the work of civil society and others. We need to ensure that the discoverability mechanisms of online streaming platforms are also open to critique and change through public transparency, debate and data.
Thank you.