Evidence of meeting #8 for Canadian Heritage in the 45th Parliament, 1st session. (The original version is on Parliament’s site, as are the minutes.) The winning word was use.

A recording is available from Parliament.

On the agenda

Members speaking

Before the committee

Finlay  Legal Counsel, The Canadian Copyright Licensing Agency, Access Copyright
Illingworth  Executive Director, Association of Canadian Publishers
Croken  National Co-Chair, Wax Seal Productions, Canadian Authors Association
Davy  Executive Director, Work in Culture, Cultural Careers Council of Ontario
Morin  Executive Director, Fédération culturelle canadienne-française
Buridans  Director, Innovation and Digital Partnerships, Fédération culturelle canadienne-française
Chan  Public Policy Director, Meta Platforms Inc.
Curran  Head of Public Policy, Meta Platforms Inc.
Ouellette  Co-Chair, Copyright Committee, Association of Canadian Publishers

The Chair Liberal Lisa Hepfner

Welcome to meeting number eight of the Standing Committee on Canadian Heritage.

Pursuant to Standing Order 108(2) and the motion adopted by the committee on Monday, September 22, 2025, the committee is meeting to study the effects of the technological advances in AI on the creative industries.

Today we have with us, from Access Copyright, Erin Finlay; from the Association of Canadian Publishers, John Illingworth and Brendan Ouellette; from the Canadian Authors Association, Travis Croken; from Cultural Careers Council Ontario, Diane Davy; and from the Fédération culturelle canadienne-française, Marie-Christine Morin and Sven Buridans.

Thank you for joining us.

From Meta Platforms, we have Kevin Chan and Rachel Curran.

It's nice to see you again.

We'll give each organization five minutes for opening remarks, starting with Erin Finlay from Access Copyright.

You have the floor.

Erin Finlay Legal Counsel, The Canadian Copyright Licensing Agency, Access Copyright

Madam Chair and members of the committee, thank you for inviting us to appear today. Access Copyright is a not-for-profit copyright collective that was founded in 1988. Since then, we have licensed the published works of more than 14,000 Canadian writers, visual artists and publishers, returning over $500 million in royalties to the creative ecosystem. These royalties ensure that Canadian creators and publishers are compensated for the use of their copyrighted material and they can reinvest in new publications that inform, educate, entertain and reflect the diverse experiences of Canadians across this country.

Access Copyright supports the development of a strategy that fosters a fair, safe and ethical AI ecosystem, one that recognizes the importance of human ingenuity to our society and our Canadian culture. Like organizations across creative industries, we believe AI uses must be authorized, remunerated and transparent. You heard about ART previously in these meetings.

I want to focus on three key points today.

First, do not introduce new exceptions into the Copyright Act. AI models do not create from nothing. They copy from human creativity—from books, journals, magazines, newspapers, songs, images and countless other works. Text and data mining and the training of large language models engage creators' exclusive rights and require licences. Calls for new exceptions are both unfair and unnecessary. AI innovation can and should coexist with a system that incentivizes creators and protects their rights. However, big-tech companies and small-tech companies are profiting from the unauthorized use of creative works to train their AI systems. Now they're asking the government to legitimize that behaviour with new exceptions. AI is fast and powerful, but speed and scale cannot replace fairness, consent or respect for creators' rights. AI innovation cannot be a shortcut to ignoring the people who create, protect, sustain and proclaim our culture. After all, culture is what makes Canada Canada.

Second, enable a rights market. Introducing new exceptions would undermine emerging rights markets, create uncertainty and harm the creative industries. Voluntary licensing, by contrast, is feasible, desirable and already happening. We've seen multiple licensing deals between copyright owners and AI developers recently, including HarperCollins and Microsoft in November 2024; The New York Times and Amazon in July 2025; News Corp, Time, Axel Springer and others with OpenAI; and, most recently, the landmark Bartz v. Anthropic settlement coming out of the U.S. In addition, copyright collectives, such as Access Copyright, in the U.K., the U.S. and Australia are also offering voluntary collective licences for AI uses and training.

These examples prove that the voluntary licensing model works. We need a market built on voluntary licensing. Voluntary licensing lets creators retain control, receive fair remuneration and know when their works are used. It also provides AI companies with the rights they need to do the work they do. It makes innovation fair, sustainable, and legal, respecting the value of human creativity while enabling responsible AI development.

Third, require transparency. AI often operates as a black box. Creators cannot see if or how their works are used, and users cannot tell what is human-made versus AI-generated. Platforms must disclose both the works used in training and which outputs are AI-generated. Transparency allows creators to verify use, ensures accountability and builds trust. Without it, Canada's creative industries face information asymmetry in licensing their rights and impossible evidentiary burdens when enforcing copyright. Building transparency obligations offers a practical, balanced solution.

In conclusion, Canada can lead globally, but only if we insist on authorization, remuneration and transparency—no exceptions, no free riding and no guessing. Voluntary licensing protects creators, encourages innovation and ensures that AI evolves in a way that respects the human creativity at its core.

Thank you.

The Chair Liberal Lisa Hepfner

Thank you. Well done. You finished with plenty of time.

Next we turn to the Association of Canadian Publishers.

John Illingworth Executive Director, Association of Canadian Publishers

Madam Chair and committee members, I am grateful for your invitation and the opportunity to share the Canadian-owned book publishing sector’s views on this issue with you today.

As this committee is aware, the writing and publishing sector is currently locked in litigation around the world with developers of large language models, LLMs, over the unauthorized use of pirated book collections, or shadow libraries, in the training of their products. These shadow libraries contain hundreds of thousands of in-copyright titles, including thousands by Canadian creators. They offer AI companies an easy, expedient, unethical and arguably illegal route to make their models more robust and expressive. The result has been an unprecedented industrial-scale extraction of commercial value from the collective published work of humanity without any compensation flowing to its creators and facilitators, the authors and artists, and the businesses with whom they’ve partnered voluntarily to bring their work to the public.

In the United States, this litigation is beginning to translate into colossal settlements. In the Bartz v. Anthropic case, Anthropic has agreed to pay $1.5 billion U.S. to rights holders for its unauthorized use of their works. That amount hints at the scale of what has been misappropriated during the development of large language models. The real value across all AI developers is massively higher.

Books, especially those that have been through a traditional publishing process, are of tremendous value for AI training. A large language model is only as good as the works it has been trained on. These models are economically valuable not solely because of new technology; it is that combination of technological innovation and overlaying repositories of cultural expression that makes them powerful. It is unjust that the technological innovators should be rewarded while the cultural producers, without whom they would have no product to sell at all, are cut out of the deal.

We maintain that the use of a copyrighted work for the purpose of AI training is a licensable right. Canada’s publishers, and the authors without whom we would have no business, are ready to come to terms with the developers of AI. Not all authors will want to participate in such a transaction, and that is their right, but we have seen licensing models emerging in the U.S., the U.K., Australia and elsewhere, and we are ready to do the work necessary to ensure that Canada’s creators share in the wealth their artistry is already generating for the tech sector.

As such, we too implore this government and the opposition parties to avoid disrupting this emerging market. No new exceptions to copyright should be entertained. AI training must be based on those principles of authorization, remuneration and transparency.

Of course, there are impacts on our industry that go beyond copyright concerns. We are already witnessing one consequence of the advent of LLMs in a deluge of poor-quality, AI-generated books on major distribution platforms.

An Amazon.ca search for “Mark Carney biography”, for example, brings up a slew of purported biographies of our Prime Minister. Many have AI-generated cover art, and some rank higher in search results than his own Value(s). Not all of these books are selling, but some are, and the average consumer has no means of distinguishing a properly researched book from incoherent slop until they buy it. Putting transparency obligations on AI platforms to help Canadians identify what is AI-generated will help build consumer trust.

Finally, I'd like to raise the issue of competitiveness in the cultural industries. We have learned that some or all of the so-called big five publishers—the global corporations that produce the overwhelming majority of books sold in Canada but publish an elite minority of Canadian writers—are developing bespoke AI-powered tools in-house. Good for them. They should be doing that, but the fact that the Canadian-owned publishing sector, which is composed primarily of SMEs, is not in a position to make comparable investments in research and development means that an already uneven competitive playing field will become even more tilted against the domestic industry unless steps are taken to enhance our own capabilities.

This is a situation in which cultural sovereignty and AI sovereignty are closely linked. Canada’s domestic cultural industries—the businesses that do the hard work of discovering Canada’s writers and artists and putting them on the global stage—need a cultural AI strategy that centres the interests of Canadian creators and cultural workers. What would that strategy include? That’s up for debate, but we can make a few suggestions, including compensation for past, present and future use of copyrighted works; a legal framework that doesn’t undercut emerging rights markets; and selective investments that support the Canadian community, competitiveness and culture.

Our publishers need AI that works for them and helps them be stronger engines of Canadian culture, not AI that harvests their works as a way to replace them.

Thank you for your time today.

The Chair Liberal Lisa Hepfner

Perfect. Thank you, Jack.

We are moving on to the Canadian Authors Association.

Go ahead, Travis.

Travis Croken National Co-Chair, Wax Seal Productions, Canadian Authors Association

Thank you, Madam Chair and members of the committee, for the opportunity to contribute to your study on the impact of artificial intelligence on the creative sector.

My name is Travis Croken, and I'm an author and the national co-chair of the Canadian Authors Association.

Artificial intelligence is transforming the world around us, the creative sector included. While it offers tools that can assist artists, writers and other creatives, it also poses serious risks, threatening to undermine cultural diversity, intellectual property and ethical rights, and the creative market.

I will centre my remarks on three key points.

The first is intellectual and ethical rights. The Canadian Authors Association was created over 100 years ago, largely to advocate for the protection of authors and copyright. Here we are still discussing the same issues on a much larger scale with little room for error. Copyright was created to ensure the protections of creative works and to ensure creatives continued their work. Why pour time, heart and soul into a project if it can be disseminated without appropriate compensation and control?

The use of copyright-protected materials to train AI datasets counters the intended purposes of the Copyright Act. Worse, it limits the author's control over their work, over the ethical use of their work and hinders their compensation, all for a system that will later threaten their livelihood, that will oversaturate the market and that will potentially cause damage to their reputation and style through mimicry of their voice or by using their words in a manner the author does not condone.

The second is the impact on the creative. Much like a painter's brush stroke, an author's voice, writing style and creative concepts are unique and create an identifiable brand for the author. Artificial intelligence can mimic an author's voice and can flood the market with books strikingly similar to their novels, creating an unjust competition for the author.

It also creates a multitude of ambiguity. If an author uses artificial intelligence in their novel, how does the reader discern between which parts of the novel are created by the author and which parts of the novel are developed by artificial intelligence? This undermines the reader's confidence in the author. If artificial intelligence is used to create a novel and to directly copy from another author's work, who is liable? Is it the author, the publisher or the creator of the artificial intelligence system?

Further to this is the time sink that is created. Writing a novel can take years, including multiple drafts and edits, working with publishers and doing marketing and promotional tours. Unless the author is exceptionally well known, the royalties are not such that they can quit their regular jobs. Artificial intelligence has now added further steps to this process, including fighting for their rights and for fair compensation, trying to ensure their works are not used illicitly and trying to navigate this new and uncertain era. Further, the use of artificial intelligence in writing runs the risk of impacting an author's ability to create, diminishing the creative muscle as it is used less.

The third is cultural diversity. Canada thrives on its cultural identity, and it has fought hard over the years to protect it and to ensure Canadians are treated fairly in the market. If artificial intelligence systems are allowed to be trained with an author's work without their permission or knowledge, are allowed to diminish an author's financial gain from their work and are granted copyright-protected status where it is not needed—as artificial intelligence systems do not require any incentive to create; they only need commands—we risk losing our creatives we hold so dear to our hearts.

If our authors are not protected and granted security to defend their livelihoods, we run the risk that they stop creating, not for spite but for the inability to survive. This would leave a dearth of Canadian cultural diversity to be filled with foreign creators or with artificial intelligence systems that do not create anything new, but they simply recycle and reword what has already been created before.

In conclusion, Canada should ensure that the guidelines and rules created to govern artificial intelligence ensure that our human creatives, our cultural heritage now and in the future and our ability to stay on the world stage of creation are protected as a distinct and valuable contribution to our economy, our culture and our future. Consent, fair compensation and transparency must be included in any governance created. We have a choice: Do we want the future legacy of our cultural heritage to be created by humans or by machines? It is my opinion that we have one opportunity to get this right. Artificial intelligence already moves at a daunting pace, and if we misstep now, it may be too far ahead of us to catch up.

Thank you.

I am looking forward to answering any questions you may have.

The Chair Liberal Lisa Hepfner

Thank you.

We are turning next to the Cultural Careers Council of Ontario.

You have the floor for five minutes, Ms. Davy.

Diane Davy Executive Director, Work in Culture, Cultural Careers Council of Ontario

Thank you, Madam Chair and members of the committee, for the opportunity to speak today.

My name is Diane Davy, and I am the executive director of Work in Culture, which is the popular name for the Cultural Careers Council Ontario, which is quite a mouthful.

Work in Culture is a non-profit arts service organization. Its mission is “to advance the careers of artists, creatives, and cultural workers from diverse lived experiences, and support the organizations that engage them, through entrepreneurial and business skills development and innovative research.” We are best known across the creative sector for our job board, which is the most popular arts and culture job board in the country. In addition to the job board, we develop and deliver a wide variety of training programs, both in-person and virtual, and do related research on an ongoing basis.

We recently published a report, “AI for Administration in Ontario's Creative Industries: A Snapshot of Current Use, Concerns, and Considerations”, which seeks to explore and understand the specific potential of AI tools for business operations and administration in Ontario's creative industries. The study asked how organizations and individuals in film and television, book and magazine publishing, music, and interactive digital media are using generative AI to streamline tasks and manage day-to-day demands and asked whether AI is helping to alleviate the pressure to do more with fewer resources, a pressure that faces all of us in the arts. The report focuses on how AI trends can be used to help the predominantly small and mid-sized enterprises that dominate the sector, but it also acknowledges the challenges and ethical concerns of working with tools that have been built using creative content without permission or recompense. We are supporters of strong copyright policies that ensure that rights holders maintain control over their works and receive the fair compensation that they deserve.

While the focus of our report is Ontario's creative industries, the findings are likely to resonate with cultural workers and small businesses that are navigating similar operational pressures across Canada. The research is intended to help creative organizations situate themselves within a rapidly evolving landscape, to gain insight into how their peers are approaching AI and to reflect on their own values, needs and readiness. At the same time, it supports a broader understanding for sector leaders of how AI is currently being used in practice and where knowledge gaps, barriers and opportunities remain.

Work in Culture, with its training mandate, specifically provides the following recommendations: build foundational AI literacy training to equip creative professionals with a baseline understanding of how AI systems work and their implications; support the development of workplace AI policies to help organizations create clear, responsible, ethical guidelines; and provide ongoing training on critical issues like data privacy, algorithmic bias and effective use strategies.

Since the release of the report, we have been getting more and more responses from the community on the need for this kind of training, along with concerns about the ethical issues. We will be presenting the report in person at an event on October 18 in Toronto—if anyone is there and would like to attend, let me know—and we expect additional feedback and insights at that time.

The Work in Culture team of four, which is typical of many of the small arts organizations in the community, recently offered itself up as a guinea pig in a pilot training program working with Skills for Change, an agency that works to enhance skill sets, opportunities and access to good work for newcomers and underserved groups across Canada. The pilot program, which combined a series of virtual modules created by Google with several in-person sessions by Skills for Change, has given us a model that we feel we can build and adapt for use across the Canadian creative community. We continue to look for opportunities, resources and partnerships to build, develop and deliver this much-needed training across the sector while we work to make appropriate use of AI's potential to enhance and augment our own internal capacity and help us serve our community. We would welcome the development of a national training strategy that would help our Canadian creative community make the best of the opportunities offered by AI within an ethical framework.

Thank you so much.

The Chair Liberal Lisa Hepfner

Thank you.

We will now hear from the representative of the Fédération culturelle canadienne-française.

Ms. Morin, you have the floor for five minutes.

Marie-Christine Morin Executive Director, Fédération culturelle canadienne-française

Madam Chair and members of the committee, my name is Marie‑Christine Morin, and I am the executive director of the Fédération culturelle canadienne-française, also known as the FCCF. I'm joined today by my colleague Sven Buridans, director of innovation and digital partnerships. I would like to thank the committee for inviting us to testify.

For nearly 50 years, the FCCF has been the national political voice of the artistic and cultural sector of the Canadian and Acadian francophonie. Our sector plays a major economic role in Canada, accounting for more than $5.8 billion in gross domestic product and generating more than 36,000 jobs across the country in 2022. That's how important it is for local economic development and job creation.

Sven Buridans Director, Innovation and Digital Partnerships, Fédération culturelle canadienne-française

Last month, we delved into the topic of artificial intelligence, or AI, at the All In event held in Montreal. Our exchanges with key players in the Canadian ecosystem confirmed two things: first, a genuine interest on the part of the technological community in cultural issues; second, the concerning realization that arts and culture are still absent from AI funding channels. Non-profit cultural organizations, which carry out a public interest mission, don't have access to funding programs like Scale AI's. By neglecting the arts and culture sector, Canada is missing out on a critical innovation hub and its creative, ethical and critical perspective on AI.

Earlier this month, we were also at the UNESCO Mondiacult conference, the United Nations Educational, Scientific and Cultural Organization, in Barcelona, alongside the Coalition for the Diversity of Cultural Expressions, or CDEC. This historic meeting led to a final statement, signed by over 120 ministers of culture around the world. It lists AI as one of the priority areas of action for states. This statement is consistent with the vision of the CDEC and the FCCF. It commits states to promoting the discoverability of multilingual cultural content, protecting copyright and involving the cultural sector in AI policy development.

We are concerned about the Government of Canada's response to this positioning. No representative of cultural industries sits on the AI strategy working group, which was created by Canada on September 26. We ask for significant involvement of the cultural sector in the development of AI policies and systems.

In the meantime, we need to take action and equip artists and cultural organizations on the ground. This fall, the FCCF will launch its national digital strategy, Impulsion 2025-30, which will mobilize its network around four major initiatives: taking action on public policy; strengthening digital skills and capacities; developing new structuring alliances with federal institutions, Quebec and the world; and supporting research and innovation.

This strategy will position evidence culture as a common thread in our collective efforts to put arts and culture at the heart of issues of discoverability, infrastructure, digital sovereignty and, of course, artificial intelligence. However, this transformation will not be possible without clear and sustainable federal support. Current investments of $2.4 billion in AI need to go beyond the private sector. They must also support the francophone arts and culture sector. We are asking Canadian Heritage and federal cultural institutions to work with Innovation, Science and Economic Development Canada and Employment and Social Development Canada to make their innovation and training programs accessible to our sector.

4:55 p.m.

Executive Director, Fédération culturelle canadienne-française

Marie-Christine Morin

We met recently with ministers Steven Guilbeault and Evan Solomon. Minister Solomon referred to a “Gutenberg moment”, saying AI is transforming our cultural markers, just like printing did for knowledge. He pointed out that culture is at the heart of Canadian identity, and we agree totally with him. Our creators and institutions must be trained, supported and equipped for this transformation to be inclusive.

Finally, it is essential to train AI models in French using representative data from our communities, to promote the diversity of francophone cultural expressions in Canada. To ensure the consistent, inclusive, open and safe development of artificial intelligence in support of creation, the cultural industry must be part of the conversation and the future digital direction.

Thank you for your attention. I will be happy to answer your questions.

The Chair Liberal Lisa Hepfner

Thank you.

Finally, we turn to Meta Platforms.

You have five minutes for your opening remarks.

Kevin Chan Public Policy Director, Meta Platforms Inc.

Thank you, Madam Chair.

My name is Kevin Chan. I am the director of public policy at Meta.

I'm here with my colleague, Rachel Curran.

Meta employs more than 3,000 people in offices across the country, including in our AI lab in Montreal. Most Canadians use at least one of our family of apps to share with family and friends, discover businesses and connect over things that interest them. Our apps empower hundreds of thousands of Canadian businesses, artists and creators every month to reach new audiences and grow. Approximately 98% of the Canadian businesses using our platforms are small businesses and 55% are female-led.

Meta is also investing significantly in foundation AI models and generative AI. For example, Llama, our AI foundation model, is the leading open-source model, with over one billion downloads today.

Just as anyone can freely use our family of apps to connect and create, we believe AI technology should be accessible to all. At Meta, we have employed an open-source approach. That means making our AI models like Llama freely available for anyone to download, use, modify and build upon so that researchers and businesses of all sizes can customize and deploy these technologies in any environment without restrictive licensing or costly barriers.

This approach ensures that innovation, safety and opportunity are distributed as widely as possible, not concentrated in the hands of a few.

Research suggests that AI has the potential to inject $180 billion annually into Canada's economy by 2030. We know that AI helps countries grow in a competitive global economy. We also know that, along with Canada's persistent productivity gap relative to other OECD nations, Canadian organizations lag behind their global counterparts when it comes to commercializing AI. If Canada is going to realize its full potential as an AI leader, it must create policies that prioritize innovation and encourage investors to seize the moment.

Open-source AI should be a key part of Canada's AI strategy. It gives Canadian governments, businesses, creators and indigenous communities access to world-class technology without the cost burdens. It increases transparency and safety and results in more widely distributed benefits. Most importantly, it is the key to building truly made-in-Canada AI solutions.

Open-source models will be important for Canada as we seek to build our own AI stack, because it helps address concerns about building independent AI capabilities. Meta’s Llama models, for instance, are free to download so that anyone can build innovative new applications on top of them while protecting data and privacy.

These models can be run on local infrastructure and do not require data to be hosted elsewhere or shared with us. Organizations that handle sensitive data, especially public sector organizations, need high degrees of security and often can’t send their data to closed models over cloud APIs.

In one example, the time-constrained North Dakota Legislative Council convenes every two years for 80 days. In their 2025 legislative session, they piloted an AI solution using Llama to help review and summarize more than 1,000 draft bills. The solution is 100% on premise and running on secure local hardware to ensure maximum data security and control.

Here in Canada, we were pleased to recently learn that federal departments are already using Llama to power solutions that are made in Canada and are secure, with government data staying within the Government of Canada.

We believe that AI sovereignty does not mean closing ourselves off from using models and products developed elsewhere. It's sovereignty, not solitude, as the Minister of Artificial Intelligence has said. It means taking frontier technology, regardless of origin, and adapting and refining it to best suit Canadian interests. Canadians can benefit enormously from the investments that companies like Meta have made in open-source frontier models.

Artists, musicians and other creators use our AI tools as a creative partner. It helps to automate repetitive tasks like editing and building content calendars and helps those same creators reach new and bigger audiences through personalized recommendations and content optimization. Our goal is to empower people to tell their stories, build their businesses and connect with their communities in new ways.

Thank you for your time. We look forward to working together for a prosperous Canada that brings the benefits of AI to everyone.

5 p.m.

Liberal

The Chair Liberal Lisa Hepfner

Thank you, Mr. Chan.

We'll turn to our questioning now, starting with Mrs. Thomas for six minutes.

5 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Thank you.

My first question goes to the representatives from Facebook.

I'm curious. Can you actually expand on this quote, Mr. Chan, that you've just stated? You said that AI “increases transparency”. How is that exactly?

5 p.m.

Public Policy Director, Meta Platforms Inc.

Kevin Chan

I was referring to more open-source AI increasing transparency.

The way we have gone about ensuring that we democratize this technology is by making our models freely available to anybody. That allows anybody—a large government agency, a business or a not-for-profit—to download a version of the model. They can run tests on it locally. They can poke and prod it. We publish model weights so that people have a better understanding of the nature of the model. Of course, they can then take it and fine-tune it to customize it for their particular needs.

5 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

Talk to me a bit about the specifics of how AI is currently being used by artists or creators within Meta platforms in order to support their efforts and reach further audiences.

5 p.m.

Public Policy Director, Meta Platforms Inc.

Kevin Chan

I'm happy to take that one.

I just spent some time with a round table of creators and artists who are using AI. You may have seen, as well, that The New York Times had a recent article about how artists are using this.

If we think about technology as an enabling tool for creativity and creators, there has been a boom of new use cases, new ways of having creative outlets to bring to life new kinds of ideas and new kinds of sources of expression—

5:05 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

I'm sorry, Mr. Chan. In the interest of time, I need you to get to specifics.

How are creators specifically using AI to advance themselves within Meta platforms?

5:05 p.m.

Public Policy Director, Meta Platforms Inc.

Kevin Chan

There are two ways.

First, as I mentioned earlier, they are using our platform to grow their audience, discover new fans and new communities. That is built into the nature of the platform.

Second, I think there are a lot of artists who are using open-source and closed source models to integrate them into their art. There are lots of installations across the world where artists who are at the leading edge of their work are using data visualization tools powered by open-source models like Llama to showcase new and creative ways of expressing themselves.

5:05 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

One of the interesting things taking place within Meta platforms, specifically on Instagram, is using AI for age verification. Talk to me a bit about how that's being done.

Rachel Curran Head of Public Policy, Meta Platforms Inc.

I'm happy to take that one.

We build AI systems that help us identify underage users on our platforms. They read signals from users, whether they be the content they're interacting with or their friend networks, and look at things like birthday posts, of course, with identifying information removed. The systems look at a variety of signals to determine whether someone is underage or not.

If we think a user is underage—under 18—we proactively place them in a much more restrictive experience that we have built for youth, and we require that they prove to us that they are over 18 before we let them out of it.

Our AI systems are helping us determine whether users are in the right experience for their age or not.

5:05 p.m.

Conservative

Rachael Thomas Conservative Lethbridge, AB

What if they are over 18 and they need to prove that? How do they go about doing that?

5:05 p.m.

Head of Public Policy, Meta Platforms Inc.

Rachel Curran

We have a variety of methods. We work with a company called Yoti in the U.K., which makes facial recognition technology to predict someone's age. People can also submit a piece of ID to us that proves they're over 18.

We really want to make sure that people are in the right experience for their age, so we go through a pretty rigorous process to identify them if we think they're underage.