Mr. Speaker, today I am going to speak about one of the online harms act's core purposes, and that is the protection of children. Our government will stop at nothing to ensure that kids in this country are safe, and this includes their online safety.
Our children spend many hours of their day watching online videos, chatting with their friends and posting snippets of their lives. Being online is integral to their lives and offers many benefits. It is a way for them to connect, learn and find entertainment. However, the online space is not always safe for children. We have rigorous toy standards to ensure that Canadian kids do not get hurt while playing. The Internet is the most complex and riskiest toy ever invented. It must have its own safety standards to protect kids from the harms embedded within social media platforms.
For too long, we have tolerated a system where social media platforms have off-loaded their responsibilities onto parents, expecting them to protect their kids from harms that platforms create and amplify. Until now, there have been no safety regulations for online platforms. Parents and kids do not know where to turn to get help when things go wrong online.
The bill would create a baseline standard for online platforms to keep Canadians safe. It would hold platforms accountable for the content they host.
Over the last several years, we have conducted extensive public consultations. A common theme that was heard was the vulnerability of children online and the pressing need to take steps to protect them. At the same time, the consultations highlighted a desire for a flexible, risk-based approach to online regulation. Bill C-63 would balance these two objectives.
I am disappointed to see the Conservatives discredit the hard work of the organizers, victims and survivors across the country who were consulted on the legislation. By refusing to support the bill, they are rejecting this experience and the reality of today's world that children are not currently safe online. The bill was meticulously created to keep Canadians safe while ensuring that their rights are maintained.
The online harms act introduces a new duty to protect children. It requires platforms to integrate design features that protect children on their platforms and report on the measures they are taking to protect children. The specific design features will be identified following open regulatory processes where all interested parties have a chance to be heard. This would ensure that the measures are fit for purpose and consider the latest research and evidence, as well as that they are workable for the social media services that need to implement them. We believe this approach to protecting children respects the government's position of supporting a safe and inclusive digital space in Canada.
The online harms act would require operators of social media services to integrate design features that protect children, such as age-appropriate design. Bill C-63 does not opt for a prescriptive approach requiring the use of a specific technology, such as age verification; instead, it opts for a principle-based approach that can evolve with technology. The goal of age-appropriate design is to make the online user experience of children safer by decreasing the risk that they will encounter harmful content. This might include design features such as parental controls, default settings related to warning labels on content and safe search settings.
Age-appropriate design is useful because it is not a one-size-fits-all approach. It recognizes that a five-year-old and a 16-year-old interact with the online world differently, so they likely require different design features to improve the safety of their online experience. The digital safety commission would articulate these features through regulations after examining industry practices and available technology, as well as engaging with stakeholders and Canadians. This process would ensure that the subsequent regulations on design features that protect children are well-informed and in line with Canadians' expectations of privacy and digital expression.
Bill C-63 was crafted with special attention to freedom of speech, a charter right that the government will always protect. At each step, we made design choices with freedom of expression top of mind. Under the online harms act, the risk-based approach is anchored in a duty to act responsibly that requires platforms to create safer spaces online so that users are less likely to encounter harmful content. The duty to act seeks to ensure that we have in place adequate systems by services that limit the likelihood of users viewing harmful content.
Bill C-63 would also enhance the protection of children online by amending an act respecting the mandatory reporting of Internet child pornography by persons who provide an Internet service, the mandatory reporting act. The bill would amend the mandatory reporting act to strengthen reporting obligations under the act to help facilitate child pornography investigations. The bill would allow for the centralization of reporting to a single law enforcement body, a response to a long-time ask from law enforcement and child advocates.
The duty to report would be triggered when the service provider has reasonable grounds to believe that their network is being or has been used to commit a child pornography offence. The reporting requirement would also be enhanced to require the provision of transmission data in any report where the service provider believes that the material is manifestly child pornography.
We recognize that children are spending more and more of their time on the Internet. Our goal is not to prevent children from having access to valuable information and a social experience online. Our goal is the opposite: to make the online environment as safe as possible for them to explore. The duties set out in the online harms act would be critical to accomplishing this goal.