Thank you, Mr. Chair.
I would also like to thank the kind team doing us the honour of being here today: Kevin Chan, Derek Slater, Neil Potts, Carlos Moje and Michele Austin. We would have liked Mark Zuckerberg to be with us, but he let us down. We hope he will return some other time.
I have been very attentive to two proposals from Mr. Chan. I would like to make a linguistic clarification for interpreters: when I use the word “proposition”, in English, it refers to the term “proposition”, and not “proposal”.
In presenting the issues raised by his company, Mr. Chan said that it was not just Facebook's responsibility to resolve them. We fully agree on this point.
And then, again on these issues, he added that society must be protected from the consequences. Of course, these platforms have social advantages. However, today we are talking about the social unrest they cause; this is what challenges us more than ever.
Facebook, Twitter and YouTube were initially intended to be a digital evolution, but it has turned into a digital revolution. Indeed, it has led to a revolution in systems, a revolution against systems, a revolution in behaviour, and even a revolution in our perception of the world.
It is true that today, artificial intelligence depends on the massive accumulation of personal data. However, this accumulation puts other fundamental rights at risk, as it is based on data that can be distorted.
Beyond the commercial and profit aspect, wouldn't it be opportune for you today to try a moral leap, or even a moral revolution? After allowing this dazzling success, why not now focus much more on people than on the algorithm, provided that you impose strict restrictions beforehand, in order to promote accountability and transparency?
We sometimes wonders if you are as interested when misinformation or hate speech occurs in countries other than China or in places other than Europe or North America, among others.
It isn't always easy to explain why young people, or even children, can upload staged videos that contain obscene scenes, insulting comments or swear words. We find this unacceptable. Sometimes, this is found to deviate from the purpose of these tools, the common rule and the accepted social norm.
We aren't here to judge you or to conduct your trial, but much more to implore you to take our remarks into consideration.
Thank you.