Thank you, Mr. Chair.
My name is Matt Malone, and I am an assistant professor at Thompson Rivers University faculty of law in Kamloops. Today I am attending the meeting in a personal capacity.
I am going to use my opening remarks to share my thoughts using a case study, which is specifically regarding the selective ban of TikTok on government-issued devices that was announced in February 2023. As the committee might recall, that selective ban was accompanied by a statement about concerns relating to privacy and security.
These stated concerns do not explain several things. First of all, they do not explain why the government waited five months to act on the underlying intelligence brief that warned about TikTok's practices. Second, they do not explain why the government continues to buy advertising on TikTok itself. Finally, they do not explain why the government has ignored that TikTok is not the only app that retains user data in foreign jurisdictions and potentially shares it with foreign regimes.
As the Treasury Board Secretariat confirmed to me a couple of days before this hearing, none of the following apps are banned from download and use on government-issued devices: the Russian-affiliated VKontakte social media app, the Russian-affiliated Yandex app, and the Russian-affiliated Mail.ru app, as well as other social media apps, like Facebook, Instagram, Tinder, Snapchat, Bumble, Grindr, Truth Social, Gab and Discord, which was implicated in the 2022-23 Pentagon leaks and which Dr. Laidlaw noted does not have child safety protection measures in place.
As I recommended in a recent article—and as I'll take this opportunity to recommend again now to the President of the Treasury Board—I believe that a better privacy and security baseline would see the government ban all social media apps on government-issued devices, unless there is a strong business justification otherwise. It's crazy to me that the apps I just listed are not banned on government-issued devices. I also believe that the government should stop buying ads on all social media services.
Even with such bans in place, it is worth noting that federal privacy law places no meaningful constraints on data transfers to jurisdictions like Russia and China. An internal government brief that I obtained through the Access to Information Act notes that Bill C-27 and the proposed privacy legislation currently before Parliament avoided putting into that bill any new or European-style restrictions on the transfer of personal information across borders specifically out of deference to commercial interests. It's very telling that the privacy bill before Parliament is being stewarded by the industry portfolio in cabinet, not a portfolio in human rights, public safety or national security.
Like many social media apps, TikTok does deserve opprobrium for its privacy violations, data harvesting and narrative control practices, and for granting access to data despite assurances otherwise. Like other social media apps, it is a vector for online harm visited on young people. Its business model is focused on privacy-invasive, targeted advertising that exacerbates the mental health crisis affecting young people. The app's safety features for children are all easy to bypass.
Through various access to information requests, I have seen several internal briefings where Canadian government actors repeatedly identified these problems. I'm happy to talk about these.
However, it's important to note that the real culprit here is Canadian law, because it does not stop these practices for TikTok or any other social media service. As TikTok lobbyists appearing before this committee repeatedly underscored, TikTok's handling of Canadians' user data is governed by Canadian law. That's the problem. Canada's privacy laws fail to respect the rights and interests of individuals and collectives in the digital age. Enforcement is basically non-existent. At the federal level, the Office of the Privacy Commissioner has become skilled at making fanfare announcements about its investigations, but it is very slow at investigating, as I learned in my own complaint about the ArriveCAN app, which was ultimately sustained.
Law enforcement has struggled to adapt to the new digital landscape as well. The RCMP's national cybercrime and fraud reporting system, which this committee recently heard about in glowing terms as part of this study, is actually two years behind schedule and still in beta testing. Its website says that it accepts only 25 complaints per day nationwide.
To give members another illustrative example, as I learned in a recent access to information request, the RCMP's cybercrime investigative team has only eight employees in all of Alberta. Here in British Columbia, where there was a recent tragic sextortion case involving a young person that was carried out over social media, there are only four employees on the cybercrime investigation team for the entire province. There are none in Saskatchewan, Manitoba or any of the maritime provinces.
With privacy and data protection legislation that deprives citizens of meaningful protection, government funding priorities deeply out of alignment with stated values and actual needs, and gaps in law and policy that the government shows no urgency to fill, the federal government's policies and practices pose significant challenges to addressing the real types of harms that we are seeing perpetuated these days on social media.
To wrap up, I want to thank the committee for its unexpected invitation.
I also want to give a particular shout-out of appreciation to the MP for Mississauga—Erin Mills for her leadership on this very important issue. I've been very impressed with her work on this file.
I look forward to answering, to the best of my abilities, any questions that the committee members might have.
Thanks.