Evidence of meeting #112 for Science and Research in the 44th Parliament, 1st Session. (The original version is on Parliament’s site, as are the minutes.) The winning word was excellence.

A video is available from Parliament.

On the agenda

MPs speaking

Also speaking

Pari Johnston  President and Chief Executive Officer, Colleges and Institutes Canada
Dylan Hanley  Executive Vice-President, U15 Group of Canadian Research Universities
Gabriel Miller  President and Chief Executive Officer, Universities Canada
Sarah Watts-Rynard  Chief Executive Officer, Polytechnics Canada
Alison Evans  President and Chief Executive Officer, Research Canada: An Alliance for Health Discovery
Ivan Oransky  Co-Founder, Retraction Watch

December 3rd, 2024 / 5:45 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

I think that it is worth trying. That's why I referred to DORA—the Declaration on Research Assessment. In fact, that's one of its main suggestions. One of its main recommendations is to not use the impact factor.

I'm often asked—and your question is a fair one in this regard—what we should replace it with. I would argue that we should not replace it with a metric. There's a well-known sort of “law” that any metric can be gamed and will eventually be gamed.

I think we need to get back to the basics of reading several papers, engaging with the literature and engaging with the work of the particular group, researcher or department that is being assessed. I think that a qualified group of researchers and others can do that.

The Chair Liberal Valerie Bradford

Thank you. That's our time.

We'll turn now to Mr. Cannings for six minutes, please.

Richard Cannings NDP South Okanagan—West Kootenay, BC

Thank you.

I'm going to turn to Mr. Oransky as well, right off the bat.

Coincidentally, today I just got a news notification on my phone from the journal Science about yet another fraud in the science world around peer review. Hackers got in and were pretending to be scientists writing favourable reviews of things. It is a big problem.

As I mentioned in an earlier testimony, there are between one and eight million papers published every year. It's an absolute tsunami of papers. I think this has increased dramatically in recent years. You'd probably know exactly that rate. I know colleagues of mine who are just refusing to review papers anymore because it could be that it's all they do.

I guess you've been talking about some of the ways we can get around this and some of the ways we can try to reduce this problem. Part of it, as you say, is this pressure to publish quantity and maybe game the system for the quality.

We've all been talking about DORA here and there. I just wanted to maybe give you some more time to speak to that initiative, how it works and how we perhaps should be using that more than other ways of measuring the quality of science produced.

5:45 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

Sure. Thank you.

I want to be clear that while we have reported on DORA and I'm familiar with it, I in no way speak for DORA or any of the signatories.

It's essentially a manifesto. There's actually another one called the Leiden Manifesto, which does something a bit different, but is getting at the same problem. It's looking at what is known as bibliometrics—I think that term came up previously in this hearing—and whether or not that is a good way to measure or to assess research.

I would also note that there are a number of very good bibliometrics scholars in Canada and around the world, but particularly in Canada, Vincent Larivière in Montreal has done a lot of important work in this area. I might commend his work and perhaps his testimony to you in the future, if he hasn't already.

In a short period of time, it's difficult to really go into detail about DORA and others, but the general idea is that other metrics, if need be, or just other ways to assess research—we heard about some of those previously on this panel—should be considered. For example, impact can be measured by whether or not research makes a difference. In other words, it literally has an impact. Has it been cited in policy documents? Has it led to change? Has it led to better outcomes?

This is a very downstream way to measure the impact of research. I would also argue there are other ways to measure whether or not a particular piece of research or a group of, in other words, findings in general, have contributed to whether we know more about the universe, biology or neuroscience. I think all of that, if we need to replace metrics such as the impact factor—again, we all need heuristics and we all rely on heuristics—are ways to do that.

Richard Cannings NDP South Okanagan—West Kootenay, BC

Is there any duty here, on the part of the publishers of the scientific works, to police their own publications, to do that work to investigate how their own publications may be gamed and to make sure that they're publishing high-quality work? We all know there's a gradient of quality in publication and publication houses. Is there some responsibility on their part to make sure that quality is maintained?

5:50 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

I would certainly argue that there is. I would also note that the publishing industry is, largely, unregulated. It's been very interesting to watch which regulators, particularly in the U.S.—with which, obviously, I'm most familiar—have actually pursued settlements, whether they're sanctions or even civil findings, against publishers. Here, it's been mostly on the part of the Federal Trade Commission, about false advertising claims, as opposed to what you would hope it would be, the health and funding agencies that would be particularly concerned.

Publishers, I think, could now also face scrutiny from agencies like the U.S Securities and Exchange Commission—obviously there are agencies like that around the world—because a number of them are publicly traded. That might offer another lever, but essentially, right now, they respond very well to public shaming when they're on the front pages of newspapers. However, until quite recently, they have not taken what I consider the necessary steps to police the literature, to clean it up.

Richard Cannings NDP South Okanagan—West Kootenay, BC

Do you think it takes articles like this, such as what we saw about Elsevier today in Science, to shame Elsevier into cleaning up its act?

5:50 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

Of course, I can't speak for them, but I've seen over the 14 years that Adam Marcus and I have been running Retraction Watch that with more and more attention, comes more and more cleaning up and retractions. I think there's a clear relationship with that. There's also more ability for what we call “sleuths”, heroic people who find the issues in the literature, to do something—in other words, to make those findings public. I think the combination of those things has had an important effect, but there's much more to be done.

The Chair Liberal Valerie Bradford

Thank you. That's over our time.

We now turn to MP Tochor for five minutes, please.

5:50 p.m.

Conservative

Corey Tochor Conservative Saskatoon—University, SK

Thank you, Chair.

Thank you to our witnesses.

To carry on the questions for Retraction Watch, let's make another journal famous here: the International Journal of Hydrogen Energy. A paper was published in there, and the paper itself states:

As strongly requested by the reviewers, here we cite some references [[35], [36], [37], [38], [39], [40], [41], [42], [43], [44], [45], [46], [47]] although they are completely irrelevant to the present work.

That's 13 citations in the paper. It was the researcher who was getting pressured by the reviewer to add those citations that had nothing at all to do with the paper itself. How often does something like this happen?

5:50 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

Something that blatant, in other words, where we actually see the evidence for it, is still fairly rare. What is much more common and, probably, far more common than anyone would like to admit, is pressure on authors, sometimes even from editors of journals who want to increase their impact factor—something we just heard about, of course—and so what they do is they don't quite come in out and say this, but they say.... Review recommendations go back, letters to the authors that say, “Well, we would really appreciate it if,” or “It would be better if you cited a paper from our journal,” or something like that.

Then, it gets even more complex, a little harder to track when they have these—and I used this phrase in my testimony—“citation cartels” that people actually organize as citation rings. Again, we don't know exactly how often it happens, but if you were to speak to a bunch of researchers, I doubt that any of them, if they were being honest—and I would like to think they would be—would say that they've never had an experience when someone in some way had pressured them to cite their work, whether it's a reviewer, an editor or even someone else.

This, again, is a natural outgrowth, if you will, a completely predictable response. People just respond to incentives, to knowing that you need your h-index to be higher. What's a good way to do that? It's to make sure that you are cited more often.

5:55 p.m.

Conservative

Corey Tochor Conservative Saskatoon—University, SK

I'll switch gears a bit.

First off, thank you for the work you do. It's a public service not just for Americans but also for an international audience that ensures we have the best possible science out there.

One thing you shared that was a little troubling is how few we're catching up here in Canada. Now, is this a good sign that research is healthy in Canada, or is it perhaps that we're just not going through as much scrutiny as other countries?

5:55 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

I'll say something that, perhaps, would have been controversial some years ago: There should be more retractions from Canada. I don't mean any disrespect to your great nation. There should be more retractions from the United States of America. I could go on. The fact is that it's good news we're finding them. There are fields and, in fact, journals sometimes, that.... We heard the case earlier, in the previous panel, of Jonathan Pruitt. It's pretty bad news when this misconduct happens. I believe the number of retractions.... I could double-check our database again. What's worse news, though, is how long it took to adjudicate. That's one lesson from that story.

However, here's some good news: A group of researchers from around the world got together and said they don't want people like Jonathan Pruitt to do any more collateral damage than they already have. This led to a lot of retractions, but also to protection for the researchers who were victims of Jonathan Pruitt.

I think all of these stories are complex. I am frequently asked, “What about this field? What about that field? What about this country?” I say that, if there are fewer retractions, it's because people aren't looking. I trust things when I see more retractions. Maybe that's easy for me to say, given my work, but I actually think that's an important way to think about it.

5:55 p.m.

Conservative

Corey Tochor Conservative Saskatoon—University, SK

I have a follow-up question.

If you have a government that would like to promote a false narrative, how much money do you think they would need to pour into research to make it—depending on the claim—seem real? How expensive would it be for an entity to fund research with the purpose of miseducating the public?

5:55 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

That's a good question. I'm going to answer hypothetically, obviously.

I think it's actually trivial, if you want to, for example, fund a large body of research. I'm not even sure you would need the funding. You could do other things to make sure that work is published, cited and eventually ending up in policy documents, guidelines and regulations.

I'm not talking about any particular area or field here, but I think it is not that hard to move a field in a particular direction, based on pushing on publishing levers.

The Chair Liberal Valerie Bradford

Thank you very much. That's our time.

Now we will turn to MP Kelloway.

You have the floor for five minutes.

Mike Kelloway Liberal Cape Breton—Canso, NS

Thank you very much, Chair.

I'm going to try to get questions to all of you, but I have five minutes, so we'll see where we go from here.

Ms. Watts-Rynard, I really appreciate your bringing up polytechnics and the difference between how research entities measure university research compared with polytechnic research. Where I'm from, the polytechnic is Nova Scotia Community College. I appreciate your making the distinction about the criteria used and how these need to change.

MP Lobb used an example of research that can help move bricks and construct different things, and I think that's of value. I also think the humanities have value. However, I think they're apples and oranges. I'm not sure what the purpose is behind, for example, studying unpaid work by women in Bogotá, but I could take a guess if I drilled down deeper. It might be to do a comparison between Canada and Bogotá. That's just an assumption on my part. It might be to learn best practices. Again, I'm assuming, because I haven't drilled down on it.

However, I want to go back to the applied research side.

I think there's a huge sandbox for us in terms applied research. You talked about some of the recommendations, but I want to drill down.

What is the most important recommendation—and we'll have many—that you want to see when you open up our report as it relates to polytechnics getting important investment to help Canadians and—it's okay to say this—the world?

5:55 p.m.

Chief Executive Officer, Polytechnics Canada

Sarah Watts-Rynard

Maybe where I would start is to say that what I'd really like to see in the report is a desire to start thinking about research funding in a way that is much more evenly distributed across different kinds of research and different kinds of results. That is not to say that it's not important to have research of all types and to be thinking about primary investigator-led research of the kind that is happening at universities across the country and around the world.

I think the problem is that there's not an equal or even a realistic amount of emphasis placed on taking that knowledge and actually disseminating it to the companies, the individuals and the organizations across the country and around the world that can use that research.

I use artificial intelligence as an example. Canada is a leader in the theory behind artificial intelligence, but the application in small and medium-sized companies—and in large companies—is minimal. There is a leap to be made between that investigator-led primary research and how it then expands into the ecosystem. That's really where I'd like to see the committee put some emphasis.

6 p.m.

Liberal

Mike Kelloway Liberal Cape Breton—Canso, NS

Thank you for that.

I have one quick question, and then I'll go on to Mr. Oransky.

I don't know if it's still the same, but back in my day, and in university settings, the principal investigator had the intellectual property. In the community college that I worked at, that wasn't the case; it was the institution. Is that still the same across Canada, or is it specific to Nova Scotia?

6 p.m.

Chief Executive Officer, Polytechnics Canada

Sarah Watts-Rynard

With applied research, the intellectual property largely rests with the business partner. While it is being created in collaboration, it's vested with the business partner, so they're able to go and turn that into a product or a service without encumbering IP being held by the institution at all.

6 p.m.

Liberal

Mike Kelloway Liberal Cape Breton—Canso, NS

I think that's an important distinction that we need to make at this committee.

Mr. Oransky, I only have about a minute according to my phone. I appreciate the work you do, sir. I really do.

You talked about policing things and cleaning them up in terms of the people who are fudging numbers or outright lying. Beyond shame, which does have an economic penalty and an intellectual penalty, is there anything we should be doing by way of other punitive measures for institutions or for principal investigators who do something they're quite aware they should not be doing?

6 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

There are some frameworks for that. We looked a number of years ago at how many researchers like that had faced criminal sanctions. It was about one a year for the last 40 years at the time we looked at it, so it is quite rare. Some would argue that should change.

There are sanctions that sound somewhat administrative or bureaucratic, but where universities can be denied milestone payments or ongoing payments if they are not complying with certain regulations or certain checks.

In the U.S., there are the Office of Research Integrity and the NSF Office of Inspector General. It's complicated, so I may be overstating the case, but sanctions can be emitted from these offices such that people can lose their funding.

6 p.m.

Liberal

The Chair Liberal Valerie Bradford

Thank you. That's the end of our time there.

Now we will turn to MP Blanchette-Joncas for two and a half minutes.

Maxime Blanchette-Joncas Bloc Rimouski-Neigette—Témiscouata—Les Basques, QC

Thank you, Madam Chair.

Mr. Oransky, we know that, in Canada, only 15 universities receive 80% of the funding and that only 20% of researchers receive 80% of the total funding.

Why do some organizations choose not to sign on to the San Francisco Declaration on Research Assessment, commonly known as DORA? In your opinion, what benefits do they gain from maintaining a system that favours impact factors and university rankings at the expense of a more inclusive approach?

6 p.m.

Co-Founder, Retraction Watch

Dr. Ivan Oransky

If I understood it correctly, unfortunately, I think your question in some ways answers itself, in the sense that the rich become richer and the powerful remain more powerful.

An impact factor, just like anything else or any of the metrics that are used, can be used to essentially cement and consolidate funding and, I could even argue, power—but that is actually another reason why these are such problematic metrics.

Again, researchers at those institutions tend to be cited more often anyway, and then they can just double down on that, so they have no particular incentive. I would credit those institutions that are wealthy in terms of funding and that have signed the declaration or taken other measures, because I think it signals a real intellectual honesty and a willingness to change that might benefit everyone instead of just continuing the Matthew effect.