Utrecht Professor of Jurisprudence Ton Hol
Research integrity isn’t at the forefront of everyone’s minds yet, says UU professor of Jurisprudence Ton Hol, who was the main author of a report for a network of European universities on this subject. Universities should work on that. There’s room for improvement of scientists’ knowledge of methods of research and statistics. The requirements for integrity in research should become stricter, and the atmosphere at universities should become more open, to be able to discuss difficult issues in research.
“A case like Diederik Stapel’s is an exception,” Ton Hol says. He’s referring to the professor of Psychology at Tilburg University, who was exposed in 2011 for purposefully forging data in order to score with his research. “Much more often, the problem is that scientists are just a little too lax about their data, and draw conclusions too quickly. As universities, you need to be able to prevent that, by already urging students in their Bachelor’s phase to be careful with their data, and by creating an atmosphere in which scientists dare to share their doubts with each other.”
The professor of Jurisprudence has steadily become an expert in the area of research integrity. He’s chairman of the committee on research integrity of the universities of Utrecht and Tilburg, and presided over committees for controversial integrity-related cases. Examples include the case of Tilburg professor Arie de Ruijter, who’s being accused of PhD fraud, and recently, the University of Amsterdam asked him to conduct research on the alleged plagiarism in academic orations of the University of Amsterdam’s former rector Dymph van den Boom. He’s also co-author of the Netherlands Code of Conduct for Research Integrity, and chairman of a committee of the League of European Research Universities (Leru) on research integrity. In January 2020, his committee wrote an advice that has since been embraced by all connected universities.
What was the reason for this Leru advice?
“A few years ago, it became evident that the authority of science was under more and more pressure. Diederik Stapel, for instance, was a well-known scientist with a beautiful CV filled with publications. So when it turns out he was making up data, that’s not exactly a great advertisement for science. You’d also see, outside of the university, that people would start to question the outcomes of scientific research, for instance of research on climate change. In response to all this, Leru established a workgroup to identify what the root of the issue of research integrity is, and what we, as European universities, can learn from each other.
“We also wanted to respond to new developments. Open science is amazing in its transparency, but it also brings certain risks along with it. When you can or cannot share data, for instance.”
What are some of the conclusions you drew?
“We concluded that intentional fraud within science is truly an exception. Purposefully messing with data rarely happens. One thing that does happen, is an increase in the number of complaints about sloppiness in science.
“There are several causes for this. Often, the researcher’s knowledge of methods and statistics is insufficient. That causes them to quickly draw conclusions that you can’t actually draw based on the data. In those cases, researchers get more from the data than there actually is. You’d want scientists to let their statistical analyses be checked more often, or for them to dare to talk to colleagues when they’re having doubts.
Pressure to publish also plays a role. You need results in order to be eligible for grants, and so it’s very tempting to neglect that extra check if the results seem to be promising. That pressure needs to be relieved.”
Is research integrity more important in research done on behalf of external parties?
“Universities obtain a lot of extra funding for independent research done on behalf of third parties. That can be as much as an average of 25 percent of a project’s financing. Commissioned research in itself is not a problem, is desirable even, in terms of the social value of science. But, in this type of research, you need to be transparent, and justify extremely well how you reach your conclusions. It’s important to guard against being influenced by the desires of the commissioner that you as a scientist can’t actually meet, for instance by embellishing research results.
“You also have to ensure you prevent the appearance of a conflict of interest. Take, for instance, the case of the Philip Morris foundation, which wanted to finance a study on tobacco smuggling. It was a methodologically sound study that was going to be conducted by renowned scientists, but people were afraid that the tobacco industry would use the results for their own gain, for instance by lowering taxes on tobacco.
“That means that as a university, you need to be aware of the consequences of research and simultaneously view academic freedom as being of paramount importance. There’s some tension there.”
What can be improved, according to the Leru committee?
“In the first place, universities have to be really clear about the requirements that studies have to meet. That norm differs per discipline, and in some places, those requirements could be made more explicit.
“Furthermore, scientists should always be able to justify their research. That means that they can explicate how they obtained their data, and that wherever possible, they can share their data. In principle, research should be replicable. Still, we’re aware that you can never guarantee that the results will be exactly the same if you do replicate a study. Say you’ve got a questionnaire. It might be that people will give slightly different answers in summer than they would during winter, because weather influences how people experience or view things.
Scientists should also have a thorough base of knowledge of methods of research and statistics. Explicit attention to numerous skills, such as imaging, is also important. It’s important to prevent images from presenting data in a skewed manner.
Finally, researchers should know their limitations. This is especially important in multidisciplinary research. Each discipline has its own expertise. A legal expert tends to know little about econometry, and it’s good to realise that when you’re collaborating. Attention needs to be paid to all this in academic education as well as during PhD tracks.”
To what extent does open science play a role in all this?
“Open science makes research more transparent, making it more easily accessible for society. That’s also a matter of good research practices and ethics. Researchers don’t just share scientific insights, but often data as well. But there’s a downside to this as well. What are the boundaries for sharing data? Look at privacy for example. Are the respondents identifiable? There’s also the risk of third parties taking your data and running with it. Think, for instance, of commercial organisations. You don’t want others to free-ride on your research. That’s why we say: make research as open as possible, and as closed as necessary.”
The report states that you’re opposed to anonymous complaints. But within science, isn’t there often an imbalance of power, and don’t you need whistle blowers in that case?
“Definitely, it’s good and often courageous when people blow the whistle, but don’t do so anonymously. There’s a reason whistle blowers are protected, and shouldn’t experience difficulties from blowing the whistle. This is done in order to compensate the imbalance of power. Why is it undesirable to complain anonymously? If someone files a complaint against you as a scientist, that has enormous impact. Your reputation is at stake. People lie awake at night from things like that. You should at least have the right to defend yourself. It’s impossible to do so against an anonymous complainant. Sometimes, it turns out that a complainant’s goal is to harm the other, for instance because they were in conflict with each other.
“I draw parallels with the legal system here. If you’re on trial as a suspect in a criminal lawsuit, you need to know who the witnesses are so you can defend yourself. Perhaps people testified because they wanted to enact revenge, which can taint their testimony.
“In rare cases, boards can decide to have a case be investigated after an anonymous tip, for instance when the case has made the news, the investigation can be done without the anonymous complainant, and when it’s about something very important.”
To what extent should universities keep the issues behind closed doors – or publicise them?
“As complaints procedures have such great impact, and in most cases, not a lot of severe issues are brought to light, it’s important that cases are treated confidentially. Research reports are published in an anonymised fashion for that reason. In the legal system, it’s the same way with verdicts. You can see what’s going on and how the university dealt with it, while the people in question remain anonymous. Unfortunately, this doesn’t always work that way in practice. Often, people can still link the anonymised complaints to a certain person. And once a case has been published about in the media, universities sometimes feel forced to make the whole thing public.”
“There’s also some discussion about whether universities should perhaps disclose everything in all cases. Sweden, for instance, is an example of a country where cases are publicised with names and details intact. I’m not a proponent of that system, because these cases don’t often deal with heavy matters. When names are exposed, the general public often misses the nuance, and so reputations are often quickly at stake. Where there’s smoke, there’s fire. That’s also why newspapers tend to be cautious in mentioning the full name of suspects in criminal court cases.
“However, there might be situations in which it’s important that another university where someone started at is notified of a breach of research integrity. That’s a choice the university board makes. It’s difficult, considering privacy protection. But the main rule should be to be very cautious with publicity surrounding cases. Before you know it, it won’t be the committees on research integrity that make well-informed judgments about issues, but instead, it’ll be the general public that does so, and that’s not necessarily the best thing.”
Like recently, with the plagiarism affair surrounding the former Amsterdam rector Dymph van den Boom, where you were a member of the committee?
“I don't usually talk publicly about cases I advise on, but I'll make an exception here. Yes, she’d already been vilified by the media before any research had been conducted. In our committee, we called it ‘trial by media’. For all sorts of reasons, this is a very complicated case, that could’ve been handled via regular complaints procedure. In this case, we pointed out (a pattern of) inaccuracies in citations, but didn’t reach the conclusion that there was any plagiarism at play. She didn’t present the work of others as her own. Especially when you take into account the background of these speeches. You can see how in media publications after the fact, there are some very selective citations taken from the advice. That’s harmful to everyone.”
When you take stock, what would you say the status of research integrity in the Netherlands is?
“There’s a lot of attention for the topic in the Netherlands. Look at Utrecht, where we’ve already done a lot of work. Since 2014, PhD candidates have had to declare an oath, and new employees have to sign a scientific code of conduct. We’ve got a teaching fellow who studies how research integrity can be incorporated into education. There’s also been a steady build-up of expertise in this area. There’s good reason why someone from Utrecht was asked to join this workgroup.
“It’s remarkable that we handle research integrity in a different way in Europe than they do in the United States. In Europe, we’ve got more of a reactive stance. We respond to wrongs when someone complains. In the US, they’re more proactive. They check research even if there are no complaints. I think this wouldn’t fit in our culture very well. We tend to assume a level of trust amongst us, and in my opinion, this should remain the way it is. I do think that it’s advisable to have something like this on a volunteer basis. It’d be good if someone does random checks of statistical analyses of studies. In that sense, there’s a lot we could learn from the US, where they’ve been working on research integrity much longer than in Europe.”
By Ries Agterberg, DUB
Translation Indra Spronk