Own picture Robert Suurmond
Chic conferences, polished presentations and glossy journals. The scientific world seems like one big success story. “Seems”, because reality, of course, is different. In this series, Observant will look for the mistakes, the setbacks, the slip-ups, the unexpected turns. Because those too, or maybe even especially, are science. Today: Robert Suurmond, researcher at the School of Business and Economics.
According to Suurmond, the “Facebookisation” of science is a clearly visible trend in current academic culture. “Researchers only share things that are going well on social media, especially on LinkedIn or Twitter. New collaborations, successful funding applications, newspaper interviews, publications in journals. We don’t post about it when a journal rejects our paper.”
This also goes for studies with “no results” – studies whose findings are “not significant” or, in other words, statistically unconvincing. “An editor of a renowned journal in my field, management studies, allegedly once said, ‘If there are no results, you didn’t think your hypothesis through.’ Nonsense, of course – as if you can know exactly what the results will be beforehand. But it does touch on a fundamental point: each study should result in something interesting one way or another.”
If it doesn’t, most researchers will no longer try to write an article about it, says Suurmond. “Unless you can put a spin on it by ignoring the main relationship and focusing on another relationship that did happen to be significant. If you do this, you must explicitly state in the article that you are deviating from your original plan, but no one does that. This is an example of “hypothesising after results are known”, also known as HARK.”
Suurmond studies supply chain management: how can organisations in a chain use each other’s expertise to improve products or services? “Take, for example, the automotive industry, which makes use of its suppliers’ knowledge in producing electric car models. This happens more often in Japan and Europe than in the US. This might be why we do have an electric Nissan, but no e-Ford.”
Suurmond wondered if the same applied to services. Concretely: do institutions like UM utilise the knowledge of their catering or cleaning companies, for example? And does this improve the quality of the services? Does it lead to a more sustainable or varied menu, or to more eco-friendly cleaning products?
Together with Facility Management Nederland, Suurmond sent a survey to both one hundred “purchasing organisations” (including UM) and one of the organisations that provide those purchasing organisations with services.
His goal was to look at a hundred pairs, but the results were disappointing. “I received only fifty responses, all sent in by purchasing organisations. I suspect that the service providers didn't participate because of competitive considerations. The catering industry and the cleaning industry in the Netherlands consist of a few major players that know each other and regularly take over contracts from each other. Although the study was anonymous, they wanted to keep their cards close to their chests. We limited ourselves to the purchasers instead, finding, among other things, that a good relationship with the catering or cleaning company leads to better services.”
All in all, the study failed, says Suurmond. “We didn’t answer the original question, even if we did manage to get something out of it. It was part of my PhD dissertation and it cost a lot of time and money, so I felt ethically obliged to turn it into an article. It won’t become a top publication, but it is interesting enough.”
A waste of time
Would he ever return funding if a project turned out to be unfeasible? A cardiologist at Utrecht University told newspaper De Volkskrant that he had returned 1.3 million euros to the Dutch Heart Foundation. “I think that took a lot of guts. I hope that I would do the same, but you can’t say that with certainty. In practice, you’ll still try to get something out of it, even if the money wasn’t intended for that. You’ve already put so much effort into getting that funding.”
He did recently see a funding application for the Dutch Research Agenda fall through, which a consortium of several universities was working on. “We’d been working on it for a year –organising meetings, writing proposals, you name it. We ultimately had to conclude that our idea – making the agricultural food production chain more sustainable – didn’t lend itself to the Dutch Research Agenda. Its Transport and Logistics section focuses on technical, smarter solutions, whereas we were looking for logistical improvements, like fewer chains and better coordination.”
And working together in a consortium doesn’t necessarily pay off. “As the number of partners increases, it becomes more and more difficult to reach agreement, also because everyone values their own input. Our collaboration wasn’t a success and while it’s certainly a shame that the project was discontinued, I don’t think of it as a waste of time. We formulated challenges that are still relevant and may lead to new studies at a later time.”
Journal of “failed” science
Many things go wrong in science, as they do everywhere. But why are the failures, setbacks or dead ends in research rarely exposed? Is it because of the tremendous amount of pressure on researchers to be successful? Is that why failure is a taboo in science?
“We have unrealistically high expectations of researchers”, says recently graduated historian of science Martijn van der Meer. “If failure was a little more accepted in the scientific world, the work environment would immediately be a lot healthier and more pleasant. Sometimes failure is necessary to achieve something beautiful.”
Van der Meer is one of the master’s students from Utrecht University who founded the Journal of Trial and Error (JOTE). This open-access journal embraces negative, non-significant results rather than shying away from them.
The point of the journal, which first appeared in November, isn’t to glorify “sloppy science”, says Van der Meer. Papers with incorrect statistics, improper data collection or sloppy writing are rejected. All articles go through a rigorous peer-review process and first appear online in preprint.
The journal isn’t receiving a lot of submissions yet. “People have plenty of articles in their desk drawers”, suspects Van der Meer, “but they have to be brave enough to submit them. Some might be worried that being published in a journal of “failed” science wouldn’t look good on their CV.” And that’s exactly the problem JOTE aims to address. (HOP)