Scientific scepticism

Information overload has led to a dramatic rise in false information. Nathan Goodey asks whether we should be sceptical of the information we read.

Major breakthroughs seem like a common occurrence these days. In the last two months alone, NASA has revealed that water exists on Mars, the WHO have shown that processed meats can increase the likelihood of cancer, and scientists have developed a self-healing, flexible sensor that mimics the properties of human skin. These publications don’t even cover a fraction of the thousands of papers published on sites across the globe. Some boast works that Einstein would be proud of, others provide evidence to back up theories – but how many contain incorrect data? Motives and opinions differ everywhere you look, knowing what to believe can be difficult.

We live in a society where information is easily accessible, at any place, any time. We read other people’s work, other people’s opinions and often we take their story to be true. It is easy to forget that this is not necessarily the case. When it comes to science, we must put every ounce of trust in the people who have collected the data. Without having done the research ourselves, we must blindly judge between what we agree and disagree with. The truth is, most of the time, we don’t know any better. We take scientific results to be justified – we have no reason to believe otherwise. In all honesty, to believe these ‘professionals’ is the very essence of faith.

Take yourself back to 2011, disbelief quickly spread around the scientific community when OPERA scientists released results that appeared to show neutrinos travelling faster than the speed of light[1]. General relativity was suddenly under question, with no definitive proof that the data was correct. Unsurprisingly, the results were found to be wrong due to equipment failure. What was simply an honest mistake led to the circulation of false information and unrequired media attention. I believe that the information should have been reviewed before being announced.

As well as misconceptions, fraudulent data is a serious problem in science. In June 2014, two stem cell papers were retracted, when several other groups failed to replicate the results[2]. Furthermore, a paper on ‘chopstick nanorods’ was removed in August 2013 because an image showed rods copy and pasted into position, using computer software[3]. Justice was served as the PHD student who had tampered with the results had his doctorate degree removed, but questions remain on how the information was released in the first place. Problems exist throughout the publishing process that allows these erroneous papers to be printed. For instance, people voluntarily read and edit these papers through peer review, sometimes with very little relevant background of the topic and understanding of the science. There is no way (unless unwittingly obvious) that someone reading the paper could be expected to know if it’s forged without having time to replicate the experiment – which they cannot be expected to do. Sometimes, if it is a major topic or discovery, a scientific journal may go ahead and publish it anyway, even if there are questions surrounding it. They would not want to reject a paper that could enhance its reputation, reap financial reward, or aid a rival magazine if turned away.


Who to trust? Flawed nanorod images[3]

All or nothing

Making up data seems irresponsible, mindless and dangerous. Why do some people go to such extreme measures? Money, reputation and personal gain all play major roles. Projects need financing, which is more likely with ‘successful’ results. For some people it is worth the risk to manipulate data, if it means that they can get the funding to continue their research. Standing out for investment is also difficult when you are up against research groups from across the globe. This leads to the additional pressure of publishing conclusions by specific deadlines. Results can often end up getting neglected, with only those deemed important being selected. With time constraints, it may also be impossible to double check raw data before publication.

The Schön scandal is a good example of just how easily data can be created, manipulated and dispersed around the world. Jan Hendrik Schön’s field of research was condensed matter physics and he was hired by Bell Labs in 1997[4]. His findings, which looked at semiconductor materials, were published in Science and Nature journals and received worldwide attention – yet no one could reproduce his results. By 2001, he was publishing one paper every eight days; at this rate, alarm bells should have been ringing. Eventually, his results were found to be forged when someone compared graphs from two of his papers, and discovered that they were the same. Following a formal investigation, the majority of his papers were retracted and his doctoral degree revoked due to dishonourable conduct. This remains one of the biggest known data frauds to date. Although the scandal did bring to light that peer review required a reassessment, can the publication of dozens of completely fraudulent papers be justified? One can only guess that they were printed because of his reputation, rather than the work he had actually written.

Concluding that fake information is only the fault of scientists is naïve, however. Large corporations are just as likely to manipulate data for their own benefit and greed. In 2006, P&G, who create beauty products, were in dispute with Aubrey Blumsohn. He was refused access to drug trial data to verify conclusions that were published in his name[5]. They had concealed evidence and created conclusions that they wanted rather than found. Of course, they only want to have a positive persona. Any negatives are swept under the rug. It is more cost efficient for them to alter their outcomes than to adjust their products.

Judgement day

It is a relief that faulty data has been discovered and rectified – but the worry is that it happened at all. While some results are just mistakes, falsification leads to wasted time, misconceptions and a difficulty understanding the truth. It should not be so easy to publish results that are fake and avoiding such circumstances should be attempted at all costs.

With the vast amount of information that we see every day, be it through education, television, the internet etc., it will only become more difficult to know what to believe. Unlike the small few who know the ins and outs of a specific subject, the vast majority of us are left with the decision to trust the data and conclusions of those who undertook the project. It is easy to be sceptical of what we read, and to an extent it is a good thing. Asking why, how, and not always taking something to be true can enhance our understanding, but being suspicious of everything we read is counterproductive – common sense should be used when possible.

I expect that only more cases of incorrect, fraudulent and manipulated data will arise in the near future and that attempts to fix it, most likely through the re-evaluation of peer review, will be undertaken. However, I cannot see a simple method of removing the problem. It is inevitable that mistakes are going to be made. What is important is that we are careful, we stay vigilant and we don’t believe everything we read.



[2] Papers on ‘stress-induced’ stem cells are retracted, David Cyranoski, doi:10.1038/nature.2014.15501




Scientific scepticism

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s