A search of the PubMed database, which indexes scholarly biomedical articles, reveals that 997,508 articles were published in the year 2011, which amounts to roughly 2,700 articles per day. Since the database does not include all published biomedical research articles, the actual number of published biomedical papers is probably even higher. Most biomedical researchers work in defined research areas, so perhaps only 1 percent of the published articles may be relevant for their research. As an example, the major focus of my research is the biology of stem cells, so I narrowed down the PubMed search to articles containing the expression "stem cells." I found that 14,291 articles containing the words "stem cells" were published in 2011, which translates to an average of 39 articles per day (assuming that one reads scientific papers on weekends and during vacations, which is probably true for most scientists). Many researchers also tend to have two or three areas of interest, which further increases the number of articles one needs to read.
Needless to say, it has become impossible for researchers to read all the articles published in their fields of interest, because if they did that, they would not have any time left to conduct experiments of their own. To avoid drowning in the information overload, researchers have developed multiple strategies for surviving and navigating all this published data. These strategies include relying on the recommendations of colleagues, focusing on articles published in high-impact journals, only perusing articles that are directly related to one's own work or only reading articles that have been cited or featured in major review articles, editorials or commentaries. As a stem cell researcher, I can use the above-mentioned strategies to narrow down the stem cell articles that I ought to read to the manageable number of about three or four articles a day. However, scientific innovation in research is fueled by the cross-fertilization of ideas. The most creative ideas are derived from combining seemingly unrelated research questions. Therefore, the challenge for me is to not only stay informed about important developments in my own areas of interest but to know about major developments in other scientific domains, such as network theory, botany or neuroscience, because discoveries in such "distant" fields could inspire me to develop innovative approaches in my own work.
In order to keep up with scientific developments outside of my area of expertise, I have begun to rely on high-quality science journalism, which can be found in selected print and online publications or on science blogs. Good science journalists accurately convey complex scientific concepts in simple language, without oversimplifying the actual science. This is easier said than done, because it requires a solid understanding of the science as well as excellent communication skills. Most scientists are not trained to communicate to the general audience, and most journalists have had very limited exposure to actual scientific work. To become good science journalists, either scientists have to be trained in the art of communicating results to non-specialists or journalists have to acquire the scientific knowledge pertinent to the topics they want to write about. The training of science journalists requires time, resources and good mentors.
Once they have completed their training and start working as science journalists, they still need adequate time, resources and mentors. When writing about an important new scientific development, good science journalists do not just repeat the information provided by the researchers or contained in the press release of the university where the research was conducted. Instead, science journalists perform the necessary fact-checking to ensure that the provided information is indeed correct. They also consult the scientific literature and other scientific experts to place the new development in the context of the existing research. Importantly, science journalists then analyze the new scientific development, separating the actual scientific data from speculation and pointing out the limitations and implications of the work. Science journalists also write for a very broad audience, and this also poses a challenge. Their readership includes members of the general public interested in new scientific findings, politicians and members of the private sector who may base political and economic decisions on scientific findings, patients and physicians who want to stay informed about innovative new treatments and, as mentioned above, scientists who want to know about new scientific research outside their area of expertise.
Unfortunately, I do not think that it is widely appreciated how important high-quality science journalism is and how much effort it requires. Limited resources, constraints on a journalist's time and the pressure to publish sensationalist articles that exaggerate or oversimplify the science in order to attract a larger readership can compromise the quality of the work. Two recent examples illustrated this. The first was the so-called controversy surrounding Jonah Lehrer, a highly respected and popular science journalist who was found to have fabricated quotations, plagiarized and oversimplified research. The case of Jonah Lehrer was a big shock for me. I had enjoyed reading a number of his articles and blogs, and at first it was difficult for me to accept that his work contained so many errors and evidence of misconduct. Boris Kachka has recently written a very profound analysis of the Jonah Lehrer controversy in New York Magazine:
Lehrer was the first of the Millennials to follow his elders into the dubious promised land of the convention hall, where the book, blog, TED talk, and article are merely delivery systems for a core commodity, the Insight.
The Insight is less of an idea than a conceit, a bit of alchemy that transforms minor studies into news, data into magic. Once the Insight is in place -- Blink, Nudge, Free, The World Is Flat -- the data becomes scaffolding. It can go in the book, along with any caveats, but it's secondary. The purpose is not to substantiate but to enchant.
Kachka's expression "Insight" describes our desire to believe in simple narratives. Any active scientist knows that scientific findings tend to be more complex and difficult to interpret than we anticipated. There are few simple truths or "Insights" in science, even though part of us wants to seek out these elusive simple truths. The metaphor that comes to mind is the German expression "der innere Schweinehund," which literally translates to "the inner swine dog." The expression may evoke the image of a chimeric pig-dog beast created by a mad German scientist in a Hollywood World War II movie, but in Germany this expression is actually used to describe a metaphorical inner creature that wants us to be lazy, seek out convenience and avoid challenges. In my view, scientific work is an ongoing battle with our inner swine dog. We start experiments with simple hypotheses and models, and we are usually quite pleased with results that confirm these anticipated findings because they allow us to be intellectually lazy. However, good scientists know that more often than not, scientific truths are complex, and that we need to force ourselves to continuously challenge our own scientific concepts. Usually this involves performing more experiments, analyzing more data and trying to interpret data from many different perspectives. Overcoming the intellectual laziness requires work, but most of us who are passionate about science enjoy these challenges and seek out opportunities to battle against our inner swine dog instead of succumbing to a state of perpetual intellectual laziness.
When I read Kachka's description of why Lehrer was able to get away with his fabrications and oversimplifications, I realized that it was probably because Lehrer gave us the narratives that we wanted to believe. He provided "Insight" -- portraying scientific research in a false shroud of certainty and simplicity. Even though many of us look forward to overcoming intellectual laziness in our own work, we may not be used to challenging our inner swine dog when we learn about scientific topics outside our own areas of expertise. This is precisely why we need good science journalists who challenge us intellectually by avoiding oversimplifications.
The other recent case of poor science journalism -- and one that is equally instructive -- occurred when the widely circulated Japanese newspaper Yomiuri Shimbun reported in early October 2012 that researcher Hisashi Moriguchi had transplanted induced pluripotent stem cells into patients with heart disease. This was quite a sensation, because it would have been the first transplantation of this kind of stem cell into real patients. For those of us in the field of stem cell research, this came as a big surprise and did not sound very believable, because the story suggested that the work had been performed in the United States, and most of us knew that obtaining approval for using such stem cells in clinical studies would have been very challenging. However, it is very likely that many people who were not acquainted with the complexities of using stem cells in patients may have believed the story. Within days it became apparent that the researcher"s claims were fraudulent. He had said that he had conducted the studies at Harvard, but Harvard stated that he was not currently affiliated with them and there was no evidence of any such studies ever being conducted there. His claims of how he derived the cells and how little time it supposedly took him to perform the experiments were also debunked.
This was not the first incident of scientific fraud in the world of stem cell research, and it unfortunately will not be the last. What makes this incident noteworthy is how the newspaper Yomiuri Shimbun responded to their reporting of these fraudulent claims. They removed the original story from their site and issued public apologies for their poor reporting. The English-language version of the newspaper listed the mistakes in an article entitled "iPS REPORTS--WHAT WENT WRONG / Moriguchi reporting left questions unanswered." These problems included inadequate fact checking regarding the researcher's claims and affiliations by the reporter and a failure to consult with other scientists on whether the findings sounded reasonable. Interestingly, the reporter had identified some red flags and concerns:
--Moriguchi had not published any research on animal experiments.
--The reporter had not been able to contact people who could confirm the iPS cell clinical applications.
--Moriguchi's affiliation with Harvard University could not be confirmed online.
--It was possible that different cells, instead of iPS cells, had been effective in the treatments.
--It was odd that what appeared to be major world news was appearing only in the form of a poster at a science conference.
--The reporter wondered if it was really possible that transplant operations using iPS cells had been approved at Harvard.
The reporter sent the e-mail to three others, including another news editor in charge of medical science, on the same day, and the reporter's regular updates on the topic were shared among them.
The science reporter said he felt "at ease" after informing the editors about such dubious points. After receiving explanations from Moriguchi, along with the video clip and other materials, the reporter sought opinions from only one expert and came to believe the doubts had been resolved.
In spite of these red flags, the reporter and the editors decided to run the story. The reporter and the editors gave in to their intellectual laziness and their desire to run a sensational story instead of tediously following up on all the red flags. They had a story about a Japanese researcher making a groundbreaking discovery in a very competitive area of stem cell research, and this was the story that their readers would probably love. This unprofessional conduct is why the reporter and the editors received reprimands and penalties for their actions. Another article in the newspaper summarizes the punitive measures:
Effective as of next Thursday, The Yomiuri Shimbun will take disciplinary action against the following officials and employees:
--Yoshimitsu Ohashi, senior managing director and managing editor of the company, and Takeshi Mizoguchi, corporate officer and senior deputy managing editor, will each return 30 percent of their remuneration and salary for two months.
--Fumitaka Shibata, a deputy managing editor and editor of the Science News Department, will be replaced and his salary will be reduced.
--Another deputy managing editor in charge of editorial work for the Oct. 11 edition will receive an official reprimand.
--The salaries of two deputy editors of the Science News Department will be cut.
--A reporter in charge of the Oct. 11 series will receive an official reprimand.
I have mixed feelings about these punitive actions. I think it is commendable that the newspaper made apologies without reservations or excuses and listed its mistakes. The reprimands and penalties also highlight that the newspaper takes its science journalism very seriously and recognizes the importance of high professional standards. The penalties were also more severe for its editors than for the reporter, which may reflect the fact that the reporter did consult with the editors, and they decided to run the story even though the red flags had been pointed out to them. My concerns arise from the fact that I am not sure punitive actions will solve the problem, and they leave a lot of questions unanswered. Did the newspaper evaluate whether the science journalists and editors had been appropriately trained? Did the science journalist have the time and resources to conduct his or her research in a conscientious manner? Importantly, will science journalists be given the appropriate resources and protected from pressures or constraints that encourage unprofessional science journalism? We do not know the answers to these questions, but providing the infrastructure for high-quality science journalism is probably going to be more useful than mere punitive actions. We can also hope that media organizations all over the world learn from this incident and recognize the importance of science journalism and put mechanisms in place to ensure its quality.