Our problem lies not in the existence of overly-simplified research, but in the public’s chronically shortened attention span.
Anyone who has ever scoped the seemingly endless domains of Google Scholar, ProQuest, or PubMed knows that there is a wealth of research readily available to the public. Such works can be prohibitively complex for the layman, however, and in the spirit of promoting equal access and openness, many have taken it upon themselves to simplify items of particular interest for the public’s benefit. However, the risk of such simplifications is that without the expertise that professionals possess, the general public may cause more harm to itself through misunderstanding.
Most abstracts contain something of a disclaimer, a series of phrases along the lines of “Our study suggests_________. Given this result, more research into _________ should be done to see if ____________.” These lines function to emphasize the difference between correlation and causation, acknowledge the likely presence of bias, and recognize uncertainty—in short, to clarify the work’s intermediate nature. Most researchers understand that few conclusions offer a definitive solution to a problem, but that disclaimer is often lost in translation when research is simplified for the public.
Several years ago, Dr. Hannah Gardener of the University of Miami saw her work on a possible diet soda-stroke link, wildly overpublicized due to an overlooked disclaimer. In an interview with Salynn Boyles of WebMD, Gardener stated: “It was just an abstract presented at a meeting. … We are still working on the analysis. I don’t think the level of press attention it received would have been warranted even if it was a published paper.” Gardener’s situation is a clear example of how the public can overreact, based on too simplified of an understanding. In her article, Boyles emphasizes Gardener’s feeling that “the headlines often got it wrong by leaving the impression that her research proved the diet soda-stroke connection” even though many reports actually stated that “the findings were preliminary and did not prove that diet sodas cause stroke.” This explanation highlights the role of the media in public misinterpretation, and it does seem that sensational headlines can misrepresent findings. However, the media cannot be held liable: headlines are written with the assumption that the article will then be read, and ultimately it is the lack of an audience’s follow-through that is to blame for situations like these.
The problem does not end with headlines, though. Some forget that while a study may suggest a correlation between factors, the factors in the study may not be analyzed under natural circumstances—or with human subjects. For instance, a widely publicized 2005 experiment that found a correlation between aspartame and cancer in rats tested high levels of consumption that were not directly applicable to individuals drinking less than a thousand cans of diet soda a day. Similarly, older studies saw correlation between saccharin and cancer in rats, but discrepancies between rats and humans nullified the association’s relevance to humans. Both cases required a thorough understanding to realize that the research did not prove the harm of artificial sweeteners.
Our problem lies not in the existence of overly-simplified research, but in the public’s chronically shortened attention span. Individuals do harm when they make choices based on sensational headlines rather than nuanced explanations, or when they act based on one paper without considering dozens of the opposite position. On the researchers’ part, engaging with the public means choosing words carefully. Promoting research, perhaps even “dumbing down” for the public, seems to be the right thing to do. But as with any other right, there is always the understanding the public must respond with intellectual responsibility.Layla Stahr is a Brevia staff writer. She can be reached at email@example.com.