Don’t Board the Roller Coaster of Media Headlines

(Or “How to avoid re-traumatization by critically assessing medical research”)

 

Bereaved parents have survived what most consider the unimaginable⁠—the death of a child. And when that death is unexplained, there is a peculiar kind of grief that lingers. What did I miss? Will it happen again? As bereaved parents, we stifle these feelings down and learn to keep them at bay only after a long painful process to acceptance. But one thing that often rockets them to the surface are unexpected media headlines with striking implications: “Researchers found the cause of SIDS!” “A simple blood test at birth is the answer!”

Since my daughter died almost 25 years ago, I think I have seen SIDS “solved” in the media a few dozen times at least. The reality is, Sudden Infant Death Syndrome (SIDS) and Sudden Unexplained Death in Childhood (SUDC) are heterogeneous groups, meaning there are many causes of sudden infant and child death that elude our scientific understanding. The claim that one research finding is “the cause” or “the answer” is preposterous.

The poet Emily Dickinson wrote, “Hope is the thing with feathers / That perches in the soul / And sings the tune without the words / And never stops / at all.” We all need hope. And bereaved parents of unexplained infant and child deaths yearn to live to see the day that the mystery behind their tragedy is finally made clear. The prospect of that clarity is one of the most tantalizing things to dangle in their view.

As a bereaved mother, a scientist, the president of an organization dedicated to supporting bereaved parents of unexplained child death, such media headlines make me angry⁠— really angry. I immediately think of the stress that bereaved families will face. The public reads these headlines with seemingly innocent hopefulness that they indeed are true. They may be so excited about it that they forward it to the one bereaved parent they know to share the great news! They think it will be the best part of their day. “Look! Researchers have found the answer to what you have been searching for!”

Sigh… deep breath…

The bereaved parent may receive one of these types of communications⁠—or many more⁠—depending on how widespread the media coverage is. They understand the good intentions of those sharing these media alerts. But inside, the bereaved parent is often in knots as the rollercoaster that ensues lurches into high gear.

“Is this true? Oh my God, is this what happened to my child? How did I not know? Were they in pain? Did they suffer? I can’t believe I didn’t check on them sooner? Can I test my other kids for this? What if they have it too? I can’t survive losing another child. Can this be prevented? What does this mean?”

The rollercoaster can then continue into a dark spiral tunnel.

“Should I not have more children? Will this happen again? Did I cause this in my child? I missed my prenatal vitamins a few times- I know that was bad. I shouldn’t have had coffee. I need to find my husband/wife. Can I test this now in my child who died? Maybe I should call the medical examiner and ask them? Were they in pain? How could I have let this happen? Is this the answer? Is this true? I think I am going to vomit….”

And this is the source of my anger. Misinformation hurts. It is irresponsible and it retraumatizes those most vulnerable. It is dangerous and effects the adoption of public health messaging in keeping our children safe.

To avoid this rollercoaster, my wish is that bereaved parents adopt some of the below guidelines. It will save them some heartache. Bereaved parents have already had more than their share.

  • Let the rollercoaster pass. Wave it good riddance and choose not to engage. It is the best way to honor your self-care.
    • IGNORE the headlines. If people reach out to you with a dramatic headline, thank them for sending it and explain that you haven’t read the paper yet to evaluate the finding
    • If it sounds too good to be true, it often is. Avoid the first wave of media articles. If they went too far, responsible journalists will usually follow up with a more accurate measured perspective of what the study can teach us and why earlier reports went too far.
    • Weigh the source of the media publication and whether you consider it highly reputable for high quality journalism. Don’t assume that just because something is shared widely on social media that it is valid. On the contrary- have cautionary skepticism.
  • Go to the original source: the actual published study. Yes. It’s not like years ago when you needed access to a paid subscription of a journal to read most articles. At a minimum, read the abstract (pubmed.com etc). This is a short, usually less than 300-word, summary at the beginning of the actual published research paper that provides the key findings of the paper. Does it match up with the media statements?
  • Consider the study’s question, design, methods, analysi If that sounds complicated, we can break it down.
    • The research question: A great research question is often described with the acronym FINER (Feasible, Interesting, Novel, Ethical and Relevant). Did the authors clearly present their question guiding the study?
    • Study design: Different kinds of studies offer varying levels of evidence. Case series, case-controls provide less evidence than cohort studies, randomized clinical trials or systematic reviews.
    • Methods: these can be hard to assess if you are not familiar with the specific science, but one simple take away can be looking at the sample size. This is the number of subjects studied. In general, the larger the number of subjects, the more compelling it can be when differences are found. After all, plain luck and chance can never be removed from science. If you toss a penny 10 times, we know the probability of heads and tails is 50/50. But sometimes you just might get 8/10 or 10/10 heads! It’s the same with research; we need to consider if the results happened by chance. This is one of the reasons why it is so important for other researchers to replicate the study; do they get the same findings too?
    • Analysis/Conclusions: As human beings, we all have our biases. Great research studies work to limit these biases in the design and should explain this in the methods. In the analyses, this is where the researcher chooses how to evaluate the data. There are always many approaches one can take. But note that finding a significant difference between two groups may be mathematically real, but not meaningful in the real world. For example, if a new medication is found to lower blood pressure by only 1mm systolic, is it worth trying when there may be other side effects? Or if the medication is expensive? Does that minimal reduction in blood pressure make it worth it?
  • Read the actual paper (try not to be intimidated, but you can do this!) and then compare the actual findings to the media reports. Do the conclusions line up with the headlines? Many medical journals support articles written in plain language. You may understand more of the content than you anticipate. Is the journal a peer reviewed journal? If a journal is peer reviewed, it means the editors had subject matter content experts critically review the draft of the paper and were involved with its approval. They will often request the authors make updates to the paper before it is published to clarify findings and the information presented.
  • There is no such thing as a “perfect” research study. They all have limitations. Read what the researchers admit are their limitations. You will usually find this in the section of the paper named “discussion”. The media tends to gloss over limitations as it cuts away from the dramatic headline. Respectable researchers will cite their limitations clearly and help the reader to understand the findings in this context.
  • Next steps: most authors end their paper with their final conclusions and what they feel is needed next in research to continue to move the science forward. You will often see things like- “needs to be replicated with a larger study sample” (AKA take these findings with a grain of salt). Science is hard. Science is slow. And as humans, we understand that patience can be very difficult especially when the subject is so personal, and we are desperate for the answers.
  • If none of the above is a good match for you, reach out to a trusted friend or medical professional who can help you critically review the study and discuss it with you. Or email info@sudc.org for help.

Anyone can do their homework and critically appraise research, really! In this age of tremendous real time news that we receive and often are bombarded with, it is a great investment in one’s self. It allows you to be more confident when you see that next fantastic headline.

So, instead of jumping on that roller coaster, I hope you will instead enjoy a peaceful stroll in the park.

Written by:
Laura Gould
President, SUDC Foundation
Research Scientist, SUDC Registry and Research Collaborative

References:

  1. A simplified approach to critically appraising research evidence. (2021). Nurse Researcher (2014+), 29(1), 32-41. doi:https://doi.org/10.7748/nr.2021.e1760

  2. Designing Clinical Research
    ISBN: 978-1-60831-804-9 | 4th_Edition
    Grady, Deborah | Browner, Warren | Newman, Thomas | Cummings, Steven | Hulley, Stephen

  3. Elsevier Author Services, FINER: A Research Framework, https://scientific-publishing.webshop.elsevier.com/research-process/finer-research-framework/



Skip to content