Linked In Clinical studies: Finding the value and ignoring the noise - Sedgwick

Clinical studies: Finding the value and ignoring the noise

Just over 40 years ago, the New England Journal of Medicine published a five-sentence letter to the editor [1] that would be used to ignite one of the worst public health tragedies of our time. Written by a Massachusetts physician and graduate student, the short letter was merely an observation that out of thousands of hospitalized patients who received an opioid, there were only four “cases of reasonably well documented addiction.” The document was not the result of a clinical trial or a double-blinded study designed to determine the ‘addictive’ nature of opioids. The letter was merely an observation, and it did not include evidence or data analysis to support that observation.

Over the course of the next several decades, this single letter would be cited over 600 times, and in over 400 of those instances, it was referenced as evidence that opioid addiction was rare [2]. In the next two to three decades, drug companies would use this brief correspondence as a ‘clinical permission slip’ during their sales calls to providers, and a drug crisis would catch fire and sweep through the United States. This story is only one sobering example, but it serves to make the critical point that when we’re interpreting scientific data or clinical studies for potential cures or treatments, we cannot take shortcuts and we must do the work of learning what the data is telling us, and what it is not.

From both a medical and data perspective, we’re light years ahead of where we were in 1980. A surgeon in one hospital can now operate on a patient hundreds of miles away via remote capabilities. Medications have moved well beyond their ability to impact specific receptors and we now have drugs that can target specific genes in our body. While our data capability was once confined to 80 kilobytes on a floppy disk, we can now hold terabytes of data on a single flash drive.

Yet, despite our progress, the realities of today still tempt us to consume our information in quick soundbites or via social media posts of 280 characters or less. In our ‘need it now’ culture, we often expect data and new information to be distilled into the most simplified form, converted into a catchy headline, or reduced to a binary output. Recently, I’ve been amazed at how many times I’ve clicked on a link from a financial website touting the latest results of a vaccine clinical trial, yet the actual clinical data has yet to even be published. We must be careful not to give into careless consumption of information and risk misinterpreting clinical data and missing the key information gleaned, even if it means the conclusion was not the answer we hoped for.

We typically trust the healthcare and scientific communities to be at the front lines of discovering new treatments and learning new insights into how our bodies are impacted by food, exercise, stress, sleep, medications, etc. However, not all clinical data is created equal. It is a foundational notion that was emphasized more times than I can recall during my pharmacy school training, whether it was through our biostatistics course or during clinical rotations. Scientific data requires an intentional, focused scrutiny. It is critical to understand the methodology, the measurements, and the statistical outcomes and conclusions the researchers are suggesting. Does the outcome of the study really offer up the value and impact that is claimed in the news headline?

Let’s take food as an example. We typically hear headlines of how wine may enhance cardiovascular health, chocolate supposedly prevents cancer, or how coffee might be protective for liver health. News organizations have caught on that we all love to hear more about how our favorite foods may actually bring us good health and longer lives so they seem to love bringing food studies into the headlines. Researchers will often review data on food intake based on a food questionnaire and review specific health-related outcomes and they will hope to discover certain correlations that can be linked to statistical significance – that is, find a high probability that the connection is unlikely due to chance. However, even statistical analysis is not foolproof and results need to be verified and reproduced in additional studies.

To highlight this, some data researchers compiled 54 food questionnaires from participants as well as a survey on various questions about their personal preferences [3]. It turns out, even with such a small number of surveys completed, they were able to get to a “statistically significant” conclusion (i.e. very high probability that a correlation exists) that people who add salt to their diets also had a positive relationship with their internet service provider. Now that may sound ridiculous, and that’s because it is ridiculous. The researchers were being intentional in providing this type of absurd example to make the point that we need to take a step back and properly review clinical studies, understand how to look for real value, and include our common sense in the exercise.

At Sedgwick, our pharmacy program works hard to understand the clinical data we review to ensure we’re bringing needed value through our services and to our clients. Weekly, our clinical leadership team meets with our clinical pharmacists and pharmacy interns to review the latest studies or published findings related to health issues that impact workers’ compensation and our injured workers. We review and discuss study methodologies and measurements. We review the clinical analysis and conclusions and consider the findings while also considering additional studies that may have been published on the same or similar topic. Overall, does this data show us value that we can in some way apply to the benefit of our clients and their injured workers? Some of the questions we’ve asked during our recent meetings include:

  • Does the latest pain reliever approved by the FDA provide any true value to patients?
  • What is the latest data on cannabis and its abilities to address pain and/or anxiety?
  • What do the clinical studies really tell us about hydroxychloroquine and its ability to treat COVID-19?
  • What can we glean from new guidelines on chronic pain?

These represent only a few examples of the type of research and analysis we dig into each week. We understand that continually learning and taking in more data and clinical literature is critical to providing value to our program. We understand that we cannot afford to take a headline at face value. Behind that headline is either something we can add to our library of insights or something to disregard. Either way, at Sedgwick, we will continue to stay on top of new clinical research so our clients can continue to have confidence in the value we are bringing to them and their injured workers.

Sources:

[1]]"Addiction Rare In Patients Treated With Narcotics". Vol 302, no. 2, 1980, pp. 123-123. Massachusetts Medical Society, doi:10.1056/nejm198001103020221.

[2]]Leung, Pamela T.M. et al. "A 1980 Letter On The Risk Of Opioid Addiction". New England Journal Of Medicine, vol 376, no. 22, 2017, pp. 2194-2195. Massachusetts Medical Society, doi:10.1056/nejmc1700150.

[3]]Aschwanden, Christie. "You Can’t Trust What You Read About Nutrition". Fivethirtyeight, 2020, https://fivethirtyeight.com/features/you-cant-trust-what-you-read-about-nutrition/.

Back to Blog
Back to top