The question about real-world evidence is no longer whether data collected during routine clinical processes can be useful, but rather, how can it best be used?
“In recent years we have several examples of large, international comparative effectiveness studies showing that on a global scale you can do useful, methodologically rigorous studies with good-quality data that are truly informative for clinical practice,” said Mikhail Kosiborod, MD, Cardiologist at Saint Luke’s Mid America Heart Institute and Professor of Medicine at the University of Missouri-Kansas City School of Medicine. “Real-world evidence can also be very informative for policymakers and regulators. There’s enormous interest in the contributions real-world evidence can bring.”
But with that enormous interest comes enormous questions.
“The question turns on what sort of conclusions and inferences clinicians, policymakers, and payers can make based on day-to-day data. It comes down to association versus causation,” said Hertzel C. Gerstein MD, MSc, FRCPC, the Population Health Institute Chair in Diabetes Research, Director of the Division of Endocrinology and Metabolism, Director of the Diabetes Care and Research Program, and Deputy Director of the Population Health Research Institute at McMaster University in Hamilton, Ontario, Canada.
“There are good arguments that unless you control conditions of data collection, you can never have confidence that what you are seeing is actually the result of therapy versus something else they happen to be doing at the same time,” he continued. “How much confidence can you have when you see that people who drink diet soda may be heavier than people who don’t drink much diet soda? Is it because people who are heavier tend to drink more diet beverages? Or does diet soda really make them heavier?”
Drs. Kosiborod and Gerstein will take part in Saturday’s Current Issues debate What are the Contributions of the Evolving Real-World Evidence? The session begins at 4:00 p.m. in N-Hall E (North, Exhibition Level).
Misconceptions about what real-world evidence can and cannot do abound, Dr. Kosiborod said, and opinions tend to polarize quickly. Clinical trialists sometimes dismiss real-world evidence as being biased and unreliable.
At the same time, clinical trials cannot answer every important clinical question. Trials are too expensive, take too long, and, most importantly, are too selective. It isn’t possible to do a large clinical trial and answer every relevant clinical question.
“If you look at trials of SGLT2 inhibitors, for example, the majority of our real-world patients would not qualify for inclusion,” Dr. Kosiborod said. “That raises important questions about generalizability. Clinical trials have excellent internal validity, but inclusion and exclusion criteria are frequently quite strict, such that external validity is often questionable. That’s where real-world evidence can be helpful. It can complement clinical trial data in important ways.”
Regulators recognized the limitations of clinical trials years ago and began to require post-marketing studies to evaluate the safety of newly approved therapeutic agents in real-world patient populations. These post-marketing studies are based on voluntary reporting and designed to detect possible safety signals not seen in clinical trials. But they cannot determine whether the detected signals are caused by the therapy or some other factor, Dr. Gerstein said.
“There are a confusing number of conflicting messages saying that real-world data is either good or bad,” Dr. Gerstein said. “It’s not necessarily a black-and-white issue. This debate will help us all to have some clarity as the limitations of clinical trial evidence and real-world evidence as we scrutinize the information we are faced with every day as we care for patients.”