Evidence-Based Medicine: What Premeds Should Know
A look at the curricula of different medical schools in the U.S. reveals that most if not all programs incorporate evidence-based medicine, or EBM, in their teaching. Med school applicants may be asked to discuss their understanding of EBM and its implications for patient care in secondary applications or during an interview.
Recognizing this trend, some med school applicants are trying to gain a better grasp of EBM and demonstrate their understanding of the topic in their applications. But what exactly is EBM, and how can aspiring doctors develop skills in it?
A few basic pointers can help premed students better grasp the concept and prepare to use it as they embark on their careers as physicians.
What Is Evidence-Based Medicine?
EBM refers to the idea of practicing medicine based on the latest evidence from research studies. Stated more simply, it is medical care that is up to date.
Every year, thousands of new research studies are conducted about different medical conditions, their diagnoses and their treatments. Such studies are designed to judge how well a diagnostic test, drug or surgery works and whether it is safe. Practicing medicine in an evidence-based fashion means relying on findings from these research studies – the evidence – when deciding when to order a specific test or prescribe a certain medication.
While the idea of EBM seems quite simple, it is not always adopted in clinical practice. Take the simple example of viral upper respiratory infections, or URIs, which we collectively refer to as the common cold. We have known for years that antibiotics do not offer any benefit in the treatment of most URIs.
However, in clinical practice, they continue to be prescribed frequently both in the U.S. and globally. This type of antibiotic overuse in turn leads to antibiotic resistance, which leads to many new infections and avoidable deaths.
Medical practice is not always evidence-based for many reasons. In the case of antibiotics for URIs, one theory is that patients push for antibiotics and doctors feel a need to oblige.
Another reason why EBM may not be adopted in practice is that research findings are not always clear-cut. Two studies looking at the same treatment may have contradictory findings. It can also be difficult to assess how robust a study is based on its design and whether its findings are reliable.
The most robust studies are randomized controlled trials where patients are randomly assigned to a treatment or placebo group, the two groups are followed and health outcomes are compared. These studies provide more convincing evidence than observational studies, which do not require and often lack randomization.
For example, if you want to study whether eating vegetables can reduce blood pressure, you can design an observational study where you compare blood pressure in people who regularly eat vegetables to those who don’t. If you find vegetable-eaters have lower blood pressure, it may be tempting to conclude that vegetables reduce blood pressure. However, it may be that people who eat vegetables are more health-conscious and therefore exercise more. You can’t know for sure if the reason for the lower blood pressure is vegetables, exercise or another factor.
In contrast, in a randomized controlled trial, people are randomly assigned to eat vegetables or not eat vegetables. Thus, in each group, there will be a mix of people with different exercise habits, making it less likely that exercise can alter the results.
To develop higher-level evidence that is more easily accessible to doctors, researchers conduct systematic reviews. These studies systematically review the results of various individual published studies on a topic, distill the evidence, evaluate each study’s design and draw conclusions.
Over time, as research evidence accumulates in a field, experts in that field come together to develop clinical guidelines. These guidelines further distill the evidence from research into algorithms that physicians can employ to make decisions about how to treat patients using the latest evidence.
How to Develop Skills in Evidence-Based Medicine
Some studies from international settings and from the field of nursing suggest that a higher level of education is associated with greater use of evidence-based approaches in patient care. By extension, a desire for constant learning and growth may also make aspiring doctors more prepared for a future career that incorporates EBM.
To practice EBM, one must constantly be learning. This is why it helps for students going into medicine to adopt lifelong learning skills.
Many of our students who have participated in research report that this involvement increased their curiosity, their desire to constantly learn and their ability to be resourceful to find answers to research questions.
While all types of research can be valuable, getting involved in clinical research as an undergraduate can be an especially unique way to develop an understanding of how clinical studies are designed and how to interpret them. Courses in epidemiology and biostatistics as an undergrad can also give premed students the basic framework for understanding published clinical studies.
How to Discuss Evidence-Based Medicine in Medical School Applications
When writing your personal statement or other essays, reflect on experiences that demonstrate your inherent curiosity, your desire to always learn and your ability to think critically.
For example, if you have participated in research, don’t just explain your research. Describe how the research equipped you with skills that may allow you to incorporate EBM into your daily practice when you become a doctor.
In the end, becoming evidence-based in medical practice requires possessing the humility to know that your knowledge is limited and the curiosity to constantly build on that knowledge base. With these two qualities, humility and curiosity, you can embark on a career in medicine ready to build the foundations for a career that incorporates evidence-based medicine.