Checking the Box on Research in Medical Education

It’s easy to get into a checklist mentality when it comes to medical and premedical education. One full year of organic chemistry – check. Take the boards – check. Get some clinical experience – check. Become a great doctor – check?

As I have just finished the end of my research month (incidentally, a requirement by my school), I can’t help but think about the checklist mentality when it comes to research as well. Though this is naturally most apparent at schools like mine that require scholarly activities, my sense is that it is still prevalent for those coming from schools that don’t necessarily emphasize research when they are interested in “more competitive” specialties, like orthopedic surgery or dermatology. Research experience – basic science research especially – is assumed to be the best way (besides great test scores, of course) to distinguish yourself to your residency committee. And though I have always been one to enjoy research and to appreciate how this emphasis means that more research opportunities are available to me, I’ve always been perplexed by this — why is research such a highly valued part of medical education?

Medicine, it seems, has always latched onto basic science research as the surest way to teach our young people critical thinking and reasoning skills. There is a need for us to recognize that all knowledge is not static, that it has accumulated and evolved through years of sifting through messy data and hard work. As my professors are fond of telling us, much of what we learn in the classroom was discovered since they graduated medical school. They had to learn it all for themselves, through “lifelong learning,” diligence and self-study. Even now, no matter how far the science seems to have gone in the last few years (because it surely felt that way decades ago as it does now), the so-called facts we’re made to memorize in school may still be disproven, requiring us to change our practice to help our patients better. So we have to be ready for that. Doing research, then, is a way to prepare us for that, while also contributing in some small way to the work of creating new knowledge.

And yet, I can’t help feeling like doing research as a student is at once too little and too much. Depending on where you are or how dedicated to research you are, if you aren’t sticking around to get a PhD, you end up spending a few months to a year to doing research. Somehow, that is supposed to counterbalance the years of memorizing facts and First Aid. Without previous experience, a few months is just simply not enough time to develop a new mentality, to appreciate how fundamentally different research is from your schoolwork and to start to understand how to apply that to your practice. (Looking back on my first few months of research, I’m pretty sure all I was able to do was break a lot of agarose gels. I was really young then, but still, the thought of wasting my time learning those skills now is unimaginable.)

Without sufficient time and reflection, there’s a great risk that all research becomes is an annoying break from your studying routine, a lot of pain and hard work without realization of reward, and another box to check on your way to completing medical school. In fact, if one focuses on how research builds out one’s CV, the work of research becomes instead the work of chasing the publication, jumping through all the necessary hoops to make a result into a manuscript, rather than stopping to understand what the result really means or doing what you can to make that result have some sort of impact.

Because the truth is, when I think about where I learned how to be a critical thinker, I only think sometimes of the years I spent in a basic science research lab. Instead, more often than not, I think about analyzing texts in college, trying to understand what some ancient (or not so ancient) philosopher really meant, and growing to comprehend how to take that understanding and apply it to reality. I think about the summer internship I spent in New York, advocating for natural birth and poring over the literature on cesarean section and VBAC and realizing just how little the best scientific journals really know. It’s true that in all likelihood doing bench research started me on this journey of questioning everything. It helped me to understand just how messy the world of scientific discovery is, and just how important it is to appreciate the process. But that’s also because I happened to do that earlier than most. Who is to say how I would have viewed my research experience if I had read Plato and Aristotle before I ever touched a pipette?

At this point in my studies, I can say that research is much more to me than just a way to advance my critical thinking skills. It’s a way for me to pursue my passions, to see how doing community health work looks like from an academic perspective, and to develop my skills and knowledge in the science and art of quality improvement.  But I’ve been lucky — I can (and do) see other medical students in research projects that are a much worse match for their interests, simply because they feel (or are institutionally obligated) to do some research. Even for myself, I could see the summer ending very differently, if the project that most fit my goals and interests was not necessarily couched in academic terms but fit better in the world of entrepreneurship or community program development. Some would rightly argue that even a new business or community program could be turned into scholarly work, but I would still wonder whether the paradigm of scholarship is the best standard for considering all our accomplishments as doctors-to-be.