If at First You Don’t Succeed….
Viewpoint articles are written by members of the SPH community from a wide diversity of perspectives. The views expressed are solely those of the author and are not intended to represent the views of Boston University or the School of Public Health. We aspire to a culture where all can express views in a context of civility and respect. Our guidance on the values that guide our commitment can be found at Revisiting the Principles of Free and Inclusive Academic Speech.
In clinical research, it is rare that we actually get a second bite of the apple. Yet this is exactly what happened in the case of the Mobile Continuing Medical Education (mCME) Project.
When I first took my infectious disease boards some years ago, it was old-school: me, a textbook, and a yellow highlighter pen. When I retook them 10 years later, the whole process had shifted online, and was based instead on clinical vignettes with multiple choice questions and lots of linked readings. Frankly, it was a much better way to study than the yellow highlighter pen method. And since I’m a global health researcher, I quickly started to contemplate whether a similar approach could be adapted to work in low-resource country settings, using cell phones instead of a laptop computer.
This idea would likely have remained nothing more than an idea except for a fortuitous meeting that was facilitated by one of my graduate students with a delegation of public health practitioners from Vietnam. They were interested in trying to develop an mHealth intervention, but wanted to do so in a rigorous, scientific, evidence-based way. And they needed an idea for what their project was. Since the CME via cell phones idea had been percolating in my thinking for some time, I pitched it and they accepted. We agreed to write a grant—which was, astonishingly, accepted by an NIH study section on the first review. And so was born the mCME Project.
The concept was simple: In similar fashion to how I studied for the ID boards, we created an intervention that delivered SMS text messages to Vietnamese primary care clinicians. Some of these were phrased as medical factoid—factual statements about things that any given primary care practitioner ought to know. Other participants received the same information, but phrased instead as a multiple-choice question. A third group served as a comparison and did not receive the intervention. At baseline and endline, all three groups were assessed using a standardized medical exam. Our hope was to see that either of the two intervention groups would outperform the control group on the endline exam.
Participation rates started and remained high, particularly among those who received the daily quiz questions. And at endline, when we did focus groups, the enthusiasm for the intervention was palpable: The clinicians loved the approach. They found it simple to use, helpful, and fun, and were excited that people were paying attention to them and to their professional educational needs.
However, in our final analysis, the intervention proved a complete dud. No increase in test scores whatsoever. Alas.
Fortunately, this was not quite the end. After several days of crying into our tea, we started the process of trying to understand why the intervention had failed. What we determined proved pivotal. Specifically, we learned that the intervention subjects took the content of the daily messages quite literally, assuming that the information in the SMS messages was all that they needed to know. As an example, let’s say that a daily question was, “What is the first line therapy for Type II Diabetes?” Knowing that the answer is Metformin is helpful, but only the start. We had hoped that the question would alert the students to the fact that Metformin and Type II diabetes are important topics that merit further study—that the participants would go beyond the daily question and seek more information generally on that topic, a process that we termed “lateral learning.”
But that is not what the participants did. Instead, they studied the SMS questions as mini textbooks, and hence did not improve their medical knowledge. To note, we had designed the standardized exams and daily quiz questions carefully so that they covered the same domains of information, but never repeated the same questions. The goal was, after all, to see if they had learned, not whether they had memorized answers.
This is where we got our second bite at the apple. During this period, our funder, Fogarty International Center at the National Institutes of Health, granted us a supplemental award. While only 50,000 USD, this was still enough to repeat our experiment using an improved version of the intervention that was adapted based on the feedback and data we had gathered in the initial study.
For mCME v2.0, this time tested among a group of HIV clinicians in Vietnam, the intervention worked like a charm. Self-study behaviors significantly improved among the intervention participants—evidence that they were engaging in lateral learning—and this translated into improved exam scores compared with the control participants. As a further bonus, intervention participants reported improved job satisfaction.
Our team was thrilled by this, but our partners at the Vietnamese MOH were downright ecstatic. This was a scientific win, for sure, but it also took a big step towards solving their problem of how to provide CME to its medical work force with very limited financial resources. We are already in negotiations about how to take this approach nationwide and move towards incorporating this strategy into routine practice in Vietnam.
The motto of the School of Public Health is “Think. Teach. Do.” In this one study, we hit all three of those goals. This is an intervention that is quite literally about thinking and teaching. And the “do” part emerges out of the fact that the whole process was done in close partnership with the Vietnamese MOH, with the expectation that this should lead to policy changes.
If there is a moral here, it is the importance of validating good ideas (even quirky ones inspired by having to suffer through board examinations). But these results also emphasize how critical it is that we learn the lessons that our data and experiences tell us. Without that humility, we could easily have given up after mCME v1.0 and never taken that second bite of the apple. But we did, and now we know something new and useful that is likely to improve and save lives.
Christopher Gill is an associate professor of global health.
Comments & Discussion
Boston University moderates comments to facilitate an informed, substantive, civil conversation. Abusive, profane, self-promotional, misleading, incoherent or off-topic comments will be rejected. Moderators are staffed during regular business hours (EST) and can only accept comments written in English. Statistics or facts must include a citation or a link to the citation.