Thursday, September 10, 2015

3-Hospital Quality- and Performance-Improvement CME Project With Systems Change Support for Diabetes Care, With CE Measure Editor, Derek Dietze

This performance-improvement/continuingmedical education (PI-CME) study by Lapolla and colleagues at three mid-sized, regional hospitals in the United States focused on clinicians’ behaviors that can be documented as “validated metrics of diabetes care” in patient charts.

When you look at this example of a multi-center performance-improvement initiative with educational and institutional change supports, you’ll see that most of the intervention involved obvious reminders of the change initiative and its measures, graphs and data regarding performance trends, and a PI specialist as leader or champion of the campaign in each hospital. Even though the researchers chose hospitals that were willing to invest in change, the following implementation tools and collaborative approaches were incorporated as being necessary to change: "In the design and implementation of this program, we applied recognized PI principles and developed a dedicated working group to evaluate, monitor, and disseminate data, provide timely feedback, monitor outliers, attend to project management details, and maintain support of institutional leadership. We encouraged physicians' engagement by minimizing their time requirements, soliciting their input throughout the initiative, sharing meaningful data, and taking an 'improvement-oriented' approach rather than 'mandating change.'” [Emphases added.]

I wanted to point out the reminder of the American Medical Association’s recommendation of using a three-stage PI-CME structure, “comprising assessment of current performance (Stage A), design and implementation of strategies to improve performance (Stage B), and evaluation of PI efforts (Stage C).” Any outcomes project or program evaluation that sets its goals after educational content and resources are nearly final faces great challenges for later measurement. Permit me to include related content from the May 2015 Almanac article that Erik Brady and I wrote for the series, “Beginner’s Guide to Measuring Educational Outcomes in CEhp”: "A common error in assessment item-writing is the construction of assessment items that focus on a minor or trivial data point found in the content. This practice is particularly common in two cases: first, when assessment items are written from finished content that offers too little material for assessment; and second, when the minimum score a learner needs to request educational credit dictates the number of items on a tool, causing planners to test trivial points in their desperation to hit an arbitrary quota."

Because assessment items are optimally designed to assess how well a learning objective has been met, aligning a learning objective with an assessment item should ensure that your items are focused on the key points of the activity content and that activity content consistently supports learners’ achievement of the educational and performance objectives. Thus when Lapolla et al planned the educational content and reinforcing health-systems supports according to well-established metrics, they made certain that they had all desired outcomes mapped to hospital-specific care gaps before they started. This allowed the research team to provide “timely feedback” and manage the project better at each institution by identifying outliers in the PI datasets. A previously featured article in this month’s “Back to School” Tweet Fest mentioned incorporation of VTE prophylaxis therapies into order sets, and how this was effective. Notably, we see the same in this study: “the PI specialists at all 3 participating hospitals saw marked improvement once order sets included the metrics.”

Look at this article for its insights into quality improvement, implementation science, and educational methods. It is also nicely written and described, with model writing and brevity for a research report—especially remarkable given the number of study authors. Finally, note that Derek Dietze, Editor-in-Chief of CE Measure, participated in this large PI-CME study on improving care practices for diabetes, one of the most challenging epidemiological issues in the United States.

References cited:
Lapolla J, Morrice A, Quinn S, et al. Diabetes management in the hospital setting: a performance improvement continuing medical education program. CE Meas. 2013;7(1):54-60. doi:10.1532/CEM08.12103. doi:10.1532/CEM08.12103. http://www.cardenjenningspublishing.com/journal/index.php/cem/article/view/116. Accessed September 10, 2015. [FREE full text]
Brady ED, Binford SH. How to write sound educational outcomes questions: a focus on knowledge and competence assessments [series: “Beginner’s Guide to Measuring Educational Outcomes in CEhp”]. The Almanac. 2015;37(5):4-9. http://www.acehp.org/p/do/sd/topic=216&sid=811. Accessed September 10, 2015. [Full text]
Raffini L, Trimarchi T, Beliveau J, Davis D. Thromboprophylaxis in a pediatric hospital: a patient-safety and quality-improvement initiative. Pediatrics. 2011 May;127(5):e1326-32. doi: 10.1542/peds.2010-3282. http://pediatrics.aappublications.org/content/127/5/e1326.full.pdf+html. Accessed September 8, 2015. [FREE full text (Pediatrics final version)] 

No comments: