Showing posts with label patient outcomes. Show all posts
Showing posts with label patient outcomes. Show all posts

Friday, September 18, 2015

CS2day: Award-Winning, 9-Collaborator, Performance-Improvement CME With an Outcomes-Based Evaluation Model

I saved the best for the last entry in the Back to School Tweet Fest. The Cease Smoking Today (CS2day) initiative cannot be ignored in a series about effective educational interventions in changing practice and improving quality of health care. An entire 2011 supplement of the Journal of Continuing Education in the Health Professions (JCEHP) reports the complex CS2day educational program and its findings, with six research articles [1-6] and three forum articles [7-9] written by multi-institutional teams among the nine initiative partners. This study was awarded the Alliance for CME (now ACEhp) Award for Outstanding CME Collaboration in 2009 (see PDF pages 15-18 of www.acehp.org/d/do/150), and was presented in a 2012 CME Congress poster (P50: http://www.cmecongress.org/wp-content/uploads/2012/05/CME-Congress-2012-Abstracts.pdf). The study boasts collaboration among universities, professional societies, accredited CME providers, ACEhp presidents and conference chairs, CME directors at academic medical centers, the JCEHP Editor-in-Chief, and other published researchers [1,10] who carefully define the educational program’s framework and collaboration model in the new quality improvement paradigm of CME called for by the Institute of Medicine in 2001 [11].

The CS2day initiative is so big that this blog post cannot feature just one article reporting it. I will focus on the introductory editorial [10] and 2 study articles that focus on (a) developing competencies to assess needs and outcomes [3] and (b) the educational and patient health outcomes data themselves [4]. The medical education expert Donald Moore introduces the supplement and one article therein reports the outcomes data. I hope you will do as Moore recommends, when you question what you can take from articles describing “a huge project with significant funding,” which is to ask, “What are the general principles that I can identify in these articles and how can I use them in my CME practice[?]” [10].

In my previous post, I noted the difficulties of using PI-CME to change patient health outcomes in a condition posing a major public health challenge: the COSEHC study addressed cardiometabolic risk factors and saw performance and patient health improvements. The CS2day initiative faced the same challenge, and happily also reported performance change and a change in patient health outcomes: smoking cessation. Moore nicely summarizes the challenge of connecting Level 5 performance changes among clinicians to Level 7 changes in public health outcomes: “All of us want to improve the health of the public in some way, but our approaches … may prevent us from having the impact that we wish to have. The [CS2day] articles … suggest there might be another approach that we should consider to address the important public health issues that surround but do not seem to be impacted by our CME programs” [10; emphasis added].

The articles in the JCEHP supplement are organized around 4 themes [10], to which I have added themes from the articles: 
a) Collaboration is challenging but worth doing if guidelines are set and a formative evaluation of the collaboration against known success factors is carried out [1,2,5]
b) Best-practice CME includes an outcomes orientation that connects learning and performance objectives from the needs assessment to the outcomes assessment in a valid framework to support content in all educational activities [3-6]
c) A public health focus can lead to development of CME/CEhp activities with a translational or implementation science function that transcends what can happen when education addresses only a practice gap [7]
d) Standards and competencies for CEhp and members of the CEhp profession help initiatives meet the principles and characteristics of the IOM report’s expectations [8,9,11] 

The two featured research articles [3,4] function together as the Methods and Results sections of a typical IMRAD-structured paper, but each is extensive enough to stand alone and inform CEhp professionals. McKeithen et al describe the following: the need for establishing clinical competency statements related to supporting smoking cessation; the clinical guidelines that informed performance expectations; “the 5 A’s” of support for smoking cessation (Ask, Advise, Assess, Assist, and Arrange); the 14 competencies or the 8 performance outcomes measures that fit into the 5 A’s algorithm being assessed; and collaboration of clinical and educational experts on outcomes tools to develop “a comprehensive set of measures at Levels 3 through 6” [3].

The summative outcomes data are extensively reported by Shershneva et al, where “evaluation of a collaborative program” is presented as “translating” the outcomes framework into practice [3,4]. Defining desired outcomes of the program across Levels 1 to 6* was seen as useful in facilitating agreement among stakeholders; guiding the evaluation process; gathering data from multiple activities and collaborators in a central repository; and studying the effects of mechanisms that link education to outcomes [4]. Thanks to effective planning, the researchers were also able to add to the literature on instructional design in CEhp by distinguishing performance outcomes from two groups of activity types: a) live PI activities with either a collaborative or practice-facilitator model and b) self-directed learning PI activities.

Also worth reading are additional insights about using the Success Case Method (SCM) to determine whether and why educational interventions succeed [6]. In CS2day reporting, using the SCM allowed the research team to conclude remarkably confidently, stating, “the PI activities were a primary and proximal cause of improvement in clinical practice” [4]. Moore notes that “the results were impressive: physicians integrated a new guideline into their practices and many patients stopped smoking” [10]. The guideline integrated into practice through the CS2day initiative was a “heavily researched evidence-based practice guideline published by the U.S. Agency for Healthcare Research and Quality,” due to be updated in 2008, the year after this collaborative initiative was begun [1].

Finally, a comment: In CEhp, change data are often seen as valid only when educational and program interventions do not change before activity expiration, nor even when a formative assessment shows changes to be necessary. This attitude can leave participating clinicians with suboptimal educational opportunities and stakeholders in the educational design frustrated. The use of the formative program evaluations that improved the CS2day initiative, with acknowledgements of changes, is in my opinion better than a pure pre/post comparison on an activity where valuable investments are not updated when indicated. If the CME/CEhp profession helps clinicians link medical care to public health through disease prevention, accountability to quality, and more, then educational design should respond to data collected in lengthy and large interventions.

The CS2day initiative is a model study in educational and performance improvement methods for a challenging public health problem. Please read the study articles if you have print or online access to JCEHP, for I have only touched the surface of the initiative's methodology, results, and rationales in the limited confines of this space. 

* Note: In this study, “Learning” was used as Level 3 and included knowledge and clinical skill (competence) measures, while “Performance” including commitment to change (CTC) queries was used as Level 4. Thus Level 5 was “Patient Health Status” and Level 6 was “Population Health Status.”

References cited: 
1. Olson CA, Balmer JT, Mejicano GC. Factors contributing to successful interorganizational collaboration: the case of CS2day. J Contin Educ Health Prof. 2011;31(Suppl 1):S3-S12.
2. Ales MW, Rodrigues SB, Snyder R, Conklin M. Developing and implementing an effective framework for collaboration: the experience of the CS2day collaborative. J Contin Educ Health Prof. 2011;31(Suppl 1): S13-S20.
3. McKeithen T, Robertson S, Speight M. Developing clinical competencies to assess learning needs and outcomes: the experience of the CS2day initiative. J Contin Educ Health Prof. 2011;31(Suppl 1):S21-S27. http://www.ncbi.nlm.nih.gov/pubmed/22190097. [Featured Article]
4. Shershneva MB, Larrison C, Robertson S, Speight M. Evaluation of a collaborative program on smoking cessation: translating outcomes framework into practice. J Contin Educ Health Prof. 2011;31(Suppl 1):S28-S36. http://www.ncbi.nlm.nih.gov/pubmed/22190098. [Featured Article]
5. Mullikin EA, Ales MW, Cho J, Nelson TM, Rodrigues SB, Speight M. Sharing collaborative designs of tobacco cessation performance improvement CME projects. J Contin Educ Health Prof. 2011;31(Suppl 1):S37-S49.
6. Olson CA, Shershneva MB, Brownstein MH. Peering inside the clock: using success case method to determine how and why practice-based educational interventions succeed. J Contin Educ Health Prof. 2011;31(Suppl 1):S50-S59.
7. Hudmon KS, Addleton RL, Vitale FM, Christiansen BA, Mejicano GC. Advancing public health through continuing education of health care professionals. J Contin Educ Health Prof. 2011;31(Suppl 1):S60-S66.
8. Balmer JT, Bellande BJ, Addleton RL, Havens CS. The relevance of the Alliance for CME competencies for planning, organizing, and sustaining an interorganizational educational collaborative. J Contin Educ Health Prof. 2011;31(Suppl 1):S67-S75.
9. Cervero RM, Moore DE. The Cease Smoking Today (CS2day) initiative: a guide to pursue the 2010 IOM report vision for CPD. J Contin Educ Health Prof. 2011;31(Suppl 1):S76-S82.
10. Moore DE. Collaboration, best-practice CME, public health focus, and the Alliance for CME competencies: a formula for the new CME? J Contin Educ Health Prof. 2011;31(Suppl 1):S1-S2. http://www.ncbi.nlm.nih.gov/pubmed/22190095. [Featured Editorial]
11. Institute of Medicine (IOM) Committee on Planning a Continuing Health Professional Education Institute. Redesigning Continuing Education in the Health Professions. Washington, DC: The National Academies Press; 2010. http://books.nap.edu/openbook.php?record_id=12704. Accessed September 17, 2015.

MeSH “Major” Terms for the 3 Featured Articles (common items italicized)
McKeithen et al [3]: Benchmarking; Clinical Competence; Education, Medical, Continuing/methods; Needs Assessment; Outcome and Process Assessment (Health Care)/organization & administration; Practice Guidelines as Topic/standards; Smoking Cessation/methods; Tobacco Use Disorder/prevention & control
Shershneva et al [4]: Benchmarking/methods; Clinical Competence/standards; Health Personnel/classification; Health Personnel/psychology; Health Personnel/statistics & numerical data; Interprofessional Relations; Outcome Assessment (Health Care)/organization & administration; Program Evaluation; Smoking Cessation/methods; Tobacco Use Disorder/prevention & control
Moore [11]: Benchmarking; Clinical Competence; Delivery of Health Care, Integrated; Education, Medical, Continuing/methods; Interinstitutional Relations; Public Health

Thursday, September 17, 2015

Patient-Health Effects of a Performance-Improvement CME Educational Intervention to Control Cardiometabolic Risk in the Southeastern U.S.

Many of you who know me might recall that I moved from the Northeast to the Southeast U.S. some years back. As I learned about the people and culture of the Southeast, I commonly saw many dietary and lifestyle factors that would confer increased risks for cardiovascular diseases and diabetes—indeed, this part of the United States is known as “The Stroke Belt.” The Consortium for Southeastern Hypertension Control (COSEHC) initiative reported by Joyner et al sought to improve the control of these risk factors through a performance-improvement continuing medical education (PI-CME) activity [1]. It somehow seems fated that I report this study because the lead author is based in the same North Carolina city where I have lived these many years, working at Wake Forest University. The PI-CME initiative itself was conducted with several primary care physician practices with designation as a COSEHC Cardiovascular Center of Excellence in Charleston, South Carolina; a comparable practice group served as a control. Results were reported to Moore’s Level 6 (patient health outcomes) [2]. 

The intervention included many overlapping and reinforcing elements that we would expect to see in a major initiative on a major health concern: using the plan-do-study-act (PDSA) model, researchers worked to “improve practice gaps by integrating evidence-based clinical interventions, physician-patient education, processes of care, performance metrics, and patient outcomes.” The intervention design included an action plan to include medical assistants and nurses in patient-level tasks and education, patient chart reminders, patient risk stratification, and sharing of physicians’ feedback on successful practice changes with other participating practices. 

Because patient health outcome indicators were used to define educational effectiveness of the PI-CME initiative, the selection of measures is important to our understanding of study findings. The research team used cardiometabolic risk factor target treatment goals for 7 lab values as recommended by 3 sets of evidence-based guidelines (JNC-7, ATP-III, and ADA). The team set a more aggressive target for low-density lipoprotein cholesterol (LDL-C) because many patients had multiple risk factors for cardiometabolic diseases and coronary heart disease risk “can exist even in the absence of other risk factors.” Researchers investigated changes in patient subgroups: “diabetic, African American, the elderly (> 65 years), and female patient subpopulations and in patients with uncontrolled risk factors at baseline.” The authors note that the average patient in both intervention and control groups was clinically obese; other baseline health indicators were also similar. 

Now to results, gathered at 6 months to assess changes in patients' cardiometabolic risk factor values and control rates from baseline. The abstract summarizes findings as follows [1]:
Only women receiving health care by intervention physicians showed a statistical improvement in their cardiometabolic risk factors as evidenced by a -3.0 mg/dL and a -3.5 mg/dL decrease in mean LDL cholesterol and non-HDL cholesterol, respectively, and a -7.0 mg/dL decrease in LDL cholesterol among females with uncontrolled baseline LDL cholesterol values. No other statistical differences were found.

I want to discuss some factors that could explain the little change seen in this study. First, the intervention was measured at just 6 months into the educational initiative; this is known to be barely adequate for assessing clinicians’ performance change, and even performance changes were not likely to produce significantly different lab values in patients with years of health-related practices that led to their higher risks. Interestingly, there was less room for improvement because patients in both groups had higher baseline risk-control rates than is seen at the U.S. national level, and the patients in the intervention group had even higher baseline risk-control rates than patients in the physician control group had.

The study did appear to improve noted performance gaps regarding gender disparities in care. The authors note 4 studies pointing out suboptimal treatment-intensification to control LDL-C in female vs. male patients and even physician bias or inaction for female patients. Thus the improved patient outcome data for LDL-C and non-HDL cholesterol among women treated by physicians in the intervention group indicates a narrowing of established gaps in attitude (Level 4) and/or performance (Level 5).

Here in “The Stroke Belt,” any effort to control cardiometabolic risk factors must include population-level initiatives and patient education, which I have seen state governments, public health departments, recreation centers, and schools undertake at many levels. Two items stand out as affecting the COSEHC report’s findings: that the study tried to measure changed patient health indicators too soon after intervention, and that the researchers tied themselves to the high standard of measuring Level 6 for a health concern that needs interventions among patients and the public that were not considered here. Indeed, because physicians’ feedback on successful changes during the initiative were shared across practices, we know that Level 4 - 5 competence and performance changes were achieved. The authors should be commended on their work to tackle this public health concern through a PI-CME initiative.

Finally, I want to mention that Joyner et al cite two studies by others I am humbled to name as colleagues. First, Sara Miller and others at Med-IQ (in a team often featured in Don Harting’s earlier posts in this Back to School campaign) published with PJ Boyle on improving diabetes care and patient outcomes in skilled-care (long-term-care) communities [3]. Second, Joyner et al cite the article featured in this blog on September 11, 2015—which itself came up in my reporting on that day’s release of the landmark SPRINT study results of the NHLBI [4]—by Shershneva, Olson, and others [5]. The Joyner article noted the Shershneva team’s finding that “process mapping led to improvement in [a majority of CVD] measures” [1].

References cited:
1. Joyner J, Moore MA, Simmons DR, et al. Impact of performance improvement continuing medical education on cardiometabolic risk factor control: the COSEHC initiative. J Contin Educ Health Prof. 2014;34(1):25-36. http://onlinelibrary.wiley.com/doi/10.1002/chp.21217/abstract. [Featured Article]
2. Moore DE, Green JS, Gallis HA. Achieving desired results and improved outcomes: integrating planning and assessment throughout learning activities. J Contin Educ Health Prof. 2009;29(1):1-15.
3. Boyle PJ, O’Neil KW, Berry CA, Stowell SA, Miller SC. Improving diabetes care and patient outcomes in skilled-care communities: successes and lessons from a quality improvement initiative. J Am Med Dir Assoc. 2013;14(5):340-344.
4. NHLBI. Landmark NIH study shows intensive blood pressure management may save lives: lower blood pressure target greatly reduces cardiovascular complications and deaths in older adults [press release]. NHLBI Website. http://www.nih.gov/news/health/sep2015/nhlbi-11.htm. Accessed September 11, 2015.
5. Shershneva MB, Mullikin EA, Loose A-S, Olson CA. Learning to collaborate: a case study of performance improvement CME. J Contin Educ Health Prof. 2008;28(3):140-147. http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2782606/. [See blog post on this previously featured article at http://fullcirclece.blogspot.com/2015/09/todays-landmark-nhlbi-sprint-study.html]
MeSH “Major” Terms of Featured Article [1]:
Education, Medical, Continuing/organization & administration; Metabolic Syndrome X/prevention & control; Models, Educational; Physicians, Family/education; Quality Improvement