Volume 53 Number 1, 2016
Pages 137 — 146
Abstract — Successful organizational improvement processes depend on application of reliable metrics to establish targets and to monitor progress. This study examined the utility of the Pain Care Quality (PCQ) extraction tool in evaluating implementation of the Stepped Care Model for Pain Management at one Veterans Health Administration (VHA) healthcare system over 4 yr and in a non-VHA Federally qualified health center (FQHC) over 2 yr. Two hundred progress notes per year from VHA and 150 notes per year from FQHC primary care prescribers of long-term opioid therapy (>90 consecutive days) were randomly sampled. Each note was coded for the presence or absence of key dimensions of PCQ (i.e., pain assessment, treatment plans, pain reassessment/outcomes, patient education). General estimating equations controlling for provider and facility were used to examine changes in PCQ items over time. Improvements in the VHA were noted in pain reassessment and patient education, with trends in positive directions for all dimensions. Results suggest that the PCQ extraction tool is feasible and may be responsive to efforts to promote organizational improvements in pain care. Future research is indicated to improve the reliability of the PCQ extraction tool and enhance its usability.
Key words: chart extraction, chart review, chronic pain, organizational improvement, pain, pain care, pain management, primary care, quality indicators, Veterans.
Chronic pain poses a substantial burden on the health of the U.S. population. Estimates suggest that over 100 million Americans experience persistent pain [1–2], with higher prevalence among Veterans [3] as well as medically underserved populations [4]. Among Veterans treated at Veterans Health Administration (VHA) primary care clinics, 50 percent report persistent pain [3,5]. A recent study in a large Federally qualified health center (FQHC) found that 40 percent of all adult ambulatory visits involved patients with chronic pain [6]. In addition, costs are estimated to exceed $600 billion in medical expenses and lost productivity [7]. Although specialized multidisciplinary pain treatment is necessary and effective, particularly for more complex patients [8–9], access to these services is limited and is often not needed [7,10]. Thus, while most patients with chronic pain are treated by a primary care provider (PCP), most PCPs face organizational and administrative barriers to providing effective care [11], receive limited training in pain management [12–13], express low confidence in their ability to care for such patients [14–17], and hold reservations regarding treatment of chronic pain. Studies suggest that there is wide variability in PCPs' adherence to guidelines for pain management [18–20], and documentation of comprehensive pain care plans and specific treatment provided is poor [21–22].
Effective models of pain management in primary care have been developed. The most widely promoted evidence-based model is the Stepped Care Model for Pain Management (SCM-PM). The model, advocated by the American Academy of Pain Medicine [23], is the basis for the VHA's national pain management strategy [24–29]. It emphasizes an individualized, stepwise approach to pain management as patients increase in complexity and/or fail to achieve treatment goals with more conservative interventions [30]. Although several studies have demonstrated the potential for quality improvement initiatives to increase the quality of pain management, such initiatives are limited by a lack of well-established quality measures and benchmarks to measure their effect [6,22,30–32]. Recently, our group developed and validated a new tool for extracting information from electronic health records (EHRs) on the quality of documentation of pain and pain management [33]. Three dimensions of pain care quality were targeted, namely pain assessment (e.g., assessment of functioning and pain interference), treatment plans (e.g., patient education), and pain reassessment (i.e., assessment of outcomes). The current study was designed to further examine the psychometric properties of the measure with a specific focus on examining its responsivity to change in the context of a 5 yr performance improvement project designed to promote implementation of the SCM-PM with a specific focus on improved management of patients receiving long-term opioid therapy. Here we examine outcomes in one multisite VHA healthcare system, with replication and crossvalidation of the utility of this measurement approach in another multisite FQHC that was conducting a similar SCM-PM–based quality improvement initiative.
The Department of Veterans Affairs Connecticut Healthcare System (VACHS) is composed of two academically affiliated VHA medical centers and six community-based outpatient clinics. About 50,000 Veterans receive care within VACHS annually. In addition to primary care services provided by an interdisciplinary team consistent with VHA's Patient Aligned Care Team model of care [34], VACHS PCPs and patients have access to a range of specialty pain management services, including rehabilitation, mental health, pain medicine, and complementary and integrative approaches.
Project Step was a 5 yr study designed to examine the adoption and implementation of SCM-PM throughout VACHS, with a particular emphasis on improvements to pain management in the primary care setting and appropriate referral to secondary specialty care [28]. From 2009 to 2012, a wide range of pain management-focused interventions were implemented, including policy and practice guidelines, templates in the EHR, increased access to complementary and alternative medicine providers, a rapid performance improvement workshop, a primary care pain workgroup, a PCP peer support group, and a wide range of PCP educational opportunities (a grand rounds series, Web access, round table meetings with pain specialty care, case-based interactive training, and workshops on improving patient communication).
Community Health Center Inc (CHCI) is a multisite FQHC located in Connecticut. CHCI provides comprehensive primary care services, including medical, behavioral, and dental care in 12 primary care health centers across the state as well as nearly 200 additional sites in schools and homeless shelters. CHCI cares for over 130,000 medically underserved patients in the state. Over 60 percent of CHCI patients are racial/ethnic minorities; over 90 percent are below 200 percent Federal poverty level, 60 percent are on Medicaid or state insurance, and 22 percent are uninsured.
Project STEP-ing Out was a 3 yr quality improvement initiative designed to improve pain care quality by applying the SCM-PM in a manner similar to Project Step and evaluating the effectiveness of the model outside the VHA setting [6]. The project included implementation of a variety of organizational interventions that were introduced from 2010 to 2012. These included structured data collection and documentation tools, new standard policies for the management of chronic opioids, establishment of a chronic opioid "dashboard," annual pain-specific continuing medical education, integration of behavioral health pain management interventions, increased access to complementary and alternative medicine providers, and Project ECHO (virtual specialty consultation using video conferencing) [35].
All data for this study were extracted from primary care progress notes in the VACHS or CHCI EHRs. A random sample was selected of 200 patient progress notes for each of the four years for VACHS from July 2008 through June 2012 and 150 notes for each of the two years for CHCI from January 2011 through December 2012. Progress notes were eligible for patients enrolled for care at VACHS or CHCI who had received 90 consecutive days or more of prescription opioid medications for pain by a PCP within the study year. Patients prescribed opioids for cancer pain, for substance use disorder, for other nonpain uses, solely by a specialty-care or other nonprimary care provider, or outside of the specified setting (VACHS or CHCI) were not included. Notes were evaluated for any time during the entire study year rather than only during the period for which the patient was prescribed opioids. Within VACHS, the number of patients who received 90 consecutive days or more of opioid medications for pain was 552 for year 1, 596 for year 2, 578 for year 3, and 535 for year 4. Within CHCI, there were 1,058 patients who received ≥90 consecutive days of opioid medications for pain in 2011 and 1,308 in 2012.
Details of the development and characteristics of the chart abstraction tool have been previously published [33]. The tool contains 12 indicators grouped into three domains: pain assessment, pain treatment, and reassessment. Pain assessment targeted information gathered by the PCP to help with diagnosis and treatment, including assessment of the presence of pain, the source of pain, or the effect on patient functioning, and a review of any recent pain tests or diagnostics. Pain treatment included entering a consult for pain-related specialty services (e.g., chiropractic, pain medicine clinic, physical therapy), ordering a diagnostic test, prescribing a medication, documenting a specific plan for treatment, and/or providing education/information. Pain reassessment addressed whether PCPs checked in with patients about the effectiveness of current pain treatments and whether pain and/or functioning have changed since the previous visit. For each progress note, the rater read all available clinical notes and data fields for that date. The rater then coded whether each individual indicator was present or absent. A comprehensive coding manual was developed at each site detailing operational definitions for each domain and individual indicators, guidelines for coding specific frequently occurring content in PCP notes, and specific examples of cases that met or did not meet criteria for each indicator. Raters were trained by a physician or psychologist with extensive clinical experience with the specified EHR. Training included reading and reviewing the coding manual and addressing questions and discussing distinctions and variations, guided coding of example notes, and review of notes coded by both the trainer and the rater. For notes coded by both the trainer and rater, inconsistent codes were reviewed directly and resolved through consensus by consulting the coding manual or discussing with other members of the research team. Additional reliability checks were performed randomly after initial training to avoid drift. The prior study by our group found that the measures of interrater reliability ranged from κ = 0.56 to 1.00 [33].
Each of the randomly selected progress notes was examined by a trained research assistant using the chart abstraction coding manual. Cases were excluded for the following reasons: (1) patients did not have a routine primary care visit with their PCP within the 1 yr time period; (2) the only primary care encounter was one in which the patient saw only the nurse and not the PCP; (3) the only primary care encounter was an unscheduled or urgent, rather than routine, visit and would thus not be focused on the presenting problem and not necessarily include assessment of pain; or (4) the only primary care encounter was an initial rather than follow-up visit with the PCP, for which many of the extraction items, such as reassessment of pain and review of assessments, would not be possible. For VACHS, there were a total of 689 included progress notes over the four years. For CHCI, there were 300 included progress notes.
Analyses were conducted separately for each setting. Generalized estimating equations with logit link and autoregressive covariance matrices were used to estimate the proportion of charts coded for the presence of each Pain Care Quality (PCQ) extraction tool item for each year and type of facility (health center or community-based outpatient clinics in the VACHS and small [1–2 PCPs], medium [3–4 PCPs], or large [5 or more PCPs] clinics in the CHCI). Analyses controlled for PCP and the repeated measures within PCP in each of the years of observation. Planned follow-up contrasts evaluated the linear trend over the four years. All analyses were conducted using SPSS 19.0 (IBM Inc; Armonk, New York). Cohen kappa was used to evaluate interrater reliability in the VACHS.
Table 1 presents measures of reliability and outcome estimates for each PCQ extraction outcome within the VACHS sample. The sample was predominately male (96.7%), and the mean age was 62.5 yr. Reliability measures (Cohen kappa) were based on 114 cases over the four years. Kappa indices ranged from 0.50 to 1.0 and were somewhat lower for the intervention outcomes (0.63–1.00 for assessment outcomes and 0.50–0.87 for intervention outcomes).
Controlling for provider and treatment location, there were significant changes over the four years for provider assessment of function, review of recent tests and diagnostics, ordering of pain-related consults, documentation of a specific treatment plan, pain education, and reassessment. Evaluating the linear trend over the four years, there were significant increases for documentation of review of recent tests and diagnostics (p = 0.001), pain medication prescriptions (p = 0.03), and reassessment (p = 0.005) and decreases for consult orders (p = 0.02) and specific pain treatment plan (p = 0.001). There was a marginally significant trend for increasing assessment of presence of pain (p = 0.05) and decreasing orders for diagnostics (p = 0.06). Documentation of pain education was higher in years 2 and 4 than years 1 and 3 (p = 0.002). Pain intensity ratings increased significantly over the four years (p = 0.006 for the linear trend).
There were significant differences in documentation by medical facility type, with higher rates of presence of pain (p = 0.006), patient function (p = 0.02), pain source (p = 0.001), review of tests and diagnostics (p = 0.001), specific pain treatment plan (p = 0.001), and reassessment (p = 0.03) in health centers than in the community-based outpatient clinics. There were no significant interactions of medical facility type and year.
p-Value
|
||||||
2008–2009,
N = 174 |
2009–2010,
N = 175 |
2010–2011,
N = 160 |
2011–2012,
N = 180 |
|||
*Based on 114 notes double coded for reliability.
|
Table 2 presents measures of reliability and outcome estimates for each PCQ extraction outcome within the CHCI sample. The sample was predominately female (59.7%), and the average age was 49.5 yr. Kappa values at CHCI were not calculated due to limited resources and unavailability of additional raters. During the training process, a research assistant and a senior researcher who had previously worked at the VACHS with Project Step and contributed to the development of the PCQ tool reviewed sample patient progress notes separately and compared their findings until they reached 10 consecutive cases for which they had 100 percent consensus in coding.
Controlling for provider and treatment location, there were no significant changes over the two years of evaluation. There were significant differences in documentation by medical facility type. Large facilities had lower documentation of presence of pain (p = 0.005) compared with small and medium facilities. Small facilities had greater documentation of pain education compared with medium and large facilities (p = 0.006). There were no significant interactions of medical facility type and year.
The VHA has established an evidence- and population-based SCM-PM as its single standard of pain care [28–29]. Despite growing empirical support and enthusiasm for the SCM-PM, there currently exists no methodology for evaluating the degree to which this new standard has been implemented. To address this gap, we have defined the key dimensions of PCQ as pain assessment, treatment (including pain education), and reassessment [33]. Our definition is informed by VHA policy that established these key dimensions as standards of pain management, including standards for assessing outcomes and quality, and benchmarks for provider competencies and expertise. Our team has completed foundational work to develop reliable and valid metrics for assessing these key dimensions of PCQ using chart review to extract the data from the EHRs. Ratings of interrater agreement over the study period were consistent, ranging from 0.50 to 1.0, with two values indicating "fair," five indicating "good," and five indicating "excellent" reliability [36–37]. There was greater reliability for the overall domains of pain assessment and treatment planning and lower reliability for most of the individual items [38]. This study extends these findings in significant ways by providing further evidence of the reliability and responsivity to change of this measure in a VHA setting while replicating its usability in a non-VHA community-based integrated healthcare setting.
At VACHS, a number of PCQ components were responsive to change over time and type of clinic. Findings showed evidence of improved PCP assessment of patient functioning, review of tests and diagnostics, pain education provision, medication prescription, and pain reassessment. These findings are consistent with Cleeland et al., who found improvement in provider pain management documentation over an 8 mo rapid improvement process across five primary care sites in VHA [30]. There was also evidence of a decline in ordering of pain specialty consultations (such as pain medicine, rehabilitation, and chiropractic). However, these findings contrast a recent evaluation of VACHS EHR data among recipients of opioid prescriptions for noncancer pain, which found increased referral and use of complementary and alternative treatments for pain, such as chiropractic and physical therapy [38]. It is possible that the chart extraction item is too broad and may be identifying trends across different types of referrals.
Although interrater agreement ranged from "fair" to "excellent," two items had only "moderate" or "fair" overall reliability (medication ordered, kappa = 0.51, and specific pain plan, kappa = 0.50) [36–37]. Medication ordered may have been affected by the high overall prevalence of medication orders [39]. Of note, although the measure of reliability for medication orders was lower, the prevalence was consistently high across all years for VACHS (>94%) and 100 percent across both years within CHCI. This suggests that for patients on long-term opioids, this item may not be responsive to change over time, likely because medications (opioids or nonopioids) will continue to be prescribed for these patients. It may also be possible that coders had a difficult time distinguishing whether medications commonly used for pain management, such as antidepressants and anticonvulsants, were being prescribed for this indication or for a mental health or other comorbid condition. Similarly, for the specific pain treatment plan, the item was intended to capture documentation of plans of any prescribed activities for treating pain, including medications. However, coders may have had difficulty distinguishing possible alternative treatment for pain that may have been presented. Operationally defining this item was difficult, and additional specification and training may be needed to improve reliability, or it may be more practical to separate the item into components that are more reliably coded. Thus, the lower reliability of these items may have affected the evaluation over time.
Although most of the results were consistent with improvement in PCP pain care management, some findings were equivocal. There was an increase in medication prescriptions for pain. Although our methods did not evaluate whether the increases were in opioid or nonopioid pain medications, it is noteworthy that the pool of patients receiving opioid medications over the 4 yr study period appears to have decreased, and our recent evaluation of VACHS EHR data [38] showed an overall increase in nonopioid prescriptions and decrease in high-dose opioids over the same time frame, indicative of overall improved pain care. One somewhat troubling finding was a decrease in documentation of a specific treatment plan for pain management. Given the increase in pain education and decrease in consultations, it is unclear whether this change represents less provider clarity in patient treatment or whether providers are exploring new treatment options that may require additional evaluation. Additional qualitative evaluation of providers' treatment processes would be informative.
Within the Federally qualified CHCI setting, the PCQ extraction tool was found to be usable and provided information on commonly documented components of pain care quality (assessment of pain presence and source, medication prescriptions, and specific pain treatment plans) and those with low documentation (assessment of function and review of tests and diagnostics and pain education, ordering of consultation and diagnostics, and reassessment). The findings noted in VACHS were not replicated in the 2 yr evaluation in this setting, although estimates were nominally in the expected direction, suggesting that the time period may have been insufficient to evaluate the effect of the wide array of interventions implemented.
Despite the fact that the study was not designed or powered to examine PCQ documentation by type of facility in both settings, we decided that such analyses might be informative. The fact that facility type did not interact with time suggests that improvements were similar in both settings. The pain management-focused interventions in both VACHS and CHCI were implemented across facility types, with a number of features designed to involve providers from all types of facilities. In VACHS, quality of documentation appeared to be better in larger health centers than in community-based outpatient clinics across the four years. In contrast, documentation was greater in smaller CHCI clinics than in larger ones. These findings suggest that performance improvement efforts may be enhanced by taking into account facility characteristics.
Several limitations of this study are important to acknowledge. First, the findings were based on opioid prescribing in primary care only, and thus patients prescribed opioids solely by a specialty care or other nonprimary care provider, or prescribed outside of VHA were not included. Based on other analyses on this topic, we believe that these are likely a small number of patients, but we do not know how much they differ from patients in the current study. Second, the findings are based on documentation within the EHR of pain management rather than observation of provider behavior. As has been noted by Krebs et al. [22], this likely represents an underestimate of provider pain activities. Thus, the interventions implemented in both settings may have affected provider reporting behavior without affecting underlying pain management behavior, or the reverse. Third, despite the large number of PCP encounters extracted and coded each year, there were indications of substantial variability in several of the measures. For example, in the VACHS setting for all measures that showed significant changes over time (functioning, review, planning, education, consult ordered, and reassessment), there were numeric changes from year 2 to year 3 inconsistent with the overall trend over the four years. Increasing the number of encounters extracted and coded would improve measurement precision for evaluating changes within providers over time, as would controlling the number of encounters within PCP. Similarly, although ratings of interrater reliability were generally in the moderate to excellent range, measurement error may have limited power to evaluate the effect of the implemented interventions on provider pain management behavior, including the nonsignificant effects in CHCI.
Finally, responsivity of the measure to performance improvement efforts might be improved by assessing quality of pain care as continuous rather than multiple dichotomous measures. We considered an overall measure of PCQ as the sum of the individual components. However, it is unclear whether the components should be equally weighted to measure overall PCQ, whether improved PCQ is based on decreases in some components and increases in others, and whether some components may be better measured as continuous or ordinal measures rather than simply categorical. Further evaluation and development of measures that address component importance, direction, and degree are warranted.
Our study provides evidence of the potential utility of our PCQ extraction tool in the context of organizational efforts to promote implementation of an evidence-based SCM-PM specifically targeting improvements in management of patients receiving long-term opioid therapy for chronic pain. Overall, results suggest some improvements in the quality of pain care at two institutions over several years of investigation, although the magnitude of these changes is modest and inconsistent across setting and quality indicator. Although our results encourage the use of the PCQ extraction tool in similar efforts, there are several implications for further developments. First, raters found it challenging to code some subdomains reliably, and efforts are underway to refine the operational definitions and coding manual to improve interrater agreement. Second, the current version of the measure is limited by the binary (yes-no; present vs nonpresent) nature of the coding process. Future efforts to permit reliable coding of degrees or level of quality for each of the indicators would likely prove useful. Third, it similarly may be useful to examine a summary index of PCQ based on the individual domains of the measure. Finally, the current manual record review approach is extraordinarily time consuming and labor intensive. Development of an automated approach using machine learning and natural language processing promises to yield increased reliability and utility of the measure. Taken together, a more efficient, reliable tool with readily interpretable summary indices could likely improve the responsivity of the measure to change and encourage its use in similar pain-relevant organizational improvement efforts.
1.
|
2.
|
3.
|
5.
|
6.
|
8.
|
9.
|
10.
|
11.
|
12.
|
13.
|
14.
|
15.
|
17.
|
18.
|
19.
|
20.
|
22.
|
23.
|
25.
|
27.
|
28.
|
31.
|
32.
|
33.
|
34.
|
35.
|
36.
|
37.
|
38.
|
39.
|
Go to TOP
Last Reviewed or Updated Wednesday, February 24, 2016 10:47 AM