Relationships Between Low Quality-of-Care Scores and HMO’s Subsequent Public Disclosure of Quality-of-Care Scores
This article appeared in the September 25, 2002 edition of the Journal of the American Medical Association (JAMA)
Context
Public disclosure of quality data on health maintenance organizations (HMOs) might improve public accountability, inform consumer decision making, and promote quality improvement. But, because disclosure is voluntary, some HMOs could subvert these objectives by refusing to release unfavorable data.
Objective
To determine the association between HMO quality of care and withdrawal from public disclosure of quality-of-care data the subsequent year.
Design and Setting
Retrospective cohort study of administrative and quality-of-care data on HMOs from the National Committee for Quality Assurance (NCQA) annual Quality Compass databases for 1997, 1998, and 1999, including Health Plan Employer Data and Information Set (HEDIS) quality scores.
Main Outcome Measure
One-year rates of HMO withdrawal from public disclosure of HEDIS scores for plans in the highest and lowest tertiles of HEDIS scores, adjusted for method of data collection and plan model type.
Results
Of the 329 HMOs that publicly disclosed HEDIS scores in 1997, 161 plans (49%) withdrew from public disclosure in 1998. Of the 292 HMOs that disclosed their scores in 1998 (including 130 newly participating plans), 67 plans (23%) withdrew from public disclosure in 1999. Plans whose scores ranked in the lowest-quality tertile were much more likely than plans ranking in the highest-quality tertile to withdraw from public disclosure in 1998 (odds ratio [OR], 3.6; 95% confidence interval [CI], 2.1-7.0) and 1999 (OR, 5.7; 95% CI, 2.7-17.7).
Conclusion
Compared with HMOs receiving higher quality-of-care scores, lower-scoring plans are more likely to stop disclosing their quality data. Voluntary reporting of quality data by HMOs is ineffective; selective nondisclosure undermines both informed consumer decision making and public accountability.
JAMA. 2002;288:1484-1490
Employers,[1] government purchasers of health insurance,[2] individual consumers,[3] and lawmakers[4,5] are seeking more information on the quality of health care. Recently, the President’s Commission on Consumer Protection and Quality in the Health Care Industry called for widespread public disclosure of quality data by all health care provider organizations including health plans.[6] Public disclosure is seen as a way to enhance informed consumer decision making,[7] promote quality improvement,[8-10] and increase health plans’ accountability for health care delivery.[6,11-14]
Public disclosure of data on quality by health maintenance organizations (HMOs), except those enrolling Medicare patients, is voluntary. In 1998, only 32.5% of all HMOs disclosed their scores on the National Committee for Quality Assurance (NCQA) Health Plan Employer Data and Information Set (HEDIS) measures,[15] the most widely used set of quality indicators. If health plans that refuse to disclose quality data provide inferior care, publicly available data would overstate the average quality of HMO care nationally and result in a distorted picture of how a given plan that discloses quality data compares with that average. Selective nondisclosure could also undermine public accountability and quality improvement efforts by weakening the impetus to improve quality.
Despite the importance of this issue, no peer-reviewed studies have examined the relationship of HMO quality to willingness to disclose quality scores. We linked data for multiple years from the NCQA’s annual Quality Compass databases to determine if withdrawal from public disclosure of HEDIS scores was related to an HMO’s HEDIS performance 1 year earlier.
METHODS
Study Sample
The NCQA currently uses HEDIS measures, a standardized set of clinical quality indicators, as the principal clinical criteria for its HMO accreditation program. Health plans voluntarily submit these data to the NCQA. The NCQA lists HEDIS scores of individual HMOs in its annual Quality Compass database, designed for use by health insurance purchasers and consumers. Until recently, the NCQA allowed plans to decline public disclosure of their HEDIS scores, yet remain fully eligible for NCQA accreditation. Plans may also disclose data privately (eg, to large purchasers) but refuse public disclosure.
To determine which HMOs that disclosed HEDIS scores in 1997 (the “1997 cohort,” n = 329) or 1998 (the “1998 cohort,” n = 292) withdrew from public disclosure in the subsequent year, we linked the 1997[16] (the first year of use of HEDIS version 3.0), 1998,[17] and 1999[18] Quality Compass databases (which reflect plan characteristics and performance in 1996, 1997, and 1998, respectively). We used a “link-file” database provided by the NCQA to assist in tracking plans in the Quality Compass databases from year to year, since name changes were common. In addition to identifying plans that withdrew from public disclosure, we identified plans that merged or closed from one year to the next. We also identified HMOs that newly began public disclosure in 1998 or 1999. Last, we telephoned each HMO that we identified as having withdrawn from public disclosure to confirm the plan’s identity and whether it had changed its name, merged with another plan, or closed. For the single plan that had closed, we obtained the date of closure from its parent company.
Data Collection
The NCQA requires HMOs to follow a detailed guide that defines each HEDIS measure and specifies standards for data collection. Plans may garner data from administrative records (administrative method) or supplement the administrative method with chart reviews (hybrid method).
For each quality indicator, the plan first draws a sample from the target population (eg, for mammography, women aged 52-69 years continuously enrolled in the HMO for at least 1 year). The HMO then searches administrative records (eg, payment or radiology files) to determine if the intervention occurred within a set time frame (eg, 2 years for a mammogram). If no evidence of the intervention is found, the HMO may choose to search for exclusions (eg, a history of bilateral mastectomy). For the hybrid method, when administrative records fail to give evidence either of the intervention or an exclusion, the plan reviews sampled patients’ charts for such evidence. The HEDIS score is calculated as the number of patients who received the intervention divided by the number of eligible patients. The hybrid method, used by most plans for most measures, usually results in higher quality scores.
The NCQA’s Quality Compass databases contain information on other health plan characteristics including ownership status (ie, investor-owned vs not-for-profit). When data on ownership status was missing, we consulted InterStudy’s HMO Directory[19,20] and/or telephoned the plan to determine ownership status.
Designation of HMO Quality
We assessed HMO quality using all HEDIS measures listed under the NCQA’s rubric, “effectiveness of care.” This rubric encompasses 13 distinct measures for the 1997 cohort, 4 of which are rates for individual childhood immunizations (measles-mumps-rubella, hepatitis B, diphtheria-pertussis-tetanus, and oral polio virus), and one of which is a rate for completion of all of these 4 childhood immunizations. For the 1998 cohort, vaccination for varicella and Haemophilus influenzae type B were added as measures and are included in the rate for completion of all recommended childhood immunizations that year. To avoid giving undue weight to childhood immunizations, we analyzed only the combined immunization rate, yielding 9 HEDIS scores for each plan. We ranked HMOs by quality in 2 ways. First, we ranked HMOs according to their score on each of the 9 HEDIS measures separately. Second, we ranked HMOs based on the average of ranks for all individual HEDIS measures for which the plan submitted data. For this latter analysis, we included only plans reporting scores on at least 5 of the 9 HEDIS measures. When more than 1 plan reported the same score, we assigned these plans the same rank. We then divided the plans into tertiles on the basis of their quality ranks. All analyses were performed using SAS.[21]
Outcomes
Our primary outcome was withdrawal from public disclosure of HEDIS scores 1 year after a previous public disclosure. We defined withdrawal as either (1) a failure to submit any HEDIS scores to NCQA or (2) submission of HEDIS scores but refusal to allow public disclosure. Plans that disclosed even a single HEDIS score or that merged and disclosed pooled HEDIS scores were not considered to have withdrawn. We excluded from our analysis the single plan that closed.
Statistical Analysis
For each of the 9 separate HEDIS measures we classified plans by whether their scores fell in the highest, middle, or lowest tertile of the 329 plans publicly disclosing data in 1997. We then calculated for each quality tertile the proportion of plans that withdrew from disclosure 1 year later and used the ×2 test to compare the proportions withdrawing in the highest and lowest tertiles. We report 2-tailed P values for all tests.
We repeated this analysis using the 1998 cohort (the 292 plans disclosing data in 1998), classifying plans according to their quality ranks in 1998 and comparing withdrawal rates 1 year later among plans in the highest- and lowest-quality tertiles.
Thus, each analysis examined whether the quality rank in a given year predicted the likelihood of publicly disclosing quality scores in the subsequent year.
We also used multiple logistic regression to estimate the adjusted odds ratio (OR) for withdrawal from public disclosure for HMOs in the lowest vs highest tertile of average plan rank for all 9 measures combined. We considered plan characteristics (model type, geographic location, and method of data collection) as potential covariates. The final multivariate models included only those variables that showed a significant univariate association (P<.05) with the outcome in both cohort years. In addition, because collecting data by the hybrid method produces higher HEDIS scores than the administrative method,[22] we controlled for the method of data collection in all multivariate models.
Because our previous research had shown that investor-owned plans achieve lower HEDIS scores than not-for-profit plans, we also explored the interrelationships among ownership status, tertile of average rank on HEDIS scores, and the likelihood of withdrawal from public disclosure with 2 × 2 contingency tables and ×2 tests of significance for both the 1997 and 1998 cohorts. Specifically, we compared investor-owned and not-for-profit plans with regard to the percentage of plans in the lowest tertile of average HEDIS rank and the percentage of plans that withdrew from public disclosure. Last, we compared plans in the upper and lower tertiles of average HEDIS rank with regard to the percentage of plans that withdrew from public disclosure among investor-owned and not-for-profit plans separately.
Finally, to quantify the clinical significance of differences in quality between the highest- and lowest-quality plans, we calculated the mean (SD) rates for each indicator for the highest- and lowest-quality tertile.
RESULTS
Characteristics of the Health Plans
The majority of HMOs in both the 1997 and 1998 cohorts were investor-owned and were independent practice associations or mixed model type plans. Plans in both cohorts were most commonly located in the South Atlantic, Mid Atlantic, and East North Central regions.
HMO Withdrawal From Public Disclosure of HEDIS Scores
A total of 329 HMOs allowed public disclosure of their HEDIS scores in 1997. Of these plans, 161 (49%) withdrew from public disclosure the following year. In 1998, 292 plans allowed public disclosure of their HEDIS scores. This cohort consisted of 162 plans (after mergers) that allowed public disclosure in 1997, plus 130 newly participating plans. Of these 292 plans, 67 (23%) withdrew from public disclosure in 1999. For both the 1997 and 1998 cohorts, just over half of all plans that withdrew from public disclosure continued to submit HEDIS scores to NCQA.
HMO Quality Rank and Withdrawal From Public Disclosure of HEDIS Scores
HEDIS scores among HMOs that allowed public release of their quality data varied widely in both 1997 and 1998. Absolute differences in mean HEDIS scores of plans in the lowest and highest tertiles ranged from 15.6 to 42.3 percentage points in the 1997 cohort and from 14.6 to 37.5 percentage points in the 1998 cohort. For example, the mean immunization completion rate for 13-year-olds in the 1997 cohort was 74.7% for plans in the highest-quality tertile, but only 32.4% for plans in the lowest-quality tertile.
Health maintenance organizations in the lowest tertile were significantly more likely to withdraw from public disclosure than plans in the highest tertile for 7 of the 9 measures in the 1997 cohort and for 6 of 9 measures in the 1998 cohort. Plans in the lowest tertile were 1.6 to 2.7 times more likely to withdraw from public disclosure than plans in the highest tertile in the 1997 cohort and 2.2 to 7.0 times more likely to withdraw in the 1998 cohort. For 7 of the 9 indicators in the 1997 cohort, more than half of plans in the lowest-quality tertile withdrew from public disclosure of HEDIS scores the subsequent year. Withdrawal rates for the 1998 cohort were somewhat lower; nonetheless, for 8 of 9 indicators, at least 25% of plans in the lowest-quality tertile withdrew the subsequent year.
Health maintenance organizations in the lowest tertile of overall quality (average rank for all 9 HEDIS measures) were more likely to withdraw from public disclosure than plans in the highest tertile in both the 1997 (OR, 3.6; 95% confidence interval [CI], 2.1-7.0) and 1998 (OR, 5.7; 95% CI, 2.7-17.7) cohorts after adjustment for the method of data collection and plan model type (the only plan characteristic consistently correlated with plan withdrawal in univariate analyses).
In our analyses according to plan ownership status, investor-owned plans were more likely than not-for-profit plans to be in the lowest-quality tertile in both the 1997 (RR = 3.4; 95% CI, 1.9-5.8) and 1998 (RR = 1.9; 95% CI, 1.3-2.7) cohorts and to withdraw from public disclosure in both the 1997 (RR = 5.7; 95% CI, 2.6-12.2) and 1998 (RR = 1.3; 95% CI, 0.7-2.5) cohorts, although this difference was not statistically significant for the latter cohort. It appears, however, that poor quality rather than profit status per se was the primary determinant of withdrawal from public disclosure. The poorest-quality plans were more likely to withdraw from disclosure than the best-quality plans among both investor-owned plans (RR = 1.5; 95% CI, 1.1-2.1), and not-for-profit plans (RR = 2.2; 95% CI, 0.5-10.4) in the 1997 cohort. Similar results were obtained in the 1998 cohort for both investor-owned (RR = 2.7; 95% CI, 1.1-6.6) and not-for-profit plans (RR = 20.0; 95% CI, 2.8-149.8).
COMMENT
While the total number of HMOs that publicly disclosed HEDIS quality scores changed little each year from 1997 to 1999, the composition of this group changed substantially. Forty-eight percent of plans in the 1997 cohort and 23% of plans in the 1998 cohort withdrew from public disclosure 1 year later. Quality scores varied substantially among HMOs, and lower-scoring plans were much more likely to withdraw.
No previous peer-reviewed studies have examined the relationship between HMO quality scores and withdrawal from participation in public disclosure of scores in the HEDIS program. Previous yearly NCQA reports have documented that nondisclosing plans score poorly. However, our longitudinal analyses provide a quite different view than these reports based on cross-sectional data. Our approach encompasses plans that drop out of the HEDIS program entirely, in addition to those that refuse disclosure. For example, a 1998 NCQA report that provided data on nondisclosing plans included only 88 of the 161 (nondisclosing and drop-out) plans we analyzed.
We also delineate, for the first time, the shifting cohort of HEDIS participants and disclosers. Many plans are disclosers one year and nondisclosers the next (or vice versa). Hence the manipulation of the HEDIS monitoring system is more pervasive than is apparent from the NCQA’s cross-sectional comparisons. The differences in analytic approaches also give rise to quite different interpretations. The NCQA suggests that the better cross-sectional performance of disclosing plans is evidence that their quality monitoring system is working. In contrast, our longitudinal data imply that gaming of the system is so extensive as to potentially undermine the quality monitoring process.
Why do HMO executives at lower-scoring plans choose to withdraw from reporting of HEDIS scores? They might believe that their plan’s low HEDIS scores result from inadequate data collection methods that could understate true quality. Perhaps some suspect that their plan will suffer from biased comparisons since not all plans’ data were audited, especially in the earlier years of the HEDIS program. They may become aware of such issues only after disclosing HEDIS scores at least once. Some executives may regard the costs of data collection as too high,[23-25] which could explain why some higher-scoring plans withdrew from HEDIS participation. But costs cannot explain most withdrawals; about half of the plans withdrawing from public disclosure still collected and submitted HEDIS scores.
The most likely explanation for our findings is that many plans withdraw because they fear (or know) that they will score low again. Low scores might place such plans at a marketing disadvantage, especially if nondisclosure carries little stigma. Regardless of the explanation, however, our results imply that voluntary disclosure of quality data, the primary national mechanism for HMO quality oversight, is failing to meet its stated goals of informing consumer decision making, providing incentives to improve quality and increasing public accountability.
The NCQA’s HEDIS program represents the most comprehensive and influential quality assessment tool[26] for HMOs (and any health care sector). HEDIS measures are standardized and subject to external audit to verify the data collection and calculation process.[22] Yet, the selective withdrawal by lower-scoring plans means that enrollees, purchasers, and the public often cannot monitor a plan’s quality over time. Furthermore, it implies that the average quality of HMO care in the United States is unknowable. Average published HEDIS scores could improve even if the actual average quality were stable or even deteriorating. Hence, HEDIS scores cannot be used as an accurate barometer of HMOs’ attainment of specific health goals for the nation.[11,27,28]
The variation in HEDIS scores that we observed between the highest- and lowest-scoring plans has substantial clinical relevance. For example, receiving a beta-blocker after a myocardial infarction reduces the risk of cardiovascular death and nonfatal reinfarction by 22% and 27%, respectively.[29,30] Yet in 1998, a patient surviving a myocardial infarction was only half as likely to receive this medication if enrolled in a health plan in the lowest compared with the highest tertile of quality.
Voluntary disclosure allows HMOs to use the HEDIS program as a marketing tool, sacrificing its value as a quality assessment and improvement tool. When scores are high, plans can disclose them and take advantage of consequent marketing benefits. When scores are low, plans can withdraw from public disclosure. Indeed, until recently, HMOs that refuse to publicly disclose their quality scores were fully eligible for NCQA accreditation; only 4% of HMO applications for accreditation were rejected in 1998.[31]
Investor-owned plans were somewhat more likely to withdraw from public reporting. However, poor quality was associated with withdrawal from public disclosure among both investor-owned and not-for-profit plans. Apparently, the increasingly competitive health care marketplace drives health plans (irrespective of ownership status) to control data release to maximize competitive advantage.
Lack of disclosure is not the only challenge to the HMO quality oversight process. Patients appear to have difficulty understanding quality data[32,33] and use it infrequently when selecting a health plan.[34, 35] Many employers offer only 1 health insurance option,[36] foreclosing patient choice. Even large employers make only limited use of quality data,[1,8] instead, selecting health plans primarily on the basis of cost.[1,37] Indeed, in the current health care market, evidence that public disclosure of quality data improves quality is equivocal.[9,10,38,39] Therefore, improving accountability and encouraging quality improvement would require, at the very least, that quality data be presented in a patient-friendly format, that patients be offered a choice of health plans, and that both patients and large purchasers make purchasing decisions based on quality rather than price. Without publicly available data on quality, however, achieving these goals would accomplish little.
Our findings should also be viewed in the context of the broader debate on public disclosure of quality data by all types of health care provider organizations. Like HMOs, physicians and hospitals have opposed mandatory disclosure of performance data. Improvement of quality and accountability may ultimately depend on forthright disclosure of quality data at all levels of the health care system.
Our study has several limitations. First, since data on nondisclosing plans were, by definition, unavailable to us, we used a plan’s performance 1 year earlier as a proxy for current performance. Plans know their current HEDIS scores before having to decide whether or not to disclose them. Thus, it seems likely that low-scoring plans that improved would choose to continue disclosing, while those that did not would be more likely to withdraw from disclosure. Hence, our study may underestimate the relationship between low scores and withdrawal from disclosure. Second, we cannot exclude the possibility that unmeasured plan characteristics such as geographic dispersal of medical provider sites or differences in data systems could systematically influence our results. Third, although our data show that lower-scoring plans are more likely to withdraw from disclosure, we have no direct data on HMO executives’ reasoning regarding this decision. Last, only HMOs are currently eligible to submit HEDIS scores and receive NCQA accreditation. Whether the selective reporting of quality data we observed would apply to fee-for-service insurance or other facets of the health care system is unknown.
Few industries whose impact on health rivals HMOs’ are as free of public oversight. Airlines and car manufacturers are required to disclose standardized data on the safety of their products. Our findings suggest that voluntary quality reporting by HMOs will not create the preconditions for effective quality oversight. Reporting and public disclosure of HEDIS and other meaningful quality data by HMOs should be mandatory.
Author/Article Information
Author Affiliations
Department of Medicine, Cambridge Hospital and Harvard Medical School, Cambridge, Mass (Drs McCormick, Himmelstein, Woolhandler, and Bor) and Public Citizen Health Research Group, Washington, DC (Dr Wolfe).
Corresponding Author and Reprints
Danny McCormick, M.D., M.P.H., Cambridge Hospital, 1493 Cambridge St, Cambridge, MA 02139 (e-mail: danny_mccormick@hms.harvard.edu).
Author Contributions
Study concept and design: McCormick, Himmelstein, Woolhandler, Wolfe, Bor.
Acquisition of data: Himmelstein, Woolhandler, Wolfe.
Analysis and interpretation of data: McCormick, Himmelstein, Woolhandler, Wolfe.
Drafting of the manuscript: McCormick, Himmelstein, Woolhandler.
Critical revision of the manuscript for important intellectual content: McCormick, Himmelstein, Woolhandler, Wolfe, Bor.
Statistical expertise: McCormick, Woolhandler.
Administrative, technical, or material support: Woolhandler, Bor.
Study supervision: McCormick, Himmelstein, Woolhandler, Wolfe, Bor.
Previous Presentation
This work was presented at the 2001 Annual Meeting of the Society of General Internal Medicine, San Diego, Calif, May 5, 2001.
REFERENCES
[1] Gabel JR, Hunt KA, Hurst K. When Employers Choose Health Plans: Do NCQA Accreditation and HEDIS Data Count? New York, NY: Commonwealth Fund; 1998.
[2] California Office of Statewide Health Planning and Development. Annual Report of the California Hospital Outcomes Project. Sacramento, Calif: Office of Statewide Health Planning and Development; 1993.
[3] Princeton Survey Research Associates. Americans as Health Care Consumers: The Role of Quality Information. Menlo Park, Calif: The Kaiser Family Foundation; 1996.
[4] Miller T. Managed care regulation: in the laboratory of the states. JAMA. 1997;278:1102-1109. MEDLINE
[5] Noble AA, Brennan TA. Stages of managed care regulation: developing better rules. J Health Polit Policy Law. 1999;24:1275-1305. MEDLINE
[6] Final Report. Washington, DC: The President’s Advisory Commission on Consumer Protection and Quality in the Health Care Industry; 1998.
[7] Epstein AM. Public release of performance data: a progress report from the front. JAMA. 2000;283:1884-1886. FULL TEXT PDF MEDLINE
[8] Hibbard JH, Jewett JJ, Legnini MW, Tusler M. Choosing a health plan: do large employers use the data? Health Aff (Millwood). 1997;16:172-180. MEDLINE
[9] Mukamel DB, Mushlin AI. Quality of care information makes a difference: an analysis of market share and price changes after publication of the New York State Cardiac Surgery Mortality Reports. Med Care. 1998;36:945-954. MEDLINE
[10] Hannan EL, Kilburn H, Racz M, Shields E, Chassin MR. Improving the outcomes of coronary artery bypass surgery in New York State. JAMA. 1994;271:761-766. MEDLINE
[11] Harris JR, Caldwell B, Cahill K. Measuring the public’s health in an era of accountability: lessons from HEDIS. Am J Prev Med. 1998;14:9-13. MEDLINE
[12] Rosenbaum S. Negotiating the new health system: purchasing publicly accountable managed care. Am J Prev Med. 1998;14:67-71. MEDLINE
[13] Longo DR, Land G, Schramm W, Fraas J, Hoskins B, Howel V. Consumer reports in health care: do they make a difference in patient care? JAMA. 1997;278:1579-1584. MEDLINE
[14] Leatherman S, McCarthy D. Public disclosure of health care performance reports: experience, evidence and issues for policy. Int J Qual Health Care. 1999;11:93-105. MEDLINE
[15] Farley DO, McGlynn EA, Klein D. Assessing Quality in Managed Care: Health Plans Reporting of HEDIS Performance Measures. New York, NY: The Commonwealth Fund; 1998.
[16] NCQA’s Quality Compass Data Base 1997. Washington, DC: the National Committee for Quality Assurance; 1997.
[17] NCQA’s Quality Compass Data Base 1998. Washington, DC: the National Committee for Quality Assurance; 1998.
[18] NCQA’s Quality Compass Data Base 1999. Washington, DC: the National Committee for Quality Assurance; 1999.
[19] The InterStudy Competitive Edge: HMO Directory 9.1. St Paul, Minn: InterStudy Publications; 1999.
[20] The InterStudy Competitive Edge: HMO Directory 10.1. St Paul, Minn: InterStudy Publications; 2000.
[21] SAS Software Version 8.1. Cary, NC: SAS Institute; 2000.
[22] Spoeri RK, Ullman R. Measuring and reporting managed care performance: lessons learned and new initiatives. Ann Intern Med. 1997;127(8, pt 2):726-732. MEDLINE
[23] Roper WL, Cutler CM. Health plan accountability and reporting: issues and challenges. Health Aff (Millwood). 1998;17:152-155. MEDLINE
[24] Eddy DM. Performance measurement: problems and solutions. Health Aff (Millwood). 1998;17:7-25. MEDLINE
[25] Bodenheimer T, Calasino L. Executives with white coats – the work and world view of managed-care directors: the second of two parts. N Engl J Med. 1999;341:2029-2032. MEDLINE
[26] Epstein AM. Performance reports on quality – prototypes, problems and prospects. N Engl J Med. 1995;333:57-61. MEDLINE
[27] US Public Health Service. Healthy People 2000: National Health Promotion and Disease Prevention Objectives. Washington, DC: US Dept of Health and Human Services; 1990.
[28] Schneider EC, Riehl V, Courte-Wienecke S, Eddy DM, Sennett C. Enhancing performance measurement: NCQA’s road map for a health information framework. JAMA. 1999;282:1184-1190. ABSTRACT FULL TEXT PDF MEDLINE
[29] Yusef S, Wittes J, Friedman L. Overview of results of randomized clinical trials in heart disease, I: treatments following myocardial infarction. JAMA. 1988;260:2088-2093. MEDLINE
[30] Yusef S, Peto R, Lewis J, Collins R, Sleight P. Beta blockade during and after myocardial infarction: an overview of the randomized trials. Prog Cardiovasc Dis. 1985;27:335-371. MEDLINE
[31] Bodenheimer T. The American health care system – the movement for improved quality in health care. N Engl J Med. 1999;340:488-492. MEDLINE
[32] Hibbard JH, Jewett JJ. Will quality report cards help consumers? Health Aff (Millwood). 1997;16:218-228. MEDLINE
[33] Hibbard JH, Sofaer S, Jewett JJ. Condition-specific performance information: assessing salience, comprehension, and approaches for communication quality. Health Care Financ Rev. 1996;18:95-109. MEDLINE
[34] Chernew M, Scanlon DP. Health plan report cards and insurance choice. Inquiry. 1998;35:9-22. MEDLINE
[35] Tumlinson A, Bottigheimer H, Mahoney P, Stone EM, Hendricks A. Choosing a health plan: what information will consumers use? Health Aff (Millwood). 1997;16:229-238. MEDLINE
[36] Gabel JR, Ginsburg PB, Hunt KA. Small employers and their health benefits, 1988-1996: an awkward adolescence. Health Aff (Millwood). 1997;16:103-110. MEDLINE
[37] Sisk JE. Increasing competition and the quality of health care. Milbank Q. 1998;76:687-707. MEDLINE
[38] Marshall MN, Shekelle PG, Leatherman S, Brook RH. Public release of performance data: what do we expect to gain? a review of the evidence. JAMA. 2000;283:1866-1874. ABSTRACT FULL TEXT PDF MEDLINE
[39] Schneider EC, Epstein AM. Use of public performance reports: a survey of patients undergoing cardiac surgery. JAMA. 1998;279:1638-1642. ABSTRACT FULL TEXT PDF MEDLINE
Quality Reporting Dropouts in 1998 and 1999*
1998
Aetna Health Plans of California, Inc. (Loma Linda)
Aetna U.S. Healthcare – Delaware
Aetna U.S. Healthcare – New Jersey
Aetna U.S. Healthcare – New Jersey
AvMed Health Plan – Ft. Lauderdale Plan Area
AvMed Health Plan – Gainesville Plan Area
AvMed Health Plan – Jacksonville Plan Area
AvMed Health Plan – Miami Plan Area
AvMed Health Plan – Orlando Plan Area
AvMed Health Plan – Tampa Plan Area
BlueChoice
BlueChoice
CIGNA HEALTHCARE OF AZ – TUCSON
CIGNA HEALTHCARE OF CLEVELAND
CIGNA HEALTHCARE OF CO – DENVER
CIGNA HEALTHCARE OF CONNECTICUT. INC.
CIGNA HEALTHCARE OF DE – WILMINGTON
CIGNA HEALTHCARE OF FL – JACKSONVILLE
CIGNA HEALTHCARE OF FL – ORLANDO
CIGNA HEALTHCARE OF FL – TAMPA
CIGNA HEALTHCARE OF GA – ATLANTA
CIGNA HEALTHCARE OF IL – CHICAGO
CIGNA HEALTHCARE OF KANSAS CITY, MISSOURI
CIGNA HEALTHCARE OF LOUISIANA – BATON ROUGE
CIGNA HEALTHCARE OF MA – SPRINGFIELD
CIGNA HEALTHCARE OF MID ATLANTIC
CIGNA HEALTHCARE OF NC – RALEIGH
CIGNA HEALTHCARE OF NEW YORK
CIGNA HEALTHCARE OF NO CALIFORNIA – SAN FRANCISCO
CIGNA HEALTHCARE OF NO. LA – SHREVEPORT
CIGNA HEALTHCARE OF NORTH NEW JERSEY
CIGNA HEALTHCARE OF OH – CINCINNATI
CIGNA HEALTHCARE OF OH – COLUMBUS
CIGNA HEALTHCARE OF OK – TULSA
CIGNA HEALTHCARE OF OKLAHOMA
CIGNA HEALTHCARE OF PA – PHILADELPHIA
CIGNA HEALTHCARE OF SAINT LOUIS
CIGNA HEALTHCARE OF SOUTH FLORIDA
CIGNA HEALTHCARE OF SOUTH NEW JERSEY
CIGNA HEALTHCARE OF TN – MEMPHIS
CIGNA HEALTHCARE OF TN – NASHVILLE
CIGNA HEALTHCARE OF TX – DALLAS
CIGNA HEALTHCARE OF TX – HOUSTON
CIGNA HEALTHCARE OF UTAH – SALT LAKE CITY
CIGNA HEALTHCARE OF VIRGINIA
CIGNA HealthCare of Arizona, Inc. – Staff Model
CIGNA HealthCare of Arizona-Private Practice Plan
CIGNA HealthCare of San Diego
CIGNA PRIVATE PRACTICE – LOS ANGELES IPA
ChoiceCare
ChoiceCare
Companion HealthCare Corp.
DayMed Health Maintenance Plan, Inc.
Delmarva Health Plan, Inc.
Exclusive Healthcare, Inc. – Nevada
Family Health Plan, Inc.
Foundation Health, A California Health Plan
Foundation Health, a Florida Health Plan, Inc.
Health Net
Health Options of Florida, Inc.
HealthPlus – Washington State
Healthsource HMO of New York (Patient’s Choice)
Healthsource Indiana, Inc.
Healthsource North Carolina
Healthsource South Carolina
Humana Health Care Plans – (Arizona)
Humana Health Plan of Ohio, Inc.
Humana Health Plan of Texas (Corpus Christi)
Humana Health Plan of Texas (San Antonio)
Humana Health Plans (Chicago)
Humana Healthcare Plans (Lexington)
Humana Healthcare Plans (Louisville)
Humana Healthcare Plans (Louisville-KPPA)
Humana Medical Plan Inc. (Central Florida)
Humana Medical Plan Inc. (Daytona/ Jacksonville)
Humana Medical Plan Inc. (South Florida)
Humana Medical Plan, Inc. (Milwaukee)
Humana Medical Plan, Inc. (Tampa)
Humana Prime Health Plan (Kansas City)
Kaiser Foundation Health Plan, CHP
M.D. Health Plan
MDNY Healthcare, Inc.
Managed Health, Inc.
Managed Health, Inc.
NYLCare Health Plans of the Southwest, Inc.
NYLCare of the Mid-Atlantic
NYLCare of the New York Region
NYLCare of the New York Region
PACC Health Plans
PACC Health Plans
PCA Health Plans of Florida – North
PCA Health Plans of Florida – South
PCA Health Plans of Florida Central
PCA Health Plans of Texas, Inc.
Prudential Health Care Plan of California, Inc.
Prudential Health Care Plan of California, Inc.
Prudential Health Care of Florida, Inc. – S. Florida
Prudential Health Care of Florida, Inc. – S. Florida
Prudential HealthCare – Amarillo
Prudential HealthCare – Arkansas
Prudential HealthCare – Arkansas
Prudential HealthCare – Atlanta
Prudential HealthCare – Atlanta
Prudential HealthCare – Austin/Central Texas
Prudential HealthCare – Austin/Central Texas
Prudential HealthCare – Central Florida
Prudential HealthCare – Central Florida
Prudential HealthCare – Central Ohio
Prudential HealthCare – Central Ohio
Prudential HealthCare – Charlotte
Prudential HealthCare – Charlotte
Prudential HealthCare – Colorado
Prudential HealthCare – Colorado
Prudential HealthCare – Corpus Christi
Prudential HealthCare – Houston
Prudential HealthCare – Houston
Prudential HealthCare – Indiana
Prudential HealthCare – Indiana
Prudential HealthCare – Jacksonville
Prudential HealthCare – Jacksonville
Prudential HealthCare – Kansas City
Prudential HealthCare – Kansas City
Prudential HealthCare – Memphis
Prudential HealthCare – Memphis
Prudential HealthCare – Mid Atlantic
Prudential HealthCare – Mid Atlantic
Prudential HealthCare – Nashville
Prudential HealthCare – Nashville
Prudential HealthCare – New York, New Jersey, Conn.
Prudential HealthCare – New York, New Jersey, Conn.
Prudential HealthCare – North Texas
Prudential HealthCare – North Texas
Prudential HealthCare – Northern Ohio
Prudential HealthCare – Northern Ohio
Prudential HealthCare – Oklahoma City
Prudential HealthCare – Oklahoma City
Prudential HealthCare – Pennsylvania – Delaware
Prudential HealthCare – Pennsylvania – Delaware
Prudential HealthCare – Raleigh/Durham
Prudential HealthCare – Raleigh/Durham
Prudential HealthCare – Richmond
Prudential HealthCare – Richmond
Prudential HealthCare – San Antonio
Prudential HealthCare – San Antonio
Prudential HealthCare – SouthWest Ohio/Northern Ke
Prudential HealthCare – SouthWest Ohio/Northern Ke
Prudential HealthCare – St. Louis
Prudential HealthCare – St. Louis
Prudential HealthCare – TAC -(Beaumont, C Christi,
Prudential HealthCare – Tampa Bay
Prudential HealthCare – Tampa Bay
Prudential HealthCare – Topeka
Prudential HealthCare – Tulsa
Prudential HealthCare – Tulsa
QualMed Philadelphia Health Plan, Inc.
Rush Prudential Health Plans
San Luis Valley HMO
Southern Health Services, Inc.
Trigon Blue Cross Blue Shield
Virginia Mason – Group Health Alliance, Inc.
Virginia Mason – Group Health Alliance, Inc.
1999
Aetna U.S. Healthcare – Eastern/Central Pennsylvania
Aetna U.S. Healthcare – Eastern/Central Pennsylvania
AmeriHealth HMO Delaware
Antero Healthplans
BCWA-Blue Choices, MSCCare
BlueLincs HMO
Community Health Plan of Ohio
Compcare – Wisconsin
Compcare – Wisconsin
Exclusive Healthcare, Inc. (Kansas City)
Exclusive Healthcare, Inc. (Kansas City)
Exclusive Healthcare, Inc. – Dallas
Exclusive Healthcare, Inc. – Dallas
Group Health Plan, Inc. – St. Louis
Group Health Plan, Inc. – St. Louis
Health Alliance Medical Plans, Inc.
Health Network of Colorado Springs, Inc.
Health Partners of the Midwest
HealthAssurance (Eastern Region)
HealthPartners Health Plans Phoenix
HealthPlus, MSC-PrimeCare
Healthsource Massachusetts – CIGNA
HomeTown Health Plan
Independent Health Western New York
InterValley Health Plan
Kaiser Foundation Health Plan of Texas
M-Care
M.D. Health Plan
Mutual of Omaha of South Dakota and Community Hlth
Mutual of Omaha of South Dakota and Community Hlth
NYLCare of the Gulf Coast
NYLCare of the Gulf Coast
PHN-HMO, Inc.
PHS, Inc. Tristate
Physicians Health Plan of South Michigan, Inc.
Physicians Health Plan of South Michigan, Inc. Plus
Physicians Health Plan of Southwest MI, Inc.
Physicians Health Plan of Southwest MI, Inc.
Premier Blue
Presbyterian Health Plan, Inc.
Principal Health Care of Delaware, Inc.
Priority Health – Michigan
QualChoice
QualChoice
QualChoice
SelectCare HMO
Southern Health Plan, Inc. (The Apple Plan)
Southern Health Plan, Inc. (The Apple Plan)
UHC – Dallas – POS
United HealthCare Choice Plus Plan
United HealthCare Inc. (Oregon)
United HealthCare Insurance Co. (Colorado)
United HealthCare Insurance Co. (NY) – Syracuse
United HealthCare Select +/Choice + (Kansas City)
United HealthCare Select Plus
United HealthCare Select of KY.
United HealthCare of Arizona
United HealthCare of Colorado Choice/Select
United HealthCare of Florida, Inc. (Orlando)
United HealthCare of Florida, Inc. (Orlando)
United HealthCare of Georgia, Inc.
United HealthCare of Georgia, Inc.
United HealthCare of Ohio, Inc. (Cleveland)
United HealthCare of Upstate New York, Inc.
United HealthCare of Utah, Inc.
United HealthCare of Utah, Inc.
United Select/Choice (Kansas City)
*All data from the National Committee for Quality Assurance’s Quality Compass and Link-File databases.