Skip to main content

ORIGINAL RESEARCH article

Front. Public Health, 09 June 2020
Sec. Public Health Policy
This article is part of the Research Topic Hospitals’ Benefit to the Community: Research, Policy and Evaluation View all 11 articles

Volatility and Persistence of Value-Based Purchasing Adjustments: A Challenge to Integrating Population Health and Community Benefit Into Business Operations

  • 1Department of Health Services Management, Rush University, Chicago, IL, United States
  • 2Department of Health Policy and Management, University of Pittsburgh, Pittsburgh, PA, United States
  • 3Department of Health Management and Policy, Saint Louis University, Saint Louis, MO, United States
  • 4Department of Internal Medicine, Rush University, Chicago, IL, United States
  • 5Bryan Memorial Hospital, Lincoln, NE, United States

With the passage of the Deficit Reduction Act of 2005 and the Patient Protection and Affordable Care Act in 2010, Medicare's Inpatient Prospective Payment System (IPPS) began a transition to value-based purchasing (VBP) that rewards or penalizes hospitals based on patient satisfaction, clinical processes of care, outcomes, and efficiency metrics. However, hospital-level volatility vs. persistence in value-based payments year-over-year could result in unpredictable cash flows that negatively influence investment behavior, drive underinvestment in community benefit/population health management initiatives, and make management of the factors that drive the VBP adjustment more challenging. To evaluate the volatility and persistence of hospital VBP adjustments, the sample includes VBP adjustments and the associated domain scores for the 2,547 hospitals that participated in the program from 2013 to 2016. The sample includes urban (74%), teaching (29.1%), system affiliated (46.5%), and not-for-profit (63.6%) facilities. Volatility was measured using basic descriptive statistics, relative risk ratios, and a fixed effect, autoregressive, dynamic panel model that robust-clustered the standard errors. There is substantial change in a given facility's total VBP score with an average standard deviation of 10.74 (on a 100-point scale) that is driven by significant volatility in all metrics but particularly by efficiency and outcomes metrics. Relative risk ratios have dropped substantially over the life of the program, and there is low persistence of VBP scores from one period to the next. Findings indicate that if hospitals receive a positive adjustment in 1 year, they are almost as likely to receive a negative adjustment as a positive adjustment the following year. Furthermore, using a fixed-effect dynamic panel model that controls for autocorrelation, we find that only 13.5% of a facility's prior year IPPS adjustment (positive or negative) carries forward to the next year. The low persistence makes investment in population health management and community benefit more challenging.

Introduction

National healthcare expenditures have grown from $146 per person in 1960 to $11,172 per person in 2018. During the same time period, the percentage of the gross domestic product (GDP) devoted to healthcare has grown from 5% to over 17.7%. Healthcare spending projections from Centers for Medicare and Medicaid Services (CMS) continue to grow at rates that outstrip projected inflation rates and are projected to account for per person spending just under $17,000 (almost 20% of the GDP) in the next 7 years (1).

Payers, both public and private, have responded to the expense growth by altering incentives, manipulating benefits, increasing cost sharing, and limiting provider networks all in attempts to constrain risks and expense growth rates. More recently, there is a movement to accountable care organizations, shared savings programs, and value-based payments. There is also increasing attention being paid to community benefit reporting and the promise of community benefit and population health management (2, 3).

Not a new concept, population health focuses on “the health of a population as measured by health status indicators and as influenced by social, economic, and physical environments, personal health practices, individual capacity and coping skills, human biology, early childhood development, and health services. As an approach, population health focuses on interrelated conditions and factors that influence the health of populations over the life course, identifies systematic variations in the patterns of occurrence, and applies the resulting knowledge to develop and implement policies and actions to improve the health and well-being of those populations” (4). The interplay between the social determinants of health, the larger environment, and population health is well-documented by Kindig and Stoddart (5), McAlerney (6), the World Health Organization, and a host of more recent research (7). What is not clear is how healthcare payers and systems can successfully initiate and sustain population efforts while enhancing long-term viability and support within the new payment framework.

The Centers for Medicare and Medicaid Services (CMS) is moving more Medicare fee-for-service (FFS) reimbursements to an alternative payment model (APM) basis and aims to grow that percentage to 50% (8, 9). To date, CMS has instituted programs that identify and disseminate best practices, established bundled payments for comprehensive episodes of care, held providers responsible for total cost of care and overall quality, and established pay-for-performance (P4P) rewards and penalties for provider performance relative to preset metrics. With the passage of the Deficit Reduction Act of 2005 and the Patient Protection and Affordable Care Act in 2010, Medicare's Inpatient Prospective Payment System (IPPS) began an APM transition to value-based purchasing (VBP). The VBP legislation and subsequent CMS rules are intended to move hospitals from a payment system in which facilities are financially rewarded for volume to a P4P system that accounts for patient experiences, adherence to predetermined clinical protocols, health outcomes, and cost efficiency in the delivery of care.

Though voluntary in 2012, participation in the VBP program became mandatory in 2013 for hospitals receiving IPPS payments and meeting the minimum number of cases, surveys, or measures required to calculate the adjustment (psychiatric, rehabilitation, long-term care, children's, and cancer hospitals are exempt). The program works by adjusting the Diagnosis-Related Group (DRG) base rate up or down relative to performance on predetermined measures. Adjustments started at ±1% of IPPS hospital-specific base rates in 2013 and have increased by 0.25% per year. In 2017, the program put up to 2% of the Medicare IPPS at risk—the maximum at-risk percentage for the program. Because the program is revenue neutral, increases in the hospital base rate are equally offset by decreases at other hospitals with the average adjustment centered on zero. Those facilities that deviate the most positively or negatively from the mean receive the largest positive and negative IPPS adjustment.

In an environment where profit margins are already thin, ranging from 3 to 5% depending on hospital ownership, location, and teaching status (10, 11), fluctuations in the IPPS can have a direct and immediate impact. Moreover, the impact of the Medicare changes can then be compounded by commercial payers who tend to use Medicare payments and associated adjustments as a baseline for contractual language and payments.

While some prior work has examined the magnitude of the VBP adjustments and the associated relationships with quality and hospital profitability (1214), this article attempts to quantify the volatility and persistence of VBP adjustments for participating facilities in the early years of the program. Previous research has addressed quality outcomes of the VBP program and the magnitude of the VBP adjustments; however, this study is the first to investigate the volatility and persistence of VBP payments since the inception of the program. Volatility in P4P payment results in unpredictable cash flows that negatively influence investment behavior (15, 16) and make management of the factors that drive the VBP adjustment more challenging (17, 18). In the balance of this article, we provide a brief review of the VBP/P4P literature before presenting: (1) an examination of the volatility of VBP adjustments as well as the components that influence the composite score, (2) a calculated measure of persistence that quantifies how much of a facility's prior year's adjustment carries forward to the next year, and (3) a discussion of the volatility and persistence of payments on community benefit and population health.

Literature Review

Prior literature on earnings persistence within the healthcare sector is largely absent. Outside of the sector, substantial efforts have been devoted to the relationship between measures of persistence and methods of improving security pricing (1921), the negative impacts of earnings volatility on investment behavior (15, 16), the higher costs of capital and lower capital investment associated with low earnings persistence (22), and the impact on accounting accruals (23).

The prior research on the effects of P4P programs, including systematic reviews, is more robust, but the findings are mixed (24). Some studies find no difference in health outcomes (25), whereas others have documented improvements in composite measures of quality (26). More recently, the Quality Incentive Program, the Medicare VBP program that is associated with end-stage renal disease, notes substantial improvement in clinical process measures (27). Briesacher et al. (28) also found that P4P increased access and improved outcomes in nursing facilities but increased costs. Several survey studies have shown P4P initiatives to be cost effective; however, the associated interventions have tended to be narrowly focused. Among the more narrowly defined P4P initiatives, Armour and Pitts found that physician bonuses/withholds reduced outpatient expenditures by 5% (29). Existing literature shows that the cost-effectiveness of a program appears to depend on the design of the interventions and incentives (30).

Despite the potential for adjustments of up to 1.75% in 2016, early evaluations of the VBP adjustment indicate that over 74% of hospitals nationally experience a change in IPPS reimbursement of >0.50% (12, 13, 31). Financially, earlier work did not find a significant relationship between VBP adjustments and facility profitability in the early years of the program, and there was no apparent change in quality of care (14, 32). More recently, Ryan et al. (33) found that there was no significant relationship between the aggregate VBP adjustment and improvements in patient experience or quality of care metrics. In some cases, favorable VBP adjustments are related to poor performance on metrics that are costly to improve being offset by savings in expense-related metrics (34). There has also been no relationship between bond rating and the factors that influence the VBP adjustment, with the exception of Medicare spending per beneficiary (MSPB) (35). From a bond perspective, Rangnekar et al. (35) found a positive relationship between high levels of MSPB, which will result in downward VBP adjustments, but favorable bond ratings, which will decrease the borrowing costs for facilities. Ironically, hospitals that operate more efficiently to improve their VBP adjustments will hurt their bond ratings in the process, resulting in a reduced ability to secure funding for furtherance of the organizational mission. Not surprisingly, hospitals affiliated with systems, that are able to learn from others, and that have a level of market control do better than their counterparts in hospital VBP adjustments (36). There also appears to be a significant and negative relationship between particular hospital lines of business and the hospital VBP adjustment. Trauma certified hospitals consistently score poorer on VBP metrics (37).

Overview of P4P Payment Incentives for Hospitals

Unlike some prior P4P payment incentives that employ more targeted performance metrics and incentives, the VBP adjustment to the IPPS utilizes a wide variety of factors. In 2013, the first year of the program, the VBP adjustment was driven only by patient satisfaction and clinical process scores, with 70% of adjustment driven by the clinical process score. As the program matured, outcomes and efficiency metrics were added to the overall calculation and accounted for 40 and 25% of the overall adjustment, respectively. As detailed in Table 1, the 2013–2016 adjustments fall into four categories: person and community engagement, clinical processes of care measures, safety, and a measure of efficiency that is scored on a 0–100 scale (facilities at the top of the distribution receive a score of 100 and those at the bottom receive a score of 0). The 2016 program split 23 distinct measures across all four domains. The content of each of these VBP categories is highly varied and ranges from patient satisfaction with nurse communication and the cleanliness of the facility to central line–associated bloodstream infections and spending per Medicare beneficiary1.

TABLE 1
www.frontiersin.org

Table 1. Value-based purchasing domains, measures, and weighting 2013–2016.

Methods

Data

VBP adjustments and the respective unweighted domain scores for all hospitals in the United States were gathered from CMS for years 2013–2016. For descriptive and analytic purposes, hospital characteristics were gathered from HCRIS data (Hospital Form 2552-10) and matched on the unique provider identification. Inclusion criteria required VBP adjustments for all 4 years of the program and resulted in 2,547 hospitals and 7,641 observations. Hospital characteristics of those in the analysis are included in Table 2 and include 742 teaching facilities, 1,184 facilities with system affiliations, 1,620 not-for-profit facilities, and 1,886 urban facilities. The analysis is limited to the 2013–2016 time frame due to reporting changes instituted by CMS. In more recent years, CMS substantively changed how domain measures were shared such that weighted and unweighted composite scores by domain were discontinued and replaced with a metric-specific number between 1 and 10. One of this study's limitations is that the newer scores do not carry the same resolution as the prior scale (0–100), are not aggregated at the domain level, and, as a result, limit the study's framework to the earlier years of the program.

TABLE 2
www.frontiersin.org

Table 2. Hospital sample composition (n = 2,547).

Measures

The volatility of the IPPS adjustments was measured as the standard deviation and the relative standard deviation (standard deviation/mean) of their final VBP score (scored 0–100) prior to a financial adjustment being tied to a given score. The process was repeated for the associated VBP domains over the 2013–2016 time frame as long as the domain contributed to the final IPPS adjustment. Since unweighted domain scores are scored based on a 0–100 percentile achievement for all years in the study, they did not require standardization. However, the financial VBP adjustments increased from ±1 to ±1.75% over the 2013–2016 time frame and required standardization. To account for the change, every facility adjustment was standardized by the total potential adjustment in the respective year for a standardized adjustment of between−100 and 100% for every year in the sample. For example, a facility that received an upward adjustment of 0.75% in 2014 would receive a standardized score of 60% or (0.75/1.25). With an upward adjustment of 0.75, the facility received a total of 60% of the total potential upward adjustment for that time period.

Relative risk measures were also calculated for the 2013–2014, the 2014–2015, and the 2015–2016 time frames as an additional volatility metric. These metrics measure the relative risk of receiving a positive VBP adjustment where receiving a positive adjustment in the prior year is treated as the exposure. All measures of volatility are presented in Table 3.

Standardized Adjustmentit=Intercept                                                      +β1Standardized Adjustmenti(t-1)                                                      + βnVector of time invariant                                                             hospital characteristics + error    (1)

Persistence of the VBP adjustment was measured as the β1 coefficient associated with a lagged, standardized VBP adjustment in a time series analysis (a dynamic panel model with maximum likelihood estimation) where time-invariant hospital characteristics are fixed (Equation 1). The standardized adjustment for a given period t and facility i serves as the dependent variable. The standardized score from the hospital's prior period (t-1) and a vector of time-invariant, hospital-specific characteristics serve as the independent variables. Standard errors were clustered at the facility level to adjust for within-facility correlations after Durbin–Watson tests indicated some small autocorrelations (38). The fixed effect and between group analysis of persistence is presented in Table 4. The analysis was also repeated with the unique provider identification serving as the fixed effect and found no differences in the estimates or standard errors.

TABLE 3
www.frontiersin.org

Table 3. Relative risk and average standard deviation of hospital total and domain scores.

TABLE 4
www.frontiersin.org

Table 4. Dynamic panel regression with fixed effects.

Results

The VBP adjustments are, by design, closely centered on zero with an average of 51.3% of facilities receiving a positive adjustment between 2013 and 2016. Over the 4 years of the study, facilities have experienced substantial variation in their total VBP score. The average facility has a standard deviation of 10.74 and an associated standardized standard deviation of 0.2438. Although each domain has variation that is not inconsequential, the deviations in total score appear to be driven by volatility in the efficiency and outcomes domains, which have standard deviations of 22.12 and 16.11, respectively.

Relative risk calculations also indicate substantial variation in VBP scores and the associated payment adjustments. If a facility received a positive adjustment in 2013, they were 3.159 times more likely to receive a positive adjustment in 2014. That positive association greatly attenuated over the subsequent years. Between 2014 and 2015, the same calculation yielded a relative risk of 1.499. By 2015–2016, a relative risk of 1.0118 indicates that facilities were almost as likely to receive an IPPS penalty despite receiving a positive adjustment in the prior year.

The persistence measure associated with VBP adjustments reinforces the volatility metrics. In the fixed-effects model where hospital characteristics and autocorrelation are controlled for, a β1 of 1 would indicate that the facility received the same standardized adjustment in the current year that they received in the prior year. In short, persistence would be high since hospitals would receive the same standardized VBP adjustment from one period to the next. The prior year adjustment is a significant predictor of current adjustments (P < 0.0001), and our model estimated a β1 parameter of 0.135. On average, 13.5% of a hospital's prior VBP financial penalty or reward carries to the next period.

When examined on a between-group basis, all of the time-invariant characteristics are significant predictors of the VBP adjustment persistence. Facilities that are affiliated with a system, designated as urban, and are teaching institutions have, on average, maintain slightly less of their VBP adjustments than peers. Not-for-profit firms maintain slightly more of their VBP adjustment. While all characteristics are significant at the <0.01 level, the adjustments are not operationally significant. The largest between-group difference is among urban and non-urban facilities where the parameter estimate is −0.0646. To put this in context, a −0.0646 parameter estimate indicates that urban facilities are able to maintain one-tenth of one percentage point (−0.0646 × potential adjustment of 0.0175 in 2016 = 0.0011) more of their VBP adjustment relative to non-urban facilities.

Discussion

Hospitals participating in the VBP program have experienced significant volatility in their total VBP score and a lack of persistence in the associated IPPS adjustments. Hospitals that receive a positive adjustment in 1 year are now almost as likely to receive a penalty in the next. The lack of consistency from one period to another makes both financial planning and process management more challenging.

As discussed in earlier works (14, 39), because the VBP adjustment is designed to be revenue neutral with adjustments centered on zero, it makes it more difficult for hospitals to differentiate themselves from others participating in the system. To maximize their IPPS payment, facilities must achieve significantly better outcomes, patient satisfaction, and adherence to clinical processes at a lower cost per beneficiary. While above-average achievement in multiple domains is possible, regression to the mean and/or above average performance in one domain offset by below-average performance in another results in relatively tight clustering around zero (32) with over 74% of facilities receiving a bonus or penalty of <0.05%.

The costs of compliance and metric improvement are also important to note. CMS expanded the VBP program to include 23 separate metrics and added a new safety domain (weighted at 20% of overall score) in 2017. As metrics and domains are added, there are at least two direct impacts on hospitals. First, the relative weight of any given metric and the impact it can have on the overall IPPS adjustment diminish. The financial weight and resulting adjustments are spread over more domains and metrics. Second, as items are added to the evaluation protocol, hospitals must implement methods of tracking and improving those metrics2. Given the volatility of the scores and adjustment, the investment to report and improve may not generate a return or result in improved quality.

It is not clear that each domain and metric reinforce each other. For example, to score well on the MSPB metric, a facility would be interested in cost control and utilization management tools. As suggested by Das et al. (34), those cost control or utilization management tools may make clinical staff less available for patient interaction and drive down satisfaction scores. It is also conceivable that cost considerations could influence other domains.

From a community benefit perspective, the “two-canoe” problem of population health initiatives becomes more pronounced under the hospital VBP methodology. Providers have one foot in a canoe that is operating in the traditional volume-based system that incentivizes providing more frequent and more expensive care while the other foot is attempting to occupy a canoe operating in a value-based environment where there are efforts to constrain costs and reduce care that has little to no marginal benefit (40). As long as payment systems deploy diverging incentives with low persistence and high volatility, then health systems and providers will have a difficult time investing in community benefit and population health management. The findings this study coupled to the lack of a significant relationship between improvements and VBP adjustments (33) seem to support the diverging incentives put forth by others.

There is also a “wrong pocket problem” with a mismatch between investors and those who accrue the benefits of improved health. Providers and facilities may make significant investments of time and money to improve the health and well-being of the communities they serve and those benefits do not necessarily accrue to the investing providers. Efforts may be effective and even result in fewer patient visits or reduced facility occupancy, which has a negative impact on the bottom line. Not only is the healthcare provider incurring community benefit expenses, but they are also negatively impacted by a reduction of volume and frequency. It is also possible that the benefits of population health initiatives accrue to competing or nearby providers that did not make the investment.

In cases of a shared-savings program, the reduction in volume may be offset by payouts from the program. The programs themselves transfer significant health status (probability of falling ill and needing care) and medical care (cost of providing care) risk without an associated risk premium. Of the 32 pioneer ACOs that pursued shared savings program with CMS, only 8 remain with only 6 receiving a positive payout at an average rate of 1.37% of benchmark expenditures (41). There appears to be a mismatch between the upside of these programs and the risk being borne.

Conclusion

Although there was great interest in the initial years of the VBP program, each year there exists little financial reward for provider organizations (60% hospitals won or lost <0.5%) (42), and what little opportunity that exists comes at significant cost. Even for providers who earn them, VBP payments are not consistent, leaving organizations unable to plan on receiving them over time, thereby hampering strategic planning and future investment decisions. Moving forward, additional research on the relationships between hospital characteristics and quality metrics should be investigated. Specific and additional attention should be paid to the impact of cost control metrics (MSPB) and patient satisfaction, adherence to clinical guidelines, and outcomes. As the program continues to mature and administrations reevaluate current healthcare legislation, thought should also be given to: (1) increasing the financial incentive to influence behavior, (2) moving to a forced distribution of the VBP adjustment such that more facility revenue is at risk for poor performance, and/or (3) narrowing the number of metrics included in the program to focus facility efforts. An alternative method of improving safety and quality may include a more targeted approach that sets facility-specific performance targets (43).

Payment adjustments continue to be tightly distributed around zero with the majority of hospitals receiving an adjustment of < 12 of percentage. What this means is that facilities may make investments in population health or value-based care but not realize any downstream payments from CMS. The resulting volatility of cash flows discourages investments to improve population, community health, and value metrics.

Data Availability Statement

The datasets generated for this study may be made available upon request to the corresponding author.

Author Contributions

All authors listed have made a substantial, direct and intellectual contribution to the work, and approved it for publication.

Conflict of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Footnotes

1. ^The hospital VBP program has continued to evolve and change. The 2020 domains largely remain the same with slight nomenclature changes. The number of metrics within those domains has been reduced to 20 and includes: catheter-associated urinary tract infection, central line-associated blood stream infection, C. difficile, methicillin-resistant Staphylococcus aureus Bacteremia, elective delivery prior to 39 completed weeks gestation, surgical site infection for colons and abdominal hysterectomies, AMI, heart failure 30-days mortality, pneumonia 30-days mortality, total hip arthroplasty and/or total knee arthroplasty, Medicare spending per beneficiary, communication with nurses, communication with doctors, responsiveness of hospital staff, communication about medicines, hospital cleanliness and quietness, discharge information, three-item care transition, and overall rating of the hospital.

2. ^It is important to note that not only have some metrics within domains changed but the weighting of each domain has continued to evolve. In 2020, the impact of each of the four domains on the hospital VBP adjustment is equally weighted.

References

1. CMS, c. f. M. M. S. National Health Expenditure Data. (2020). Available online at: https://www.cms.gov/Research-Statistics-Data-and-Systems/Statistics-Trends-and-Reports/NationalHealthExpendData (accessed June 2, 2020).

2. IRS, IRS. 2009 Instructions for Schedule H (Form 990) (2009). Available online at: http://www.irs.gov/pub/irs-pdf/i990sh.pdf

Google Scholar

3. IRS IRS. IRS Exempt Organizations Hospital Compliance Project Final Report. Available online at: www.irs.gov/pub/irs-tege/frepthospproj.pdf

4. Canada GO. What is the Population Health Approach. (2012)? Available online at: https://www.canada.ca/en/public-health/services/health-promotion/population-health/population-health-approach.html (accessed March 3, 2019)

PubMed Abstract

5. Kindig D, Stoddart G. What is population health? Am J Public Health. (2003) 93:380–3. doi: 10.2105/ajph.93.3.380

CrossRef Full Text | Google Scholar

6. McAlearney AS. Population Health Management: Strategies to Improve Outcomes. Chicago, IL: Health Administration Press (2003).

7. WHO Commision on Social Determinants of Health, W. H. O. W. Closing the Gap in a Generation: Health Equity Through Action on the Social Determinants of Health. Geneva: WHO (2008).

Google Scholar

8. CMS.gov. Better Care. Smarter Spending. Healthier People: Paying Providers for Value, Not Volume. (2015). Available online at: https://www.cms.gov/Newsroom/MediaReleaseDatabase/Fact-sheets/2015-Fact-sheets-items/2015-01-26-3.html (accessed November 9 2016).

9. Rajkumar R, Conway PH, Tavenner M. CMS—engaging multiple payers in payment reform. JAMA. (2014) 311:1967–8. doi: 10.1001/jama.2014.3703

PubMed Abstract | CrossRef Full Text | Google Scholar

10. Kutscher B. Fewer Hospitals Have Positive Margins as They Face Financial Squeeze. Modern Healthcare (2014).

11. Turner J, Broom K, Elliott M, Lee J-F. A decomposition of hospital profitability an application of DuPont analysis to the US market. Health Serv Res Manag Epidemiol. (2015) 2:1–10. doi: 10.1177/2333392815590397

PubMed Abstract | CrossRef Full Text | Google Scholar

12. Centers for Medicare and Medicaid Services. (2015). Hospital Compare. Available online at: http://www.medicare.gov/hospitalcompare/search.html (Retrieved June 28, 2015)

13. Werner RM, Dudley RA. Medicare's new hospital value-based purchasing program is likely to have only a small impact on hospital payments. Health Aff. (2012) 31:1932–40. doi: 10.1377/hlthaff.2011.0990

PubMed Abstract | CrossRef Full Text | Google Scholar

14. Turner JS, Broom KD, Counte MA. Is there a relationship between value-based purchasing and hospital profitability? An exploratory study of Missouri hospitals. Health Serv Res Manag Epidemiol. (2015) 2:1–11. doi: 10.1177/2333392815606096

PubMed Abstract | CrossRef Full Text | Google Scholar

15. Lamont O. Cash flow and investment: evidence from internal capital markets. J Finance. (1997) 52:83–109. doi: 10.1111/j.1540-6261.1997.tb03809.x

CrossRef Full Text | Google Scholar

16. Minton BA, Schrand C. The impact of cash flow volatility on discretionary investment and the costs of debt and equity financing. J Financ Econ. (1999) 54:423–60. doi: 10.1016/S0304-405X(99)00042-2

CrossRef Full Text | Google Scholar

17. Anderson EG, Fine CH, Parker GG. Upstream volatility in the supply chain: the machine tool industry as a case study. Prod Operat Manag. (2000) 9:239–61. doi: 10.1111/j.1937-5956.2000.tb00136.x

CrossRef Full Text | Google Scholar

18. Crum M, Poist R, Christopher M, Holweg M. “Supply Chain 2.0”: managing supply chains in the era of turbulence. Int J Phys Distrib Log Manag. (2011) 41:63–82. doi: 10.1108/09600031111101439

CrossRef Full Text | Google Scholar

19. Anthony JH, Ramesh K. Association between accounting performance measures and stock prices: a test of the life cycle hypothesis. J Account Econ. (1992) 15:203–27. doi: 10.1016/0165-4101(92)90018-W

CrossRef Full Text | Google Scholar

20. Kormendi R, Lipe R. Earnings innovations, earnings persistence, and stock returns. J Business. (1987) 60:323–45. doi: 10.1086/296400

CrossRef Full Text | Google Scholar

21. Richardson SA, Sloan RG, Soliman MT, Tuna I. Accrual reliability, earnings persistence and stock prices. J Account Econom. (2005) 39:437–85. doi: 10.1016/j.jacceco.2005.04.005

CrossRef Full Text | Google Scholar

22. Francis J, LaFond R, Olsson PM, Schipper K. Costs of equity and earnings attributes. Account Rev. (2004) 79:967–1010. doi: 10.2308/accr.2004.79.4.967

CrossRef Full Text | Google Scholar

23. Sloan RG. Do stock prices fully reflect information in accruals and cash flows about future earnings? Account Rev. (1996) 71:289–315.

Google Scholar

24. Eijkenaar F, Emmert M, Scheppach M, Schoffski O. Effects of pay for performance in health care: a systematic review of systematic reviews. Health Policy. (2013) 110:115–30. doi: 10.1016/j.healthpol.2013.01.008

PubMed Abstract | CrossRef Full Text | Google Scholar

25. Jha AK, Joynt KE, Orav EJ, Epstein AM. The long-term effect of premier pay for performance on patient outcomes. N Engl J Med. (2012) 366:1606–15. doi: 10.1056/NEJMsa1112351

PubMed Abstract | CrossRef Full Text | Google Scholar

26. Lindenauer PK, Remus D, Roman S, Rothberg MB, Benjamin EM, Ma A, et al. Public reporting and pay for performance in hospital quality improvement. N Engl J Med. (2007) 356:486–96. doi: 10.1056/NEJMsa064964

PubMed Abstract | CrossRef Full Text | Google Scholar

27. Centers for Medicare and Medicaid Services. National Impact Assessment of Medicare Quality Measures. (2012). Baltimore, MD: Centers for Medicare and Medicaid Services.

28. Briesacher BA, Field TS, Baril J, Gurwitz JH. Pay-for-performance in nursing homes. Health Care Financ Rev. (2009) 30:1–13.

PubMed Abstract | Google Scholar

29. Armour BS, Pitts MM. Physician financial incentives in managed care. Dis Manag Health Outcomes. (2003) 11:139–47. doi: 10.2165/00115677-200311030-00001

CrossRef Full Text | Google Scholar

30. Van Herck P, De Smedt D, Annemans L, Remmen R, Rosenthal MB, Sermeus W. Systematic review: effects, design choices, and context of pay-for-performance in health care. BMC Health Serv Res. (2010) 10:247. doi: 10.1186/1472-6963-10-247

PubMed Abstract | CrossRef Full Text | Google Scholar

31. Kahn CN, Ault T, Potetz L, Walke T, Chambers JH, Burch S. Assessing Medicare's hospital pay-for-performance programs and whether they are achieving their goals. Health Aff. (2015) 34:1281–8. doi: 10.1377/hlthaff.2015.0158

PubMed Abstract | CrossRef Full Text | Google Scholar

32. US Government Accountability Office. Hospital Value-Based Purchasing: Initial Results Show Modest Effects on Medicare Payments and No Apparent Change in Quality-of-Care Trends (Vol. GAO-16-9). Washington, DC: USGAO (2015).

33. Ryan AM, Krinsky S, Maurer KA, Dimick JB. Changes in Hospital Quality Associated with Hospital Value-Based Purchasing. N Engl J Med. (2017) 376:2358–66. doi: 10.1056/NEJMsa1613412

PubMed Abstract | CrossRef Full Text | Google Scholar

34. Das A, Norton EC, Miller DC, Ryan AM, Birkmeyer JD, Chen LM. Adding a spending metric to medicare's value-based purchasing program rewarded low-quality hospitals. Health Affairs. (2016) 35:898–906. doi: 10.1377/hlthaff.2015.1190

PubMed Abstract | CrossRef Full Text | Google Scholar

35. Rangnekar A, Johnson T, Garman A, O'Neil P. The relationship between hospital value-based purchasing program scores and hospital bond ratings. J Healthcare Manag. (2015) 60:220–31. doi: 10.1097/00115514-201505000-00011

PubMed Abstract | CrossRef Full Text | Google Scholar

36. Spaulding A, Edwardson N, Zhao M. Hospital value-based purchasing performance: do organizational and market characteristics matter? J Healthcare Manag. (2018) 63:31–48. doi: 10.1097/JHM-D-16-00015

PubMed Abstract | CrossRef Full Text | Google Scholar

37. Spaulding A, Hamadi H, Martinez L, Martin TJ, Purnell JM, Zhao M. Hospital value-based purchasing and trauma-certified hospital performance. J Healthcare Quality. (2019) 41:39–48. doi: 10.1097/JHQ.0000000000000147

PubMed Abstract | CrossRef Full Text | Google Scholar

38. Maas CJM, Hox JJ. Robustness issues in multilevel regression analysis. Stat Neerl. (2004) 58:127–37. doi: 10.1046/j.0039-0402.2003.00252.x

CrossRef Full Text | Google Scholar

39. Miller HD. Making value-based payment work for academic health centers. Acad Med. (2015) 90:1294–7. doi: 10.1097/ACM.0000000000000864

PubMed Abstract | CrossRef Full Text | Google Scholar

40. Wasfy JH, Ferris TG. The business case for population health management. Prim Care. (2019) 46:623–9. doi: 10.1016/j.pop.2019.07.003

PubMed Abstract | CrossRef Full Text | Google Scholar

41. CMS, c. f. M. M. S. (2020). Advance PAyment ACO Model. Available online at: https://innovation.cms.gov/initiatives/Advance-Payment-ACO-Model/ (accessed May 20, 2020).

42. Morse S. CMS Will Pay $1.9 Billion to Hospitals in Value-Based Payments for Inpatient Care. (2020). Retrieved from: https://www.healthcarefinancenews.com/news/cms-will-pay-19-billion-hospitals-value-based-payments-inpatient-care

43. Unruh L, Hofler R. Predictors of gaps in patient safety and quality in U.S. hospitals. Health Serv Res. (2016) 51:2258–81. doi: 10.1111/1475-6773.12468

PubMed Abstract | CrossRef Full Text | Google Scholar

Keywords: value-based payment, population health, community benefit, healthcare financing, payment methodologies

Citation: Turner JS, Broom KD, Johnston KJ, Howard SW, Freeman SL and Englund T (2020) Volatility and Persistence of Value-Based Purchasing Adjustments: A Challenge to Integrating Population Health and Community Benefit Into Business Operations. Front. Public Health 8:165. doi: 10.3389/fpubh.2020.00165

Received: 06 February 2020; Accepted: 17 April 2020;
Published: 09 June 2020.

Edited by:

Simone Rauscher Singh, University of Michigan, United States

Reviewed by:

Mei Zhao, University of North Florida, United States
William Edson Aaronson, Temple University, United States

Copyright © 2020 Turner, Broom, Johnston, Howard, Freeman and Englund. This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

*Correspondence: Jason S. Turner, jason_s_turner@rush.edu

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.