The Costly Effects of an Outdated Organ Donation System

OPOs and Quality Assurance and Performance Improvement Plans

Introduction

The Centers for Medicare and Medicaid Services (CMS) require all Organ Procurement Organizations (OPOs) to produce Quality Assurance and Performance Improvement (QAPI) plans. These plans are intended to be a “comprehensive, data-driven [program] designed to monitor and evaluate performance of all donation services,” and be used to “demonstrate performance improvement.”1

While all OPOs are required to have QAPIs, the majority of OPOs still fail to meet performance standards set by CMS, which calls the efficacy of QAPIs into question. This report explores opportunities for CMS to dramatically strengthen its QAPI process.

In practice, the use of QAPIs has historically been hampered by severe data issues at OPOs, including a well-documented reliance on self-reporting, lack of transparency, and lack of oversight by either CMS or the United Network for Organ Sharing (UNOS), which currently operates as the Organ Procurement and Transplant Network (OPTN) contractor.

However, QAPIs anchored in objective data and a transparent public accounting of relative performance and quality of clinical care have the potential to drive real performance improvement across the organ procurement system. The recent OPO Final Rule creates an opportunity for CMS to make reforms to its QAPI process and ensure they finally achieve their intended impact.

More specifically, with key measures included and proper oversight in place, QAPIs could be used to hold OPOs accountable to higher performance standards, leading to thousands more organs procured and transplanted—and lives saved—every year.2

Current use of QAPIs

Since 2006, CMS has required that all OPOs create, annually review, and maintain a QAPI program. The requirements for QAPI programs were revised under the November 2020 Final Rule, which states that OPOs are expected to “implement a comprehensive data-driven QAPI program to monitor and evaluate their performance.”3

Specifically, CMS currently reviews each QAPI to determine that the OPO:

  1. Has comprehensive policies and procedures in place;
  2. Monitors processes to ensure compliance with policies;
  3. Tracks performance to ensure that improvements are sustained;
  4. Reviews donor, family and/or staff complaints; and
  5. Records minutes of meetings, committees, and formal QAPI activities.4 In practice, the mere existence of a QAPI is where the requirements stop. CMS provides guidance about the standard components of a QAPI5, but has not historically issued requirements for what specific metric definitions should be used in a QAPI, who should create the plan, or how specifically an OPO might use it to improve performance. Similarly, there is no process for CMS to evaluate the sufficiency of an individual OPO’s QAPI, or, relatedly, anyThere is no meaningful consequence for an OPO’s failure to adhere to its QAPI. A former CMS staffer we spoke with told us that QAPIs are “just paperwork [OPOs] have to do.”

The purpose of QAPIs

CMS’s goal should be for QAPIs to become public, data-driven improvement plans. This goal is achievable now that CMS has finalized regulations to evaluate OPOs based on objective data.

Specifically as it relates to QAPIs, the November 2020 Final Rule dictates that OPOs must not rely “on any single source of information to conduct self-assessments of their performance and should be employing a variety of information as part of a comprehensive QAPI program.”6 This means OPOs will be expected to include “a range of data and activities for this purpose that will inform and drive performance.”7 It also introduces the opportunity for improved measures based on the CMS “CALC” metric.

OPO performance will be assessed annually based on revised donation and transplantation outcome measures, with OPOs receiving a tier ranking established by the lowest rates of the top 25% of OPOs.8 OPOs will be expected to review these outcome measures as part of a QAPI, and must revise the QAPI if their performance does not meet the performance threshold.

If an OPO cannot improve within the four-year certification cycle, it will be decertified and the Designated Service Area (DSA) will be opened to competition from other OPOs.

QAPIs are not achieving their potential

Despite these important steps towards QAPI improvements, researchers heard from experts that several issues hold back the implementation of many QAPIs, including:

  • Lack of standardized data collection. OPOs can use misleading and self-serving data measures, and can even create benchmarks and definitions that fit the existing data. The variation in data collection exacerbates existing opacity in OPO performance and operations, creating major challenges in accurately assessing OPO performance. For instance, if one OPO’s definition of “authorization” is unique to their organization, any associated QAPIs will not be able to be externally validated or benchmarked for improvement.
  • Lack of transparency. Currently, OPOs are not required to make QAPI plans publicly available, and CMS does not publish detailed data on OPO performance.9 This means that major issues and areas for performance improvement are harder for researchers, media, patient advocates, hospitals, and other stakeholders to identify.
  • Lack of preparedness. QAPIs can be a valuable way for OPOs to identify, prepare for, and respond to rapidly evolving and unexpected events, such as the COVID-19 pandemic. However, we heard that OPOs do not maintain a strong culture of using QAPIs to prepare for changing circumstances and events.
  • Lack of accountability and consequences. Consistently, experts we spoke with described a system where OPOs face almost no consequences for poor performance. Theoretically an OPO can be decertified by CMS, but this has never happened, despite severe performance failures and fatal lapses in patient safety.

To ensure high performance, the goal of a QAPI should be to help an OPO understand its deficiencies and address them. However, as currently structured, an OPO can meet the requirement of simply having a QAPI without any broader understanding of the importance or accuracy of the data being collected, or how to use it to improve performance; this indicates that QAPIs are not set up to achieve the intended purpose of helping OPOs continuously improve.

We were told that OPOs can present their QAPI to a CMS auditor in a variety of ways, because that there is no set standard for what a QAPI should look like. This means that one OPO might develop a robust strategic plan, while another may offer CMS a single spreadsheet listing generic goals. One OPO Executive reported that an example OPO QAPI listed six goals, and every single one started with the word “maintain.” Rather than being data-driven and quality-focused, the goals this Executive saw were essentially made of “rainbows and butterflies.” Other interviewees researchers spoke with used words like “soft” and “fluffy” to describe the work of OPO Quality Assurance Departments.

With more than half of OPOs failing to meet tier one performance standards, this means that at least half of all QAPIs are failing to result in sufficient quality control and performance improvement.

Variation and data quality

Without a list of set measure definitions, it is up to each OPO to decide how to collect and analyze the data included in the QAPI. Researchers heard several interviewees talk about how OPOs can “pick and choose” how to include metrics by creating definitions and benchmarks that fit existing data. Variations in data collection and quality create major downstream effects in developing an accurate understanding of why OPOs are successful — or not. Understanding such drivers of OPO high- or poor-performance would help CMS oversee its OPO contractors, as well as empirically inform future criteria for competition for OPO DSAs.

OPO executives themselves may not even realize that differing definitions can hide a wide variety of practices across OPOs. OPO executives have different standards for identifying potential donors, making decisions about when to approach a family, having different relationships with hospitals driving (or taking away from) referrals, and often provide varying oversight and guidance for front line staff, all of which leads to distortions in data collection and analysis. Without clarity from CMS about how OPOs should define each measure and use this data for QAPIs, these distortions remain opaque and difficult to correct. Furthermore, without knowing the specific processes used to collect data, CMS cannot easily compare OPOs on specific process points, nor can CMS ensure that OPOs are effectively and equitably serving all of their patient communities.

One major data issue is that OPOs have historically used inconsistent and potentially misleading denominators to calculate metrics. This is the population figure used by OPOs to calculate measures like donor approach rate or conversion rate.

For example, an OPO might claim they approached 280 out of 300 potential donors. But if an underperforming OPO is not receiving every potential donor patient referral possible because of its failure to develop a positive relationship with a partner hospital, the real number of potential donor patients in the DSA will not be reflected in the QAPI. It has been well-documented that OPOs may also choose not to include certain categories of potential donor patients, which often include older patients,10 rural patients,11 patients of color,12 HIV+ patients,13 or medically-complex patients, despite the legal mandate that OPOs must evaluate every patient referred to their care. Poor data collection from the start creates inaccurate calculations and misleading interpretation for every subsequent metric, masking poor performance and inequitable care delivery.

One former CMS staffer confirmed that this is a rampant practice among OPOs. OPOs “mis-report the denominator because they claim there are less potential donor [patients]. If you don’t go out and identify potential donor [patients], your success rate is higher because you only approached people who would consent and be ‘good donors’…And so they don’t count in the denominator because they weren’t considered potential donor [patients] — if [the OPO] didn’t see it, it didn’t happen.” While CMS has corrected for misleading denominators in performance measures via the 2020 OPO Final Rule, it must now also make concomitant adjustments to its QAPI process to ensure that OPOs are held accountable throughout the entirety of their contracts, rather just every four years.14

Additionally, if an OPO only counts the incidence of an event, they may not understand the quality of that event. For instance, if an OPO only counts whether a potential donor’s family authorized donation, they may not understand the quality of that conversation or collect adequate feedback from the family or hospital. Without the data to properly understand where they may be going wrong, OPO executives are unlikely to meaningfully improve their family approaches. This may help explain why, after controlling for increases in donors due to factors outside of OPO influence, such as population growth and increased donors resulting from the opioid and gun epidemics, OPOs have actually gotten less effective over the past 10 years.15

One reason that QAPI measures have been difficult to compare across OPOs is that even measures that were internally reliable have not been valid across OPOs. Without a valid measure for performance comparison, some OPO QAPIs have been unable to connect care quality with performance outcomes. In the past, a QAPI measure such as percent onsite response within 60 minutes of hospital referral would have been anchored only to year-over-year data from the individual OPO, without the ability to associate a threshold of compliance with the sample measure with the OPO’s relative performance ranking. In other words, OPOs have historically used internally consistent but externally invalid measures in their QAPIs.

Consequences of inaccurate data

Researchers routinely heard from interviewees that OPOs don’t trust each other’s data, making it difficult to identify accurate national or regional benchmarks for their own performance and severely limiting the potential for sharing best practices (the lack of valid measures for performance comparison is likely a factor here). One OPO researcher we spoke with identified this as a major issue because it impacts CMS’s ability to understand shared practices among OPOs. Currently, CMS cannot use QAPI data to identify shared practices between all OPOs in the same tier, let alone compare rates at various points in the procurement process before transplant.

Not being able to identify commonalities makes it easier for OPOs to claim issues stem from a unique hospital or a unique population. Without a data-driven understanding of best practices, an OPO may be truly unaware that their practices fall outside of industry norms. This is compounded by the fact that each hospital only works with one OPO, as the OPO holds a monopoly in the DSA. For example, hospital staff may not be aware that the 50-hour case times of their OPO are not normal, or anywhere close to the best practice. OPO performance patterns have been examined across hospitals and hospital systems, but at this time, such data is not commonly used by OPOs nor the OPTN contractor to inform hospitals, regulators, or the public about identifiable patterns of poor OPO practice.16

A lack of accountability allows OPOs to fail

While OPOs are required to maintain a QAPI and provide evidence to CMS that they continually review this plan, such as sharing meeting minutes from a QAPI review, in practice it is treated as a paperwork requirement and a box for CMS auditors to check off. Several interviewees told researchers that CMS auditors do not have the training to fully understand what OPOs do, because they are the same staff who audit sites with different procedures and standards, like blood banks. One OPO consultant noted that much of their audit time was spent explaining to the CMS auditor what OPOs actually do.

This creates a fundamental barrier to CMS providing accurate oversight.

Without in-depth training on OPO operations, CMS staff cannot be expected to accurately audit an OPO’s performance or understand the components of a successful QAPI. When asked why CMS does not invest more resources in OPO performance and auditing, one former CMS staff member told researchers that organ procurement “is small potatoes” for CMS, compared to the money CMS spends on organizations like nursing homes and acute care facilities. However, this ignores not only the lives lost due to OPOs failures, but also the tremendous downstream financial costs, which contribute to $36 billion/year in Medicare expenditures on patients with end stage renal disease.17

The urgency of reforms to strengthen CMS’s oversight of OPOs is further compounded by the severe failures of other oversight bodies in the organ donation industry. For example, the UNOS (which currently holds the contract for the organ procurement transplantation network, OPTN) has consistently shown itself to be reluctant to act, even when it is aware of potentially life-threatening issues.18 Most alarmingly, in emails obtained by the Senate Finance Committee, the UNOS’s then-CEO even joked that UNOS is “an overgrown homeowners association;”19 UNOS’s Membership and Professional Standards Committee (MPSC) recently came under scrutiny for failing to remediate fatal patient safety lapses at OPOs.20

The case of LiveOnNY

The historical failures of CMS’s performance improvement plans have fatal consequences. For example, LiveOnNY, the OPO for New York City, was threatened with decertification twice in four years for severe performance failures, yet still continues to operate.21

Rather than decertify the OPO, however, CMS placed it on a “Performance Improvement Plan,” while allowing it to continue to operate without any functional consequences, including that the OPO’s CEO—despite a sustained record of management failure—was allowed to remain in place. This plan was never made public, nor was any empirically-supported rationale for which elements were and were not included within it.

However, recent investigative reporting in The Markup “obtained [an] audit [of LiveOn, which was part of the performance improvement plan], dated March 2019, which found deep, systemic problems ranging from poor training and undefined performance standards to a sweeping lack of urgency and missed donation opportunities.”22

The reporting also cited specific quotes from the audit, including that “One of the most concerning trends that emerged during our assessment was the conscious decision to allow LiveOnNY staff to leave cases where patients appeared brain dead and the family was interested in organ donation” and that “There is a history of hospitals expressing concern over service delivery from LiveOnNY”.

As noted in the Washington Post, CMS had placed the OPO on at least three “corrective action plans” since 2012 . Despite such plans, over that period the OPO “has consistently registered one of the poorest performances in the nation,” and “ranked as the country’s second-worst OPO [in 2017].”23 In a July 2020 letter to Secretary Azar, Representatives Katie Porter and Karen Bass criticized CMS’s reliance on performance improvement plans, writing that “patients do not have years to wait,” and “there is no reason to have confidence that performance improvement plans actually lead to OPO improvement or better results for patients.”24

These concerns proved to be well-founded. According to the most recent CMS data available, as of 2020, LiveOn ranked 54th of 57 OPOs in the country,25 and had one of the lowest rates of recovery of Black donors in the U.S.26 Clearly, CMS’s performance improvement plans were ineffective, and must be reformed to be anchored in objective data and transparency, and to carry consequences if its objectives are not met. This would be in line with the way many states use healthcare performance improvement plans and make them publicly available.27 OPOs are uniquely allowed to operate without an acceptable level of transparency and accountability.

The future of QAPIs: Recommendations for action

Under the November 2020 Final Rule, QAPIs must be revised when OPO performance falls below the defined threshold. If an OPO cannot meet performance standards for its next certification cycle, it risks losing the OPO’s existing DSA to a higher performing OPO. This makes data-driven QAPIs a vital part of OPO operations moving forward.

With key changes and updated metrics, QAPIs could be used to accurately evaluate OPO performance and ensure that OPOs are held to standards that are truly serving patients. For example, in collaboration with external researchers, data-driven interventions at the Indiana Donor Network OPO created a 44% increase in organ donors in just one year, driven by a 57% increase in the number of potential donor families approached.28

These significant improvements were possible at an OPO that was ranked 51st out 58 OPOs,29 suggesting that accurate data analysis and qualitative performance improvement practices do have the ability to bring improvement to even some of the lowest performing OPOs. Earlier research has highlighted that this improvement was driven by heightened public scrutiny and oversight, underscoring the importance of transparency in OPO performance data coupled with systemic pressures to perform.30

Moving forward, replicating turnarounds of failing OPOs through enforceable, data-driven, public improvement plans is precisely what CMS’s goal should be for QAPIs.

Require accurate, standardized data collection

A meaningful QAPI needs to be built on sound data. Without accurate, standardized data, QAPIs are fundamentally not serving the OPO, let alone hospitals and patients.

The first step in improving QAPIs is that CMS must provide set definitions for each process point and require standardized data measures based on population counts.

Every single expert researchers spoke with made this recommendation. With a set definition and an aggregated national number that can be broken down by OPO, CMS staff and researchers would be able to clearly identify when attrition happens and would be able to develop accurate comparisons between OPOs.

For a list of specific measures OPOs should be required to use, see Appendix A

QAPIs should be linked to CALC

The new lever for more effective, more informative, and more accurate QAPIs is the creation of a reliable, valid external measure of OPO performance: the CMS metric, referred to as cause, age, and location-consistent (CALC) deaths. Using CALC, OPOs can track the effectiveness of changes to clinical practice with changes in objective performance ranking. This enables process and outcome QAPIs to be both reliable and valid, creating opportunities for QAPIs to be used more effectively and expansively than they have in the past.

Additionally, CALC offers regulators and OPOs the ability to construct QAPIs as rates instead of counts. In the same way, regulators and OPOs now have the ability to measure conversion and aspects of clinical care against rates of donation-consistent deaths. Previously, OPO QAPIs were anchored to increases in number, and did not appropriately account for changes in the number, composition, and location of patient deaths within a service area.

Previously published research has established that the number of donors recovered per 100 CALC deaths provides a reliable measure of OPO performance. Critically, CALC-based performance measures are of equal discriminatory power31 to those with much higher data reporting burdens, and provide objective and reproducible data in a variety of population compositions.32 Performance rates, measured as recovered donors and transplanted organs per 100 CALC deaths, also provide valuable context to other domains of transplantation, such as differences in center-level organ import and utilization practices.33

Several considerations support the use of a combination of internal, OPO-reported process data measures and objective assessments of endpoint (recovered donors, transplanted organs) performance using the CALC denominator. First, peer-reviewed research has studied data already reported by OPOs to the OPTN contractor.34 In this research, OPO-reported measures of activity at individual hospitals were found to correlate with CALC-based assessments of performance. Secondly, the CALC metric has shown sensitivity to changes in OPO practice. In a separate peer-reviewed study,35 steady state and improved performance of a single OPO, as measured using the CALC denominator, correlated strongly with signals from OPO data on approaches and authorizations within specific patient subgroups.

In the rulemaking period preceding the release of the 2021 CMS Final Rule, a critique of the CALC data was that the source, Multiple Cause of Death (MCOD) Data from the National Center for Health Statistics, reports information on a 12 to 16 month delay. Critics asserted that this made MCOD data ‘outdated’ at the time of their release. In practice, this has been shown to be inconsequential, as trends in area level deaths are generally gradual in nature and predictable from other public health and demographic trends. Published evidence has shown excellent fidelity with short term, seasonally-adjusted predictions of CALC deaths at the level of OPOs’ base service populations, the Donor Service Area (DSA).36 Short term area level predictions should allow practicable base denominators on which to assess changes in overall OPO performance (donors recovered and organs transplanted), and have sufficient reliability for QI purposes in the absence of unanticipated changes to patterns of mortality. These forecasts can provide near real term backstops to OPO internal process measures, using correlation at a period of known (actual MCOD data) and forecast denominators. Together, the combination of sources can produce a high level assessment of performance trajectory and a more detailed depiction of areas of strength and weakness in practice.

QAPIs should be publicly available

In a system historically devoid of meaningful oversight, the lack of access to QAPIs and their data is a glaring issue.37 QAPIs should be made publicly available. A community has the right to know how their healthcare providers are failing them—particularly when those healthcare providers are funded by taxpayer dollars.

Numerous tools do exist for publicly evaluating and reviewing other types of care providers, such as the Consumer Assessment of Healthcare Providers and Systems (CAHPS) survey that collects patient feedback. This survey is used to create detailed, publicly available ratings of healthcare providers on Care Compare, a CMS public dashboard.38 OPOs must be held to the same standard of transparency as any other healthcare provider.

Ensure the right staff develop, have access to, and use the QAPI

An organizational culture of safety and performance improvement must start at the top. OPO leadership must be engaged in creating and using the QAPI, and can cultivate a culture of safety and performance improvement through key measures such as:

  • Maintaining a “transparent, non-punitive approach to reporting and learning from adverse events,” close calls, and near misses
  • Seeing problems as organizational and operational, not as issues with individual employees
  • Establishing a baseline performance measure on safety culture39, 40 Leadership must be held to performance improvement by an informed and engaged Board. This is a standard practice at hospitals, where Board reviews of leadership are based explicitly on safety and quality measures.41 Tools such as the World Management Survey Instrument have been used to assess hospital performance and safety practices, creating a baseline from which to improve.42

Additionally, every department at an OPO should be involved in creating the QAPI. An OPO will not be able to make sustainable improvements without involving staff throughout the performance improvement process.

One OPO Executive we spoke with meets with each of their department heads to understand that department’s data and the resources they need to meet their goals. Each department then creates its own QAPI, resulting in a 30-page executive summary shared with the OPO leadership and Board. This QAPI is reviewed in monthly meetings, with additional meetings to discuss strategic planning. Given the dire necessity of their work, this is the level to which OPOs should be held when creating and implementing a QAPI.

Not only should every department be engaged in creating the QAPI, but every staff member should have access to the plan (which should be public). In order for staff across an organization to have equitable access to data, the QAPI must be developed and written in a way that is as accessible to entry-level employees as it is to Board Members.

In addition to employees, OPOs must be required to share the QAPI with transplant centers and hospital partners. Hospitals should see the same data that an OPO sees. OPOs should not be allowed to manipulate process data to keep key performance indicators opaque to hospital partners.

QAPIs should incorporate patient feedback

With access to transparent, accessible data about OPO performance, patients and their families will be able to make more informed decisions about working with an OPO. In general, a community must be able to see how their OPO operates in comparison to others, ultimately highlighting any areas of care disparity. Research has consistently shown that OPOs fail to provide equitable services for vulnerable populations, such as those who are rural, older, HIV+, or identified as an historically underserved racial or ethnic group;43 transparent QAPI data could allow researchers to better pinpoint these issues.44

Additionally, if QAPIs incorporated patient and family feedback and were more transparent, patients may have increased assurance that their complaints and experiences are being heard. Currently, there is a strong selection bias within OPO survey practices, as the patients and families who often remain involved with an OPO do so because they had a positive experience. OPOs are not adequately reaching and getting feedback from those families who did not consent for donation or who had a poor experience. Yet these can be some of the most vital experiences for identifying areas for performance improvements.

Currently, patients and families do have the option to register a complaint with UNOS, though most families are unlikely to know this. Furthermore, UNOS’s process for evaluating such complaints has a deeply questionable track record, with the UNOS CEO himself characterizing the review process for complaints related to patient care and safety as “like putting your kid’s artwork up at home. You value it because of how it was created rather than whether it’s well done. Only in this case, we persuade ourselves that it is well done anyway.”45

Overall, the process lacks the level of transparency that could help a patient or family trust that their complaint is being addressed, and it is paramount that CMS take meaningful actions to rectify this. Incorporating patient and family feedback into a QAPI and making that QAPI public would go far to build (or rebuild) trust with the very individuals OPOs are meant to serve.

CMS should utilize QAPIs when evaluating competition

QAPIs carry the potential to not only improve OPO performance and restore patient trust in the system, but can also be used to strengthen other CMS regulatory efforts related to OPOs, such as informing CMS’s criteria for competition for OPO DSAs.

Specifically, under the 2020 Final Rule, CMS will be evaluating competition for any DSA served by an underperforming OPO. CMS has estimated that between two and five OPOs will apply for each DSA opened for competition.46 When evaluating an OPO application, CMS should incorporate QAPIs as a marker of competitiveness. An OPO should be required to submit its QAPI documents when applying for an open DSA.

As an OPO Executive told researchers, any OPO with a detailed strategic plan should be able to clearly state its plans, staffing needs, expected donation and transplant measures, and the estimated cost of this work. With a robust QAPI program, these details should be already available.

CMS has improved system-wide performance through performance measures and QAPI development models at the sector level before; skilled nursing facilities patients and nursing home patients have both benefited from CMS strengthening performance measures and defining priority areas for QAPIs, with widespread adoption by organizations.47 In fact, recently published peer-reviewed research from a nationally representative survey of more than 1,000 skilled nursing facilities indicates that an average of 13 QI initiatives per organization led to widespread, measurable improvements under CMS standards, even for those facilities primarily serving rural and/or minority patients.48

Conclusion

With the right regulations, review mechanisms, and standardized data in place, QAPIs could be a valuable tool for effectively assessing OPOs and ensuring performance improvement. When OPOs are held to rigorous standards, the organ procurement system as a whole could see major improvements.

There is nothing unique about OPOs that means they should be allowed to operate outside the standards of other healthcare providers. For decades, these organizations have been underperforming with little to no consequence, while causing thousands of preventable deaths each year, and disproportionately harming patients of color.49

As a 2022 bipartisan Senate Finance report concluded: “From the top down, the U.S. transplant network is not working, putting Americans’ lives at risk.”50 Patients and families have a right to know and understand what will happen to themselves and their loved ones after death, while those waiting for a transplant deserve transparency and accountability that they do not currently have. OPOs must be held accountable for these patients and families.

Data Point Source Information Provided Availability (e.g., public, upon request, not public)
Organ utilization: Organs recovered for transplant, transplanted by donation service area (DSA) Scientific Registry of Transplant Recipients Data on the number of organs, by type, that are authorized and successfully transplanted Public
Number of recovered organ donors, by race/ethnicity (%) by donation service area (DSA) Scientific Registry of Transplant Recipients Data on the demographics of organ donor patients; provides insight for researchers tracking procurement and transplantation equity Public, suppressed if <10 patients annually within DSA
Number of vented patients referred to OPO care stratified by demographic OPO, currently collected by OPTN contractor on the DNR Data on the demographics of potential organ donor patients; provides insight for researchers tracking procurement and transplantation equity, quality of care and access to care

Counts number of patient interactions for regulators
Public, suppressed if <10 patients annually within DSA
Number of patients meeting referral criteria without appropriate hospital referral (adverse event) OPO, currently collected by OPTN contractor on the DNR Data on the demographics of potential organ donor patients; provides insight for researchers tracking procurement and transplantation equity , quality of care, access to care

Counts adverse events for regulators
Not public at DSA level (small number of patients), should be public when aggregated at hospital system level/transplant system level
Onsite patient evaluations (sometimes called “referral response”) rate OPO, currently collected at the organizational level but not reported to OPTN Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data for regulators
Public
Onsite response to referred patient within 60 minutes OPO, currently collected at the organizational level but not reported to OPTN Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data for regulators

Allows hospitals to validate OPO quality of care for patients referred
Public
Rate of referred patients per 100 CALC deaths (using seasonally-adjusted forecasts of CALC deaths) OPO, CMS denominator Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and measures effectiveness of OPO referral criteria for regulators
Public
Rate of referred patients ruled medically suitable OPO, currently collected by OPTN contractor on the DNR Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and measures effectiveness of OPO referral criteria for regulators
Public
Rate of ruled medically suitable patients per 100 CALC deaths (using seasonally-adjusted forecasts of CALC deaths) OPO, CMS denominator Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and measures effectiveness of OPO referral criteria for regulators
Public
Rate of referred patients with approach (stratified by age, race/ethnicity, and OPTN cause of death) OPO, only authorization outcome is currently collected on the DNR (authorized yes/no), but all OPOs collect these data at the organizational level Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of selection for approach for regulators

Allows hospitals to validate OPO quality of care for patients referred
Public, suppressed if <10 patients annually within DSA
Rate of family approach per 100 vented referrals OPOOPO, only authorization outcome is currently collected on the DNR (authorized yes/no), but all OPOs collect these data at the organizational level, CMS denominator Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of selection for approach for regulators
Public
Rate of family authorization per 100 approaches (stratified by age, race/ethnicity, and OPTN cause of death) OPO, only authorization outcome is currently collected on the DNR (authorized yes/no), but all OPOs collect these data at the organizational levelOPO Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of approach for regulators

Allows hospitals to validate OPO quality of care for patients referred
Public
Rate of “huddles” with hospital provider of all medically suitable patients OPO, all OPOs collect these data at the organizational levelOPO Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care, coordination of care

Provides quality of care data for regulators

Allows hospitals to validate OPO quality of care for patients referred
Public
Number of organs recovered by type OPTN (DDR), SRTR Data on the number of organs, by type, that are authorized and successfully transplanted Public
Rate of recovered organs transplanted by type OPTN (DDR), SRTR Data on the number of organs, by type, that are authorized and successfully transplanted Public
Rate of hospital compliance for appropriate referral of patients (i.e., number patients referred appropriately / number patients eligible to be referred per OPO death record review) OPTN, SRTR, OPO (available on the DNR, which currently characterizes timeliness, but does not describe compliance with clinical triggers) Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care, coordination of care

Provides quality of care, effectiveness of OPO referral criteria and associated data for regulators

Allows hospitals to validate OPO quality of care for patients referred
Public
Rate of family decline by category (stratified by age, race/ethnicity, and OPTN cause of death) OPO, only authorization outcome is currently collected on the DNR (authorized yes/no), but all OPOs collect these data at the organizational levelOPO Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of approach for regulators

Allows hospitals to validate OPO quality of care for patients referred
Public, suppressed if <10 patients annually within DSA
Rate of potential donor patients authorized but not recovered OPO, DDR and DNR Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of donor management for regulators
Public
Rate of non-allocated organs by category for refusal OPO, OPTN Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of allocation and donor management for regulators
Public
Count of refusal codes associated with procurement adverse event OPO, OPTN Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of allocation and donor management for regulators
Upon request
Rate of discarded organs of all recovered organs, by type OPO, OPTN, DDR Provides insight for researchers tracking procurement and transplantation equity, quality of care, access to care

Provides quality of care data and effectiveness of allocation and donor management for regulators
Public
Item Requirements Department
Organ Donation and Transplant Metrics
Process Data51 For a complete list of recommended QAPI data points, see Appendix A. Data must be collected across every OPO department.

Quality Assurance Department compiles, analyzes, and ensures accuracy.
Quality Assurance Department Reports
Safety Incident Review In concert with Hospital and Family Services, Quality Assurance Department should prepare a report including:
  • Review of any safety incidents that have been reported by hospital, transplant centers, or family partners52
  • Summary of responses made to any safety incident
  • Updates to any previously reported safety incidents, including any new or updated protocols
Quality Assurance Department

Hospital and Family Services
Preparedness Statement Working with every OPO department, Quality Assurance Department should prepare a report including:
  • Current state of OPO preparedness in response to or during unexpected events, such as the COVID-19 pandemic53
  • Updates to any protocols made to increase preparedness
Quality Assurance Department

Department Leadership
Performance Improvement Projects Quality Assurance Department should report on:
  • Results of ongoing performance improvement projects
  • Upcoming performance improvement projects
    • Incorporate and highlight opportunities for staff review and comment54
    • Review process for developing and evaluating improvement projects; review criteria for determining priority.55
    • Schedule of monthly QAPI review meetings56
Quality Assurance Department
Community
Registration and Outreach Provide updates on:
  • Current rates of community donor registration
  • Details of upcoming outreach events
  • Success of previous outreach events (eg., number of new donors registered)
  • Details of ongoing outreach efforts and campaigns
Family Services, Hospital Services, Communications/Press Office
Patient and Family Feedback OPO should provide updates and data gathered from ongoing surveys of donor families,57 covering key measures such as:
  • The responsiveness, respect and courtesy shown by staff58
  • The speed and frequency with which families were able to get help and have questions answered
  • How thoroughly staff explained the donation process59
  • How families felt about donation after the process was complete60
Family Services
Hospital and Transplant Center Feedback OPO should provide updates and data gathered from ongoing surveys of hospital and transplant center staff, covering key measures such as:
  • The responsiveness, respect, and courtesy shown by staff
  • Response times after a potential referral
  • Effectiveness ratings of specific steps within the donation process (eg., donor evaluation, communication during referral)
Hospital Services
Organization
Organizational Chart Human Resources Department should provide:
  • Organizational chart
  • Summary of responsibilities for each department and/or key staff positions
Human Resources
Staff Training Human Resources Department should provide:
  • Upcoming and/or ongoing staff training opportunities and requirements
  • Current state of staff training (i.e., number of staff in orientation, number of staff who have not met or renewed current training requirements)
  • Review updated policies or protocols
Human Resources
Staff Feedback Human Resources Department should provide:
  • Aggregated data from orientation and exit surveys
Human Resources
Executive Summary
Strategic Planning Statement Strategic Planning Statement should reflect the current standing of the OPO according to CMS, future and current projects focused on improving organ donation and transplant, and future or current plans expansion (eg., competing for a neighboring DSA). Executive
Financials Financial statement should reflect the current financial position of OPO and any details as required on publicly available tax documents, broken down by department.

Financial statements should include a disclosure of any potential conflicts of interest among Board Members and Executives.
Executive
Definitions and Resources
OPO should provide definitions and relevant details for all metrics included in QAPI.

Provide staff and the public with commonly useful resources, like Scientific Registry of Transplant Recipients and links to CMS and OPTN guidelines and bylaws.
Shared; Human Resources
Published Format
OPO should make QAPI documents available to every staff member. OPOs must provide the most updated data available on a monthly basis.61

OPO should review QAPI at monthly staff meetings and quarterly Board meetings, and provide meeting minutes covering the QAPI review for CMS auditors.

OPO must make QAPIs publicly available when they are submitted to CMS.62
Communications

Notes

  1. “§ 486.348 Condition: Quality assessment and performance improvement (QAPI).” Centers for Medicare and Medicaid, 2020. 

  2. “A System in Need of Repair: Addressing Organizational Failures of the U.S.’s Organ Procurement and Transplantation Network.” United States Senate, 2020. 

  3. “Medicare and Medicaid Programs; Organ Procurement Organizations Conditions for Coverage: Revisions to the Outcome Measure Requirements for Organ Procurement Organizations.” Department of Health and Human Services, Federal Register, December 2020. 

  4. “New Organ Procurement Organization (OPO) Survey Protocol and Guidance Revisions in Appendix Y of the State Operations Manual (SOM).” Department of Health and Human Services, Center for Clinical Standards and Quality/Quality, Safety & Oversight Group, 2018. 

  5. “§ 486.348 Condition: Quality assessment and performance improvement (QAPI).” Centers for Medicare and Medicaid, 2020. 

  6. “Medicare and Medicaid Programs; Organ Procurement Organizations Conditions for Coverage: Revisions to the Outcome Measure Requirements for Organ Procurement Organizations.” Department of Health and Human Services, Federal Register, December 2020. 

  7. “Medicare and Medicaid Programs; Organ Procurement Organizations Conditions for Coverage: Revisions to the Outcome Measure Requirements for Organ Procurement Organizations.” Department of Health and Human Services, Federal Register, December 2020. 

  8. “Organ Procurement Organization (OPO) Conditions for Coverage Final Rule: Revisions to Outcome Measures for OPOs CMS-3380-F.” Centers for Medicare and Medicaid, November 2020. 

  9. Under the Final Rule, CMS will publish a tiered ranking of OPO performance, with tiers established by the lowest performance of the top 25% of OPOs. 

  10. “Results of a data-driven performance improvement initiative in organ donation.” _American Journal of Transplantation, _2020. 

  11. “Procurement characteristics of high- and low-performing OPOs as seen in OPTN/SRTR data,” _American Journal of Transplantation, _2021. 

  12. “Rejecting bias: The case against race adjustment for OPO performance in communities of color,” American Journal of Transplantation, 2020. 

  13. “Potential donor characteristics and decisions made by organ procurement organization staff: Results of a discrete choice experiment,” Transplant Infectious Disease, 2021; “Barriers experienced by organ procurement organizations in implementing the HOPE act and HIV-positive organ donation,” AIDS Care, 2022. 

  14. Per the 2020 OPO Final Rule, OPOs are evaluated on 4-year contract cycles, and only performance failures in the 4th year of any contracting cycle can merit decertification. 

  15. “Not all organ donation increases equal OPO improvement,” The Costly Effects of an Outdated Organ Donation System, 2020. 

  16. “Procurement characteristics of high- and low-performing OPOs as seen in OPTN/SRTR data,” _American Journal of Transplantation, _2021. 

  17. Graphic: Organ Procurement Money Flow, The Costly Effects of an Outdated Organ Donation System, 2020. 

  18. “Transplant monitor lax in oversight.” LA Times, 2006. 

  19. “Hearing Memo: A System in Need of Repair: Addressing Organizational Failures of the U.S.’s Organ Procurement and Transplantation Network,” United States Senate, 2020. 

  20. “70 deaths, many wasted organs are blamed on transplant system errors.” Washington Post, 2022. 

  21. Letter from Senator Todd Young to the Honorable Seema Verda, Administrator for Centers for Medicare & Medicaid Services, 2019. 

  22. “New York’s organ collection agency, nation’s second-largest, threatened with closure.” _Washington Post, _2018. 

  23. “Letter to the Honorable Alex Azar, Secretary, and the Honorable Seema Verma, Administrator.” Representatives Katie Porter and Karen Bass, 2020. 

  24. Data via CMS, visualized at OPOdata.org. See “LiveOn NY (NYRT),” OPOData, 2020. 

  25. Data from 2019 (most recent publicly available). “Organ donation recovery rates worse for people of color, data show.” Axios, 2021. Note: LiveOn experienced a change in leadership in 2021; data for 2021 will be released in spring 2022. 

  26. “Performance Improvement Plan - Mass General Brigham.” Massachusetts Health Policy Commission, 2022. 

  27. “Results of a data-driven performance improvement initiative in organ donation.” _American Journal of Transplantation, _2020. 

  28. Note: Two OPOs merged on January 1, 2021, so there are now 57 OPOs. However, at the time of the Indiana Donor Network intervention, there were 58 OPOs. 

  29. “Oversight Gaps and Conflicts,” The Costly Effects of an Outdated Organ Donation System, 2020; “Public discourse and policy change: Absence of harm from increased oversight and transparency in OPO performance,” _American Journal of Transplantation, _2021. 

  30. “Addressing Critiques of the Proposed CMS Metric of Organ Procurement Organ Performance: More Data Isn’t Better,” Transplantation, 2020. 

  31. “Rejecting bias: The case against race adjustment for OPO performance in communities of color,” American Journal of Transplantation, 2020. 

  32. “Examining utilization of kidneys as a function of procurement performance,” American Journal of Transplantation, 2022. 

  33. “Procurement characteristics of high- and low-performing OPOs as seen in OPTN/SRTR data,” _American Journal of Transplantation, _2021. 

  34. “Results of a data-driven performance improvement initiative in organ donation.” _American Journal of Transplantation, _2020. 

  35. “Public discourse and policy change: Absence of harm from increased oversight and transparency in OPO performance,” _American Journal of Transplantation, _2021. 

  36. See: “Current Reliance on Media for Accountability,” in “Oversight Gaps and Conflicts.” The Costly Effects of an Outdated Organ Donation System, 2020. 

  37. For additional examples, see: HEDIS Measures (National Committee for Quality Assurance); Leapfrog Ratings (The Leapfrog Group). 

  38. The Joint Commission has recommended tools such as: Surveys on Patient Safety Culture (Agency for Healthcare Research and Quality) and Safety Attitudes and Safety Climate Questionnaire (Center for Healthcare Quality and Safety) 

  39. For additional information on developing a culture of safety, see: “The essential role of leadership in developing a safety culture.” The Joint Commission, 2017. 

  40. “Leadership Role in Improving Safety.” Agency for Healthcare Research and Quality, 2019. 

  41. “Hospital Board And Management Practices Are Strongly Related To Hospital Performance On Clinical Quality Metrics.” Health Affairs, 2015. 

  42. “Results of a data-driven performance improvement initiative in organ donation.” American Journal of Transplantation, _2020; “Procurement characteristics of high- and low-performing OPOs as seen in OPTN/SRTR data,” _American Journal of Transplantation, _2021; “Rejecting bias: The case against race adjustment for OPO performance in communities of color,” _American Journal of Transplantation, 2020; “Potential donor characteristics and decisions made by organ procurement organization staff: Results of a discrete choice experiment,” Transplant Infectious Disease, 2021; “Barriers experienced by organ procurement organizations in implementing the HOPE act and HIV-positive organ donation,” AIDS Care, 2022. 

  43. “Inequity in Organ Donation.” The Costly Effects of an Outdated Organ Donation System, 2020. 

  44. “Hearing Memo: A System in Need of Repair: Addressing Organizational Failures of the U.S.’s Organ Procurement and Transplantation Network.” United States Senate, 2020. 

  45. “The Lowdown on Organ Procurement Organizations.” National Law Review, 2021. 

  46. CMS.gov: QAPI Description and Background; “Taking a Look at Systematic Quality Improvement Using the Five QAPI Elements,” CMS Compliance Group, 2012. 

  47. “Nursing Home Responses to Performance-based Accountability: Results of a National Survey,” Journal of the American Geriatrics Society, 2020. 

  48. For data on preventable deaths, see: OPOdata.org. The Costly Effects of an Outdated Organ Donation System, 2022. 

  49. UNOS Hearing Confidential Memo (PDF for release): “A System in Need of Repair: Addressing Organizational Failures of the U.S.’s Organ Procurement and Transplantation Network,” 2020. 

  50. For further details on some of these recommended metrics, see: “Ad Hoc Systems Performance Committee Report.” OPTN, 2019. 

  51. Cases should be summarized anonymously. If OPO policies preclude sharing this information publicly, then OPO should provide aggregated data as to the incidence of specific types of safety reports. OPO must still provide information as to the responses taken and any newly updated protocols. 

  52. Preparedness measures could include: steps taken to physically protect facilities during extreme weather events; updated pandemic-specific measures; community investments made to support families and transplant recipients during times of economic hardship (eg., increased funding for transplant family accommodations during economic recession); responses to updated clinical guidelines or new regulations from CMS or OPTN. 

  53. Quality Assurance Department must actively solicit staff input, especially when determining the priority level of a given performance improvement project. 

  54. For a QAPI process model, see: “Meeting the Center for Medicare & Medicaid Services Requirements for Quality Assessment and Performance Improvement: A Model for Hospitals.” Journal of Nursing Care Quality, 2006. 

  55. Ideally, staff from every department should be present at QAPI review meetings. Additionally, OPOs should provide an alternate mechanism for collecting ongoing staff feedback, such as surveys or report sharing from departmental meetings. 

  56. These surveys must be available to both families who consented for donation and those who did not. 

  57. This measure should include any staff position a family or patient interacts with. For example, in the HCAHPS survey for hospital care, interactions with nurses and doctors are rated separately. 

  58. It is crucial that families provide informed consent and fully understand the donation process. OPOs should collect a rating about the degree to which families felt they understood the consent process. 

  59. Measures like how likely a family is to remain involved with the OPO (eg., through volunteer work) can be an additional indicator of the overall quality of their experience. 

  60. Some measures, such as national rankings, may not be updated monthly. OPOs should provide the most recent data, in lieu of monthly updates, where appropriate. 

  61. Like an Annual Report, a QAPI must be made public at least once annually. Previous QAPI documents should be available for review upon request. 

Research supported by Arnold Ventures and Schmidt Futures in partnership with Organize and the Federation of American Scientists.

© 2024
Arnold Ventures
Schmidt Futures
FAS
Organize