The Role of the Crossmatch in Kidney Transplantation: Past, Present and Future

Immunogenetic characterization of the transplant recipient with crossmatch is used to minimize graft loss by detecting preformed antibodies. Use of increasingly sensitive tests including flow cytometry crossmatch (FCXM) has been accompanied by near elimination of hyperacute rejection. We reviewed associations of crossmatch results with kidney graft outcomes in contemporary practice, and provided updates of our past publications with more recent data in several instances. Recent United States registry data for transplants performed with a reported positive crossmatch demonstrate immediate graft loss rates of ≤1.3% in FCXM+ recipients, and ≤3.6% in complementdependent cytotoxicity crossmatch positive (CDCXM+) recipients. One-year graft survival was reduced by ≤6.4% in FCXM+ versus FCXM– recipients, and by ≤11.5% in CDCXM+ versus CDCXM– recipients. Five-year graft survival was reduced by ≤10.2 % in FCXM+ versus FCXM– recipients, and by ≤8.7% in CDCXM+ versus CDCXM– recipients. A possible explanation for the markedly lower graft loss risk with crossmatch positive transplants in modern practice may be selection of recipients with low anti-HLA titers. Although a good correlation between virtual crossmatch and actual crossmatch has been demonstrated, the outcome significance of positive virtual/negative actual and negative virtual/positive actual crossmatches is not clearly established. Post-transplant demonstration of the persistence or appearance of donor-specific antibody is of value in prognostication, but utility for adjustment of therapy is uncertain. In summary, contemporary data suggest that, among selected transplants performed, the impact of a positive crossmatch may be relatively small compared to other accepted clinical factors. Further study is warranted work to determine, prospectively, under what circumstances crossmatch positive transplants can precede with safety. *Corresponding author: Ralph J. Graff, 3635 Vista at Grand Blvd., St. Louis, MO, USA 63110-0250, Tel: 314-577-8647; Fax: 314-268-5126; E-mail: ffmdrj@slu.edu Received December 01, 2011; Accepted January 11, 2012; Published January 13, 2012 Citation: Graff RJ, Duffy B, Xiao H, Radell J, Lentine KL (2012) The Role of the Crossmatch in Kidney Transplantation: Past, Present and Future. J Nephrol Therapeutic S4:002. doi:10.4172/2161-0959.S4-002 Copyright: © 2012 Graff RJ, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Introduction Since Murray’s successful transplant of an allogeneic kidney in 1959 [1], renal transplantation has evolved from an experimental technique into an accepted modality for the treatment of end-stage renal disease with more than 310,000 renal transplants having been performed by 2010 [2]. This achievement constitutes a classical example of multidisciplinary collaboration: the description of the rejection of canine allografts by a surgeon [3], the appreciation by geneticists that the targets of rejection are inherited [4,5], the elucidation of the mechanism of allograft rejection by biologists [6], and the adaptation of this knowledge to the bedside by clinicians [1,7]. The two mechanisms employed to minimize transplant loss from rejection are the suppression of the recipient immune response with medications and immunogenetic characterization of the recipient. Early immunosuppressive medication regimens allowed transplantation of un-sensitized recipients, but transplantation of sensitized recipients was associated with immediate and early graft rejection. The most aggressive anti-rejection drug regimens available could not save these kidneys. The utilization of a complement dependent microcytotoxicty crossmatch (CDCXM) (an assay that measures cell bound antibody by its ability to bind complement and cause cell lysis) allowed the identification of recipient pre-sensitization to the donor kidney [8] as well as the recognition of the association between a CDCXM+ result and immediate graft loss [9], allowing an avenue for its avoidance. Over the last 50 years, as a result of improvement of immunosuppressive drug regimens and immunological evaluation techniques, kidney transplant outcomes have been greatly improved. The objective of this report is to review the evolution of crossmatch technique and associations with kidney transplant outcomes. Specifically, we review kidney graft outcome data in CDCXM, flow cytometry crossmatch (FCXM) and virtual crossmatch (VXM) negative (–) and positive (+) kidney transplant recipients. In several instances we updated our prior analyses using more recent Standard Transplant Analysis and Research Files (STAR) provided by the Organ Procurement and transplant Network (OPTN). We also review the role of antibody reduction therapy and post-transplant monitoring. The Evolution of Crossmatch Technology Because rejection-associated graft loss was observed in CDCXM– recipients [9], the initial CDCXM was refined to improve immunologic characterization of the recipient through the use of separated T and B lymphocyte target cells [10], the addition of wash steps [11], extended incubation [12], and the use of anti-human globulin (AHG) augmentation [12,13]. Flow cytometry technology was adapted to create a FCXM [14]. The flow cytometer measures cell-bound antibody with a fluorescing label, the amount of antibody being quantified by fluorescence intensity. The unit of intensity is called a channel, and the difference between control and experimental is called channel shift. Although each laboratory sets its criteria for a positive test, a 40 channel shift for T cells and an 80 channel shift for B cells are Citation: Graff RJ, Duffy B, Xiao H, Radell J, Lentine KL (2012) The Role of the Crossmatch in Kidney Transplantation: Past, Present and Future. J Nephrol Therapeutic S4:002. doi:10.4172/2161-0959.S4-002


Introduction
Since Murray's successful transplant of an allogeneic kidney in 1959 [1], renal transplantation has evolved from an experimental technique into an accepted modality for the treatment of end-stage renal disease with more than 310,000 renal transplants having been performed by 2010 [2]. This achievement constitutes a classical example of multidisciplinary collaboration: the description of the rejection of canine allografts by a surgeon [3], the appreciation by geneticists that the targets of rejection are inherited [4,5], the elucidation of the mechanism of allograft rejection by biologists [6], and the adaptation of this knowledge to the bedside by clinicians [1,7].
The two mechanisms employed to minimize transplant loss from rejection are the suppression of the recipient immune response with medications and immunogenetic characterization of the recipient. Early immunosuppressive medication regimens allowed transplantation of un-sensitized recipients, but transplantation of sensitized recipients was associated with immediate and early graft rejection. The most aggressive anti-rejection drug regimens available could not save these kidneys. The utilization of a complement dependent microcytotoxicty crossmatch (CDCXM) (an assay that measures cell bound antibody by its ability to bind complement and cause cell lysis) allowed the identification of recipient pre-sensitization to the donor kidney [8] as well as the recognition of the association between a CDCXM+ result and immediate graft loss [9], allowing an avenue for its avoidance.
Over the last 50 years, as a result of improvement of immunosuppressive drug regimens and immunological evaluation techniques, kidney transplant outcomes have been greatly improved. The objective of this report is to review the evolution of crossmatch technique and associations with kidney transplant outcomes. Specifically, we review kidney graft outcome data in CDCXM, flow cytometry crossmatch (FCXM) and virtual crossmatch (VXM) negative (-) and positive (+) kidney transplant recipients. In several instances we updated our prior analyses using more recent Standard Transplant Analysis and Research Files (STAR) provided by the Organ Procurement and transplant Network (OPTN). We also review the role of antibody reduction therapy and post-transplant monitoring.

The Evolution of Crossmatch Technology
Because rejection-associated graft loss was observed in CDCXMrecipients [9], the initial CDCXM was refined to improve immunologic characterization of the recipient through the use of separated T and B lymphocyte target cells [10], the addition of wash steps [11], extended incubation [12], and the use of anti-human globulin (AHG) augmentation [12,13]. Flow cytometry technology was adapted to create a FCXM [14]. The flow cytometer measures cell-bound antibody with a fluorescing label, the amount of antibody being quantified by fluorescence intensity. The unit of intensity is called a channel, and the difference between control and experimental is called channel shift. Although each laboratory sets its criteria for a positive test, a 40 channel shift for T cells and an 80 channel shift for B cells are generally considered a positive test. Finally, solid phase technology was developed with the ability to accurately identify anti HLA antibodies and applied as a virtual crossmatch (VXM) [15][16][17]. These advances allowed the detection of low titers of antibody that were previously undetectable.
Through the years, there has been an evolution toward the use of more sensitive crossmatch technologies as shown by analysis of data from the United States (U.S.) OPTN registry for kidney transplants performed from 1987 through 2005 [21]. An updated analysis of the OPTN STAR files of transplants performed in 1995 through 2009 for this article demonstrates that the use of FCXM including both T and B cell reactivity as the most sensitive test increased from 17% to 58.3% ( Figure 1). Simultaneously, the use of T cell AHG CDCXM and B cell crossmatch as the most sensitive test decreased from 37.5% to 26.6% and the use of less sensitive techniques alone declined more markedly from 45.5% to 15.1%. During the same period there has been an evolution from cellular to solid phase antibody screening, described below in the "virtual crossmatch" section. On Oct. 25,2006, the United Network for Organ Sharing (UNOS) began requiring specificity information to identify unacceptable antigens, thus encouraging the use of single antigen beads (SAB) [34].

Evolution in Kidney Transplant Outcomes
In their 1969 review, Patel and Terasaki reported outcomes on 413 transplants performed in 15 U.S. transplant centers [9]. Twenty-four of 30 (80%) recipients transplanted with a CDCXM+ lost their grafts immediately, one graft was lost at 3 months, 2 grafts were surviving 3 months post-transplant, and 3 grafts were surviving more than 3 months after transplant. In contrast, only 4 of 168 (2.4%) recipients transplanted with CDCXM-/panel reactive antibody (PRA)-results suffered immediate graft loss. As a consequence of that report, the presence of positive crossmatch has generally been considered a contraindication to kidney transplantation, although some transplants do proceed after positive crossmatch results, especially when the crossmatch is performed by the most sensitive techniques [2].
Analyses of outcomes of recipients transplanted with positive crossmatch results have shown improvement since the classical study of Patel and Terasaki. In Gebel, Bray and Nickerson's 2003 review of 23 reports, the median one-year graft survival reduction associated with CDCXM+ and/or FCXM+ results was 12% among first transplant recipients and 35% among re-transplant recipients [18]. In a single center study with longer follow-up, Mahoney et al reported that of 22 transplants performed after FCXM+ results, twelve were lost in the first two months, but the remaining 10 were still functioning at two years [19]. As an outgrowth of these reports, a commonly held belief developed that patients transplanted after a positive crossmatch who avoided early graft loss faced no greater long-term risks than patients transplanted after a negative crossmatch.
In an analysis of OPTN registry data for transplants performed in 1995-2007, Graff et al. [20] found contrary results. Specifically, they observed that FCXM+ compared to FCXM-results were associated with a 4-12% reduction in five year graft survival depending on type of donor and the lymphocyte target used, and that patients transplanted with FCXM+ results continued to show decreased graft survival beyond the first year [20]. In a subsequent study detrimental effect was again observed in years 1 to 5 after transplant, but no detrimental effects were seen in the 5 to 10 year period [23]. These studies will be described in more detail in a subsequent section.

The Role of Sensitive Techniques in Crossmatch Negative Recipients
Theoretically, a more sensitive test would identify a positive crossmatch not identified by a less sensitive test and result in better outcomes. The benefit of the use of more sensitive crossmatch techniques was addressed by Salvalaggio et al. [21,22] in an analysis of OPTN registry data for transplants from 1987 through 2005 [21]. By multivariate Cox regression, compared with T AHG CDCXM-/B-crossmatch results, T-B-FCXM results were associated with a significantly lower incidence of acute rejection during the first one year after transplant (aOR=0.85, P<0.0001). Five-year graft survival after transplant with T-B-FCXM (82.6%) was modestly better than after T-AHG CDCXM/B-crossmatch (81.4%, P= 0.008) or T-AHG CDCXM (81.1%, P< 0.0001), but on adjusted analysis was significantly different only among recipients from deceased donors and patients aged > 60 years. An updated analysis of OPTN registry data from 1995-2009, performed by the authors for this article, showed similar results (Table  1). Thus, more sensitive techniques has had little effect on the outcome of recipients with negative crossmatch results.

Contemporary Outcomes in Crossmatch Positive Recipients
As noted, the presence of a positive crossmatch has generally been considered a contraindication to kidney transplantation, but not infrequently, patients with FCXM+ results are transplanted and less frequently, transplants proceed with CDCXM+ results [18,24]. Although some of the patients transplanted with a positive crossmatch result may be inadvertent, others may have been transplanted purposefully considering the risk of transplanting with a positive crossmatch result to be less than remaining on dialysis. Of those transplanted purposefully after a positive crossmatch, some may have received antibody reduction therapy that did not remove all antibody. Current OPTN STAR files do not include information that allow the identification of positive crossmatch recipients treated with antibody reduction therapies.

Outcomes in FCXM+ recipients
In a 2008 report, Lentine et al. [22] examined OPTN Registry data from January 1995 through November 2007, to characterize 5-year outcomes of 66,590 kidney transplants performed after FCXM [22]. Outcomes of FCXM+ transplants are shown in Table 3. Based on target (T cell, B cell or un-separated lymphocytes) and test results (negative, positive, weak positive, not measurable), outcomes could be divided into 14 groups, only three of which (T+B+, T+B not measurable and T-B+ FCXM) showed consistently reduced graft survival compared with T-B-FCXM results ( Table 2). The T-B-, T-B weak positive, T-B+ FCXM groups were particularly revealing. Graft survival after T-B   aHR, adjusted hazard ratio.

Potential Reasons for Markedly Improved Outcomes
Certainly, in current studies and probably in the Patel-Terasaki study [9], transplants performed after positive crossmatch results reflect a small minority of all potential crossmatch-positive transplant recipients. In comparing current results with those of Patel and Terasaki, an important consideration is the evolution of selection factors in the decision to proceed with crossmatch positive transplantation. In the Patel-Terasaki report, of the 413 transplants, 92% were first transplants, 80% were PRA-, 62% were male and 65% were LD recipients, a distribution associated with relatively good outcome. The report did not offer a breakdown for patients transplanted with CDCXM+. In a recent study reviewing OPTN registry, 1995-2009, outcomes in 25,699 transplant recipients, 85% were first transplants, 15% were PRA-, 59.5% were female and 40% were LD recipients. Patients with most recent PRA>50%, re-transplant, and female, DD recipients were significantly over-represented among CDCXM+ recipients. The distribution of all patients in the current study and particularly the patients transplanted with CDCXM+ are associated with relatively poor outcome [24]. Thus patient demographics do not appear to explain improved outcome in the current era. It should be noted that these traits define groups with reduced opportunities for transplantation and suggest that centers are willing to accept inferior outcomes in order to expand transplant access to disadvantaged patients.
The greatest change in outcome in crossmatch positive transplant recipients since 1969 has been the virtual elimination of immediate graft loss [20,23,24] (Table 3). Although there was no presentation of pathological information, it has been assumed that the major cause of immediate graft loss in those transplanted with a positive crossmatch in the Patel and Terasaki series [9] was hyperacute rejection. Clearly, improved immunosuppressive regimens have played a major role in improving graft loss beyond the perioperative period, but it seems unlikely that they would be able to prevent hyperacute rejection. Although improved surgical and preservation techniques certainly have played a role in reduction of immediate graft loss since 1969, they cannot explain the difference in graft loss among crossmatch positive and negative recipients in the 1969 study. A possible explanatory factor for the markedly lower risk of hyperacute rejection with crossmatch positive transplants in modern practice may be selection of recipients with low anti-HLA titers. The relatively insensitive CDCXM technique of 1969 undoubtedly required the presence of high titers of antibody to show a positive reaction in contrast to today's sensitive crossmatch techniques, which have undergone multiple modifications to increase sensitivity. Although titer data are not available in either Patel-Terasaki's study or current OPTN records, the previously noted good outcomes in recipients with weak FCXM+ results, which could be considered low titer, could support the hypothesis that titer has an effect on outcome. In reports dealing with antibody reduction therapy, both Gloor et al. [25] and Montgomery et al. [26] have reported poorer outcome when the recipient had a CDCXM+ result (indicative of a relatively high antibody titer) than when donor specific antibody (DSA) antibody was present with a CDCXM-result (indicative of a relatively lower titer). These observations will be considered in more detail in the next section.

Antibody Reduction Therapy
If titer is a determining factor in the effect of antibody on outcome, then eliminating, reducing or modulating antibody might have a beneficial effect on kidney transplant outcome. Antibody reduction/ modulation therapy has been used to treat many immunologically related diseases (reviewed in [27]. With this background, protocols have been initiated to reduce/modulate antibody levels in sensitized potential kidney transplant recipients. Early protocols utilized immunoabsorbants and plasma exchange [28]. Intravenous immunoglogulin (IVIG) [29][30][31] anti B cell agents [29,31] and, transiently, splenectomy [25,26] were later added. The protocols at most centers treated potential LD recipients that had crossmatch positive with their prospective donors and the protocol of at least one center treated potential DD recipients with crossmatch positive [32]. Although all centers using antibody reduction protocols attempted to render potential recipients antibody-free, most transplanted patients with residual antibody [25,26,31,32] The Cedar Sinai program transplanted 45 DD and 31 LD HLAsensitized recipients between July 2006 and February 2009 with IVIG and Rituximab therapy [33]. Although class I PRA was reduced 12.6%, class II, 10% (P=0.01) and the T cell FCXM channel shift by 125, many of the recipients had residual donor specific antibody at the time of transplantation. Patient survival at two years was 100% for LD recipients and 90% for DD recipients. Graft survival at two years was 90% for LD recipients and 80% for DD recipients.
The Hopkins program reported attempted desensitization (plasmapheresis and IVIG with transient use of splenectomy) of 215 patients with 211 receiving LD kidneys between February 1998 and December 2009 [26]. Patient survival was compared to that of demographically matched dialysis patients. The 1, 3, 5 and 8 year patient survivals of 90.6%, 85.7%, 80.6% and 80.6% in the desensitized group were clearly better than the 91.1%, 67.2%, 51.5% and 30.5% patient survivals in the dialysis group. Stratification of transplant recipients receiving antibody reduction therapy showed best outcomes in recipients who were FCXM-and DSA+ as defined by antibody screen, at the beginning of therapy, intermediate outcomes in FCXM+/ CDCXM-recipients and worse outcomes in CDCXM+ recipients. Overall patient survival was clearly better than for those waiting for a crossmatch negative kidney or those remaining on dialysis. No graft outcome data were offered in that publication.
University of Maryland reported on 41 plasmapheresis and IVIG treated and 41 crossmatch negative control recipients transplanted between February 1999 and October 2006 [31]. The authors deemed that the difference in graft survival at one year (7.7%) was acceptable.
The year graft survival in the treated group was 69.4% compared with 89.9% in the control group, a difference of 11.2%. Among the treated transplant recipients, those who were T-B-FCXM at the time of transplant had a five-year 87% graft survival compared to a 53% graft survival for recipients who were FCXM+ at the time of transplant. Thus the deterioration of five year outcome in the antibody reduction treated group was limited to those with residual antibody at the time of transplant.
The Mayo Clinic program reported outcomes on 189 patients transplanted between April 2000 and September 2007, 51 T+ AHG CDCXM and 37 T-AHG CDCXM/FCXM+ recipients with channel shifts >300 treated with combinations of plasmapheresis, IVIG, rituximab and transiently splenectomy, and 30 T-AHG CDCXM/ FCXM+ with channel shifts <300 and T-B-FCXM recipients that were untreated [25]. Recall that a 40 channel shift for T cells and an 80 channel shift for B cells is generally considered to be a positive test. Although there was a significantly higher rate of graft loss in treated T+ AHG CDCXM recipients (24 out of 56) compared to other groups, (HR 7.71, P=0.0001), the differences between the treated T-AHG CDCXM/FCXM+ with channel shifts >300 (2 out of 37), untreated T-AHG CDCXM/FCXM+ with channel shifts <300 (1 out of 30) and T-B-FCXM (0 out of 70) groups were not significant (P=0.57), once again showing better outcomes with recipients with presumably lower amounts of antibody.

The Role of Virtual Crossmatch
In contrast to the crossmatch which is designed to identify the presence of antibodies in a given serum directed at the antigens of a particular donor, the antibody screen is designed to survey all antibodies present in a given serum, as indicated by its reactivity with a panel of antigen-bearing targets, lymphocyte or artificial platform. Antibody screening reports the presence or absence of antibody, and, if present, the percent of the panel with which the serum reacts (panel reactive antibody, or PRA). Because each target contains multiple antigens, it cannot be directly known what antibodies are present. By using large diverse panels the antibodies present can be indirectly identified by the serum reactivity pattern. This technique had limited correlation with the crossmatch. As with crossmatch technology, antibody screening also has evolved, making available increased specificity and sensitivity. The modifications described for the CDCXM also were used for antibody screening. This was followed by the development of "solid phase technology", i.e. the ability to solubilize HLA antigens and bind them to the wells of plastic trays [15] and to beads [16]. Further refinements allowed the synthesis of HLA antigens and attachment of each antigen on a separate SAB [17]. The presence of antibody adhering to SAB is measured with a fluorescing label and fluorescence is quantified as mean fluorescence index (MFI).
With the use of solid phase antibody screening in general and SAB in particular, the concordance with crossmatch results has improved, allowing the antibody screen to accurately predict crossmatch results, leading to the coining of the term VXM. The correlation was good enough that in 2006, UNOS deemed that the presence of antibodies in a patient's serum against antigens of a potential donor makes that patient ineligible for being crossmatched with that donor [34]. Nevertheless, because a patient with a high titer of anti HLA antibody and no antibody directed at the antigens of a prospective donor has a high chance of a negative crossmatch, more high PRA patients are being crossmatched and transplanted [35].
The follow-up period on the VXM data is relatively short and associated with conflicting reports. Levine et al [36] found no detrimental effect on outcome associated with the presence of DSA when the MFI was <2000 and Morris et al. [37] and Lazarova et al. [38] found no outcome difference in crossmatch negative recipients with and without DSA. Conversely, Lefaucheur et al. [39] reported decreased graft survival in recipients with a negative crossmatch and the presence of DSA, as well as correlation between the level of MFI, antibodymediated rejection and graft loss. Caro-Oleas et al. [40] reported decreased graft survival in recipients with a negative crossmatch and the presence of DSA, but no correlation between graft loss and MFI with no mention of antibody-mediated rejection. Amico et al. [41] found that DSA+ recipients exhibiting antibody-mediated rejection showed reduced graft survival while those not exhibiting antibodymediated rejection did not. Clearly, SAB technology has increased our ability to identify antibody specificity. In order for this information to translate into an accurate VXM, the complete donor HLA phenotype must be known, including HLA A, B, Cw, DR, DQ, and DP [41].

Post-transplant Monitoring
As noted throughout, following successful transplantation, some recipients tolerate their allografts, while others suffer rejection episodes which require augmented therapy. The ultimate evidence for rejection is kidney biopsy evidence and deterioration of graft function. Scientists have sought a test less invasive than biopsy and with earlier recognition than deterioration of function, with the expectation that early recognition of rejection would allow more effective treatment [42][43][44]. Although a detailed discussion of these techniques is beyond the scope of this review it should be noted that a correlation exists between the presence of DSA and rejection (particularly antibody-mediated rejection) and graft loss. This is true for recipients developing new DSA post-transplant [45] and those with pre-transplant DSA whose DSA persists post-transplant [46,47]. Interestingly, recipients with DSA at the time of transplant whose DSA disappear post-transplant do well [47]. Pretransplant demographics have not been useful in distinguishing those whose DSA will disappear from those whose DSA will persist [47]. Under most circumstances, donor cells are not available, obviating the use of the classical crossmatch, but solid phase antibody screening has been an effective method of demonstrating DSA.

Conclusion
Since the hallmark study of Patel and Terasaki, the outcomes of patients transplanted with positive crossmatch results has greatly improved. A possible explanation may be selection of recipients with low anti-HLA titers. Although the VXM adds information on the presence of DSA, we have only sparse data on the outcome implications of such results when the actual crossmatch is negative or "borderline" positive. In some centers, both of these circumstances result in the elimination of such a potential recipient from consideration for transplant without further testing. Centers employing antibody reduction protocols report good early outcome, although one program reports decreased five-year graft survival results in recipients transplanted with residual DSA after completion of antibody reduction therapy. Post-transplant demonstration of the persistence or appearance of DSA is of value in directing monitoring. Current technology must be modified or new technology developed that will differentiate transplants with acceptable from unacceptable immunologic risk. Further work to prospectively determine under what circumstances crossmatch positive transplants can prospectively precede with safety is warranted.