BACKGROUND
The numbers of allogeneic stem cell transplants (SCTs) performed in the United States have increased steadily, from about 7,500 in 1994-1995 to over 13,500 in 2010-2011, in patients above the age of 20 years (1). Donor identification has been a constant challenge, and only 30% of patients who need allogeneic SCT have a matched sibling donor. The National Marrow Donor Program (NMDP) and its cooperative international registries have about 16 million volunteer donors. It is estimated that 75% of white patients, but only 16% of African Americans and other minority patients, will be able to find a suitably matched unrelated donor (MUD) and proceed to transplantation (2). Mismatched related (often haploidentical), cord blood (CB) or mismatched unrelated donors (MMUDs) with either peripheral blood (PB) or bone marrow (BM) graft sources are potential options for patients in need of a SCT but lacking a matched related donor (MRD) or unrelated donor.
Using CB as the graft source provides many advantages; CB units are easy to collect with little or no risk to the mother or newborn. Cord blood units can be rapidly obtained for 80% to 95% of the patients 20 years and older across all races and in almost 100% of younger patients (2). This is particularly advantageous in cases where urgent transplant is mandated. Owing to rapid procurement of CB units, patients can receive CB transplantation (CBT) 4 or 5 weeks earlier than those receiving SCT with a MUD (3). Further, CBT is associated with low risk of infection transmission, requires less-stringent human leukocyte antigen (HLA)–matching criteria, and has relatively lower risk of graft-versus-host disease (GVHD) with preserved graft-versus-malignancy effects. However, it is associated with a greater risk of graft rejection, delayed engraftment, and delayed immune reconstitution, leading to heightened risk of infection or nonrelapse mortality (NRM) (4,5,6,7). Many of the adverse outcomes noted after CBT are attributed to the naïveté of CB T lymphocytes and the low numbers of total nucleated cells (TNCs) and CD34+ cells in the graft.
CELL DOSE AND HUMAN LEUKOCYTE ANTIGEN MATCHING
The outcomes of CBT depend on the impact of cell dose and the degree of HLA match (8). The TNC dose available for CBT is a fraction of what is typically used in the PB or BM setting. The median TNC dose obtained from a BM harvest is about 3 × 108 TNCs/kg; the granulocyte colony-stimulating factor (G-CSF)–mobilized PB can yield a median of 7 × 108 TNCs/kg (9). In contrast, about one-fourth of the CB units contain less than 0.25 × 108 TNCs/kg, and two-thirds of the units have between 0.25 × 108 to 1 × 108 TNCs/kg (8). The recommended minimum cell dose is typically 2 × 107 TNCs/kg for successful engraftment after a CBT.
The HLA matching criteria between the CB units and the recipient are less stringent compared with other donor sources. Therefore, while unrelated adult donors are selected to be closely matched to recipients at HLA-A, -B, -C, and -DRB1 by high-resolution testing (10,11), CB units are selected using lower-resolution HLA typing (antigen level) for HLA-A and -B and at the allele level for HLA-DRB1 (12). In a study of single CBT, Barker et al showed that recipients of 6/6 matched CB units had the lowest transplant-related mortality (TRM) regardless of the dose, followed by 5/6 matched CB units with TNC dose greater than 2.5 × 107/kg or 4/6 matched units with TNC dose greater than 5.0 × 107/kg, and 5/6 matched units with lower TNC dose (<2.5 × 107/kg) (8,13). These findings support the notion that both TNC dose and HLA-matching level should be taken into account for CB unit selection.
Although the standard HLA-matching criteria do not require high-resolution typing at class I antigen in CBT, a recent study by the Center for International Blood and Marrow Transplant Research (CIBMTR) and the Eurocord reported better outcomes in single CBT after myeloablative conditioning (MAC) with improved allele-level matching for 4 HLA loci (-A, -B, -C, and -DRB1) (14). The investigators showed that the frequency of neutrophil recovery was lower for recipients of mismatches at three to five but not at one or two alleles compared with those of HLA-matched units. Nonrelapse mortality was higher with units mismatched at one to five alleles compared with matched units. This retrospective study confirmed the clinical importance of selecting better HLA allele-matched units for single CBT, an observation already well described for BM and PB progenitor cell transplantation. The effect of HLA matching by high-resolution testing is unclear after double CBT (dCBT) and should be investigated.
SINGLE VERSUS DOUBLE CORD BLOOD TRANSPLANTATION
The relatively low number of progenitor cells in a single CB unit resulting in delayed hematopoietic recovery, and engraftment failure limited the use of CBT in adults. Most adults do not have access to a single CB unit containing the recommended nucleated cell dose of 2.5 × 107 TNCs/kg (15). To overcome the cell-dose limitation, investigators pioneered an approach by which two partially HLA-matched CB units were used to augment the progenitor cell dose when a single unit was considered inadequate and confirmed its feasibility (15). A recent CIBMTR analysis investigated the relative risks and benefits of transplanting double CB units compared with an adequately dosed single CB unit. The investigators observed no differences in clinical outcomes after dCBT or adequately dosed single CBT. Both transplant approaches had comparable outcomes with 78% (95% CI, 72-83) versus 81% (95% CI, 74-88, P = .83) probabilities of neutrophil engraftment by day 42, and 68% (95% CI, 62-74) versus 63% (95% CI, 53-72; P = .34) probabilities of platelet recoveries at 6 months, respectively. There were no differences for grades III or IV acute GVHD (18% [95% CI, 11%-26%] versus 14% [95% CI, 10%-19%], P = .64), 2-year probabilities of chronic GVHD (31% [95% CI, 26-37] versus 24% [95% CI, 15-34], P = .27), treatment-related mortality (hazard ratio [HR] 0.91; P = .63), risk of relapse (HR 0.90, P = .64), and overall mortality (HR 0.93, P = .62) (16).
A unique feature after a dCBT is evidence of mixed chimerism from both the CB units observed during the initial posttransplant period.(17) In the early post-dCBT period (day +21), both CB units contribute to hematopoiesis in 40% to 50% of patients, but by day +100 one unit predominates in a vast majority (18,19). The factors leading to unit dominance are not well defined. It is however recognized that there is no association with the TNC or CD34+ cell doses, HLA match, gender match, ABO typing, or the order in which CB units are infused (15,18,19,20,21). This current lack of evidence is a major limitation to dCBT, and identifying predictive factors for unit dominance would optimize CB unit selection algorithms by allowing for the selection of two CB units with a high probability of long-term engraftment.
CONDITIONING REGIMENS
High-intensity MAC regimens are reserved for young and otherwise-fit patients who can tolerate the associated regimen morbidity. Such regimens lead to a low risk of relapse at the expense of a high TRM compared with reduced-intensity conditioning (RIC) regimens.
One of the largest registry studies comparing single-unit CB (n = 165) to PB (n = 888) or BM (n = 472) transplants in adults with acute leukemia using MAC regimens from 2002 through 2006 showed promising outcomes with CBT. Total body irradiation (TBI) constituted part of the preparative regimen in about half of patients in the CB group and about two-thirds in the comparative groups. Despite a significantly higher number of patients with fully HLA-matched PB or BM grafts (70%) compared to CB grafts (6%), the rates of disease-free survival (DFS) and relapse were similar among the groups, while the risks of acute or chronic GVHD were significantly lower with CBT. Also, TRM was similar with CBT compared with mismatched PB or BM grafts, but higher in contrast to fully matched PB (HR 1.62; 95% CI, 1.18-2.23; P = .003) or BM transplants (HR 1.69; 95% CI, 1.19-2.39; P = .003). This was offset by a significantly lower incidence of chronic GVHD compared with fully matched PB (HR 0.38; 95% CI, 0.27-0.53; P = .001) or BM transplants (HR 0.63; 95% CI, 0.44-0.90; P = .01) (4). Therefore, in the absence of matched PB or BM donors, CBT potentially offers better outcomes compared with mismatched alternative donor transplants.
Similar encouraging results were noted in a study that compared 4-6/6 matched dCBT exclusively (using the MAC regimen including fludarabine 75 mg/m2, cyclophosphamide 120 mg/kg, and TBI 1,200 to 1,320 cGy [Flu/Cy/TBI]) to 8/8 MRD or MUD, or 1 allele–MMUD donors (7). This study also noted lower risk of relapse, higher TRM, lower GVHD, and comparable DFS after CBT compared to other groups. The risk of relapse was significantly lower after dCBT (15%, 95% CI, 9%-22%), compared with MRD (43%, 95% CI, 35%-52%) or MUD (37%, 95% CI, 29%-46%) transplants. Higher NRM was noted after dCBT (34%, 95% CI, 25%-42%) compared to MRD (24%, 95% CI, 17%-39%) or MUD (14%, 95% CI, 9%-20%) transplants, which resulted in comparable 5-year DFS between CB (51%, 95% CI, 41%-59%), MRD 33% (95% CI, 26%-41%), and MUD (48%, 95% CI, 40%-56%) transplants. The cumulative incidence of grades II to IV GVHD at day 100 after DCBT, MRD, and MUD was 60% (95% CI, 50%-70%), 65% (95% CI, 57%-73%), and 80% (95% CI, 70%-90%), respectively. The rate of chronic GVHD at 2 years was 26% (95% CI, 15%-35%), 47% (95% CI, 39%-55%), and 43% (95% CI, 34%-52%) respectively (7).
In adults with high-risk acute lymphoblastic leukemia (ALL), CBT leads to similar overall survival (OS), TRM, and relapse risk, with significantly lower risk of acute or chronic GVHD, compared with PB or BM transplants. This was demonstrated in a recent registry study that compared outcomes after single or double 4-6/6 matched CB (n = 116) and 7-8/8 matched PB (n = 546) or BM (n =140) transplants after MAC regimens (22). More than half of the patients in the CBT group received Flu/Cy/TBI as the conditioning regimen, while about 75% of the patients in the PB or BM groups received TBI/Cy-based regimens. There were no differences in the 3-year OS rates (44%, 44%, and 43%, respectively); relapse risk (22%, 25%, and 28%, respectively); or TRM (42%, 31%, and 39%, respectively) among the groups. However, the risk of acute grades II to IV GVHD (27%, 47%, and 41%, respectively) or grades III and IV acute GVHD (9%, 16%, 24%, respectively) was appreciably lower after CBT compared with 8/8-matched and 7/8-matched PB or BM transplants, respectively (22).
The advent of RIC regimens extended the utility of CBT to older patients and those with comorbid conditions that otherwise restrict the use of the MAC regimens. It is noteworthy that a majority of trials in MRD or MUD transplants used an arbitrary age definition of greater than 55 to 60 years (to define “older patients”) as a threshold of using an RIC regimen. However, age greater than 40 to 45 years is generally chosen as a threshold for RIC in the CBT setting.
Barker et al reported that the RIC with fludarabine 200 mg/m2, cyclophosphamide 50 mg/kg, and 2-Gy TBI (Flu/Cy/2Gy TBI) was well tolerated, with rapid neutrophil recovery, a sustained donor engraftment rate of 94%, and a low incidence of TRM (23). This regimen was associated with significantly better DFS compared with other RIC regimens (51% vs 28%, P = .0002, HR 0.53) (24). Multiple studies supporting the use of RIC CBT in patients who would not be able to tolerate more intensive preparative regimens have subsequently been reported (6,18,19,25,26,27).
A retrospective single-center study compared the outcomes in patients older than 55 years who underwent CB and MRD SCT with RIC (primarily of Flu/Cy/2Gy TBI). There were no differences in TRM at 180 days (28%, 95% CI, 14%-41% vs 23%, 95% CI, 11%-36%); 3-year DFS (34%, 95% CI, 19%-48% vs 30%, 95% CI, 16%-44%); or 3-year OS (34%, 95% CI, 17%-50% vs 43%, 95% CI, 29%-58%) between the groups (6