Thalassemia: An Overview of 50 Years of Clinical Research




The thalassemias are attributable to the defective production of the α- and β-globin polypeptides of hemoglobin. Significant discoveries have illuminated the pathophysiology and enhanced the prevention and treatment of the thalassemias, and this article reviews many of the advances that have occurred in the past 50 years. However, the application of new approaches to the treatment of these disorders has been slow, particularly in the developing world where the diseases are common, but there is definite progress. This article emphasizes how the increasing knowledge of cellular and molecular biology are facilitating the development of more effective therapies for these patients.


The thalassemias are a group of disorders that are attributable to the defective production of hemoglobin (Hb). The mature Hb molecule is a tetramer composed of 2 α-globin and 2 β-globin polypeptides, which assemble, along with a heme prosthetic group, to form the complete molecule. In the α-thalassemias, sufficiently defective production of α-globin chains results in decreased red cell (erythrocyte) Hb content and free β-globin polypeptides, which can assemble to form a moderately unstable Hb known as HbH. This unstable Hb causes a mild to moderate hemolytic and hypochromic anemia. In the β-thalassemias, impaired production of β-globin chains result in unpaired α-globin chains, which are unstable in erythroid precursors, where they precipitate and cause membrane injury and unfolded protein responses and thereby lead to toxicity and death of these cells. This in turn causes ineffective erythropoiesis and the numerous clinical features of the disease.


In the last 5 decades, a significant set of discoveries has illuminated the pathophysiology and enhanced the prevention and treatment of the thalassemias. This article briefly reviews many of the important advances in thalassemia that have occurred in this period. The study of the thalassemia syndromes has greatly enhanced the development and application of molecular biology to biomedical research. Patients with the thalassemia syndromes have contributed enormously to understanding of the molecular basis of health and disease. However, the application of these new approaches to the treatment of these disorders has been slow, particularly in the developing world where the diseases are common, but there is definite progress. The articles in this issue emphasize how ever increasing knowledge of cellular and molecular biology is facilitating the development of more effective therapies for these patients.


The molecular pathophysiology of thalassemia


Although the first clinical descriptions of the thalassemia syndromes were published by Cooley, Rietti, Greppi, and Micheli in 1925, it took many more years before the pathophysiology of these diseases began to be elucidated. The first clues came from research in the 1950s that used protein chemistry to assess various Hb variants. Based on the extensive studies by numerous groups in this era, Ingram and Stretton suggested that there were 2 groups of thalassemias, α and β, which were caused by defects in the synthesis of α- and β-globin polypeptides, respectively. Several years later, Fessas suggested that the cause of the β-thalassemia syndromes could be attributable to unbalanced globin chain synthesis, with the disease manifestations resulting from the presence of intraerythroblastic inclusions of unpaired α-globin molecules. His prescient idea was confirmed by spectrographic observations with Thorell that the inclusions were indeed comprised largely of α-globin chains. Several other workers, including Clegg, Weatherall, Marks, Weissman, and Nathan, came to similar conclusions about the pathophysiology of this disease and provided a great deal of experimental support for this hypothesis.


However, with only the tools of protein chemistry available, the exact molecular basis of these diseases remained an enigma and a great deal of speculation took place in the era following these observations. After much experimental work, an era of RNA analysis ushered in important new approaches to this problem and several seminal experimental observations followed. In the early 1970s, Benz, Forget, Kan, Nathan, Lodish, Marks, Bank, and Nienhuis showed that β-globin mRNA translation was reduced in patients with β-thalassemia, suggesting that this defect was caused by impaired or defective production of a functional mRNA. With the discovery and isolation of reverse transcriptase, newer approaches to this problem became available because cDNA could be synthesized, and this rapidly led to the elucidation by Housman, Benz, Forget, Bank, and numerous others that the thalassemias appeared to be generally attributable to decreased globin mRNA levels. Kan, Weatherall, and their colleagues, used α-globin cDNA probes to identify the first genetic mutations in thalassemia by showing that the α-globin genes were deleted in certain forms of α-thalassemia.


Soon afterward, a highly productive period began with the identification of various thalassemia mutations and deletions. Numerous clinical scientists, including Kan, Forget, Weatherall, Orkin, Higgs, Kazazian, were empowered by the tools and insight provided by basic scientists such as David Baltimore, Phil Sharp, Tom Maniatis, Phil Leder, Harvey Lodish, and Daniel Nathans. The work began with the use of Southern blotting to elucidate deletions resulting in thalassemia. Although this was highly successful in the α-thalassemias, it was of limited use in β-thalassemia, and only in specific instances. This problem was solved with the use of gene cloning and DNA sequencing to identify point mutations that result in the thalassemias. In most instances, recurrent common mutations were found in many patients, making routine sequencing laborious and limited in its ability to identify new mutations. However, the finding that different β-globin gene mutations exist on haplotype blocks, at least at the β-globin locus, at first by Kan in sickle cell anemia and then by Antonarakis, Kazazian, and Orkin in β-thalassemia, helped to surmount this limitation. This resulted in the discovery of hundreds of mutations that cause the thalassemia syndromes. Not only was a large set of mutations identified that result in the thalassemias, but this also led to the elucidation of many mechanisms that can impair mRNA production or function in the cell. Insights were gained into processes as diverse as transcription, mRNA modification, splicing, and translation. These findings presaged the period that would follow with the identification of mutations in various genes that cause a diverse group of mendelian disorders. Even in the past few years, new lessons on the molecular lesions that can cause human disease continue to be learned from the thalassemias, as exemplified by a Melanesian form of α-thalassemia that is attributable to the creation of a new promoter, which competes during transcription with the endogenous α-globin promoters.


Even before the insights from molecular biology began to be used to elucidate the pathophysiology of the thalassemia syndromes, an effort was initiated to use measurements of globin chain imbalance for prenatal diagnosis. The methods pioneered by several scientists, including Clegg, Weatherall, Huehns, Stamatoyannopoulos, and Kazazian, led directly to the first successful prenatal diagnosis by Kan and colleagues. Advances in these techniques were made by Chang and colleagues, leading to more feasible diagnostic methods. Cao, Loukopoulos, and others in the Mediterranean region helped to apply these methods in their respective populations, resulting in a large reduction in the incidence of new patients with thalassemia in those populations. With the discovery of the molecular basis for many of the common thalassemia mutations, DNA-based prenatal genetic diagnostic methods were soon applied and supplanted the older methods. This allowed earlier detection in the first trimester and, as a result, these methods were more widely adopted in a variety of countries. However, there is still controversy surrounding these approaches, limiting their use to populations in which prenatal genetic diagnosis or premarital genetic counseling is accepted.




Transfusion therapy and iron chelation


Although the elucidation of the molecular pathophysiology of the thalassemia syndromes proceeded rapidly, therapeutic advances were slower. Many patients with β-thalassemia, and occasional patients with severe forms of α-thalassemia, require regular transfusions to survive. In the case of β-thalassemia, this therapy had an important effect in reducing the massive ineffective erythropoiesis and attendant bone destruction and organ infiltration that characterized untreated β-thalassemia. Initial work, started by Wolman and later Piomelli and colleagues in the 1960s, suggested the importance of maintaining the Hb level of patients at least at 6 to 8 g per deciliter. Subsequent studies suggested that considerably higher levels of Hb may be desirable, but newer evidence suggests that a target of 9 to 10 g per deciliter seems to be most effective. In the 1970s, several additional advances in transfusion techniques occurred. Filtration of blood to remove leukocytes began to be used and allowed for a reduction in severe febrile reactions and decreased alloimmunization to human leukocyte antigens. In addition, the concept of starting transfusions early in life and sustaining these transfusions without interruption appeared to be helpful in achieving immune tolerance.


However, a major problem became evident in patients who were regularly transfused during the 1960s and 1970s. Although many of the patients survived past childhood as a result of these transfusions, the patients were often severely affected as a result of iron overload. Most commonly, these patients were dying as a result of heart failure. The extent of hepatic iron appeared to correlate most closely with the occurrence of heart failure, whereas cardiac iron levels did not seem to relate to the presence of heart failure. Such observations have been supported by studies of patients with thalassemia and also studies on hemochromatosis. This phenomenon seems to be attributable to free non–transferrin-bound iron being most toxic to cardiomyocytes. Therefore, the presence of low levels of cardiac iron does not prevent heart failure from free circulating iron.


There has been a great deal of work in the past few years to find newer and better methods to measure noninvasively the burden of iron in the heart, liver, and other tissues. Some work has suggested that newer magnetic resonance imaging (MRI)–based methods of cardiac iron measurement may be useful. However, as discussed earlier, cardiac iron does not necessarily provide a good surrogate for the iron burden that is likely to result in heart failure. Therefore, in spite of these developments, care needs to be taken to ensure that clinical practices are based on adequate evidence of efficacy. Other methods show promise, such as the superconducting quantum interface device that has the potential to provide an integrated measurement of liver iron. However, this technology is both cumbersome and expensive, and there are questions of reproducibility between centers. Hepatic MRI provides an integrated measurement of liver iron burden. However, this method is also expensive, and the signal is dependent on the chemical state of iron. Hepatic MRI is becoming the gold standard for measurement of total body iron and has supplanted liver biopsy because the latter is adversely affected by uneven iron distribution. Computed tomography has not been clinically developed for this purpose because of justifiable concerns for radiation dosage.


In the 1970s, a major advance occurred in treating the iron overload present in these chronically transfused patients. Propper and colleagues developed a clinically effective method for continuous subcutaneous infusion of deferoxamine, an effective but nonabsorbable iron chelator with a short plasma half-life, through the use of a portable pump. Pippard and colleagues then modified this regimen for increased compliance. In the following decades, 3 major independent clinical trials showed markedly improved cardiac disease–free survival in patients who followed modified versions of the protocols for the use of deferoxamine. However, in spite of these promising results with deferoxamine, many therapeutic obstacles remained for dealing with iron chelation. In the period immediately following the development of these protocols, it became evident that, for many patients, deferoxamine would only have limited efficacy, because these patients had already been iron overloaded for too long. Therefore, it took a couple of decades to see the results of this treatment in the patients who began these regimens from an early age. It was also evident that the use of an iron chelator that required continuous subcutaneous infusion was limited by compliance in many patients.


As a result of these limitations, a search for orally available iron chelators was initiated. The first such compound to become available was deferiprone, an absorbable and more cell-penetrable agent with a short plasma half-life. Although the history of this iron chelator, and the fallout from clinical trials to evaluate its efficacy, is a controversial and complicated area in the history of iron chelation, it is evident that there has never been a demonstration of superiority for this chelation method compared with the use of deferoxamine. Deferiprone has been shown to be unable to prevent accumulation or reduce the iron burden in most patients in whom it was tested. This drug has not been approved by the US Food and Drug Administration (FDA) for any use in the United States. Nonetheless, a potentially important observation was made by Giardina and Grady regarding this iron chelator. They suggested that deferiprone may be useful as a shuttle to assist in the removal of cardiac iron by transferring this iron to deferoxamine. Although this has been suggested as a potentially useful approach and clinical trials are being set up to properly test the efficacy of this regimen, it is not clear that this will solve the compliance issues facing the use of deferoxamine alone.


In the intervening years, Novartis developed deferasirox, an alternative orally available and rationally designed iron chelator with a long plasma half-life. This chelator showed clear efficacy in initial small clinical trials. Since that time, several larger trials have shown its efficacy and comparability with deferoxamine. This drug is now FDA approved for use in patients with β-thalassemia, as well as patients requiring blood transfusions for other diseases including sickle cell disease, myelodysplastic syndrome, and other chronic anemias. The drug is useful, but variations in bioavailability create important dosing considerations, and the drug is expensive. The topics of iron overload and chelation therapy are discussed in greater detail in the article by Porter and Shah elsewhere in this issue.


Although the development and clinical use of deferasirox has been promising, more work is needed. Iron overload is still a problem, even when the best and strictest iron chelation regimens are used. It is likely that highly effective methods will be developed from a more sophisticated understanding of the physiologic regulation of iron. This topic is reviewed in depth in the article by Gardenghi and colleagues elsewhere in this issue. In the past few years, understanding of this regulatory process has expanded enormously and the therapeutically intervening with this regulation now seems possible. In 2000, the peptide hormone hepcidin was discovered by Ganz and colleagues. Hepcidin seems to be a master regulator of iron homeostasis by directly inhibiting the activity of the major iron transporter, ferroportin. Ferroportin is necessary for iron release from intestinal cells and macrophages into the blood Miller and colleagues recently suggested that ineffective erythropoiesis in thalassemia causes an increase in the level of a protein known as GDF15, which inhibits the activity of hepcidin. If this observation is confirmed, it may, in part, explain why patients with thalassemia intermedia become iron overloaded so readily. This work, along with several other studies, suggests that hepcidin may be a key therapeutic target that could reduce iron overload in patients with thalassemia. There have also been recent efforts to use other strategies to reduce the toxicity of free iron, such as through the use of transferrin, in mouse models of thalassemia. Although this seems to be beneficial in these mouse models, it is unclear whether this will be a viable therapeutic strategy, given the need for regular expensive infusion treatments in patients. Nonetheless, it is clear that recent work in this field will assist in increasing understanding of normal and pathologic iron metabolism, and this will lead to more effective strategies to control iron overload in chronically transfused patients.




Transfusion therapy and iron chelation


Although the elucidation of the molecular pathophysiology of the thalassemia syndromes proceeded rapidly, therapeutic advances were slower. Many patients with β-thalassemia, and occasional patients with severe forms of α-thalassemia, require regular transfusions to survive. In the case of β-thalassemia, this therapy had an important effect in reducing the massive ineffective erythropoiesis and attendant bone destruction and organ infiltration that characterized untreated β-thalassemia. Initial work, started by Wolman and later Piomelli and colleagues in the 1960s, suggested the importance of maintaining the Hb level of patients at least at 6 to 8 g per deciliter. Subsequent studies suggested that considerably higher levels of Hb may be desirable, but newer evidence suggests that a target of 9 to 10 g per deciliter seems to be most effective. In the 1970s, several additional advances in transfusion techniques occurred. Filtration of blood to remove leukocytes began to be used and allowed for a reduction in severe febrile reactions and decreased alloimmunization to human leukocyte antigens. In addition, the concept of starting transfusions early in life and sustaining these transfusions without interruption appeared to be helpful in achieving immune tolerance.


However, a major problem became evident in patients who were regularly transfused during the 1960s and 1970s. Although many of the patients survived past childhood as a result of these transfusions, the patients were often severely affected as a result of iron overload. Most commonly, these patients were dying as a result of heart failure. The extent of hepatic iron appeared to correlate most closely with the occurrence of heart failure, whereas cardiac iron levels did not seem to relate to the presence of heart failure. Such observations have been supported by studies of patients with thalassemia and also studies on hemochromatosis. This phenomenon seems to be attributable to free non–transferrin-bound iron being most toxic to cardiomyocytes. Therefore, the presence of low levels of cardiac iron does not prevent heart failure from free circulating iron.


There has been a great deal of work in the past few years to find newer and better methods to measure noninvasively the burden of iron in the heart, liver, and other tissues. Some work has suggested that newer magnetic resonance imaging (MRI)–based methods of cardiac iron measurement may be useful. However, as discussed earlier, cardiac iron does not necessarily provide a good surrogate for the iron burden that is likely to result in heart failure. Therefore, in spite of these developments, care needs to be taken to ensure that clinical practices are based on adequate evidence of efficacy. Other methods show promise, such as the superconducting quantum interface device that has the potential to provide an integrated measurement of liver iron. However, this technology is both cumbersome and expensive, and there are questions of reproducibility between centers. Hepatic MRI provides an integrated measurement of liver iron burden. However, this method is also expensive, and the signal is dependent on the chemical state of iron. Hepatic MRI is becoming the gold standard for measurement of total body iron and has supplanted liver biopsy because the latter is adversely affected by uneven iron distribution. Computed tomography has not been clinically developed for this purpose because of justifiable concerns for radiation dosage.


In the 1970s, a major advance occurred in treating the iron overload present in these chronically transfused patients. Propper and colleagues developed a clinically effective method for continuous subcutaneous infusion of deferoxamine, an effective but nonabsorbable iron chelator with a short plasma half-life, through the use of a portable pump. Pippard and colleagues then modified this regimen for increased compliance. In the following decades, 3 major independent clinical trials showed markedly improved cardiac disease–free survival in patients who followed modified versions of the protocols for the use of deferoxamine. However, in spite of these promising results with deferoxamine, many therapeutic obstacles remained for dealing with iron chelation. In the period immediately following the development of these protocols, it became evident that, for many patients, deferoxamine would only have limited efficacy, because these patients had already been iron overloaded for too long. Therefore, it took a couple of decades to see the results of this treatment in the patients who began these regimens from an early age. It was also evident that the use of an iron chelator that required continuous subcutaneous infusion was limited by compliance in many patients.


As a result of these limitations, a search for orally available iron chelators was initiated. The first such compound to become available was deferiprone, an absorbable and more cell-penetrable agent with a short plasma half-life. Although the history of this iron chelator, and the fallout from clinical trials to evaluate its efficacy, is a controversial and complicated area in the history of iron chelation, it is evident that there has never been a demonstration of superiority for this chelation method compared with the use of deferoxamine. Deferiprone has been shown to be unable to prevent accumulation or reduce the iron burden in most patients in whom it was tested. This drug has not been approved by the US Food and Drug Administration (FDA) for any use in the United States. Nonetheless, a potentially important observation was made by Giardina and Grady regarding this iron chelator. They suggested that deferiprone may be useful as a shuttle to assist in the removal of cardiac iron by transferring this iron to deferoxamine. Although this has been suggested as a potentially useful approach and clinical trials are being set up to properly test the efficacy of this regimen, it is not clear that this will solve the compliance issues facing the use of deferoxamine alone.


In the intervening years, Novartis developed deferasirox, an alternative orally available and rationally designed iron chelator with a long plasma half-life. This chelator showed clear efficacy in initial small clinical trials. Since that time, several larger trials have shown its efficacy and comparability with deferoxamine. This drug is now FDA approved for use in patients with β-thalassemia, as well as patients requiring blood transfusions for other diseases including sickle cell disease, myelodysplastic syndrome, and other chronic anemias. The drug is useful, but variations in bioavailability create important dosing considerations, and the drug is expensive. The topics of iron overload and chelation therapy are discussed in greater detail in the article by Porter and Shah elsewhere in this issue.


Although the development and clinical use of deferasirox has been promising, more work is needed. Iron overload is still a problem, even when the best and strictest iron chelation regimens are used. It is likely that highly effective methods will be developed from a more sophisticated understanding of the physiologic regulation of iron. This topic is reviewed in depth in the article by Gardenghi and colleagues elsewhere in this issue. In the past few years, understanding of this regulatory process has expanded enormously and the therapeutically intervening with this regulation now seems possible. In 2000, the peptide hormone hepcidin was discovered by Ganz and colleagues. Hepcidin seems to be a master regulator of iron homeostasis by directly inhibiting the activity of the major iron transporter, ferroportin. Ferroportin is necessary for iron release from intestinal cells and macrophages into the blood Miller and colleagues recently suggested that ineffective erythropoiesis in thalassemia causes an increase in the level of a protein known as GDF15, which inhibits the activity of hepcidin. If this observation is confirmed, it may, in part, explain why patients with thalassemia intermedia become iron overloaded so readily. This work, along with several other studies, suggests that hepcidin may be a key therapeutic target that could reduce iron overload in patients with thalassemia. There have also been recent efforts to use other strategies to reduce the toxicity of free iron, such as through the use of transferrin, in mouse models of thalassemia. Although this seems to be beneficial in these mouse models, it is unclear whether this will be a viable therapeutic strategy, given the need for regular expensive infusion treatments in patients. Nonetheless, it is clear that recent work in this field will assist in increasing understanding of normal and pathologic iron metabolism, and this will lead to more effective strategies to control iron overload in chronically transfused patients.




Regulation of fetal Hb and its modulation in therapy


Higher levels of fetal Hb (HbF) are able to ameliorate the severity of β-thalassemia by improving the balance of α- and non–α-globin chains and thereby preventing inclusion body formation. This finding was particularly evident from careful clinical observations in rare patients with highly increased levels of HbF who had a significantly milder clinical course and from the observation of infants with β-thalassemia who only begin showing symptoms after the expression of HbF declines in the months following birth. Such findings have also been subsequently confirmed in larger epidemiologic studies of numerous thalassemia populations. As a result of this, several investigators have attempted to establish how the regulation of the fetal to adult Hb switch in humans occurs and how this can be therapeutically modulated.


The human globin genes were among the first to be cloned and the structure of the entire β-globin cluster was soon elucidated and sequenced. Around that time, it became evident that the silenced HbF genes in the β-globin cluster underwent DNA methylation in adult erythroid cells, whereas this was not the case in erythroid cells from embryonic and fetal life. As a result, in the early 1980s, DeSimone and colleagues tested whether a DNA hypomethylating agent, 5-azacytidine, might allow reactivation of the HbF genes in adult erythroid cells of monkeys. These experiments were successful and showed the usefulness of this approach to increase HbF levels. Ley and colleagues and Nienhuis then went on to test this approach in patients with β-thalassemia and sickle cell disease. These small trials were successful and experts in the field, such as Edward Benz, editorially remarked at that time that “molecular biology has come to the bedside.” However, there was concern about the long-term sequelae of a potentially mutagenic compound like 5-azacytidine. Nathan and Stamatoyannopoulos suggested that the effect of 5-azacytidine on HbF induction might relate to its ability to act as an S-phase cell cycle inhibitor, rather than through an epigenetic effect on the fetal globin gene promoters. With their colleagues, both Nathan and Stamatoyannopoulos went on to prove that numerous S-phase inhibitors, including hydroxyurea, were all highly effective as inducers of HbF in monkeys. Platt and colleagues then tested hydroxyurea, which had the safest side effect profile, in patients with sickle cell disease and showed a clear HbF response. The response does not have to be large because HbF is a powerful antisickling agent. The exploratory observations of Platt and colleagues were then examined in large clinical trials initially conducted by Charache and Dover and their associates. The results have led to FDA approval of hydroxyurea for use in patients with sickle cell disease. However, hydroxyurea has not provided the same effectiveness in patients with β-thalassemia, who require much more HbF to achieve globin chain balance. It may be more effective in some select populations with β-thalassemia, particularly those with an XmnI polymorphism at position −158 in the γ-globin gene promoter. Additionally, other S-phase inhibitors or safer derivatives of 5-azacytidine may also show promise for this purpose.


Around the time that these studies were being performed, in the mid-1980s, Perrine and colleagues reported another important observation. Along with Bard, they showed that infants of diabetic mothers have a delayed fetal to adult Hb switch. Because it was known that hydroxybutyrate is increased in mothers with diabetes, Perrine and colleagues went on to test the idea that butyrate, or other similar short-chain fatty acids, may be effective as inducers of HbF first in sheep and then in patients. Although initial trials showed promise, these therapies were not particularly effective. However, a subset of patients did show responses in these trials and in independent studies performed by Dover, Atweh, and their colleagues. Although hydroxybutyrate is increased in diabetic mothers, numerous other metabolites are also increased in this ketotic state and therefore further investigation of this phenomenon is worthwhile. It has been suggested that these short-chain fatty acids are likely to induce HbF through the inhibition of histone deacetylases (HDACs). Various HDAC inhibitors can be potent inducers of HbF in vitro and so these compounds may show clinical efficacy in vivo. Pharmacologic induction of HbF production is discussed in greater detail in the article by Atweh and Fatallah elsewhere in this issue.


Molecular regulators of the fetal to adult Hb switch have been sought with the idea that targeted approaches could be developed if such molecules were identified. Although numerous regulators of erythropoiesis and globin gene expression had been identified, none appeared to directly explain the fetal to adult switch that occurs in ontogeny. Recently, new insight into this process has occurred as a result of human genetic association studies. By examining genetic polymorphisms in the human genome that were associated with HbF levels in adults, 3 major loci associated with HbF levels were found. These loci included the β-globin locus on chromosome 11, a region between the genes HBS1L and MYB on chromosome 6, and the BCL11A gene on chromosome 2. The gene BCL11A had previously been studied for its role as a zinc-finger transcription factor necessary for B lymphocyte production. However, a role in erythroid progenitors had not previously been appreciated. The role of this transcription factor in regulating HbF levels was therefore examined, given the compelling genetic association observed. This gene was found to be expressed in an inversely correlated manner with the expression of the HbF genes seen in the course of human ontogeny, as well as in humans with different HbF-associated BCL11A genetic variants. This finding led to the hypothesis that this gene may function as a repressor of the HbF genes in humans. Using a proteomic approach, BCL11A was found to interact with the erythroid transcription factors, GATA-1 and FOG-1, as well as a repressor complex known as NuRD. The NuRD complex contains 2 HDACs, suggesting that these may be the targets of short-chain fatty acids and other HDAC inhibitors that induce HbF. The role of BCL11A in directly regulating the HbF genes was shown by knocking this gene down using siRNA and shRNA methods in primary human erythroid cells. These results showed that this knockdown results in robust increases in the level of HbF without causing a major perturbation of erythroid differentiation. Consistent with a direct role in regulating the HbF genes, it was shown that BCL11A directly interacts with the β-globin cluster in human erythroid cells. Together these findings identified BCL11A as a major regulator of the fetal to adult Hb switch in humans. Subsequently, it was shown that BCL11A also plays a critical role in the evolutionarily divergent globin gene switches that occur in mammals, suggesting a critical conserved role of this factor in this phenomenon. Collectively, these studies established BCL11A or its interacting partners as potentially important therapeutic targets to induce HbF levels. Moreover, BCL11A is unique in that human genetic association studies suggested that it plays a major role in vivo in the regulation of HbF levels in humans. Indeed, studies have shown that polymorphisms in BCL11A play an important role in ameliorating the severity of both sickle cell disease and β-thalassemia. It is likely that further study of this factor and others that will find similar results and may assist in developing targeted therapeutic approaches to increase HbF in patients with β-thalassemia. Other genetic variants that have not been as well studied seem to ameliorate β-thalassemia to as great an extent (or perhaps even more so) than the polymorphisms in BCL11A , suggesting that further studies of these other factors could lead to fruitful new insight into how HbF could be therapeutically modulated.


The presence of concomitant α-globin gene mutations with β-thalassemia can result in a much milder clinical syndrome. This finding is attributable to a reduction in the extent of globin chain imbalance. Although not as much effort has been focused on the modulation of α-globin levels in β-thalassemia, this may represent another valuable therapeutic strategy, in addition to the efforts to stimulate HbF production. Studies of cohorts of patients with β-thalassemia confirm the ameliorating effects of α-globin gene deletions on the clinical course in these populations.

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Mar 1, 2017 | Posted by in HEMATOLOGY | Comments Off on Thalassemia: An Overview of 50 Years of Clinical Research

Full access? Get Clinical Tree

Get Clinical Tree app for offline access