Abstract
Background
Outcome of childhood acute lymphoblastic leukemia (ALL) in low- and middle-income countries is lagging in many aspects including diagnosis, risk stratification, access to treatment and supportive care.
Objective
to report the outcome of childhood ALL at Ain Shams University Children’s Hospitals with the use of risk-based protocols before the implementation of minimal residual disease technology and to evaluate the use of double delayed intensification (DDI) in standard risk patients.
Methods
Two hundred and twenty patients with ALL diagnosed between January 2005 and December 2014 were included in the study. Patients were treated according to a modified CCG 1991 and 1961 for standard and high risk respectively. Patients were stratified into three risk groups: standard risk (SR), high-risk standard arm (HR-SA), and high-risk augmented arm (HR-AA).
Results
Among the whole cohort, the 10-year event-free survival (EFS) and overall survival (OS) were 78.1% and 84.3% respectively. Patients with Pre-B immunophenotype (IPT) had significantly better outcome than T-cell IPT (EFS 82.0% versus 58.6%, p < 0.001; OS 86.9% versus 69%, p = 0.003 for Pre-B and T-cell respectively). Among the SR group, patients treated with single delayed intensification (SDI) had comparable EFS and OS rates when compared to patients treated with DDI with EFS 82.4% versus 87.5%, p = 0.825 and OS 88.2% versus 93.5%, p = 0.638 for SDI and DDI groups, respectively.
Conclusion
The use of risk-based protocol with simple laboratory techniques resulted in acceptable survival outcome in resource limited settings. The use of double delayed intensification showed no survival advantage in patients with standard risk.
Highlights
- •Using a Cappizi-methotrexate without leucovorin rescue is an alternative to high dose methotrexate in high risk ALL.
- •Using risk-based protocol based on bone marrow morphology is still an acceptable alternative in resource limited settings.
- •There is no survival benefit for a second delayed intensification phase using CCG protocol in standard risk patients.
1
Introduction
Notable advances in treatment of childhood acute lymphoblastic leukemia (ALL) remain primarily in developed countries where patients have access to well qualified, well supported institutions, whereas in low- and middle-income countries (LMIC) hospital infrastructure is usually less well developed, the accuracy and promptness of diagnosis are limited, and diagnostic tools and modern therapies are less accessible .
Response to therapy in ALL has greater prognostic strength than does any other biologic or clinical feature . Measuring response to therapy has traditionally been done through morphological assessment of bone marrow. The concept of minimal residual disease (MRD), introduced in 1998 , has revolutionized the assessment of response to therapy and risk stratification in treatment of ALL. Patients who have a residual leukemia level of 0.01% or more after six weeks of remission-induction therapy are treated with intensified therapy , an approach that improved outcome in patients with poor early responses according to morphologic criteria . Stratification based on MRD might not be present in leukemia treatment centers in some areas of the world, namely LMICs.
Many therapy modifications have been done to improve outcome in ALL treatment. Therapy intensification through the addition of a delayed-intensification (DI) phase after standard induction/consolidation therapy was shown to improve outcome for patients with intermediate risk ALL , and in higher risk ALL .
We present a model for LMIC where protocol-based childhood ALL therapy was given before the availability of MRD testing and without the use of high dose methotrexate. We also evaluate the impact of adding a second delayed intensification phase to the treatment protocol.
Reviews
There are no reviews yet.