Main

Record-based studies have reported higher childhood leukaemia incidence rates in relatively affluent communities within many different countries (Borugian et al, 2005; Poole et al, 2006; Adelman et al, 2007), and internationally (Feltbower et al, 2004). Within England and Wales, the leukaemia rate for the most deprived fifth of the child population was consistently about 90% of the rate for the most affluent fifth, in each of the decades centred on the census years 1981, 1991, and 2001 (Kroll et al, 2011b). The apparent socioeconomic gradient occurs both for childhood leukaemia as a whole, and for the lymphoid subtype (80% of childhood leukaemia in developed countries), which in children consists almost entirely of acute lymphoblastic leukaemia. The socioeconomic gradient might be related to either of the two well-known aetiological hypotheses linking risks of childhood leukaemia with unusual patterns of infection: ‘delayed infection’ (Greaves, 2006) and ‘population mixing’ (Kinlen, 2011). However, under-diagnosis of leukaemia in poorer children, as proposed by a ‘pre-emptive infection hypothesis’ (Stewart, 1961; Greaves et al, 1985; Doll, 1989), is another possible explanation.

Acute leukaemia in children is not always easy to diagnose, as the clinical signs can be ‘vague and non-specific’ (Mitchell et al, 2009a). Neutropenia frequently occurs in untreated leukaemia, and predisposes to bacterial and fungal infections, which can lead to death from pneumonia, septicaemia, or meningitis (Baehner, 1996). Bone marrow examination is necessary for definitive diagnosis of leukaemia, but is not routinely performed in cases of severe infection (Gillespie et al, 2004). As neutropenia is a well-recognised sign of severe infection, some children suffering from leukaemia might die from infection without leukaemia ever being suspected. Within Great Britain, this might have happened more frequently in poorer communities, where primary health care provision has been less generous and childhood infection mortality rates have been higher (Reading, 1997; Health Protection Agency, 2005). If so, the recorded incidence of childhood leukaemia would be lower in poorer communities. For acute lymphoblastic leukaemia, the effect might be clearer in B-precursor than T-precursor disease, because obvious enlargement of the lymph nodes is less frequent in B-precursor cases (Greaves et al, 1985).

Untreated acute lymphoblastic leukaemia can cause various different blood abnormalities, in almost any combination. Unlike neutropenia, low levels of haemoglobin and platelets produce obvious clinical signs that strongly suggest leukaemia: pallor and bleeding, respectively. We postulated that children who combined severe neutropenia with relatively normal haemoglobin and platelet counts would be at risk of dying from infection without leukaemia being suspected, and predicted that these children would be under-represented among poorer patients.

Materials and methods

Using records from four consecutive clinical trials sponsored by the Medical Research Council (Eden et al, 2000; Mitchell et al, 2009b), pre-treatment haemoglobin, platelet, and (to September 1990) neutrophil counts were obtained for children who had been diagnosed with acute lymphoblastic leukaemia in the United Kingdom during September 1980 to November 2002. The study was restricted to the trial participants who were resident in Great Britain and diagnosed between the first and the fourteenth birthday (ages 1–13 years), as trial participation rates were relatively low at ages <1 and 14 years, and addresses were not available for older recruits, or those resident in Northern Ireland.

Address at diagnosis was obtained by linking the trial data to the National Registry of Childhood Tumours, which records cancer diagnosed since 1962 in residents of Great Britain under 15 years old (Kroll et al, 2011a). Cases were grouped according to the deprivation category of the 1991 census ward containing the postcode of the address at diagnosis (1=affluent, … 5=deprived; Office for National Statistics, 2010), using national child-population-weighted quintiles of the 1991 Carstairs deprivation index (Carstairs and Morris, 1989). Cases were classified by immunophenotype, and the T-precursor and unknown subgroups were combined for analysis.

Using the pre-treatment blood count, each patient was classified as being with or without risk of pallor, of bleeding, and of sepsis, when diagnosed with leukaemia. Following conventional clinical criteria, risks of bleeding, and sepsis were defined, respectively, as platelets <20 × 109 l−1 (Gaydos et al, 1962) and neutrophils <0.5 × 109l−1 (Bodey et al, 1966); risk of pallor (very severe anaemia) was defined as haemoglobin <5 g dl−1. Cases at risk of non-diagnosis were defined as those at risk of sepsis without pallor or bleeding (i.e., the combination of neutrophils <0.5 × 109 l−1, platelets 20 × 109l−1, and haemoglobin 5 g dl−1).

Logistic regression was used to estimate associations between deprivation (in quintile categories) and the odds of having been at risk of pallor, bleeding, sepsis, or non-diagnosis (each treated as a dichotomous variable), in successive univariate analyses. Terms representing effects of individual trial periods, and their interactions with deprivation, were included in preliminary models, but dropped because they were not statistically significant. Goodness-of-fit was assessed by Pearson’s χ2 test and found acceptable. Models were compared by the likelihood ratio test. All statistical tests were two-sided, using the significance level of 5%.

Results

After excluding 27 registered cases with incomplete address information, 5601 cases were eligible for analysis. These comprised 79% of children aged 1–13 years who were resident in Great Britain and diagnosed with acute lymphoblastic leukaemia during the study period. Pre-treatment haemoglobin and platelet counts were recorded in all four trials, whereas neutrophil counts were recorded in the two earlier trials only. Hence, classification according to risk of non-diagnosis of leukaemia was possible for 2009 cases. The distributions of cases by level of deprivation, age at diagnosis, cell type, and available blood counts were similar in all trials (Table 1).

Table 1 The distribution of children by deprivation quintile, age group, immunophenotype, and pre-treatment blood counts, within each clinical trial period and overall

As predicted, there was a deficit of children at risk of non-diagnosis among cases from more deprived communities (Table 2). The odds ratio for risk of non-diagnosis per quintile of deprivation was 0.90 (95% confidence interval 0.84–0.97; Ptrend=0.004; N=2009); comparing the most deprived fifth of the population with the most affluent fifth, the odds ratio was 0.68 (0.48–0.96). The patterns observed separately for pallor (P=0.045; N=5535), bleeding (P=0.036; N=5541), and sepsis (P=0.083; N=2027) were weaker, but each trend went in the direction expected: odds of pallor and bleeding increased with deprivation, whereas odds of sepsis decreased. Similar trends were evident for B-precursor cases separately (Ptrend=0.002; N=1728) but not for other/unspecified cases (Ptrend=0.894; N=281). Restricting the analyses for pallor and bleeding to the earlier trials reduced the statistical significance but did not change the directions of the associations (not shown).

Table 2 Odds ratio (95% confidence interval) for pre-treatment risk of pallor, bleeding, sepsis, and non-diagnosis, among children diagnosed with acute lymphoblastic leukaemia, for each quintile of deprivation relative to the most affluent, by immunophenotype and overall

Discussion

For acute lymphoblastic leukaemia, these findings are consistent with the ‘pre-emptive infection hypothesis’, which proposes that some children with leukaemia die from infection without leukaemia being suspected. Among diagnosed cases from poorer communities, there was an excess of children with blood counts implying obvious clinical signs of leukaemia (pallor and bleeding) and a deficit of children with blood counts implying risk of dying from infection without diagnosis of leukaemia (sepsis without pallor or bleeding). As predicted, similar results were obtained for the B-precursor subgroup alone, but not for the other/unspecified subgroup consisting mainly of T-precursor cases, in which obvious enlargement of the lymph nodes is typically more frequent.

The results must be interpreted with caution. As the study was restricted to children with acute lymphoblastic leukaemia, the findings cannot be generalised to those with myeloid or mature lymphoid leukaemia. The main results are limited to the two clinical trials that recruited patients during 1980–1990, because neutrophil counts were not available for patients enrolled in the later trials: consistent results were, however, obtained for clinical signs in patients from all four trials up to 2002. The blood count data were derived from paper forms that may not always have been filled in correctly, and there may have been inaccuracies in the immunophenotype data, particularly for cases in the earlier two trials: nevertheless, there is no reason to suspect systematic error in the blood counts, and the associations were evident when all cases were included, regardless of immunophenotype. Finally, the diagnostic process includes safeguards, and blasts should have been recognised in the peripheral blood, or bone marrow abnormalities detected at autopsy – but, by definition, there would be no record of any cases that were missed.

Under-diagnosis may have contributed to the decreased leukaemia incidence in poorer children that has been reported within many different countries, and internationally. Other explanations might include rate calculation artefacts, registration bias, or a real increase in risk caused by some factor associated with higher socioeconomic status. For the cited comparison within England and Wales, artefact caused by numerator/denominator discrepancy seems unlikely, as rates were calculated from registry data for three separate decades, each centred on a census year, using appropriate census populations and census-specific deprivation indices (Kroll et al, 2011b); and a detailed study of cases diagnosed during 2003–2004 found very little evidence of socioeconomic variation in completeness of registration (Kroll et al, 2011a). The ‘delayed infection’ (Greaves, 2006) and ‘population mixing’ (Kinlen, 2011) hypotheses both propose that exposure to infection triggers leukaemia in susceptible children; in particular, the ‘delayed infection’ hypothesis suggests that protection from infection in infancy is a predisposing factor for common (B-precursor) acute lymphoblastic leukaemia. Either or both of these hypotheses could explain a reduction in childhood leukaemia rates in poorer communities, on the assumption of greater exposure of infants to infection and/or reduced population mobility in poorer communities (Dockerty et al, 2001; Stiller et al, 2008). Conversely, under-diagnosis caused by ‘pre-emptive infection’ in poorer communities might explain some of the existing epidemiological evidence for associations of higher childhood leukaemia risk with delayed infection, population mixing, or other factors linked to higher socioeconomic status. However, no explanation other than ‘pre-emptive infection’ seems likely to account for the specific patterns of blood abnormality observed in this study.

If confirmed with recent data, the findings would be relevant to future clinical practice. The apparent stability of the results over time is consistent with the persistence of the socioeconomic differential in England and Wales up to 1996–2005. However, the data relating to sepsis were available only for a limited time period (1980–1990), and the whole study period was relatively short (1980–2002); the analysis cannot be extended to more recent years, as the relevant national clinical trials have not collected pre-treatment blood counts in detail since 2002. Nevertheless, other approaches are possible. For example, previous studies from the United Kingdom have documented the presentation of childhood cancer in primary care by interviewing parents (Dixon-Woods et al, 2001), or examining the General Practice Research Database (Dommett et al, 2012); either of these sources, or Hospital Episode Statistics, might be used to investigate variation in response to signs of childhood leukaemia.

In conclusion, these results support the suggestion that childhood acute lymphoblastic leukaemia may have been under-diagnosed in poorer communities within Great Britain, and elsewhere. Potential implications for epidemiological studies and clinical practice should be considered.