More than 2 million new HIV-1 infections are reported each year. HIV-1 is characterized by an inherent high probability for residues to change from one generation to the next, and some of these mutations can be selected for under pressure from CTL1, drugs2, antibodies3 or vaccines4. Better characterization of these evasion mechanisms, which enable the virus to sidestep anti-HIV-1 immunity, is necessary to develop pertinent vaccines that could ultimately end the epidemic. In a study in Nature Medicine, Carlson and colleagues5 analyzed large HIV-1 data sets using a statistical framework that encapsulates the impact of CTL escape mutations on markers of disease progression.

Two main hypotheses have dominated our understanding of the impact of CTL escape mutations within and between hosts. One view is that CTL escape mutations incur a cost that is substantial enough to viral replicative capacity that they facilitate the maintenance of viremic control by the host1,6, which thereby suggests that pressuring the virus into these adapted, less-fit forms could be a fruitful vaccine-development strategy. Alternatively, considering that CTL escape mutations drive HIV-1 evolution during the course of infection, and that these mutations are transmitted to individuals with each new infection7, the hypothesis is that the anti-HIV-1 CTL immune response is dampened as CTL escape mutations accumulate in the population, and thus higher viral loads will eventuate. Indeed, CTL escape mutations are spreading globally, and most of them do not revert to an unmutated state upon transmission8,9.

The work of Carlson and colleagues5 lends credence to the idea that there is an evolutive process of HIV-1 adaptation to the host human leukocyte antigen class I (HLA) alleles that is accompanied by a deterioration of the CTL response against HIV-1 (Fig. 1). The researchers developed a probabilistic model that provides an adaptation score for a given HIV-1 sequence and HLA allelic combination. Adaptation scores reflect changes in a given HIV-1 sequence in response to the HLA type of the person who contracts the infection, and these scores can also quantify adaptation to the previous or next host. This, in turn, defines a framework in which to ask questions that are crucial to our understanding of HIV-1 escape from T cell immunity within and between infected hosts.

Figure 1: HLA adaptation of HIV-1 leads to a deterioration of the CTL response against HIV-1.
figure 1

Carlson and colleagues5 show that there is an increasing proportion of HLA-adapted HIV-1 sequences over time among individuals infected with the virus, as shown in pink on the phylogenetic tree. There are also sequences that remain invisible to the immune response (blue). At the center of the trees are the common, most invariable HIV-1 segments that would be included in a conserved-elements vaccine design (filled circle). They show that the recognition by CD8+ T cells (gray) of HLA-adapted peptides presented by infected cells (pink) is hampered relative to the recognition of unadapted peptides presented by infected cells (black). This leads to a loss of immune control and higher viral loads over time. Blue cells represent cells presenting peptides that are not recognized by T cell immunity.

For their analysis, Carlson and colleagues pooled data from multiple cohorts from Southern Africa and North America, including almost 5,000 individuals at different stages of infection with HIV-1 subtype C or B. For a given individual, the authors showed that the HIV-1 sequences reflected adaptation to the host's HLA type, and the extent of adaptation increased over the first two years of infection. Importantly, HIV-1 sequences from individuals with low viral loads (<50 copies/ml) showed significantly less adaptation than those from individuals with high viral loads (P < 0.0001). Furthermore, the reduction in viremia—previously shown to be associated with specific HLA alleles (such as HLA-B57 or HLA-B27)—was abrogated for sequences with more adaptation footprints corresponding to these alleles.

Given the impact of an individual's HLA type on the contracted virus and his or her viral load, it is key to assess how the adaptation of HIV-1 to its previous host translates to individuals who contract an infection. By analyzing a cohort of 129 Zambian transmission pairs with a subtype C infection, the team found that, when the transmitter and the recipient shared HLA alleles, the recipients who were infected with a virus that showed hallmarks of adaptation had faster rates of CD4+ T cell decline and higher viral loads than individuals with less adapted viruses. Notably, the heritability of viral loads was more pronounced among pairs of individuals who shared HLA alleles. Furthermore, for three cities in Southern Africa, variation in viral loads among the cities was correlated with the level of HIV-1 adaptation to HLA profiles.

If HLA-adapted HIV-1 sequences are associated with markers of worse disease progression than HLA-nonadapted sequences, it implies that CTL responses are less effective against adapted epitopes than against nonadapted epitopes. As such, CTL responses were less likely to be elicited against adapted epitopes than against nonadapted epitopes (on the basis of follow-up data from 11 individuals infected with subtype B). This was corroborated by a trend toward CTL responses of lower magnitude against adapted epitopes among vaccine recipients in the Step vaccine efficacy trial; most Step vaccine recipients mounted CTL responses after vaccination, although the vaccine failed to reduce viral loads in participants with an infection4.

Although some questions remain—for example, this study had no data from the envelope protein, which is a major target of host immunity—the authors' conclusions that CTL escape mutations reduce the effectiveness of the CTL response have implications for vaccine design. Because of the extreme diversity of HIV-1, one can posit that a vaccine consisting of a centralized antigen (e.g., a consensus sequence) would be a better vaccine than any extant strain. To maximize the coverage of CTL variation, a vaccine should seek to include the most common HIV-1 variants in antigens of practical length, such as in mosaic approaches10. However, an alternative strategy is to avoid HIV-1 diversity altogether, and to focus instead on conserved elements of the HIV-1 genome. The findings of Carlson et al.5 suggest that strategies that focus on conserved elements11,12 will be the most suitable way forward (Fig. 1). Vaccines based on conserved elements seek to raise immune responses to unmutable segments of the HIV genome to avoid variable epitopes that can act as decoy immune responses11. In the future, one can also envisage priming with vaccines that are based upon HIV-1 conserved elements, and following up with a booster of immunogens adjusted to specific epidemics. Although Carlson's findings vindicate conserved-elements vaccines, it is evident that the success of any scheme can be determined only through a vaccine efficacy trial.

The vaccine-design quandary of finding a balance between the need to increase the breadth of responses and the risk of eliciting responses against adapted peptides also serves to illustrate that there are 'holes' in the anti-HIV immune response (Fig. 1). These immune holes portend that a gradually deteriorating immune response against HIV would lead to higher mean viral loads as the epidemic progresses. The findings by Carlson et al.5 suggest that vaccine-induced elimination of circulating HIV-1 strains would echo the absence patterns of the anti-HIV-1 immune response, and that it could therefore lead to a decrease in vaccine efficacy over time. Although it is premature to worry about the consequences of an HIV-1 vaccine that is potent enough to further disrupt anti-HIV-1 immunity, the team's findings offer an argument for the continued monitoring of HIV-1 evolution dynamics.