The significance of fitness tradeoffs and compensatory mutations in the evolution of infectious disease

The last couple of decades have seen medical science infused with evolutionary reasoning. This has affected such disparate fields as vaccine design and medical genomics­­[1]. In this post, I will focus on the significance of compensatory mutations and fitness tradeoffs to understanding the evolution of infectious disease.

The imperfection of organisms is often cited as evidence for evolution. This imperfection is obvious from our genomes, which are littered with history – a “genetic book of the dead,” as Richard Dawkins put it[2]. However, even taking the historical explanations of those well-known features into account, organisms are still not ‘perfect,’ and they cannot be. This is because species must compromise, so to speak, when it comes to what traits they will excel in. On the fitness landscape, it is impractical for them to “climb every mountain”[3]. As a result, natural selection leads to specialization as tradeoffs are made between traits. These are known as “fitness tradeoffs.”

Generally, over the course of the evolution of a population, selection pressures on certain characters are greater than on others. As a result, some mutations are fixed which may increase fitness with respect to one strongly selected character, but decrease fitness with respect to another character facing weaker selection pressures. The effects of such mutations may be described as “pleiotropic,” as they influence multiple characters. These changes will be favored by natural selection because their net fitness effect is positive. At this point, it is possible for other mutations to occur which may return the more weakly-selected trait to approximately its prior value, thus compensating for the partially deleterious effect of the initial mutation. For this reason, such mutations are known as “compensatory mutations”[4]­­­.­­ Compensatory mutations can also be coupled with primary mutations that are fixed by genetic drift despite having a slightly negative net fitness effect[5].

When it comes to the evolution of infectious disease, compensatory mutations most often come into play in the context of evolved drug resistance. This is because antibiotics (and antivirals), along with the immune system, typically must recognize and bind with a particular configuration of molecules in order to operate. Because of this, many mutations that confer ‘resistance,’ such as “escape mutations” in viruses like HIV, do so simply by altering the existing configuration of amino acids so that it cannot be easily recognized by anti-viral drugs or T-cell receptors. In some cases, however, such changes may negatively affect some characteristic of the protein’s structure, or impair the virus’ ability to replicate itself and infect a new host. Such a case was documented in a 2007 paper by Arne Schniedewind and colleagues[6]­­. They studied a mutation in the gag protein of HIV-1, known as R264K. HLA B*27 alleles, part of the set of immunity-related genes known as the Major Histocompatibility Complex (MHC), have been documented to be extraordinarily effective at controlling HIV infection and preventing the infection from progressing to full-blown AIDS[7]. This is known as “Long Term Non-Progression,” or LTNP. HLA B*27 is effective because it recognizes a very specific, conserved sequence in gag[7]. This recognition is disrupted by the R­­264K mutation, allowing the viral load to increase rapidly and progress to AIDS. What Schniedwind et al. found in studying the mutation was that while it offered a given viral strain reprieve from immune pressure, there was also a significant drawback in that the mutation impaired the virus’ capacity to replicate itself. However, when they studied this in the lab, a compensatory mutation, S173A, occurred (albeit rarely) which had the effect of restoring the viral replication rate to approximately the same as that of the viruses lacking the primary mutation. The authors argue that the dramatic negative effect of R264K on replication, combined with the improbability of S173A occurring in conjunction with R264K, goes a long way toward explaining why B*27 is so effective at controlling AIDS progression. However, there are published papers looking at cases where progression does eventually occur. That is how these escape mutations are identified in the first place. I suspect that many of these cases where immune escape is sustained represent instances of the rare coincidence of a pleiotropic mutation like R264K with another compensatory mutation like S173A.

One team from Emory University published a paper in October 2006, arguing through an epidemiological model that the occurrence of compensatory mutations increases the rate and likelihood of the emergence of drug-resistant strains of infectious agents[8]. The authors were inspired to do their work by other papers taking a population genetic approach to studying compensatory mutations in infectious disease evolution, but noticed that there was little literature taking an epidemiological approach to the problem. Their model predicted, unsurprisingly, that allowing for the occurrence of compensatory mutations increased the probability that resistance would emerge, so long as the overall less fit, but resistant strain was able to persist long enough to acquire compensatory mutations, allowing it to out-compete the wild type. The likelihood of it persisting alongside the wild type is positively affected by the level of treatment, since treatment presents a greater fitness penalty to the wild type strain than it does the resistant strain[8]. This could have important implications for our approach to treating infectious diseases of this kind. For example, maybe a treatment regimen where the intensity of treatment fluctuates would be more effective in the long term at controlling sustained viral infections as seen in HIV. A pervasive occurrence in HIV treatment is drugs nearly decimating the viral population, but leaving a population dominated mostly by the previously-rare, resistant viruses. In these conditions, as they repopulate, I imagine the infectious agents would be free to acquire mutations compensating for whatever fitness penalty might have been associated with the acquisition of resistance. Continued treatment with the same antiviral drug would be unfruitful, and the patient would seem to be worse off than before. However, my reasoning goes, if treatment were dramatically reduced immediately following the first population collapse, this might allow for the few remaining non-resistant viruses to out-compete the resistant ones again. Then, once the viral load reaches a certain threshold, treatment could be abruptly increased once again. I am not aware that this has not been tried; at least not with the same evolutionary considerations in mind. While it is currently just conjecture, this idea illustrates the kinds of medical benefits which could be reaped through a greater understanding of how, when and why compensatory mutations arise and come to be fixed.

There is a catch, though. Another team from Emory has also studied the evolution of populations of drug-resistant infectious disease agents in the absence of drugs[9]. Their work sought to explain the observation made in other studies[10,11] that often, a population fixed for drug resistance would fix compensatory mutations which partially made up for the loss of fitness due to the resistance mutation, rather than reverting to the higher-fitness ancestral state of susceptibility. The authors’ conclusion attributes this to two factors: mutation rate and population dynamics. They argue that compensatory mutations are simply more likely to occur than reversions, leading to a bias for that route of adaptation. Presumably, this is because there would only be one ancestral type to revert to, but a number of compensatory mutations at different sites may lead to viable evolutionary trajectories. The authors claim that population bottlenecks, which occur when the infection is transmitted, exaggerate this effect. Even though revertants will spread more rapidly than compensatory mutants once they enter the population, the authors argue that the slow rate at which they arise by mutation will make it likely that, if any are present at all at the time of transfer, they will still be rare in the population and thus unlikely to be included in the small sampling which is transferred. The conclusion of this study means that for my idea about fluctuating treatment levels to work, it is absolutely essential that some small fraction of the non-resistant viruses survive the holocaust of antiviral drugs. This is because even if treatment is immediately reduced or altogether halted, the resistant strain will still likely evolve to compensate for, rather than abandon, its drug resistance.

Confirmation of this can be found in a 2007 paper by Hayley Crawford et al, which characterizes a mutation in gag that led to immune escape in HLA B*5703+ patients[a]. HLA B*5703 is another allele, like B*27, associated with long-term non-progression of HIV-1 infection. It recognizes a specific sequence of amino acids in gag (162-172): KAFSPEVIPMF. The researchers found that mutations in this sequence – particularly at the second and fourth positions – led to immune escape, but that this came at a cost to viral replication rate. Phylogenetic analysis showed that this mutation eventually reverted in most patients who were not carriers of B*5703, but that this reversion was significantly slowed or prevented by the occurrence of a compensatory mutation, S165N. These results are consistent with the second study out of Emory, suggesting that compensation is a more common route than reversion for viruses carrying pleiotropic drug-resistance mutations.

I have already illustrated the concept of a fitness tradeoff in this post, with the example of the R264K mutation’s positive effect on the ability of HIV-1 to avoid immune detection, but negative effect on its replication rate[6]. Such a case, where a fitness tradeoff is determined by a single gene, is described as “antagonistic pleiotropy,” a term coined by the late evolutionary biologist George C Williams as part of his explanation for the evolution of aging[12]. One of the clearest examples I know of involving direct study, in real time, of natural selection optimizing a fitness tradeoff, comes from John Endler’s work on guppies[13,14]. In his study system, guppies with spots that allowed them to blend into a fine-grained or rocky environment were less susceptible to predation. However, this also meant that they were less visible to females. Guppies with spot patterns that caused them to stand out from their environment had greater raw reproductive success when predation was not an issue. Thus, natural selection and sexual selection would be acting antagonistically. The guppy population reflected a balance between being able to readily attract mates and reproduce, and being able to survive to mate another day. As one might expect, when Endler introduced predators, guppies possessing initially rare alleles resulting in camouflaged spot patterns increased in frequency at a rate proportional to the level of predation. He also found that when the predator was removed, guppies with spots that stood out regained their dominance in the same population via sexual selection.

Another organism – one with greater medical relevance – which has to resolve a fitness tradeoff between survival and reproduction is the human malaria parasite, Plasmodium falciparum. These parasites reproduce asexually (clonally) within the blood stream. However, to spread to a new host and ensure long-term propagation, another specialized stage must be produced, known as a gametocyte. Gametocytes represent the sexual stage, and it is they that are transmitted via mosquito vectors to a new host. Each organism has finite resources for its own development, so a fundamental tradeoff exists in how it partitions these resources between these two stages of its life-history. A relatively recent paper by Sarah Reece et al investigated the conditions which prompt the parasites to invest more or less resources in the production of gametocytes for between-host transmission versus within-host survival[15]. They referred to previous studies which had shown that the parasites increased the production of gametocytes in response to ‘stresses’ like anti-malarial drugs or changes in aspects of the host’s physiology, such as anemia or red blood cell (RBC) age. George C Williams, introduced above, termed this a “terminal investment,” for the parasites can be personified as hedging their bets and fleeing to a new host rather than digging in their heels to attempt survival in their current host. However, these researchers’ results indicated that drug-sensitive colonies of P. falciparum actually resolve this fitness tradeoff in favor of survival for future reproduction within the same host. The authors discuss several possible explanations for why they have obtained results which ran contrary to previous studied. One is that they treated with lower doses of the anti-malarial drugs than other studies. To justify that condition, they explain that in areas where malaria is endemic, the parasites face similarly low doses due to treatment strategies. They also note that some other studies used Plasmodia obtained from rodents, and that the differential developmental schedules of mice and men could have effects on what is the most effective pattern of life-history investment. Which set of researchers is correct here is not important within the scope of this post. What is important is that all of these researchers are explicitly employing Darwinian logic to anticipate the behavior of parasites[15].

References

Advertisements

Leave a comment

Filed under Biology, Evolution, Science

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s