Human corneal graft outcomes

Corneal graft failure is common, particularly in patients who have received grafts for conditions other than keratoconus or stromal dystrophies. The success rate for penetrating corneal grafts is as high as 73% at 5 years and 62% at 10 years.1 This is less than the survival of solid vascularised organs, for example, the survival of renal transplants is above 90% at 5 years.2 Not only is renal graft survival higher than corneal graft survival, it continues to improve. There is no evidence that this is so for corneas. Despite long experience—the first penetrating corneal graft was performed in 1906 and the first renal graft in the 1960s—corneal transplantation has failed to match outcomes in other areas of clinical transplantation.

Developments that have improved the outcomes of renal transplants (better systemic immunosuppression, improved tissue matching, and the use of living related donors) have not been applicable to corneal transplantation. Alternative approaches are required which take into account the unique aspects of the corneal allograft response and the clinical context in which corneal transplantation is performed. The limiting aspect of the clinical context is the often-restricted suitability of patients with corneal disease for systemic immunosuppression. Visual disability and blindness are disabling but this does not justify the potentially life-threatening complications that go with shutting down the immune system for a prolonged period.

Regional immunosuppression for corneal transplantation

An unusual aspect of corneal allograft rejection is the accessibility of the cornea, which raises the possibility of regional rather than systemic immunosuppression. Although the allograft response is well conserved throughout evolution, differences occur from species to species and from one tissue or organ to another. Such differences in allograft rejection need to be taken into account when using animal models and when designing new interventions. Both the clinical context of corneal transplantation and the mechanisms of corneal allograft rejection point to the need to develop regional immunosuppression for clinical practice.

Erosion of corneal privilege by inflammation

Not all corneal grafts have the same tendency to reject. Those with undisturbed histology, as is the case with keratoconus, have a low propensity for rejection. Grafts into recipient corneas that are inflamed reject commonly and rapidly. Even a history of keratitis years before transplantation increases the risk of rejection. Immunological privilege is relative.

A number of changes in the inflamed cornea contribute to the erosion of privilege. Normal human corneas carry few bone marrow-derived cells capable of processing and presenting antigens and initiating the immune response.3, 4 With inflammation, such cells are matured and recruited into the cornea through the limbal (peripheral corneal) circulation. Even when the inflammation is resolved, these cells persist for months or years.5 The greater the number of bone marrow-derived cells in the host cornea at the time of surgery the higher the rejection rate. A cornea that has been inflamed is never the same again with regard to immune privilege.6

Inflammation upregulates transplantation antigen expression making it easier for the host immune system to see the foreign graft.7 Other mechanisms contribute to the erosion of privilege that complicates corneal inflammation. Chronic inflammation induces generation of blood vessels and lymphatics in the normally avascular cornea.8 These contribute to the connection of the allograft to the host immune system. Inflammation induces vessels to leak, facilitating ingress of cells, and proteins into the cornea. In an inflamed cornea, macrophages produce factors such as VEGF-C, which in turn induces growth of lymphatics. Pro-inflammatory cytokines gain access to the cornea and anterior chamber and encourage rejection. Normally the aqueous contains immunosuppressive proteins (eg, TGF-β), which contribute to privilege. Thus, inflammation can erode privilege by a number of mechanisms.9

Cellular and molecular events in corneal allograft rejection

The three essential requirements for the initiation of the allograft response are: non-self transplantation antigens, antigen-presenting cells, and host immunocytes. All are present in rejecting corneal allografts. During the indirect pathway of presentation, which is the most important in corneal allograft rejection, host antigen-presenting cells process alloantigen, and present it to the host immunocyte. Antigen processing is likely to occur in the cornea.10 Where antigen presentation occurs is debated. Almost certainly, it can occur at a number of sites, including the cornea, the ocular environs, and the draining lymph nodes.11, 12, 13

Antigen trafficking is a subject of considerable interest because the exact location of antigen presentation is a crucial consideration for the development of new regimens of immunosuppression. In laboratory experiments, antigens delivered to the cornea disseminate widely. Antigens delivered to the cornea can be recovered from the conjunctiva, draining lymph nodes, and from as far away as the spleen, and mesenteric lymph nodes. They travel in soluble form or are carried in antigen-presenting cells.14, 15

Antigen presentation is a critical point in the allograft response because it is the first step in the response which is allospecific. Alloantigen is presented in association with MHC molecules to the T-cell receptor (TCR) (Figure 1). There are many other interactions between the antigen-presenting cell and the T cell, which can influence the impact of antigen presentation—including coreceptor and costimulatory interactions, which may enhance or impede the impact of presentation. Although the cell surface molecules involved in TCR triggering are now largely defined, it is still unclear how the interaction of the TCR with its cognate peptide/MHC results in the appropriate response for the T cell. A number of models for TCR triggering exist, and the real situation is probably a combination of these. Regardless of the mechanism of receptor triggering it is likely that the final outcome of T cell–APC interactions is impacted by (1) the temporal expression and density of costimulatory molecules, (2) the phosphorylation state of immunoreceptor intracellular tails, (3) the spatial organisation of phosphatases (eg, CD45) and kinases (eg, Lck), and (4) the cytokine milieu at the time of presentation.16

Figure 1
figure 1

Schematic representation of the interaction between a CD4 T-cell and an antigen-presenting cell (APC). Alloantigen presented in the groove of host MHC Class II interacts with the T-cell receptor (TCR) and associated CD3 chains. Costimulatory molecules containing positive (CD28) or negative (CTLA-4, PD-1) signalling tyrosine motifs interact with their ligands on APC (CD80, CD86, PD-L1, PD-L2). Src family kinases including Lck, either associated with the coreceptor CD4 or free in the membrane, are important in phosphorylating tyrosine residues in immunoreceptor intracellular tails to facilitate the recruitment of intracellular messengers and signal transduction.

As complex as the process is, the interaction between antigen-presenting cells and host immunocytes offers the prospect of allospecific therapeutic intervention. This has not been achieved in clinical transplantation. Antibodies to key elements of antigen presentation, for example anti-CD4 and anti-CD3 antibodies and CTLA4-Ig fusion protein, have been used clinically with some success, but these agents work in a non-specific way to suppress immune responses,17, 18, 19 because they are administered systemically and result in systemic immunosuppression. This approach has not been widely employed in corneal transplantation because systemic immunosuppression cannot usually be justified in this clinical setting. Furthermore, the antibodies employed in this approach are too large to cross into the cornea and therefore cannot be delivered topically.

Alternative therapeutic options

As the therapeutic developments that have contributed to the improvement in outcomes for essential organ transplantation are not applicable to corneal transplantation, new ways of abrogating the allograft response in the cornea are required. One approach is to interfere with antigen presentation with antibody fragments, antibodies directed at crucial elements of antigen presentation and small enough to get into the eye when delivered topically. Whole antibodies are large molecules, about 150 kDa—too large to cross the corneal epithelium and enter the stroma and anterior chamber. Not all of the structure of the antibody is necessary for specific therapeutic functions. Only the antigen-binding site is required to block some responses. An antibody can be engineered using conventional molecular biology techniques that are little more than an antigen-binding site, with a molecular weight of 10–25 kDa.

This approach was tested in a rat model of corneal transplantation. A mouse anti-rat CD4 scFv was engineered and investigated as a potential therapy for the prevention of corneal allograft rejection. When the antibody fragment was administered systemically by intraperitoneal injection or by adenoviral expression, a modest prolongation of corneal graft survival was observed. When the fragment was delivered topically, even though it had been shown to readily enter the cornea, no prolongation of corneal allograft survival was observed.20 This suggested that the topically administered antibody fragment was not getting to the critical location at which antigen presentation was occurring. If antigen presentation occurs remote from the cornea, beyond the reach of locally applied immunosuppression, it will be necessary to seek therapeutic targets more proximal in the afferent limb of the allograft response. Antigen processing, rather than antigen presentation may need to be the target. This is likely to occur in the cornea. Both the antigen and the antigen-presenting cells are in the cornea and the tendency of a cornea to reject is related to the number of antigen-presenting cells in the host cornea at the time of surgical transplantation.

A second approach that we and others have employed is to modulate immunological privilege in the cornea by local over-production of anti-inflammatory cytokines. The lymphokine milieu is known to influence the establishment of immune responses and antigen processing. This milieu can be altered with a gene therapy approach. Increasing the secretion of anti-inflammatory lymphokines, for example, interleukin 10 (IL10) might be expected to protect against allograft rejection. Similarly it might be possible to block the pro-inflammatory action of interleukin 12 (IL12) by over-producing the p40 subunit of IL12, which binds the receptor but does not result in immunological activity.21 Indeed an adenoviral gene therapy approach has been shown to increase graft survival in animal experiments in both cases. Although these results are interesting and encouraging, it remains necessary to develop and test safer, more efficient gene therapy vectors.

Conclusions

The lack of progress in improving the outcome of corneal transplantation, particularly in those high-risk cases in whom immunological privilege has been eroded, needs to be overcome. To achieve this demands the development of therapies acceptable in the clinical context of corneal transplantation. Fortunately the cornea is very accessible and therapeutic agents can be delivered locally by topical medication. It is also particularly suitable for gene therapy approaches. It is likely that very little progress will be made in improving the outcome of corneal transplantation in high-risk patients until regional immunosuppression can be achieved.3