The benefit of HLA matching in renal transplantation has been acknowledged for many years.1,2 In the UK and elsewhere, kidneys are allocated using matching algorithms based on our understanding of the impact HLA class I and class II matching on the recognition and rejection of renal allografts. With corneal transplantation, there is no such consensus. Not only are the separate influences of class I and class II matching not clearly defined,3 but the need for matching itself has been repeatedly questioned. There are perhaps two main reasons for this: first, the contradictory results from clinical studies on matching;4,5,6,7 and, second, the perception of immunological privilege in the anterior chamber. It is clear that this privilege is only relative since the single main cause of graft failure in the first year is irreversible, cell-mediated rejection.4,8 Indeed, all grafts, even those considered to be at least risk of rejection, receive immunosuppressive therapy. Studies have helped to define the risk factors for rejection, which include vascularization, active inflammation, and previous graft failure; however, therapeutic immunosuppression is still the most common approach to managing this risk. Indeed, the failure of the Collaborative Corneal Transplant Study (CCTS)5 in the USA to show a beneficial effect of HLA matching was perhaps in part because of to the use of what some would consider to be an overly aggressive immunosuppressive regimen. It should be borne in mind, however, that HLA matching in renal transplantation imparts benefit that is in addition to the considerable impact of systemic immunosuppression. It is clear that hitherto the benefits of HLA matching for corneal transplants are rather less well defined than for renal transplants. When the uncertainties of the impact of HLA matching are added to the undoubted logistical difficulties and the potential delays in patients receiving matched grafts, it is not difficult to understand the reluctance to apply matching more extensively.

Even if the effects of matching corneal grafts are more subtle, there are still good reasons for pursuing efforts to reduce the risk of rejection through a greater understanding of corneal transplant immunology. These include the deleterious side effects associated both with local steroid immunosuppressive therapy, such as glaucoma and cataract, and with systemic immunosuppression. Also, each rejection episode, even if it is successfully reversed, will result in a substantial loss of endothelial cells,9 which will hasten the relentless march towards the point where there are insufficient endothelial cells to maintain corneal transparency and thus shorten the life of the graft.10

There is still, therefore, a need to pursue clinical and laboratory studies into corneal graft rejection. Many clinical studies in the past have been poorly designed and have been analysed using inappropriate statistical methods. There have been notable exceptions but they too have suffered from other difficulties, such as the errors inherent in serological typing, especially for HLA class II antigens. At least this particular problem should no longer muddy the waters now that accurate typing using DNA methodology is routinely available. The ongoing Corneal Transplant Follow-up Study in the UK, which aims to increase our understanding of the role of class II matching against a background of class I matching, uses exclusively PCR–SSP typing methods for both donors and recipients.

In this issue of Eye, Reinhard and colleagues present data that would appear to go even further than the general view (at least among those inclined towards matching) that tissue matching would be of most benefit to those patients in the accepted ‘high-risk’ categories who are most likely to reject their grafts. Reinhard et al present a single-centre study of matching in ‘low-risk’ grafts (ie, keratoconus, Fuchs' endothelial dystrophy, nonherpetic scars, and bullous keratopathy). They report that the incidence of rejection was lower in those grafts where there were no more than two HLA mismatches overall. Not only was the matching in their study based on broad rather than split antigens, but there was no distinction between class I and class II matching. This finding, therefore, seemingly ignores the fact that class I and II antigens have different biological functions, as do the split antigens routinely used for renal matching. Nonetheless, this study does underline the point that the issue of HLA matching in corneal transplantation is still not fully resolved, and that matching may yet turn out to be an important strategy not just for high-risk grafts but for low-risk grafts, where it is important to protect and conserve the endothelium in order to maximize the lifetime of the graft.10

If matching could indeed be based simply on the strategy proposed by Reinhard et al, then the widespread application of tissue matching for corneal transplantation may well become feasible. On the other hand, a greater understanding of the way that HLA responses are modulated in the anterior chamber may lead to more effective immunological strategies for preventing allograft recognition and rejection.11,12