What do you think have been the most important effects of the rule of five?

One effect is that it raised awareness of the importance of physicochemical properties in the development of drugs. It was developed in an era when people were focusing just on the potency of compounds and ignoring other physicochemical properties such as cell permeability and solubility, to the detriment of the drug discovery process in general. It was also an era in which things were thrown over the fence from discovery to development: people in discovery would nominate a highly potent compound with terrible solubility and the people in development would have to deal with it.

I think it also helped to have an impact on the quality of the screening libraries that are commercially available; most chemical vendors have applied structural filters such as the rule of five, so it's got rid of some of the worst compounds in these libraries and improved the overall profile of what can be screened.

Do you think there have been any problems in the way that the rule of five has been used?

It has been criticized in various places but I think most of the criticism arises from misuse of the rule of five, or of structural and property filters in general. For example, if you use hard cut-offs on physicochemical properties, such as those in the rule of five, it doesn't take many cut-offs to be applied to a compound library even with a 90% pass rate for each individual cut-off before you have little left to actually screen.

The rule of five was not intended to be a metric to distinguish drugs from non-drugs; rather, the aim was to help improve the probability of success.

It's important to emphasize that the rule of five was not intended to be a metric to distinguish drugs from non-drugs; rather, the aim was to help improve the probability of success. I thought the way that Pfizer used it in the time after its publication was very sensible. We had criteria for validating a clinical candidate, and the rule of five was one of them. It changed the view of compounds that broke the rules from 'innocent until proven guilty' to 'guilty until proven innocent': you could still nominate such a compound but you needed solid experimental evidence to show why it wasn't going to cause a problem. So, it prevented potentially problematic compounds from leaking through into development without being challenged. There was also a cultural aspect: when people proposed a clinical candidate, everyone knew what its 'rule of five' profile was, and it seems that project teams were embarrassed to propose compounds that were too problematic.

One thing I find interesting now is the cultural differences across organizations, as studies looking at chemical patents indicate that even for the same target, compounds from different organizations have different physicochemical profiles. Overall though, I think people are now in broad agreement that property filters and structural filters have a utility. Where you get into disagreements is around issues such as what tolerance to allow for functionalities that can potentially bind covalently to proteins, and reasonable people can come to different conclusions on that.

What are your views on approaches to identify drug candidates for traditionally intractable targets such as protein–protein interactions, which could lie outside the 'rule of five' space?

Credit: Chris Lipinski

I've often been mistakenly characterized as a computational chemist because of the rule of five, and I think the solution to this is experimental. In medicinal chemistry, to make progress you need to have structure–activity relationships (SARs) and you can't have these if the activity is zero. Let's say that you are working on compounds that are really not 'rule of five'-compliant and you want them to be orally bioavailable; you're never going to make progress if every compound has zero bioavailability. The trick is to be able — for example, with the drug metabolism department — to reliably pick up compounds with low oral bioavailability (for example, between 5% and 10%, a factor of two) and to be able to tell the difference between these with accuracy and speed (for example, getting your data in a week). Then it's still difficult but you are in a realm where you can make progress. Many people don't look at it this way; they see it as some kind of computational problem. This is actually an area where the medicinal chemistry community can make a big impact — for somebody in a drug metabolism department to be able to do this, it's going to take time and resources, and it's not going to happen unless they have support from the chemists.

With regard to the types of compounds, I think we're going to see a growth in those that have a size that is between typical small-molecule drugs and proteins, such as stapled peptides and macrocycles that have quite decent bioavailability; macrocycles have been under-represented in screening libraries because they've been difficult to make, but this has changed recently. And natural products are the other obvious source, with a few caveats. They are attractive because they were developed through evolution to do something, but the downside is that they tend to have evolved as part of 'warfare' between organisms, going after key nodes in signalling pathways that are easiest to perturb and most likely to cause major effects, which can be a problem if you're looking for something other than anti-infective or cytotoxic activity.

What are your views on the challenges for the recently launched initiative on drug repurposing by the US National Center for Advancing Translational Science (NCATS)?

Obviously, I am an advocate of efforts to repurpose drugs through my activities with Melior and elsewhere. However, with the NCATS initiative (see Nature Rev. Drug Discov. 11, 505–506; 2012) at the moment, there are 58 abandoned drugs in the list for project proposals but the structures haven't been publicly disclosed, and even with experts searching various databases we can't be sure of chemical structures for about 20 of them. This frustrates people who are trying to take chemoinformatic approaches to drug repurposing because without the structure you can't mine the literature to look at what similar compounds do.

Another issue is that the information available is structured in a way that presupposes that the best strategy to follow is that of the past 20 years, based on identifying a target and/or mechanism and then screening for that. So, at present, it doesn't accommodate phenotypic screening. One of the deficiencies of single molecular target-driven approaches is that they tend to be highly focused on a single therapeutic area, and there are many indications and many drugs that do not fit well with this model. Evaluating such compounds using phenotypic screening in a range of disease models in different therapeutic areas could be the most effective approach to repurposing them.

What do you think are the merits of phenotypic screening compared with target-based lead discovery?

The situation with high clinical attrition is forcing people to step back and ask how they are doing drug discovery: have target-oriented strategies had too much emphasis, and have we been neglecting other approaches that do work? The major merit of phenotypic screening is the tremendous opportunity that there is to find novelty in targets and mechanisms when you use mechanistically unbiased screening approaches. The main downside is that when you find a hit, you still want to know what the mechanism is, which is often not trivial to determine. From a drug discovery perspective, it's not a regulatory requirement but you do need to be able to say something about safety, and knowing the mechanism and developing a rationale for safety based on this helps to back up the standard experimental toxicological data. The lower throughput and cost of the assays compared with some conventional screens has also been cited as a challenge in the past. Now though, one area in which advances have really made phenotypic screening more of an option is cell-based screening in a high-throughput format, particularly imaging-based screening, with a wealth of reagents to label different components of the cell and get multiple read-outs from a single assay. It's still expensive and problematic at high-density screening formats (1,536-well plates) but as it starts to tie in more with the emergence of models of various diseases based on induced pluripotent stem cells, this is going to be a really exciting area.