Skip to main content

On balance

One of the big challenges in drug discovery is that many factors must come together in the same compound for it to be a successful, safe and efficacious drug. It is well understood that the appropriate physical and chemical properties must be considered, such as ADME properties, metabolic stability and an absence of toxicity – at least at therapeutic concentrations. The issue is that these requirements compete with one another and as such very rarely fall into place.

Increasing the potency, for example, may require increasing the lipophilicity, which unfortunately has knock-on effects in terms of the ADME properties and toxicity. As a result of this, not only are predictive modelling techniques being used very early in the drug discovery process, but a wide range of experimental end-points are now being measured, such as early in-vitro toxicity. This poses a new challenge: data overload. Given the amount of data and the range of properties that need to be balanced, the question becomes how to use that information to quickly target the high-quality chemicals that will deliver a successful drug. How do we optimise all these different parameters and factors simultaneously? A number of approaches have been applied in order to answer these questions.

Many approaches, described as multi-parameter optimisation (MPO), are not unique to drug discovery. This provides the opportunity to use techniques that have been tried and tested within other disciplines – aerospace engineering, for example. Within drug discovery, the application of rules of thumb like Lipinski’s rule of five mean that chemists are working within an established space and therefore not introducing significant additional risk. The problem is that these criteria are often treated as filters – should properties not match established parameters, they are simply discounted. The danger of drawing such a hard distinction between a compound with a logP of 5.01 and one with a logP of 4.99, for example, is exacerbated by significant uncertainty within the data being used. Using 10 filters, each with an accuracy of 90 per cent, the probability of a perfect compound emerging is just 30 per cent. The opportunity cost of throwing potentially good molecules away due to predictive or experimental error is potentially huge.

These are simple approaches, however, and far more sophisticated methods have emerged, such as desirability functions which take the different properties of interest and map them onto a degree of desirability. Those values are then combined into an overall score that assesses all of the factors appropriately to suggest, on balance, how well that compound fits the objective. This approach offers a lot of flexibility as the criteria can be defined in a subtle trend rather than a harsh cut-off point. An explicit consideration of the uncertainty in the underlying data needs to then be considered so that a confidence limit surrounds the decisions of which compounds to take forward.

There are times, of course, when what constitutes an appropriate balance is unknown. A commonly-used technique in this case is Pareto optimisation. Using this approach, the optimality of an outcome – a compound in this case – is defined in terms of not just one specific profile, but the range of possible optimal outcomes that represent different balances. By sampling a spectrum of options, chemists can explore what the best balance should be.

All of these approaches are now being applied in drug discovery and making quite a significant difference to the success of projects. The cost of taking the initial idea through to a drug on the market is enormous and most of that cost is due to the fact that the vast majority of compounds that are synthesised and tested will never lead to a successful drug – typically, only one in 12 compounds going into development make it to the market. The primary issue used to be a lack of appropriate ADME and pharmacokinetic properties, but that seems to have been reduced by filtering those poor compounds out earlier in the process. The issue now is toxicity – the specific problem has shifted while the overall success rate remains the same. Being able to view all these factors simultaneously and very early on, however, means that a chemistry or set of compounds that could bring all those factors together can be targeted.

There are numerous examples of this. One project we worked on involved doing a retrospective analysis of a drug discovery project that had been running for more than five years in a large pharmaceutical company. More than 3,000 compounds had been synthesised and tested, and we were given the resulting data in a blind test. What was really interesting was that in the first half of the project the company had focused a lot of effort in one particular area of chemistry but not found the balance of properties needed to get in-vivo efficacy. Examining these compounds through probabilistic scoring, we saw that the area of chemistry was very high risk and that there was an alternative area more likely to be potent.

The company had eventually come to the same conclusion, but we were able to show that by using predictive techniques to survey that entire area of chemistry to find the right set of compounds, choosing a subset to get experimental data on and then focusing narrowly on the ones to progress to actual in-vivo efficacy studies, that area of chemistry could have been reached with only 10 per cent of the effort.

Chemists tend to focus on potent areas of chemistry and then go back and determine what other properties should be present and whether any problems need to be fixed. The issue here is that it ties them into very specific areas of chemistry, making it difficult to deal with any problems that may be encountered without long iterations. As early as possible, it is important to consider where in the chemistry is most likely to yield potent compounds, combined with all the other factors of concern. When dealing with large numbers of compounds, it’s still too expensive to measure all these properties, which is where predictive techniques come in. While no approach is perfect – and these techniques do contain uncertainties and statistical error – having the balance of probabilities on where the most likely area of chemistry lies for a successful candidate is invaluable.



www.optibrium.com

Topics

Media Partners