Skip to main content

Pharmas forward motion

Patent cliffs, consolidation, the end of the blockbuster era, dwindling R&D budgets and personalised medicine. All are phrases associated with any discussion on the pharma industry. But if there’s one single word that underpins all of pharma’s activities and decision making, and that drives pharma’s direction moving forwards, it is ‘risk’, suggests Adel Laoui, managing director of life sciences at Accelrys.

‘It’s all about making sure every piece of relevant data that you have generated, or have access to from third party or public sources, is usable and relevant. Historically there has been a great emphasis on making those risk-based decisions at the early discovery stage, where compounds have already been generated and it’s time to select those to move forward. Pharma is now increasingly tapping into the huge volumes of data emerging from external research, including patient studies, genomics (particularly genome sequencing), and omics in general (DNA, RNA, proteins and endogenous small molecules level) to really understand the biological pathways that cause and impact on disease, in a patient-relevant setting, not just in test tubes.’ 

Commodity to cutting-edge

Know as much as you can about a disease before you start to look at small or biological molecules that might impact on that disease, Laoui stresses. Much of what pharma ‘does’ in-house, in terms of drug discovery and development, particularly when it comes to early-discovery processes such as high-throughput screening (HTS), is largely commoditised, he believes. ‘From an informatics perspective, the automation and data management/analytics capabilities are all in place to make this a push-button exercise. Pharma is using cutting edge modelling and simulation systems, combined with business intelligence tools, to validate known biological pathways and targets and make downstream decisions on promising compounds identified through discovery workflows such as HTS.’

The industry needs to implement more predictive informatics capabilities that can marry the wealth of knowledge emerging from patient and genomic studies with in-house intellectual property on disease targets, compound families or biological molecules. ‘It needs to be a seamless process. Not so much looking at and evaluating the research data and then moving on to look at your compound data, but using that pre-discovery information as part of your discovery workflow, to improve your predictive power.’

Understand your disease

The approach works: understanding the genomic drivers of many cancers has enabled the development of targeted therapies, biomarkers and companion diagnostics that today translate to patients receiving the optimum drug, or combination of drugs, for their tumour type, Laoui points out. Understand your disease at its most basic level, and the prospect of personalised medicine for additional disease types can become a reality. It’s actually a very tall order, because most diseases are underpinned by multiple genes and biological pathways. For many complicated diseases the goal of personalised medicine has still to be realised, at least in part because we don’t yet have the informatics capabilities that can really unpick the multifactorial basis of these diseases to identify the critical drivers, and their targets.’

This is where modelling and simulation tools will be essential, he says. ‘Pathway analytics, sequencing analytics – and, for some diseases, digital tissue visualisation – will be key. FDA already recognises that digital biomarkers, rather than molecular ones, can serve as an endpoint for clinical studies, and this is just another tier of data that can be added into the collection of information from disparate sources that all needs to be integrated, mined and turned into decision-relevant data to help pharma prioritise its development pipelines, and minimise risk of failure.’   

Cheek by jowl with doctors 
and patients

The drive towards personalised medicine has inevitably led to the requirement for data management, analysis and transfer capabilities. Pharmaceutical companies delivering companion diagnostics and clinical diagnostics laboratories are directly servicing doctors and their patients, says Trish Meek, director of product strategy at Thermo Fisher Scientific.

‘Pharma is leveraging biomarkers that provided evidence of drug safety and efficacy through the clinical development and approval process, as clinical diagnostic tools for diagnosis, patient stratification and for therapeutic monitoring. That brings up a whole different challenge in terms of data and information flow, because the pharma company now needs an interface with the doctors who are administering the drug, and the clinical diagnostic laboratories that will be carrying out the tests.’

Many of Thermo Fisher’s molecular diagnostic customers perform omics testing for cancer therapy. ‘The typical scenario is one in which a doctor will send a patient sample off for testing to the diagnostic lab, submit the test request through a web portal (or even a paper-based process), and then use the results from that test, which are sent back through the portal, to aid in therapeutic decision-making, Meek explains. ‘The LIMS does not create the final diagnostic answer, but it manages the laboratory process and the information exchange. Comprehensive information transfer, data security and a user-friendly interface are vital to ensure that the physician can easily request a test and submit all relevant information, so that the lab can quickly perform the analysis and get the results back to the physician as quickly as possible.’

Patient-centric

Clinical laboratories therefore require a clinical laboratory information management system (LIMS) that can integrate with their own laboratory information systems (LIS), manage samples and automated instrumentation, oversee and direct the laboratory workflow, offer a two-way interface with the physician’s office and manage the billing process.

‘The move towards personalised medicine also means that patients will be more involved in the decisions that are made about their treatment,’ Meek claims. ‘Pharma has become much more patient-centric, and focused on the needs of those patients. Many of my informatics-related discussions with pharma clients are centred on the patient, and how to deliver quality products quickly and effectively to those patients. Developing informatics systems that can seamlessly manage information flow between the drug companies, diagnostics laboratories, physicians, and patients will help expedite treatment and treatment decision-making, and facilitate patient education, by providing better information about their disease and how it uniquely interacts with their physiology, so the patient can be more involved in decisions about their therapy.’

Back to the future?

With personalised medicine on its horizon, the industry is in parallel embracing a high-throughput version of compound combination screening, a drug discovery concept first described in the late 1920s, states Dr Oliver Leven, who heads professional services for Genedata’s Screener business unit. And if the industry’s uptake of the Compound Synergy edition of the Genedata Screener platform is anything to go by, this high-throughput version is gaining ground rapidly, he suggests. The concept of compound combination screening addresses the well-recognised premise that a drug compound won’t necessarily exhibit the same biological effects when applied to cells and living systems as it exhibits in an enzymatic test. While a compound may be successfully designed to interact with and either block or activate a biological target, in living cells it frequently doesn’t show this effect. This may be due to alternative biochemical pathways that are capable of bypassing the target.

The goal of compound combination screening is to look, systematically, for compound combinations that exhibit the desired therapeutic effect in vivo. The experimental process involves screening every possible combination of a set of compounds. For each combination of, say, two compounds, 100 wells are measured corresponding to 10 different dosages for each compound. Often, such a test scheme is then applied across different cell lines (e.g. 10 cell lines, which results in 1,000 measurements). The number of measurements increases exponentially and can quickly reach millions of data points per project.

Analysis issues

And herein lies the data-analysis problem, Leven points out. Laboratory informatics and compound management systems applied to traditional ‘one well, one compound’ screening workflows weren’t immediately capable of handling the selection, dispensing, screening, and data management/analysis of two different compounds in a single well in a test plate. 

‘It’s an issue that has now been addressed at the compound management level by at least one of the major vendors of compound storage and management software, and Genedata has been working with clients to provide a solution at the informatics level by introducing new capabilities within our Genedata Screener platform.’ The latest developments in this area were announced at the end of 2013 when Genedata reported on the successful completion of its collaboration with AstraZeneca, which focused on the development of a Genedata Screener software extension that can automate and standardise the analysis of combination compound screening workflows.

‘Since then, we’ve been inundated with interest from pharma companies looking for the informatics capabilities that will allow them to unlock this area of drug discovery’, Leven says.

‘And as with any analytical workflow in the pharma industry, the ultimate aim is to be able to increase throughput, which for combination screening requires informatics systems that can manage and analyse a squared helping of data per experiment. ‘Coming from the issue of the numbers of wells screened, it’s not so much a case of HTS goes combination, as combination goes HTS.’

Hang on to your data

High-throughput experimentation in any field of drug discovery or development inevitably results in the generation of massive amounts of data.

A decade or so ago, pharmaceutical companies would erase data from discovery-stage compounds that didn’t have any immediate developmental future, but over recent years there has been a move towards keeping all the data, explains Terry Smallmon, director of life science sales at LabVantage Solutions.

‘Throwing 90 per cent of your data away doesn’t make sense in an era when new discoveries about disease pathways and biological processes are leading to new approaches to therapy,’ he suggests.

A compound that doesn’t exhibit the desired activity against one target may impact another target and, as biological discoveries emerge, display potential utility in a completely different therapeutic field. That therapeutic field may not be within the IP holder’s direct area of expertise, but it may open up potential avenues for out-licensing deals or co-development partnerships. Business intelligence and data visualisation tools are providing new insights and often valuable information from mined data, thus justifying the need for pharmaceutical companies to hang on to every piece of information they generate.’

Standardisation bugbear

Pharmaceutical companies haven’t traditionally been keen to share their proprietary data, but the industry’s increasing willingness to engage in risk/profit sharing partnerships, coupled with the need to outsource portions of its discovery, preclinical and clinical workflows to the contract research sector, have led to an increased requirement for the transfer of information between organisations. This data-sharing requirement has been partially responsible for a call to standardise data from disparate sources. ‘Lack of data standardisation is still an enormous concern throughout the industry,’ Smallmon stresses. ‘Pharma itself has been pressing for data standardisation for years, through initiatives such as the FDA Data Standards Council (DSC) and organisations like the Allotrope Foundation. Vendors of informatics platforms, such as laboratory information management systems (LIMS) and compound/storage management software, have developed solutions that can cope with and convert many different data formats, but ultimately it would be much simpler if there was a common concept of what data should look like, so third party tools could much more easily query that data openly, across different data sources and datasets.’

Smallmon suggests that ongoing standardisation issues are at least in part due to instrumentation manufacturers, who generally provide their hardware with dedicated data models that aren’t compatible with other systems. ‘Things are starting to change, but the situation is far from perfect, and in the meantime, LIMS and other software vendors like LabVantage are having to ensure that systems can sit alongside and integrate with the dedicated instrumentation software.’

Facilitating collaborative R&D

Personalised medicine, biologics, and the trend towards externalising services and collaborative R&D models are trends highlighted by Mariana Vaschetto, VP for marketing at Dotmatics. ‘In the pharma research arena, there is a shift towards personalised medicine and biologics, and another clear and emerging emphasis is collaborative research. Externalising services is just one aspect of this collaborative approach, but we are also now seeing increasing numbers of collaborations that span multiple groups, where every stakeholder contributes original ideas and results. This concept of collaboration is ideally suited to making optimum use of the vast amount of historical data in pharma’s archives.’

‘Informatics for R&D needs to adapt to these changes,’ Vaschetto continues. The Dotmatics platform, for example, has been designed with this R&D setup in mind, as a true collaborative platform that enables research teams around the globe to collaborate in real time, not only sharing data but discussing ideas and testing hypotheses.’

The key component for any informatics system, and independent of the specific vendor, has to be the ability to enable a collaborative research environment in a way that maximises each stakeholder contribution towards the common end. ‘From the technical point of view it is important that the informatics environment can be customised to each company’s workflow and deployed either in the cloud or in house. Because the members of a project may work with multiple technologies, it is important that access to the collaborative environment can be done from a variety of devices, including traditional workstations and mobile devices.

Consolidation

Consolidation is one area that has played a major role in shaping the pharma industry, and this brings particular informatics challenges, comments Nick Townsend, director of life sciences at LabWare. ‘Merger and acquisition activity continues to create challenges for pharmaceutical clients who have embarked on global laboratory informatics projects. We have often seen how ongoing projects involving centralised, global laboratory information management systems (LIMS) and electronic laboratory notebooks (ELNs) have to be reshaped to accommodate newly acquired sites and new businesses. The project teams have to manage harmonisation of business processes as the new entities are brought on board, and then face the challenge of moving lab informatics systems and data over to the unified, global system.’  

This typically involves designing processes for managing the transfer of static data, such as test methods and product specifications, and reworking coding/naming conventions. It might also involve extending functionality to ensure that it’s business as usual for the acquired sites, Townsend notes.  ‘It’s a complex, resource-intensive task that has to be very carefully controlled and documented in a pharmaceutical environment.’

Merging informatics infrastructures

The biggest challenges are often associated with merging infrastructures. ‘Even in an ideal situation when both companies have the same enterprise resource planning (ERP), LIMS, chromatography data system (CDS), QA and ELN, it’s still tricky since these systems are tightly integrated. Our experience is that LIMS often forms a bridge during the transition and is often the first common, harmonised system which is then integrated with the two ERPs, CDSs, and multiple ELN systems, for example.’

Breadth of functionality is important here, Townsend maintains. ‘A LIMS with wide ranging functionality can be more easily used by a newly merged organisation than one with more limited features. Many of the mergers and acquisitions we have been involved with take advantage of this capability of LIMS, and we have invested in creating software tools, techniques and services to help our customers work through these complex tasks and manage data migration.

Increasing productivity

Pharma’s drive to develop personalised medicine, minimise risk and work more collaboratively is taking place in parallel with the ongoing need to increase productivity and reduce costs at the R&D level, comments Andrew Lemon, CEO at The Edge Software Consultancy. There’s much discussion about high-level strategies, but pharma still needs to consider implementing the informatics capabilities that will allow it to manage its basic R&D processes to save time, money, and reduce errors.

‘Pharma is constantly pushing to achieve more, and faster, but at lower cost. Companies need to optimise the use of their resources, from their laboratory personnel to their instrumentation and samples, and ensure that the project management teams and hands-on scientists can communicate effectively to drive a project forwards, cost-effectively and with maximum return. Take this concept down to the level of an individual workflow in, for example, the ADME, DMPK and toxicology space, and you need to be able to streamline and manage demand, particularly in the area of global assay requesting and assay planning, to ensure that communication, tracking and reporting don’t create bottlenecks. Time is money, and there’s little point having your scientists waiting at the bench because there isn’t an efficient, automated process for requesting those assays. Conversely, you have to make sure you have the personnel, laboratory space, and instrumentation capacity to meet demand.

Project tracking and optimisation, and assay planning, are all areas that The Edge is actively working in with its clients, to provide the tools that will allow pharma to minimise time, money and resource wastage.’

Tactical tools

Lemon concurs with Adel Laoui’s sentiment that discovery functions such as HTS and early development processes have become routine and are ideally suited to factory processes driven by automation and robotics. Move downstream to preclinical in vivo work and clinical studies, however, and there is still considerable room to improve efficiency.  Working with humans and animals can’t be automated.

‘Break down the 12-year drug development timeline into stages, and its evident that there are many opportunities to shorten the in vivo and clinical phases of development, not in terms of reducing the length of required studies, but with respect to improving efficiency through improved communication, and expedited planning, execution and reporting.

‘We believe that tactical tools are important in this area, to maximise the efficiency of both in-house workflows, and externalised processes that are carried out by contract research, academic or co-development partners. It’s a case of using smart software that can operate a networked R&D model, virtualised from the inside. Implement such tools, and shaving a not insignificant amount of time and cost off your development process is achievable.’



Topics

Read more about:

Laboratory informatics

Media Partners