Greg Blackman on the versatility of LIMS, ELNs and other data management solutions in the pharmaceutical industry
In 1928, while studying influenza at St Mary’s Hospital Medical School in London, Alexander Fleming noticed a ring of inhibited bacterial growth surrounding a mould growing on a Staphylococcus plate culture. He attributed the inhibition to antibacterial substances produced by the mould, which he discovered to be Penicillium and the active substance he named penicillin. It wasn’t until 1939, however, that Howard Florey and Ernst Chain developed penicillin for use as a drug, for which all three won the Nobel Prize in Medicine in 1945.
Penicillin is now a widely used antibiotic and subject to the various regulations governing the manufacture of pharmaceuticals, put in place to ensure what we’re prescribed for our health doesn’t end up causing us more harm than good. The heparin scare in February of this year, whereby contaminated stocks of the blood-thinner heparin were linked to hundreds of severe allergic reactions and dozens of deaths in the US, illustrates just how important it is to monitor drug production.
Inherent in developing, testing and manufacturing novel drug candidates, is the generation of huge amounts of data and the pharmaceutical industry is one of the major users of data management software, such as laboratory information management systems (LIMS) and electronic laboratory notebooks (ELNs).
‘The pharmaceutical industry has established solid data management practices in specific areas, such as genomics, where, within that discipline, there is a relatively robust method for managing data. However, where there is a gap is within the overarching analytical data management framework,’ explains Andrew Anderson, director of strategic partnerships at Advanced Chemistry Development (ACD/Labs), a company based in Toronto, Canada that produces software designed to integrate chemical structures and analytical chemistry information.
When a compound is synthesised, tests are carried out at each step of the process to maintain a high level of quality assurance. The starting materials are supplied from a vendor and need to be analysed for purity, structural fidelity, and to make sure the material hasn’t degraded during transit or storage. The synthesised compounds are registered and tested to assess structure activity relationship (SAR), an assay whereby the chemical structure of a compound is measured against its ability to elicit a biological response. Pharmaceutical companies have libraries of SAR data from countless compounds and when hits come back from SAR tests, accurate quality assurance data for that compound must be available.
‘There are literally thousands of individual data steps in providing quality assurance along the drug development process,’ says Anderson, ‘so what exists is a silo of analytical data that needs to be archived. The challenge for data management software is to be able to organise all that information into a workable system that maintains a high degree of accuracy.’
Further along the drug development pipeline, the final product undergoes analysis to assess its stability and ensure the biologically active ingredient retains its potency over time. All these quality assurance checks cost time and money, and their accuracy is vital to the production of novel pharmaceutical compounds. If analytical data is lost or mishandled, then this could prove extremely costly for the company, as compounds would have to be retested, reverting several steps in the development process.
Improvements in scientific instrumentation, such as shorter chromatographic separation times, have led to increases in the throughput of samples and the volume of analytical data being produced. In the opinion of Christoph Nickel, software marketing manager at Agilent Technologies: ‘The evolution of data review systems has not kept pace with the advances in instrumentation and further improvements are required for methods of reporting data and structuring data from experiments. The bottleneck has moved from data acquisition to data analysis and interpretation.’
According to Nickel there are two main directions that can be taken to eliminate the bottleneck: implementation of data management software to improve availability and management of information; and the use of standard formats to make information more exchangeable.
Standard data storage formats based on XML are being developed in cooperation with the pharmaceutical industry and the analytical equipment suppliers. ‘The standard formats are for general laboratory-based data management, but are being solely driven by the pharmaceutical industry. This is due to the legal requirements for long-term storage of data that drug companies are subjected to, which can be between 10 and 30 years, depending on the country and the regulations,’ says Nickel.
Agilent, a company with headquarters in Santa Clara, California, US, providing electronic and bio-analytic measurement solutions, has supplied its OpenLab (OL) Enterprise Content Manger (ECM) to Nycomed, a pharmaceutical company based in Zurich, Switzerland, that provides medicines for hospitals, specialists and general practitioners.
The team at Nycomed, formerly Altana Pharma, required a system that would comply with the regulatory requirements of the US Food and Drug Administration’s (FDA) 21 CFR Part 11, dealing with electronic records, and European Communities Annex 11 for computerised systems, and which facilitated quick retrieval of data files and the ability to create fully readable reports. Agilent’s OL ECM was set up to manage laboratory data from more than 60 different software applications.
‘One of the criteria we decided right from the beginning was that we wanted to generate reports in a readable format that would be accessible to everyone,’ explains Britta Bassen, project manager at Nycomed. ‘Being able to collate raw data from numerous instruments, each with their own proprietary software, and convert this into a PDF document that is not only fully readable, but also searchable, was an important aspect when deciding what we wanted to achieve from the system.’
Prior to installing Agilent’s OL ECM, raw data was stored on CD-Roms and reports were printed and archived. In the four years Agilent’s system has been in place, a sizeable amount of data has accumulated. ‘To get the most out of the system there has to be data inputted into it. It is only then that we start to benefit from having a system that stores the data in a structured way, as it’s easy to access old data or reports,’ says Bassen.
‘One of the biggest issues was the amount of different forms of analytical data,’ Bassen says. ‘As well as being able transform this data into PDF documents, the ECM also allows the original data structure to be recreated so it is readable on the proprietary piece of software that generated it.’
Trish Meek, director of product strategy for Thermo Fisher Scientific, a global company with headquarters in Waltham, MA, US providing a range of scientific products, including LIMS and ELNs, gives an idea of the timescales involved in bringing a new drug into the marketplace. ‘It typically takes years to take a candidate from initial feasibility testing through in vitro metabolism and early safety data. It can take up to 15 years to get the compound to final approval,’ she says.
According to Meek, pharmaceutical companies are trying to compress the time it takes to test a compound from years down to a matter of weeks. By using the knowledge gained from more recent data, researchers can avoid the lengthy cycle of testing compounds that will ultimately fail, and they won’t synthesise years’ worth of compounds without taking into consideration the knowledge they gained from their prior efforts. LIMS can play a critical role in streamlining this testing and ultimately in helping researchers more effectively use the knowledge gained from the past.
The level of complexity that needs to be handled by data management software increases the distance the product moves along the drug discovery and development processes. ‘It is relatively easy to keep track of samples passing through an individual laboratory, but when that is scaled up to the level at which large pharmaceutical companies operate, where there are millions of samples and tens of millions of data sets spread over a global network of sites, then data management becomes much more complex,’ explains Andy Vines, product manager at IDBS. IDBS, with European headquarters in Guildford, UK, provides data management solutions for research and development.
‘A research laboratory is generally a non-compliant environment, where regulations are less stringent. As products move from research into development, compliance to GxP (Good Manufacturing or Laboratory Practices) adds another level of complexity to managing the flow of data. This is compounded further when the product reaches clinical trial phase, where, as the information collected is personal to the individual undertaking the trial, data security becomes key,’ says Paul Denny-Gouldson, product manager at IDBS.
Symyx Technologies, a California-based provider of integrated software, hardware and consulting services supporting R&D, supplies its ELN to aid data collection and management within numerous pharmaceutical companies including Eli Lilly, AstraZeneca, and Bristol-Myers Squibb. The Symyx ELN supports different levels of regulatory enforcement depending on the type of work carried out, whether basic research or designing drug formulations for clinical trials.
John McCarthy, vice president, product management and strategy at Symyx Software states: ‘We don’t want to slow down scientists’ productivity in the early discovery phases by exposing them to regulations that are not required for the type of work they are doing. At the same time, the ELN provides a controlled regulatory framework for scientists involved in later development phases, such as scaling up to pilot plants where these regulations are required.
‘The electronic world enhances experiment capture by bringing all results together in fully versioned, shareable and searchable documents controlled by centralised signature and security protocols,’ he says. ‘The ability to sign and witness documentation electronically ensures that experimentation and the reporting of results are carried out to the standards required by the organisation, with accurate records of any changes made to electronic documents, including when the changes were made, by whom and why.’
Alain Meller, director of ELN solutions in the software and informatics division for life sciences and chemical analysis at Agilent Technologies, notes that: ‘ELNs evolved firstly to aid scientific discovery, but they have the added benefit of protecting intellectual property, and they provide a level of security for the potential holder of the invention.’
One of the problems facing pharmaceutical companies is the amount of data generated that is never used. Data stored on private hard drives or in complex formats, rather than on a searchable centralised database, is often left unused and unknown to the rest of the organisation. In addition, even if the results are revisited on future occasions, researchers lose confidence in those results, and tests often have to be reworked. ‘Managing the data and inputting it into searchable systems is a much more effective way of drawing together all the relevant strands of information. Connections between results, which could give vital clues on the nature of certain candidate pharmaceuticals, are much more easily explained when all the data is present and searchable,’ says Denny-Gouldson.
By gaining access to the results of experiments others have done, scientists can build upon previous successes rather than repeat prior failures. ‘Knowing what drugs failed in the preclinical or clinical phase last time, is just as useful as knowing what succeeded,’ explains Meek of Thermo Fisher Scientific. ‘Our LIMS help researchers run in vitro and in vivo studies. They present the project team with the information needed to make a go/no-go decision on a compound, and generate the data required to ultimately create the IND (Investigational New Drug) and the NDA (New Drug Application) submissions for successful candidates. In a manufacturing environment, in contrast, LIMS helps to maintain the batch record and ensure product quality,’ she says.
‘The challenge for pharmaceutical companies is making the sheer mass of data generated readily available throughout the company to maximise the efficiency of decision-making processes,’ says Simon Wood, executive director, marketing and education at Starlims, a global company, providing LIMS and Scientific Data Management Systems (SDMS) to laboratories in a wide variety of industry sectors, including pharmaceutical manufacturing and R&D.
‘The danger with using a paper-based system is that research teams will carry out work that has already been completed, or will follow leads that have been investigated and found to be useless,’ explains Wood. ‘With data management software, such as Starlims SDMS, researchers can avoid paths that are known to be blind alleys or where the work has already been carried out.’ McCarthy of Symyx notes that it has been estimated that 30 per cent of all experiments are repeated, simply because no one knows the work has already been done.
‘One of the most important trends in recent years is that pharmaceutical customers are choosing to move away from running multiple LIMS products and site specific systems, towards centralised implementations based on an enterprise-wide deployment on a common LIMS platform,’ explains Nick Townsend, director of life sciences at LabWare Europe. ‘A centralised LIMS from a single supplier is easier to support, easier to upgrade, easier to prepare and roll out training programmes, and most importantly it promotes standardisation and harmonisation of workflows across the company.’
There are important technical, functional and regulatory challenges to overcome for a LIMS to be successful in global deployments using a centralised architecture. The software must be able to effectively handle concurrent use of multiple local languages, market specific reporting requirements, and use over different time zones. The latter is particularly important for pharmaceutical regulatory compliance, as correct handling of dates and time stamps, regardless of where in the world an operator is based, is crucial for accurately recording system audit trails.
LabWare, a global supplier of LIMS products, provides a considerable range of standard ‘best practice’ pharmaceutical features, functions and modules through its LIMS, together with a comprehensive library of documentation and validation materials. Pharmaceutical customers can then choose to use the functionality as supplied or implement changes following a gap analysis of their requirements versus the standard capabilities.
‘It’s really not legitimate for a vendor to promote the dream of an out-of-the-box LIMS if it proves to be inflexible, fixedformat and cannot grow with the business or adapt to changes. LabWare provides a LIMS product, which can, if necessary, be adapted with built-in LIMS configuration tools to implement customer/site specific features without compromising support or future upgrades,’ states Townsend.
As electronic forms of report creation and data management become utilised more and more in the pharmaceutical industry, laboratories are slowly moving away from paper-based data capture and archiving. However, the pharmaceutical industry has various separate organisations attached to it, such as raw material suppliers, regulatory bodies, end users, etc. ‘Each of these are disparate entities and will all have their own way of doing things, and implementing a system that accommodates everyone involved will be difficult,’ says Anderson of ACD/Labs. ‘There is a willingness to transform to a completely paperless system, but it will take time.’
Meller of Agilent expresses similar opinions: ‘Currently, many drug companies using an ELN-based system are still printing documents for longterm storage. This is because the entire regulatory architecture for document storage is based on paper. ELNs have the potential for full electronic data storage, but turning a paper-oriented organisation into a fully electronic document management organisation will take time. This not only will require a new kind of infrastructure for reporting and storage of documents to be put in place, but also to entirely revisit the human processes around it.’
Agilent’s Nickel concludes thus: ‘Electronic methods of managing data are gradually taking away the need for working on paper. This, however, cannot be the final stage in developments. Once data is available in an electronic format, companies must put this to good use and do something with that data, whether that be redesigning procedures or automating processes to increase the efficiency of sample throughput. The tools are in place, but they aren’t effective if they’re not used to their full potential.’