Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

The right prescription

Share this on social media:

Are informatics solutions keeping up with trends in pharma? Beth Harlen speaks to industry figures

Trish Meek, director of product strategy for life sciences at Thermo Fisher Scientific

The implementation of quality-by-design (QbD) principles by the US Food and Drug administration has opened the pharmaceutical industry up to the idea that there should be a full understanding of critical process parameters and critical qualities attributes, and that it should be iterative. An outcome of this shift has been the interest in paperless lab initiatives.

Until recently, there has been the perception that the cost of integration of a paperless lab outweighed the benefit. Integration technology has now improved to the point where we have direct web service connections between multiple applications that are intelligent bi-directional interfaces that can move data forward, based on reaching stage gates and checkpoints. Pharma is realising that not only are the benefits there, but that connecting a LIMS to SAP QM, for example, is not an enormous undertaking.

The key question is how to take existing investments – whether that is in LIMS, ELN, SAP QM, etc. – and have an integrated process, from simple instruments through to chromatography and mass spectrometry instrumentation, which eliminates the manual steps that introduce risk and the potential for errors. Individual business processes do need to be considered but the fact that vendor-to-vendor solutions exist in the market, and that a custom project is no longer required to connect those interfaces, is making companies far more receptive. Validation is critical in pharma and when commercially available integration solutions that have been used and tested over multiple sites are an option, that risk goes down.

Of course, while this all sounds good in principle, the biggest stumbling block is that many companies looking at this from an informatics perspective don’t know where their rate-limiting steps are in the process. What’s surprising is that even if a document exists in an electronic format it can often still be physically transferred between departments via paper. This is particularly prevalent in pharma due to the fact the industry has historically been resource rich in personnel, which meant there was always someone available to do the manual step and ensure the process felt consistent.

Now, however, the industry has been consolidated through acquisitions and mergers, and the number of scientists has significantly reduced in the past 10 years. This is manifesting in quality concerns and issues, but, again, informatics can have a fundamental impact. Depending on the company, we have seen an average improvement in performance of between 20 and 30 per cent, purely based on the elimination of manual processes and the review steps associated with them. The cost reduction immediately justifies the project and the pharma industry is beginning to embrace this – albeit at a slow pace.

Dr Othmar Pfannes, 
CEO at Genedata

The automation of R&D processes in general, and data analysis workflows in particular, is a driving force within pharma as the time it takes to move research projects forward is critical. Frequently, raw data is collected from different sources over a long period of time and understandably once the data is there researchers don’t want to spend another few months integrating that data and waiting for the analysis results. Most large pharmaceutical companies are recognising the need to standardise data analytic processes, therefore, and are working with partners to do just that. In the past, these companies could afford to run internal software development projects in order to have customised data analytics workflows. Today, however, pharmaceutical companies need to look more closely at the return on investment and consider solutions that work out-of-the-box with little configuration effort. Not only is this a cheaper option, but the deployment process becomes significantly easier and shorter.

Companies are also beginning to embrace enterprise platforms. This is driven somewhat by necessity as desktop software solutions are no longer able to cope with the rising amount of data and do not support collaborative research. An enterprise approach also ensures that external partners can gain easier access to the same algorithms and types of analytic workflows. There needs to be a slightly different mind-set when investing in an informatics solution for an entire organisation rather than an individual laboratory, but the key considerations remain the same: security, to ensure that data is protected from manipulation and theft, and that any outside partners only see the data sets they are supposed to; and, of course, scalability, which gives researchers the flexibility needed to grow. I would recommend that any company that is in a position to evaluate its informatics strategy pay close attention to enterprise solutions.

Andrew Lemon, managing director at The Edge

Almost 10 years ago, the pharmaceutical industry went through a revolution with the push towards more outsourced and networked research models. When using this model it’s important that any outsourced activities are made as much a part of the internal workflow as possible. The same level of care needs to be given to preparation, understanding and time tables for delivery, and a robust informatics platform is necessary for managing that flow of information. Externalisation and the use of partners such as contract research organisations remains prevalent, but we are beginning to see a change as pharma companies now look to pull their assays back in-house. The cost advantages that drove the move to externalisation in the first place are becoming less obvious now and although I don’t believe we’ll see a wholesale reversal, the balance is shifting towards in-house research.

This change is forcing pharma companies to look at their informatics deployments because one side of the business has been essentially virtualised in order to simply load up received data. Now that a higher level of control and functionality is needed, companies are investing in systems that can support the transitional model where some aspects remain outsourced while others are brought in-house. Progression of each project needs to be tracked and compared to all the others. This transparency can have a profound impact on the decision-making process. Ensuring that the informatics system adds quality to the data at the point of capture is also crucial because without the metadata, it becomes more difficult to exploit that data further down the line. The importance of effective data management should not be underestimated, especially in research biology, and it’s important that companies invest in an informatics platform that can handle the entire life cycle.

Kabir Chaturvedi, director life science solutions 
portfolio marketing, Elsevier

Informatics tools in pharma have been keeping pace with the rising torrent of data, but there are several challenges. On average, there is a 10-year incubation period before a drug is ready to hit the market – and, as roughly just six per cent of candidates succeed, there is a considerable amount of risk involved. The cost implications are equally significant, as each candidate when taken through the pipeline after preclinical can cost several hundreds of million dollars at a minimum.

The informatics needs of pharma are complex but one underlying point is that companies need a solution that can take the torrent of data streams and anchor them into fact. Taking next-generation sequencing as an example, there will be a time when every human will have their genome sequenced and the amount of data being generated will simply be incredible. Those predicted features within those genomes will need to be grounded in scientific fact. Providing well indexed data and associations in that data will help inform predictive analyses, minimise risk and ensure that there is no need to reinvent the wheel each time.

Many companies have their own vast repositories of internal data and paper records that are waiting to be mined, and it’s vital that this is integrated with external bodies of information. When faced with a proliferation of data, scientific publications, and new methods of reporting, users need access to a structure that will be able to navigate across the different standards and data formats. Interoperability and a common structure and format are priorities for pharma, and informatics solutions provide that stability. The analytical tools do need to be scalable and in the future I believe cloud will definitely impact the pharma community.

Daniel Weiner, senior vice president and general 
manager at Certara

There has certainly been an increase in the number and complexity of informatics solutions in the pharmaceutical space, but the data remain heavily siloed. For example, solutions are being deployed in discovery or preclinical research, but the emphasis remains on these separate areas rather than accessing the information that spans them.

There is a growing and marked understanding in pharma that they’re not good software developers and as large legacy systems reach the end of their life cycles, companies are looking to bring in external systems. There is still a belief, however, within some of the larger organisations, that they can build better than buy. I do believe that if IT departments reported to R&D rather than finance this would be a much different situation as the end users of such systems recognise the value of best-of-breed commercial solutions.

The reality in pharma is that failure rates haven’t really improved over the past few decades and too many drugs are failing late in development. The implication is that costs skyrocket accordingly. Again, this all comes down to the need for more of a translational view to development that would break these data silos down and provide better opportunities for organisational learning.

We have seen many instances where a drug fails in Phase II or III and the company doesn’t have the mechanisms to share the cause of the failure with other departments. As a result, that company could attempt to develop a molecule that fails for the exact same reason that a prior molecule that was structurally similar failed. This is happening across the industry as effective informatics mechanisms are not in place for the communication of successes and reasons for non-successes. Without the implementation of efficient workflows within each of the silos as well as spanning them, pharma companies are crippling themselves.

Nick Townsend, director of Life Sciences at LabWare

As far as laboratory information technology goes, we are living in exciting times. Solutions are becoming incredibly powerful and companies are understandably eager to take advantage of the new levels of functionality being offered. The opportunities and capabilities provided by cloud computing and modern mobile devices are also significant, but for highly regulated industries like pharmaceuticals the management of change to introduce new capabilities can also present a challenge.

When attempting to adopt new software solutions or modify existing ones, pharma has to shoulder the burden of ensuring that the implementation is executed in a compliant manner. This means that any alteration to any IT system, no matter how small, must be carried out in a very controlled way with thorough testing and an accompanying comprehensive ‘paper trail’ (now electronic) that documents the specifics of each change, how it was implemented and how it is was tested. The impact of what may seem like an easy change to make on a technological level soon mounts up in terms of the resource involvement and financial investment when all the change control procedures are taken into account.

This is further complicated when attempting to manage connectivity between multiple applications from different vendors if single platform solutions are not available. The impact of how the update of one application will affect all the other interconnected applications must be fully evaluated. The change control aspects are compounded by any potential variations in the supporting infrastructure and this must 
be taken into account when evaluating the overall cost.

The level of rigour required in the pharma industry is considerable, so informed decisions on the impact of change control must be taken into account when decisions are made regarding informatics and infrastructure. It’s vital therefore that no matter what type of solution is desired, companies are able to work with vendors that can help them look beyond the product features and focus on the overall implementation and to understand the full cost of introducing new solutions.