Thanks for visiting Scientific Computing World.

You're trying to access an editorial feature that is only available to logged in, registered users of Scientific Computing World. Registering is completely free, so why not sign up with us?

By registering, as well as being able to browse all content on the site without further interruption, you'll also have the option to receive our magazine (multiple times a year) and our email newsletters.

Tried and tested

Share this on social media:

Beth Sharp explores the informatics landscape in pharmaceutical testing laboratories

Ask the question ‘what is laboratory informatics?’ and the answer you get back will vary greatly depending on who you’re speaking to; part of the reason being the broad spectrum of applications that fall under that heading, and the range of industries in which they are deployed. Elliot Abreu, senior vice president at Xyntek, considers informatics to be any computing technology information system that collects, analyses and reports data and aids decisions. While Laboratory Information Management Systems (LIMS) and Electronic Laboratory Notebooks (ELNs) fall firmly within this category, Abreu has noted a definite move towards scientific information platforms such as data warehouses and business intelligence systems that deliver business value from all of the collected data.

This shift is a significant one for the pharmaceutical industry in particular. ‘Data warehouses have been uncommon in the pharmaceutical research and development environments,’ explains Abreu, ‘because of the underlying lack of data standards, technology platforms and unified business processes required to support usable informatics solutions in a highly-regulated environment.’

He continues by saying that IT and manufacturing managers, and scientists are all beginning to look for solutions outside the typical ‘laboratory’ setting. For instance, there are integration standards such as S88 and S95 found in pharmaceutical manufacturing processing and batch processing addressing how the data and processes are categorised. ‘These standards are allowing companies to be more innovative in the way informatics solutions and data warehouses are applied and utilised,’ he adds.

When it comes to the deployment of solutions, discussions within the industry are increasingly turning to the topic of consolidation. The cost of maintaining multiple systems can be significant, not just in terms of ownership, but with regards to the IT resources necessary for their administration, validation and support.

There is no denying that the thought of reviewing a technology infrastructure and embarking on a consolidation project can be daunting, however by focusing on one integrated platform, IT personnel can become proficient in its operation and potentially resolve any issues faster and with fewer complications. Individual companies owning and operating multiple LIMS and ELNs has been commonplace, but as industry figures such as LabWare’s Nick Townsend and Elliot Abreu can attest, pharmaceutical organisations are now moving away from that scenario and wishing to standardise their solutions.

Trish Meek, director of product strategy for life sciences at Thermo Fisher Scientific, adds that companies that have invested in a LIMS, and possibly in an ELN as well, are increasingly re-evaluating their existing deployments in order to determine how to get the most from those investments. The benefits of going down the road of standardisation may seem clear, but doing so within a complex and research-based sector such as the pharmaceutical industry is never straightforward.

Vendors are endeavouring to make this easier through the release of platform solutions. LabWare’s Enterprise Laboratory Platform (ELP), for example, combines powerful LIMS and ELN products within the same application platform, fully integrated and bi-directionally communicating on multiple levels and actively synchronising the data between the two components. LabWare’s Nick Townsend points out that the ELP also provides a rich library of application modules that can be utilised by the LIMS and ELN.

‘Integration within a laboratory environment has come a long way,’ says Trish Meek, from Thermo Fisher Scientific. She explains that vendors are embracing technology like web services that better enable integration, but with a word of warning she adds that there remains a level of fear about the time and cost of integration projects. This is leading to labs prioritising the instruments they have several of and then leaving the rest out of the integration. Solutions like the Thermo Fisher Scientific Integration Manager ease this process by transforming information from any number of instruments and lab equipment, or an enterprise system like an ERP or data warehouse, directly into the LIMS and back out again.

Having participated in a cross-industry white paper recently published on the subject[1], Joel Usansky, senior product manager at Thermo Fisher Scientific, agrees that there is a definite need for better instrument interfacing, improved automation and integration in terms of data standards. He explains that each company that produces instruments has its own software that controls and acquires data. Unfortunately that data is exported to the LIMS in a different format – in fact, an individual manufacturer can use multiple formats. Xyntek’s Elliot Abreu also believes that the lack of data standardising and definitions across the various platforms are compounding the technology integration challenges.

There are steps being taken, however, towards a universal standard. ‘For large molecule bioanalysis there has been a big push to get users and vendors together to work on common data standards so that file formats can be synced,’ says Usansky, who adds that another level of complexity currently being faced in the pharmaceutical industry is created by the need to file, store and review the data transfer files. A further concern more specific to bioanalytical labs is incurred sample reanalysis, which is a way that companies have to analyse samples a second time in order to confirm the initial measurement. Usansky explains: ‘There’s a whole workflow associated with incurred sample reanalysis whereby it’s necessary to find the existing result and assays for a second time and then bring the results together to determine whether they confirm each other.’

Thermo Fisher Scientific’s Watson LIMS has a feature called Incurred Sample Reanalysis (ISR) which enables the precise workflow for doing this. Subjects can be identified by factors such as treatment, group and time points, and organised for reporting, selection and grouping analysis. Of course, as pharmaceutical companies recognise the benefits of entering into collaborations, comments Trish Meek, the issue becomes one of needing to integrate all the information from not only their internal labs, but also from CROs and academic partnerships. Data Manager, from Thermo Fisher Scientific, enables users to pull in all the original raw data, captured both internally and externally, and examine the base line and all associated results.

‘This tightly integrated environment allows lab managers to look at the lab as a whole so that if something comes up that’s out of specification the user doesn’t automatically assume it’s a failed batch; they can see the raw data and determine immediately that the sample should be re-run due to a failed injection or run,’ explains Meek. She adds that in addition to this interactive data review, the solutions have the ability to automate the process ensuring that at each step the software guides users through the process in a compliant and scientifically-valid way.   

During the past few years, there has been a certain level of upheaval within the pharmaceutical industry and the numerous mergers, acquisitions and redundancies have meant that the reality is companies are now trying to do more with less. Trish Meek explains: ‘Where the industry used to have people who could fill in holes in the process and ensure everything ran smoothly, companies no longer have the staff to be able to deal with those inefficiencies. It’s of paramount importance that these businesses continue to manufacture quality products and do good science within the R&D phase, and so need to provide as automated a solution as possible. From an informatics perspective, that’s where we’re seeing a big push.’

Paperless is more

One topic that often goes hand-in-hand with discussion of informatics is the move away from traditional paper records and towards electronic environments – the so-called paperless laboratory.  Trish Meek comments the push towards having a paperless lab is a major focus that the company is seeing within the pharmaceutical and biotech industries. ‘At all the various stages, the industry is moving the science forward and we need to support all the new methods of testing, new instrumentation throughout the life cycle,’ she says. ‘That’s always been the case but this push towards a paperless lab, while it’s something that has come up every five years for the past two decades, is only now actually being realised. We’ve reached the point where the technology is there and the reality is that we can deliver a paperless solution.’

Elliot Abreu agrees that paperless environments have been a popular topic of conversation for many years and he believes that it remains one of the main drivers in boosting efficiencies, however he isn’t as confident that the industry is at that stage. ‘If you look at some organisations within testing environments,’ he says, ‘they’re using forms and templates that are still being managed in Excel or customised macros that have been developed in-house.’ Abreu continues by stating that while there is currently a major push to replace these platforms with electronic versions, the main reason paperless environments have yet to be fully achieved is due to the complexity and ambition of what ‘paperless lab’ actually means. 

According to Abreu, a further and perhaps more fundamental challenge, is educating users on the benefits of these systems, and getting buy-in across the user community, while developing the process and user requirements. ‘There is a great deal of hesitation when it comes to adopting new technology, and we witnessed this with the introduction of ELNs,’ he says, adding that with the implementation of successful systems and buy-in from across the user community, more and more integrated informatics technologies will be embraced, hopefully resulting in a cultural change.

Though, as with any major change, a certain level of caution is advisable. Nick Townsend, from LabWare, believes that anyone considering the implementation of these applications should tread carefully as there are a lot of mixed messages around regarding the roles of each. ‘Combining LIMS and ELN can deliver many business advantages; the issue is getting the functional balance and integration right between the two,’ he comments. ‘It can be challenging to achieve a truly seamless, bi-directional integration of LIMS and ELN but the benefits can be substantial.’ He adds that feedback from the market seems to indicate that it is increasingly considered unwise to try and deliver a comprehensive laboratory automation solution with only one of these applications, for example by ‘stretching’ the functionality of an ELN.

In conclusion, Trish Meek has this to say: ‘Looking to the long term, I believe that in the next few years we will get to the point where informatics will be viewed more holistically and in terms of what a solution delivers, rather than categorising down into LIMS or ELN functionality. It’s about looking at the daily workflows of scientists, technicians and laboratory managers, and identifying the best possible solutions to address them.’

References and Sources

[1] Sheldon S. Leung; Joel Usansky; Robert Lynde; Theingi Thway; Robert Hendricks; David Rusnak. Ligand-Binding Assays in the 21st Century Laboratory: Recommendations for an Automated Data Interchange Process. AAPS Journal. 14(1):105-12, 2012.