Skip to main content

A standard approach

Meeting industry standards can be imperative for many laboratories, whether working in a regulated sector or not, points out Simon Wood, product manager at Autoscribe Informatics. However, working to these requisite standards is not necessarily a guarantee of product safety.

‘ISO/IEC 17025 compliance, for example, encompasses general requirements for the competence of testing and calibration laboratories. Meeting that standard means that you have a well-run laboratory, but it doesn’t demonstrate product safety. A key element of product safety is assurance that data used to make key decisions is a true reflection of the actual data, and has not been subject to manipulation or other human influence. Laboratories must be able to show that workflows have been configured for secure data acquisition, recording and management.’ In effect, a well-run laboratory is the starting point for product safety. Accurate, unadulterated and complete data builds from there.

The interaction between different sectors of industry and their governing bodies also varies, Wood notes. ‘Parts of the clothing industry, which is regulated to ensure that materials meet flammability standards and do not contain potentially toxic dyes or chemicals, for example, have adopted a more collaborative approach, based around the Oeko-Tex Standard 100. This is a worldwide, independent testing and certification system for raw, semi-finished and finished products and accessory materials.’

In effect, as long as all materials used to produce your garments are certified to Standard 100, then the final product may be accepted as Standard 100-compliant, and require less frequent, or less in-depth regulatory scrutiny of the manufacturing process. ‘There are clear definitions of how to achieve Standard 100 registration, but it’s a more collaborative approach that benefits the entire supply chain.’

Meeting regulation

It’s obviously not possible to compare the degree of regulatory oversight relevant to the clothing industry, with that which is required to ensure the safety of pharmaceutical development and manufacturing, Wood comments. ‘Even so, regulators such as the US Food and Drug Administration (FDA), and UK MHRA take a more ‘adversarial,’ stance, he believes.

‘They provide guidelines, but they don’t tell the industry how to meet their compliance requirements. For example, FDA’s CFR 21 part 11 guidance, which covers electronic signatures. When the guidance was released it caused considerable confusion in the industry, because it didn’t explain what was meant by an electronic signature. For example, it was assumed, by some, that a physical scan of a signature was required; this was not the case. Current data integrity guidelines raise similar questions.’

Enter the LIMS as an infrastructure for mainstream data management, and it all started to make much more sense, Wood continues. Electronic signatures, audit trails and chain of data custody are embedded in your workflows. This baseline requirement for demonstrating unadulterated data capture and reporting holds true for any regulated industry.

‘The workflows generally adhere to a common format,’ Wood explains. ‘Whether you are a clothing manufacturer, or making a blockbuster drug, you still need to manage and test your samples against specifications, collect data from samples and controls, and present that data to your regulator, providing evidence that you are reaching the right standards. The need for data security and integrity, access levels etc. are essentially the same.’

The major differences between industries lies in what the LIMS will need to manage, including the type of information put in at one end, the complexity and volume of data that comes out of the other, and how you present that information to those that need to view it.

‘What these different industries must also take on board is that the LIMS itself is not compliant, but rather supports their regulatory compliance needs. The LIMS must be configurable to manage the laboratory operation so that capture, management and delivery of information meets the relevant compliance requirements and expectations.’

Choosing the right technology

The water industry, for example, will need to set up scheduling plans and complex run sheets for collecting, delivering and testing samples coming from the field, and will need fast turnaround from receipt to testing, Wood notes.

‘Water testing laboratories need a LIMS infrastructure with the ability to support that complexity of scheduling and provide the necessary analytics and trend-finding to prevent poor quality water entering the consumer pipeline. Laboratories in this sector, in particular, benefit from a LIMS to help them manage work scheduling, so that their staff aren’t under-deployed for long periods while they wait for samples to be delivered. Sophisticated scheduling will also maximise the use of expensive analytical equipment, ensuring a positive return of investment,’ said Wood. 

‘It’s about building a holistic system where all the different pieces work together to accomplish all the different tasks and management functions,’ continues Stephen Hayward, product manager at Dassault Systèmes’ brand Biovia. ‘At Biovia, we offer an ELN that can handle more flexible, unstructured experimentation, but this still integrates with all the other pieces of the Biovia laboratory informatics offering.

‘Labs can add our Compose module for procedure authoring – drafting SOPs and publishing them on the system – and then layer that with our Capture functionality, which allows users to capture data on a tablet, interact with pH meters and balances, and send the data directly back to the ELN.’ Out-of-specification results or instances of manual override are flagged up in real time, and channelled into a review-by-exception process for authorisation.

‘Then, on top of that, we can layer predictive analytics, including laboratory trends so that managers can predict and schedule equipment maintenance, for example. All this functionality can then be married with inventory management capabilities, so that an inspector can view not just individual bits of data or workflows, but evaluate the lab from a more universal perspective.’

Thus it’s important not to end up with a patchwork of software solutions that makes it difficult to find data and the contextual information that is associated with that data, suggests Julien Alvo, president of France-based SoftNLabs. ‘Companies may end up implementing possibly hundreds of different software platforms and modules, and take on foreign systems through mergers and acquisitions.’ This represents an IT headache, adds to the complexity of trying to get different systems to talk with each other, and can make finding data a game of hide and seek.

‘That’s where we need to start thinking about keeping data in standardised formats, to make sure data isn’t lost because the software used to read it becomes obsolete, or isn’t available in the lab that needs to view the data,’ Alvo points out. Standard data formats also allow users and inspectors to view data from different types of instrument or experiment in parallel, either for comparison, or because one piece of data was dependent on another. 

Making use of data

Organising data and setting in place an adequate archiving infrastructure are thus critical if laboratories want to ensure that their data is secure and can easily be found again, perhaps decades later. ‘Often, people will say that they can remember doing something, but can’t remember when they did it or where they put the resulting information.’ It’s a problem that will only continue to grow as big data has become part of every lab’s operation. ‘In an ideal world we would have a knowledge management solution for laboratories that is equivalent to Google,’ Alvo says, ‘and which can search across an organisation and platforms using just a few keywords.’

Two primary issues that must be addressed are those of outsourcing key analytical workflows, and long-term data retention, notes Burkhard Schaefer, BSSN-Software president. ‘At the good laboratory practice (GLP) level there is an increasing trend for industry – and in particular the pharmaceutical industry – to outsource analytical workflows. In some instances pharmaceutical companies are shutting down their in-house bioanalytical capabilities in totality, and outsourcing to contract research organisations (CRO).

‘This raises the question, how can I collaborate with an external partner and still be sure of the same quality and integrity of data that I know I can generate in-house to satisfy the regulator? Companies need to have infrastructure in place to demonstrate a chain of custody for a data record as it travels between organisations, and create a distributed audit trail that allows you to do that,’ added Schaefer.

Move downstream to GMP activities and QC workflows, and the issue becomes more one of data retention, he continues. ‘The regulator requires you to demonstrate that you have data safely archived, but more importantly, that it is retrievable for review, potentially 20 years or more,’ says Schaefer, concurring with Julien Alvo’s sentiments. This is when the topic of data formats becomes particularly interesting.

‘Archiving a proprietary data format doesn’t make sense, because it means the laboratory will have to retain the software, or the hardware on which it runs, even if it becomes obsolete,’ Schaefer notes.

Archiving in a binary data format is also less than ideal, because there is still a dependency on software to read it.  It represents a serious cost of ownership issue, he continues. ‘No-one will ever dare to throw away data, but if it’s kept in a proprietary format and you have to keep the software to read it, you will also need to keep and maintain the hardware to run it, and someone trained to operate it.’

Jim Brennan, technical specialist at LabWare, feels that Bioanalytical laboratories, for example, will want to secure and archive all data that comprise the tracking, testing, and reporting of samples from pre-clinical and clinical drug studies. ‘It is common that instrument data is archived separately from the remainder of the study archive,’ he said.

‘As laboratory software is updated, the burden of maintaining legacy versions falls on the custodians of these systems,’ Brennan says, in concurrence with Schaefer. ‘To access legacy data from old software, they must, in some cases, maintain specific versions of operating systems and databases to support outdated software. For decades data standards held the promise of eliminating this scenario. Unfortunately, some of these support limited instrument formats and are not extensible. Other standards can be modified without traceability and do not address archival storage.’

Creating a standard

BSSN-Software has been pioneering development and adoption of the Analytical Information Markup Language (AnIML) standard data format. Recently, LabWare developed functionality to exchange data in AnIML. 

‘AnIML is an extensible vendor neutral instrument data standard based on XML,’ Brennan explains. ‘AnIML uses Base64 encoding to prevent data from being modified while in transit through information systems. Conversion or export to AnIML allows generic viewers to access data after the proprietary software is no longer supported. Support for AnIML within LabWare provides immediate business benefit by eliminating the need to develop multiple interfaces for instruments that provide data in AnIML format.’

The topic of maintaining data in standard formats, particularly for long-term storage and retrieval purposes, is something that Biovia’s Hayward and Jansen also bring into the conversation. While BSSN Software and LabWare are pioneering the use and adoption of AnIML, Biovia is working with the Allotrope data format for storage of experimental data. 

‘The idea is that whatever system you are using, at least the data from all your instrumentation is archived in a standard format,’ Hayward says. ‘Then, as long as you define the ontologies in the right way, you also won’t lose data in translation or introduce errors when connecting systems together.’

BSSN offers interfaces that customers install, which on the one hand can organise their data flow to ensure a high degree of integrity between the instrument and the controlling system – typically a LIMS – and on the other enable bidirectional instrument interfaces. ‘We want the system to be able to tell the instrument what to do, as well as to retrieve information generated by the instrument,’ Schaefer notes. ‘Describing this interaction is also a key part of the complete data package that companies should aim to derive. We can do this using another standard, SiLA, which allows us to submit samples to instrument software, create instrument workflows, and then retrieve the data package back in AnIML.’

It’s highly interoperative, he notes, and using SiLA as a communication protocol provides the means for standardising the language used to instruct the instruments.

For Schaefer the ultimate goal is to be able to marry your instrument data with the package of contextual information, procedures, equipment and personnel data, and parcel it all up together in one set of AnIML documents that can be archived independently of your LIMS.

‘Companies currently maintain all of their calibration data, study data, instrument and batch release data, in the LIMS, but at the end of the day a LIMS is still part of that perishable construct called software, and from a business, as well as a regulatory perspective, you need to plan for the failure of such systems. Assuming that your laboratory will always have the same LIMS system is a bit shortsighted, when you think about the retention times that may be required for data,’ says Schaefer.

At the 2017 European Bioanalysis Forum (EBF) LabWare introduced the notion of a complete bioanalytical study archive that could contain all relevant data in AnIML format. ‘This could include all analytical instrument data, potentially from multiple analytical techniques, all sample tracking and storage records, sample freeze-thaw counts, rejected and reported results, incurred sample reanalysis, reassay history, etc,’ added Schaefer.

‘An AnIML-based study archive could be stored in LIMS or other secure centralised repositories. In principle development of such an archive would ensure that all study data would be readable at any time without the original instrument software, LIMS or ELN. This would be relevant not only in bioanalytical laboratories but also in other regulated laboratories involved in preclinical and clinical drug studies’, concluded Brennan.  



Media Partners