FEATURE

Regulating scientific discovery

Sophia Ktori explores the use of informatics software in the first of two articles covering the use of laboratory informatics software in regulated industries

Most of the issues for regulated industries are centred on how you acquire, manage and store your data, and knowing what depth and breadth of data may come under regulatory scrutiny from development to manufacturing QA/QC, explains John Gabathuler, director, industrial and environmental, at LabWare.

‘While all laboratories should comply with their own, national and international regulatory standards that are applicable, laboratories operating in regulated sectors may also have to comply with standards such as the ISO 17025 quality management standard, regulatory data integrity guidance’s, and 21 CFR Part 11’ said Gabathuler.

The ISO 17025 standard (broadly) encompasses technical competence, documentation, the control of records and data, and results reporting. Regulatory data integrity applies to systems and processes in the chain of custody of the data, while 21 CFR Part 11 details FDA’s regulations on electronic records and e-signatures.

There are many requirements that must be met to achieve compliance, Gabathuler continues. ‘Labs must provide sufficient evidence of adherence to their regulatory procedures. And, critically, they have to demonstrate that the results being put into the system are the true results that were acquired at the point of testing.’

Do this electronically by integrating your instruments with your laboratory management system – laboratory information management system (LIMS) and electronic laboratory notebook (ELN) – and there is much less opportunity for error. Labs with analysts and technicians inputting data manually will have more processes in place to ensure the integrity of their data, he notes.

While it may seem that the topic of regulatory data integrity has become unavoidable in any discussion on data management, its importance really can’t be overstated, Gabathuler suggests.

Capturing the right data

‘Historically it’s been the QA/QC laboratories’ LIMS that test products going out the door, or laboratories that test the raw materials coming into a manufacturing process, which have been most concerned with regulatory data integrity, but we are now seeing this whole concept driving down into other industries and also into research and development workflows. Audit trails, electronic signatures and laboratory execution functionality are now more common in electronic laboratory notebooks (ELNs) that serve structured and less structured laboratory workflows,’ comments Gabathuler.

Everything comes down to the ability to capture data at source, and to ensure that your data has not been modified, or if it has, that any modification has been assessed by the relevant authorised staff, qualified and approved, he continues. ‘From a compliance standpoint, this holds true whether you are in a discovery lab or a QA/QC lab, and whether you are capturing data automatically from an instrument that interfaces directly with your LIMS, or whether you are capturing or recording data on a mobile device such as a smartphone or tablet.’

Interestingly, most of the regulatory bodies do not provide companies with to-the-letter instructions on how to achieve compliance, Gabathuler notes. Rather, their guidelines and standards expect laboratories to demonstrate sufficient evidence that they have the processes and controls, the required level of validation, and the ongoing risk-based data reviews in place.

‘For a LIMS or ELN system, such as LabWare LIMS or LabWare ELN, that means building functionality into the system both for direct results capture, and for a full and complete audit trail that encompasses all data input,’ comments Graham Langrish, sales manager for life sciences at LabWare.

‘The idea is that we provide the standard functionality to help customers meet compliance and ensure regulatory data integrity, but at the same time offer the flexibility so that clients can adapt the system to meet their business processes as well,’ adds Gabathuler.

A software system designed to help achieve regulatory compliance in a laboratory should also be versatile to accommodate changes in existing regulations, or to encompass new compliance mandates, Gabathuler continues: ‘Key to that is building a versatile system that won’t become obsolete, or require customisation in the face of new or changing regulations, and that will also allow companies to adapt , as simply as possible, according to their interpretation of the regulations. Laboratories should never have to go out and buy a whole new system because new regulations have been introduced.’

Common sense regulation

Much of what regulators are asking laboratories to provide with respect to their data is more or less common sense, comments Daniela Jansen, director product marketing at Dassault Systèmes’ brand Biovia. ‘Any laboratory or company should want to have good quality data, because decisions will be based on that data, whether we are talking about developing a new product, or releasing a product batch.’

The quality, completeness, reproducibility and integrity of data are at the top of the list of regulatory compliance must-haves, although the topic of data integrity hit celebrity status when FDA released its latest guidelines in April 2016, in light of potential data integrity breaches identified during routine inspections. ‘But it’s not just relevant to regulated industries,’ Jansen notes. ‘Setting in place steps to ensure data integrity is something that all companies should strive for, whether they are in a regulated sector or not.’

One of the major differences between regulated and non-regulated industries is the level of documentation, she states. ‘From FDA’s perspective, if you haven’t documented it, you haven’t done it. This is adding a lot of additional work for organisations.’ Data review and approval is another ‘burden’ that is tied to data quality, and keenly reviewed by regulators, Jansen continues. ‘The timely and adequate appraisal of both the data and the process by which that data was derived is central to the issue of data quality.’

In today’s digitally driven labs there should perhaps be no excuse to claim ignorance, she believes. ‘Informatics systems such as ELNs can dramatically help to reduce the likelihood of intentional or accidental errors in data input and alteration and ensure data integrity and quality.’ Direct acquisition of measurement and analytical results from instrumentation removes the need for manual data entry, while automatically flagging up changes to data, highlighting anomalous results and forcing step-by-step review and approval, ensure that reviews are becoming easier and procedures are followed’.

This is data handling for good practice, adds Stephen Hayward, product marketing manager at Dassault Systèmes’ brand Biovia. ‘Laboratories are increasingly looking for digital solutions that reduce the potential for manual error, acquire data in real time, and provide chain of custody. Ultimately they want to make sure that their data is complete, can be reproduced and, importantly, re-examined, in context, at a later date, both for regulatory review and also to aid future decision making.’

For an R&D-driven organisation, the drive to fulfil data quality requirements shouldn’t just start at the point where the regulator may step in and want to look at data, Jansen believes. Rather, data quality, integrity and accessibility should become an integral consideration at any stage where those decisions may impact on development, manufacture or release. Ideally labs should have an infrastructure that can record data in combination with all contextual information, in formats that can be viewed and interrogated side-by-side. This doesn’t just satisfy the regulatory requirement for full and complete data, but also allows the lab to derive far greater value from that data over the short and longer term.

‘Critical to this is the ability to handle disparate forms of scientific data,’ Jansen notes. The context in which that data was derived is similarly critical, she adds, whether your result is a simple weight or pH measurement, or a more complex set of chromatographic and mass spectrometry data. ‘A result that is just a number is meaningless. But if you know that number is a pH, then it gets a value, and if you know what you measured the pH of and why, then you provide some initial context. Combine this fundamental ‘answer’ with information on which pH meter was used, when it was calibrated, under what conditions the measurement was made, the workflow in which that the pH measurement was taken, and who performed the measurement, and then you can start to derive some meaning.’

Today’s laboratory informatics solutions are ideally suited to managing data acquisition and contextualisation, in parallel with providing complete audit trails. Layered on top of this, Jansen continues, laboratories also need to demonstrate that they have followed correct procedures throughout an analytical or experimental workflow.  ‘Has your complete, accurate and unadulterated data been acquired by following the correct procedures?’

Whereas, historically, standard operating procedures (SOPs) and laboratory workflows have been written as paper-based instructions – with no way of verifying that the scientist or technician carries out every step according to specifications – smart laboratories are incorporating laboratory execution system (LES) software into their informatics infrastructure, whether as bolt-on modules, or directly embedded into their laboratory informatics environment . 

‘The LES effectively leads staff through each stage of an SOP, whether for an experimental or an analytical workflow, and progression to the next stage is only permitted once the previous step has been completed and, if necessary, signed off,’ Jansen notes. ‘Combine procedural control and management with management of personnel authorisation – and confirmation that they have been trained to run a process or instrument – with management of instrumentation calibration and maintenance, consumables and other reagents, and you have a complete data package that should satisfy all regulatory requirements.’

Assisted compliance

SoftNLabs is a specialist laboratory IT consultancy that works with laboratories in any kind of industry, such as pharmaceutical, food and contract labs, who are looking to select and implement software solutions that will help them achieve regulatory compliance as well as optimise laboratory operations and business processes.

‘We work with companies are looking for a complete informatics infrastructure or LIMS, as well as clients who are looking to add new functionality or complementary systems to an existing platform, whether to help meet compliance or to expand the system into other departments,’ comments Julien Alvo, president of the France-based firm.

‘Adding new LES functionality, for example, may not involve a complete reorganisation of the informatics infrastructure. We can identify an LES solution that will work with the customer’s existing informatics systems, to digitise their protocols and procedures, and allow them to embed SOPs into their business rules,’ Alvo continued.

For regulated industries, lack of procedural oversight and validation can lead to potentially devastating consequences, Alvo comments. ‘Within the last five years or so one pharmaceutical company lost millions of euros when they were shut down by the regulator because they had not implemented adequate process validation before moving into full production.’

Encouragingly, Alvo notes, ‘in most cases, companies come to us because they have failed to find the right solution, not because they have failed from a regulatory perspective.’ Sometimes stepping back and stripping down needs from a business and regulatory perspective can help point to a more streamlined, efficient informatics solution.

Despite the warning stories that hit the headlines, there are, perhaps surprisingly, still companies operating in regulated sectors who continue to rely on paper-based records and spreadsheets to a lesser or greater degree. ‘They may use the software from their instruments but have no LIMS to underpin management of their samples and the resulting data. This makes it hugely problematic to ensure either a complete chain of custody for data, or to confirm its integrity, as there may not be any system for maintaining an audit trail. One of our roles is to help these companies transition to electronic systems so that they can comply with FDA regulation 21 CFR Part 11,’ Alvo notes.

So, what are the main criteria that should be applied to companies who work in regulated industries when they start to think about selecting a new or upgraded informatics system? Perhaps surprisingly, one of the first things that companies should consider is their user groups and user roles, he suggests.

‘Once we have a basic understanding of the field in which the client’s laboratory operates, say, quality control or bioanalysis, then we have a foundation on which to build. One of the first things we then try to define is the user groups and user roles, the types of data that the laboratory produces and handles, and in particular who will have control of the system.’

If user groups and roles aren’t managed quickly, then once the system has started to generate a large volume of data it becomes hard to keep tabs on security.

‘Say you have a laboratory with a chemistry and a bacteriology unit. If you don’t add your user roles and data groups at inception, then it very quickly becomes difficult to define where new data should be channelled,’ comments Richard Vaysse, technical director at SoftNLabs.

‘You can add new groups and individual users in at any time, but the rules must be established at the beginning of the project, and establishing user roles will make it possible to control who is authorised to add, modify or view data’ Vaysse concludes.   

Exclude from view: 
Feature

As storage technology adapts to changing HPC workloads, Robert Roe looks at the technologies that could help to enhance performance and accessibility of
storage in HPC