FEATURE

The science of diagnostics

Sophia Ktori highlights the role of the laboratory software in the use of medical diagnostics

Laboratory-based testing to diagnose a patient’s disease or understand the cause of specific symptoms has traditionally centred on two spheres of laboratory testing. Clinical diagnostics is broadly concerned with the analysis of body fluids such as blood, while anatomic pathology is focused on the analysis of tissues, for example, biopsies or cellular aspirates, commented Lisa-Jean Clifford, CEO at Psyche Systems. 

However, during the last couple of decades, the emergence of next-gen sequencing and other gene- and protein-based assays and technology platforms has led to the development of complex molecular diagnostic technologies. These platforms don’t just identify pre-existing symptomatic disease, but are used to help predict disease risk or predisposition based on family history and genetic makeup, and to help clinicians select most appropriate treatment for patients according to their genotype. 

Emerging diagnostics

‘The challenge from an informatics standpoint is really being able to support all of the core functionality around those analytical processes and the complex experimental and data workflows,’ Clifford suggested. ‘For predictive testing based on next-gen sequencing you need to have a database that can map patients’ histories, possibly in context with those of their families, including children, siblings, parents and grandparents. 

Pharmacogenomic testing to help select the best medicine for a single patient according to their genotype will involve the analysis of genetic make up in combination with other patient data. 

Sophisticated algorithms are used to analyse all of this information and identify the best course of treatment going forward. It’s a huge ask for a laboratory information system (LIS) to be able to handle and coordinate experimental/analytical and complex data workflows and reporting requirements.’

The development of molecular diagnostic assays means that while some clinical and anatomic pathology laboratories do still focus on the more traditional types of testing, some laboratories are evolving to combine clinical and anatomic pathology diagnostics with a suite of molecular analyses, Clifford noted. ‘We are also seeing the emergence of specialty diagnostic laboratories that focus on just one, or a few, types of highly complex assays and technologies, such as next-gen sequencing, or predictive modelling.’ 

But whatever their specialisation, all diagnostic laboratories will have some key fundamental requirements in common, Clifford noted. ‘Underpinning every LIS will be a discrete database that lets you mine and compare disparate data so that you can derive maximum value from that data.

‘You also need seamless integration with instruments and other software, and a test ordering, scheduling and reporting system that can handle multiple types of workflow. A key requirement is the ability to integrate in a bidirectional manner between applications, and between instruments and applications, as well as output human-readable data. And on top of that your data flow must be managed and communicated in a compliant, automated fashion’ Clifford stated.

Supporting diagnostics in the laboratory  

On top of these common requirements, every laboratory will have more unique and higher level requirements, even if they offer similar types of testing. You need to be able to build and configure an LIS from the ground up to support the complexity of each laboratory’s operation’. As a rule of thumb, clinical diagnostic testing is high volume but low complexity. Anatomic pathology is mid-volume and mid-complexity, while molecular diagnostic assays are generally low volume but highly complex. 

Compare any two LIS platforms and it’s likely that the front ends, which handle ordering and sample accessioning, won’t necessarily differ hugely, irrespective of the type of laboratory, she continued. The middle sections will have to get the sample through the testing workflow, and that could involve just a single analysis on a single sample, or splitting the sample so that perhaps multiple additional tests can be carried out, each dependent upon results from the previous test. Set up the LIS to follow reflex rules and much of this can be automated, Clifford pointed out. 

It’s the back end that handles potentially complex and diverse data to be interpreted and reported, where most of the complexity for an LIS resides. ‘In clinical and anatomic pathology your final result is likely to be a range-based value, or an observation, which comes off a single instrument. In molecular testing there will be a series of results that will need interpretation, either by software or by a physician.’

Psyche Systems has been developing and supplying dedicated laboratory LIS platforms for private and pubic hospitals and clinics since 1976. ‘We have developed separate systems for each type of laboratory; clinical, anatomic pathology, toxicology and drugs of abuse, microbiology, and molecular. All share the same database, so they can be integrated with each other. Laboratories that do multiple types of testing can use two or more of our platforms to get a fully informed and unified view of the patient.’

Psyche Systems believes LIS platforms tailored to different specialisations have advantages over out-of-the box systems that can require laboratories to modify their workflows to fit. ‘We configure each individual system in partnership with the customer so that it works in a real-world setting and provides all of the functionality required by physicians, technicians, laboratory managers and business managers.’ 

The Core LIMS tracks every sample from the moment it arrives in the lab, to identify, map and track progress at any timepoint and processing stage in the workflow

The Icahn Institute’s Mount Sinai Genetic Testing Laboratory in Connecticut carries out nucleic acid sequencing for clinical diagnostic and pharmacogenomic testing, and also for academic and pharmaceutical research and development. 

The unit works with investigators at the institute, and with external clinicians, academic researchers and the pharmaceutical industry. 

Selecting the right informatics platform

A laboratory that carries out such a diverse range of clinical and non-clinical testing using complex molecular genetics techniques requires a LIMS infrastructure that can handle multiple types of workflow, manage complex data, and also meet all regulatory requirements for clinical sample and data handling, process validation, data reporting and patient confidentiality, explained Todd E Arnold, PhD, managing director at the site. 

‘We started working with Core Informatics, a part of Thermo Fisher Scientific, three years ago to configure and deploy their Core LIMS system at the laboratory. The entire process took about 18 months. One of our key priorities was to be able to guarantee complete traceability of every sample from the moment that it arrives, so that we could identify, map and track progress at any time point and processing stage in the workflow,’ said Arnold.

‘Critically, the LIMS had to be easily adaptable so that laboratory personnel could relatively easily add and validate functional elements for managing new workflows and instrumentation. We needed a platform that could be used on a day-to-day basis by technicians and scientists who aren’t experts in IT or LIMS’ stressed Arnold.

Configuring the Core LIMS was a huge task, he admitted. ‘I’m not going to say that all of the key development work was easy – we were all down in the mud doing this, because some of the elements we needed weren’t quite there yet. There was a lot of tailoring, and we got a lot of support from Core Informatics to ensure that every part of the LIMS operation matched the exact requirements of the lab.’ 

The Mount Sinai facility carries out genetic testing for the clinical and research sectors, both clinical and research samples are processed through similar workflows, Dr Arnold continued. ‘Every sample received at the Mount Sinai facility goes through the same clinically validated handling process, whether it is a research sample or has come from a cancer patient for diagnostic analysis that will then be reported back to a clinician. Samples are immediately anonymised, tagged and barcoded according to the designated workflow, so that none of the staff at the laboratory has access to any patient data. Accessioning triggers notification to relevant personnel in the lab that the samples are ready for processing, so the sequencing or other designed workflow can be started without delay.’

Each primary sample, say, a blood sample, has its own unique identifier, and sub-identifiers are given to sub-portions of that sample that are tested using different techniques or assays, Dr Arnold explained. Similarly, each patient has their own identifier, and sub-identifiers are used to track different samples from that same patient, whatever tests are undertaken, at any point in time. ‘This is particularly useful in instances where, for example, a patient’s response to a particular therapy is monitored over time.’ 

All the Mount Sinai laboratory’s workflows, instrumentation and data handling and reporting processes comply with CLIA and CAP requirements and meet HIPAA and PHI confidentiality requirements. ‘We also have quality control elements throughout the entire workflow, which the LIMS oversees to ensure that no part of a workflow or process is compromised, and that the required data is collected, handled, reviewed and reported according to regulatory guidelines.’

But the Core LIMS isn’t just a workflow tracking system, Dr Arnold stressed. 

‘We utilise the LIMS to track inventory for reagents and consumables, and also to monitor and analyse key performance indicators as part of our ongoing quality management. The laboratory promotes a culture of continuous quality improvement, and we mine data that is captured by the Core Informatics system to help make sure that our team and laboratory function at the best possible level all the times.’

The Mount Sinai facility currently has the capacity to analyse 50,000 to 80,000 samples per year, and that number will just grow, Dr Arnold suggested. ‘As we grow and expand throughput will increase, and we will undoubtedly be looking to incorporate new assays, panels, and sample sources to our workflows. There will possibly be new nucleic acid sequencing platforms, and other technologies for related analytical methods including gene expression, protein analysis and nucleic acid amplification. 

‘We currently operate a range of instrumentation from different vendors, which are all connected to Core LIMS as part of a workflow. This range of equipment and vendors will expand, and will become more diverse as we witness increasing amounts of cross disciplinary research, for example, physicists working with biologists. Although instrument vendors are becoming aware that their equipment may get plugged into multidisciplinary workflows, having the Core Informatics platform makes equipment integration and interfacing between software and the LIMS seamless,’ Arnold continued.

The Core LIMS platform has been set up as a part locally and part cloud-hosted system, but the likelihood is that more and more of the data workflows will be migrated to the cloud. ‘This will provide greater capacity, but also make it easier for people outside of the site to access permitted data, and setting up a joint cloud environment is easier than setting up multiple VPNs,’ Dr Arnold pointed out. 

Nicole Whitney, senior application manager for genomics at Core Informatics, maintains that working with customers like the Mount Sinai laboratory provides key insights that direct continued evolution of Core Informatics’ offerings to meet industry needs and address bottlenecks and pain points. ‘We use industry best practices to create applications for specific processes,’ she noted. ‘We take the feedback that we get from customers about the applications, and use it to create new versions of the apps that can even better meet their needs. These applications are available in our Platform for Science Marketplace, and customers can configure and combine them to form the foundation for their workflows. If a customer changes or adds a new vendor’s instrument, they can access the most appropriate apps to help build and configure that new workflow.’ 

Analysis and opinion
Feature

Robert Roe looks at research from the University of Alaska that is using HPC to change the way we look at the movement of ice sheets

Feature

Robert Roe talks to cooling experts to find out what innovation lies ahead for HPC users

Analysis and opinion