FEATURE

Leaving a legacy

Robert Roe investigates the importance of upgrading legacy laboratory informatics systems and the benefits this provides to scientists

Organisations choose to adopt informatics solutions in order to accelerate their capability for research, but once these tools can no longer provide sufficient value they must be replaced. However, choosing which solution requires an understanding of the role of the laboratory and how it might change in the future.

This is, in part, because there are a host of varied solutions available both in terms of laboratory informatics management systems (LIMS) and electronic laboratory notebooks (ELN), and combined platforms that offer functionality of both systems.

There are a wide variety of deployment models and underlying technologies such as hosted or cloud based-software packages, which further complicate purchasing decisions.

When is the right time for an upgrade?

Ed Ingalls, Thermo Fisher Scientific’s director of professional services for the Americas, explains the reasoning behind most users’ choice to upgrade their software: ‘Typically the systems tend to be aged and the technology has passed them by – they are not satisfied with the system anymore whether or not it is supported by the vendor. Oftentimes the customers themselves have neglected them and they just fall kind of dormant and they are not getting enough value out of them.’

Maarten den Boer, Thermo Fisher Scientific’s director of professional services, EMEA, agrees: ‘We also see a lot of home-grown systems that need to be replaced,’ adding that these projects ‘will sometimes require external developers or proprietary languages that are no longer available for further software development’.

Managing and continually updating an internally developed software packages is an expensive and time consuming task. Software development requires a company to support an IT infrastructure with sufficient expertise to maintain its software. Most laboratories do not want to carry this burden.

Trish Meek, director of product strategy at Thermo Fisher Scientific, states: ‘A lot of times when upgrading those internally developed systems we find that the resources are ready to be retired, they have left the company and they just cannot support them internally anymore.’

Customise vs configure

Rather than creating in-house software, many informatics vendors now offer tools to configure workflows without the addition of custom code.

John Boother, managing director at Autoscribe Informatics, says: ‘One of the strengths that we have is the flexibility of our software, our ability to genuinely configure the software – to meet individual customer’s requirements. What I mean by genuine configuration is we have developed a set of tools that allow us to develop workflows and screens for the system.’

Boother gives an extreme example of a lottery management system that was created using the Matrix Gemini LIMS system ‘without writing a single line of code because we can do it all with configuration tools.’

Laboratory informatics software is developed in order to support scientists to carry out laboratory operations. Traditionally this meant sample management and supporting testing and experimentation – making it easier to store and analyse data that has been created. However, as the role of the laboratory becomes more complex through new methods and more data and compute intensive technologies, upgrading a legacy system require an understanding how the role of the laboratory can change over time.

‘The system can be continually or periodically updated in terms of its user interface and its workflows and so on to match the needs of the user company. The LIMS that we supply have got to be designed for change because laboratories change,’ says Boother. ‘I am not talking about the need to add different tests but something a bit more fundamental than that.’

Boother says that, increasingly, it is not just the initial sample data that needs to be collected. A user may want to adjust the way the data from a certain test is stored, adding new data streams or adjusting existing parameters to allow better integration with other data sets. The Matrix Gemini system is designed to allow easy reconfiguration to tailor data acquisition and management in addition to setting up new tests and experiments.

Anthony Uzzo, Core Informatics co-founder, explains the importance of introducing workflows into informatics software: ‘This is the issue that has plagued many of the legacy LIMS and ELN products in the market. In order for them to be tailored to the unique needs of customers, they need to invest in a significant amount of custom code. That makes each installation of their product different from all of the rest of the systems that they are responsible for supporting.’

This is a major step that has been taken to extend the life of informatics systems as users are not adding code to a software package but creating customised experiments that can help them adapt to changes in laboratory function.

As Uzzo states, these tools allow vendors to unify their software without custom code embedded into individual installations – significantly increasing a vendor’s ability to maintain and support users. All of the informatics software vendors interviewed for this feature include their own version of workflow creating tools to support user configuration.

‘These are accessing frameworks that we have built into the system to empower the customer to deal with aspects of their workflow that are unique or to give the customer the ability to deal with change in the future,’ says Uzzo.

Supporting change

When replacing a legacy system the focus must be on the requirements of the users but this is not a simple task. In most cases, this requires a deep understanding of not only the industry in which the laboratory serves, but also to predict how changes in this industry might affect future development of the methods and experiments used.

As den Boer highlights, it is not the upgrade that is most important but the reasoning for the upgrade: ‘I think data is one of the big challenges. Before you start the process of replacing a legacy system you need to think about the intent. You need to ask yourself “did my business change in the meantime?” Normally the business has changed since the company first implemented the system. The lab has moved on but the system has not moved on and moved with it.’

The latest software products allow users to customise the functionality of a LIMS or ELN using workflow tools, but this is not possible with older technologies. However, legacy systems do not include this functionality so there is inevitably a point where the functionality is not sufficient to support changes in the laboratory.

Den Boer explains that, to manage upgrades effectively, it is imperative that users ‘sit down and really thinking about what they want to achieve with this upgrade or replacement of this system’.

Failure to properly audit a lab’s function and requirements is a common pitfall in den Boer’s experience, which results in a company that is on the back foot from the start of a new deployment or upgrade. The vendor and the customers end up learning this the hard way as they configuring the system to meet the laboratory’s requirements. This can cause delays and unwanted complications, which further postpone the new system.

Other than managing the requirements for an upgrade, Thermo Fisher’s Ingalls stresses that data migration can be the biggest challenge to successfully deploying a new informatics system: ‘Different LIMS systems data models can be so radically different that trying to do a full migration of data can get really challenging.’ Ingalls adds that this is more applicable to the sample, test, or lot relationship, rather than master data, but nevertheless combining disparate data sets from different platforms can be extremely difficult and time consuming. This is particularly troublesome if proprietary data formats are involved.

‘Sometimes you may limit the new system to try and accommodate the new data and that is a pitfall that you can get trapped in, and then you are not getting the full value out of your new system to try make compromises to support the old data,’ says Ingalls. ‘There are definitely some tough decisions to be made about why you want to move all your legacy data into that new database model – I would say that there needs to be some pretty strong factors to want to get into that.’

Data migration can be troubling to organisations, particularly if they are using instrumentation with proprietary data formats or particularly old legacy systems that can make it more troublesome to port data across to the new system.

In these cases, it is a case of deciding how much of this data needs to be recovered and stored. Some companies want to store everything, but den Boer stresses that this is not always the right choice – if the laboratory is moving into new methods and processes, then not all data may be necessary. ‘Data is one of the other challenges,’ says den Boer. ‘What data do you really want to port over? Do you want to do data cleansing as part of this exercise?’’

Radical change?

However, upgrading legacy systems is not just a case of throwing out the old and getting the shiniest new technologies. As Core Informatics’ Uzzo explains, it is an opportunity to radically change the functionality and efficiency of the laboratory by adopting new methods to conduct research: ‘As businesses shift from small molecules to biologics therapy, as they are increasingly looking to leverage next generation sequencing (NGS) technology or CRISPR cas9 workflows. No organisation, even large pharma, possesses the requisite capabilities to develop these modalities and bring them to market.’

Uzzo highlights the pharmaceutical companies as one industry in particular that is experiencing considerable change. ‘Pharma Workflows have entirely changed over the course of the last five to 10 years,’ says Uzzo. ‘These businesses have the choice of continuing to invest in custom code for legacy client server products which only run on their networks or they can use this as an opportunity to evolve to a modern informatics platform, in the cloud, that has the flexibility to suit any and all of their laboratory workflows.’

In addition to new modalities and research methods, pharma companies have also experienced change in externalisation and outsourcing of drug development processes. This provides an excellent opportunity to adopt cloud-based technologies which can further bolster a company’s capability for collaboration with external partners.

Core provides three methods to deploy its informatics software; a public, multi-tenant environment; hosted private installations using Amazon Web Services (AWS); traditional model of deploying servers on premise within a client’s facilities. While around one third of their customers use each deployment method, the AWS model was growing rapidly year-on-year.

‘Every year the proportion of our customers running in AWS increases, we expect it to be well over 80 per cent by the next year,’ states Uzzo. ‘The vast majority of new customers that join Core are going straight into the cloud. They are first motivated by changes in the laboratory, but are evaluating new informatics systems they are using that as an opportunity to switch to vendors that have the most mature cloud competency. Pharmaceutical increasing their adoption of externalised research partners, which makes a cloud-based system even more important to adequately share and exchange information with these collaborators,’ adds Uzzo.

Many of these new cloud customers want to take advantage of cloud-based systems to facilitate the sharing of data with both internal and external collaborators. Thermo Fisher also reported a large increase in collaborative approaches for drug development.

‘Collaboration is definitely something that we have seen over the last ten years with the disaggregation of pharma and outsourcing to CROs and CMOs and strategic partnerships towards external organisations,’ says Meek. ‘Sharing that data outside of your four walls is more important than before.’

‘We are seeing a lot of customers looking at how they can get the most out of their data and that has really changed the way they look at the laboratory and subsequently the way they look at the laboratory systems that manage that process,’ states Meek.

Upgrading legacy systems require an understanding change but, if handled correctly it not only provides an opportunity to adopt the latest infrastructure but it can provide added flexibility, performance and functionality.

Uzzo explaines that this is particularly true for companies using data intensive technologies such as NGS because cloud-based systems can be used to provide additional resources on-demand: ‘Customers running within our infrastructure can burst into available compute capacity, as loads dictate. Once that workload diminishes those application resources are automatically de-provisioned so that users are only paying for what they needed.’

He concludes: ‘Ultimately, this means that they do not have to worry about deploying and managing that infrastructure.’ l

Feature

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori

Feature

Robert Roe looks at the latest simulation techniques used in the design of industrial and commercial vehicles

Feature

Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware

Feature

Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community

Feature

Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers