Cloud capabilities

Share this on social media:

Cloud computing is helps to facilitate migration to new software platforms and supports real-time data access, writes Sophia Ktori

Credit: Gorodenkoff/Shutterstock

A recurring issue with companies looking to migrate their LIMS and other lab software into the cloud, is that a lot of that hardware or software may be very specialised. It may also have become embedded into the existing lab network, and so is difficult to disentangle, explained James Pena, product manager, digital science at Thermo Fisher Scientific. “Whether that software is a LIMS, LIS, or a more specific instrument software, say, chromatography, or mass spectrometry, it can be tied into the lab infrastructure in a very fixed, or static way.”

“That existing lab infrastructure can be very entangled, Pena continued. “It’s a daunting task to think about how to even start untangling that web, but a stepwise approach that focuses on ‘modernising’ and modularising the lab is a good starting point.”

Oftentimes organisations have reached the stage where they may have the financial go-ahead to start on a cloud migration journey, but they don’t know how to start. “We need to start thinking about how labs can become more universally modular”noted Pena. “Is it possible to reach a point where each piece of software can be added, updated, and removed, without impacting on interconnected systems, and/or generating a need to requalify and revalidate the whole environment to satisfy regulatory requirements.”

One approach is to break apart the typically “big, monolithic, and fixed” software infrastructure into smaller pieces. This makes it possible to reduce what Pena describes as the qualification and validation burden. “One of the key enablers of modernisation is this concept of implementing universal, modular pieces of software and hardware that can be switched in and out of the laboratory landscape. This can help to minimise regulatory impact, and also reduce any impact on the functionality of other pieces within that modular environment.”

Take a stepwise approach to untangling the existing environment and modularising the network and the task may not seem so daunting. “We can start this process by implementing universal integration points between individual pieces of technology,” Pena suggested. “There’s been a huge shift towards the use of RESTful services and more universal API's, and even some industry-led standards, such as Fast Healthcare Interoperability Resources (FHIR). These tools can make integration less of a trial.

Taking this approach we can consider breaking down the existing software monoliths into cloud-based microservices that can be integrated more simply. Internally at Thermo Fisher Scientific, we are working on ensuring that our LIMS software has the ability to communicate and integrate with our customers’ existing informatics landscape, using tools that include these gateways and enablers.”

From both the vendor and the customer’s perspective, it’s important to first understand the customer’s overarching goals, from a scientific, business, and cost-related viewpoint.

Realistic goals

Whether an organisation is just starting on a journey into migrating existing software into the cloud, investing in new cloud-based products in order to carry out specific functionality, or building out a complete cloud-based infrastructure, having endpoints and goals in sight makes it much easier to start to consider how to work towards those goals, in a realistic manner. “Is the customer looking to reduce costs in the short term, or enable global expansion, or set up collaborative environments and data management infrastructure ? From there we can work stepwise to reach short-term, and longer-term achievements.”

In practice, labs may be faced with making the decision of whether to migrate as much as possible of the existing infrastructure into the cloud, or whether to start afresh with a completely new cloud-based software setup. There are cost implications associated with both strategies, Pena acknowledged. Change management is another major consideration. Modernising the existing software landscape and adopting agile microservices in the cloud may help to lessen the level of change management and culture shift, both for day-to-day working practices in the lab, and at an enterprise level. “Scientists don’t want to have to stop their daily working practices to have to learn how to use a new piece of software.”

There may also be significant time- and IT expense-related issues associated with configuring new software until it does the job that is required. And new products may not be 100% compatible with the data generated by legacy systems. “Starting from scratch, it's possible that labs will lose some functionality, and existing data may not ‘fit’” Pena noted. “From the perspective of the customer, modernising and migrating the functionality that they have spent 20 or 30 years developing has immediate benefits when compared with starting afresh and having to build up that functionality again.”

But if the decision is made to invest anew, it’s tasked to the vendors to configure their platforms to the user’s specifications,” he continued. “We then end up with something of a race, to see whether the incumbent can modernise their software so that it can be migrated into the cloud faster than a cloud-native newcomer can build out the desired functionality.”

Implementing a cloud-based landscape means that multiple sites, whether local, or geographically distant, have a way to consolidate or unify multiple sites to a single environment, Pena continued. What commonly happens for customers who manage point implementations of software, is that each site will evolve its own local ecosystem. “And even if they all start at the same time, and all use the same SOPs, and workflows, there will be significant drift,” and this can mean that equivalence is lost.

RESTful interfaces, and microservice-based cloud infrastructure importantly makes it easier for users to define their structure, switch in and out functionality, and integrate new modules, easily, Pena reaffirmed. “So from a customer's perspective, or from an individual user's perspective, a major benefit is agility, without the high investment in revalidation.”

“Our LIMS software is available as a cloud hosted infrastructure, and has been built in a very methodical way to be customisable to the user’s needs, so that modernisation of lab software architecture to feed into that central LIMS then becomes feasible,” Pena continued. “We are spearheading a containerised approach to modernisation, and launching modernised hosted services that mirror cloud native architecture.”

Cloud integration

Containerisation effectively makes it possible to standardise deployment of the LIMS software, Pena explained, whether customers are using Azure, or AWS, or are using another cloud service provider, or even deploying on premises. What we can do is standardise that deployment, and from there we can start breaking off individual services into separate container-based microservices.”

Vendors are also building in a diverse ecosystem of new capabilities and functionality in the cloud,” Pena pointed out. “So if you have the right integration-friendly plumbing in place, its possible to toggle, or bolt on these new tools and functions, and make use of new pipelines that can bring additional business value.”

Working in the cloud then opens up access to an enormous reservoir of cloud-based data management and analytical tools and resources, and to collaborative environments. It also becomes easier to transfer relevant portions of data to or from customers, collaborators, and service providers. “You then can start to leverage data lakes for data management, and access data pipelines for machine learning and artificial intelligence-based applications. The cloud offers computing power that it’s unlikely most organisations will have access to locally.”

While integration remains one of the most obvious issues when migrating a LIMS into the cloud or when making a new web-based LIMS installation, this inherent need to connect and enable communication between disparate platforms also brings with it the requirement – on the vendor’s side – to ensure safety, commented Dave Dorsett, principal software architect at information technology consultancy Astrix.

Consider an organisation that’s looking to implement a cloud-based LIMS infrastructure, for example. “It’s likely that the vendor will be running a multi-tenant architecture,” Dorsett explained. “From the vendor’s side, there will be a defined service level agreement and a warranty of security, and this will impose fairly severe restrictions on the accessibility of that cloud-based infrastructure by equipment that is ‘on prem’ at the client’s site.”

Effectively, when the equipment that the client wants to integrate with the cloud-hosted LIMS remains physically in the client’s laboratory and is not in the vendor space or in the vendor cloud, then there is a security risk to the cloud network.  The constraints that this can impose may even require the client to rethink their integration model in order to achieve what Dorsett calls the ‘balance spot,’ where the vendor is comfortable that they can meet their responsibilities under the SLA while still protecting against threats. “It’s a significant integration challenge,” Dorsett noted.

“Trust is a major issue – in fact, we operate now from a zero trust perspective - and our networks have had to become increasingly sophisticated at managing that trust. When you then put a chunk of a piece of software under the management of an external vendor, the same issues apply. So as an external vendor with, say, five customers, not only do I have to be on the lookout for any threat that might transfer to me from those companies, their network people will have a zero-trust starting point with respect to my network as a vendor. Meanwhile, I also have to manage intercompany threats to make sure that threats cannot move between clients.”

One approach to improving security is network segmentation, so keeping lab networks and other enterprise networks separate, for example. This is a model that has been adopted by the manufacturing industry and is increasingly being implemented in R&D labs, Dorsett continued. “The simple way to do things is to put everything into one big network segment so that every computer and system can see every other computer and system. But from a threat perspective, there’s then no effective way to seal any systems off, so if a virus or other malware reaches that environment, it can then access all the entire hardware and software network. Merck is just an example of a major organisation that has in the past experienced a major virus problem in its manufacturing environment that’s required it to sustain major losses in output.”

These sorts of events raised the profile of network security in a very relatable way, Dorsett stated. “People started to think that it may not be the best idea to let everything talk to everything else, but to take a segmented approach to building networks so that systems can be isolated if necessary. This is particularly relevant for some industries, for example, petroleum refining or chemicals, where there are very real safety issues. Here it may be that hardware and software on the plant floor can talk to each other in order to get the job done, but this plant floor IT environment is cordoned off from the enterprise network for safety.”

Communication is thus governed by rules and firewalls, along with reporting and tracking requirements, to ensure safety, Dorsett noted. “Negotiating these safety layers can be daunting. The concept of data transfer then becomes a matter of negotiating safety layers. You have to take data from an instrument that lives in a network segment buried in a customer’s environment through a connection to the on-site network, move into the internet, and then into a cloud provider version of a piece of software hosted by the vendor. This can be hugely complicated to do seamlessly.” Layered on top of the need to integrate the laboratory scientific equipment, and software, a cloud LIMS may also need to integrate with asset and inventory management, and other on-prem hardware and software architecture.

Software customisation is another significant obstacle to migration or implementation of LIMS in the cloud, Dorsett pointed out. “If you have an existing on-prem LIMS system, it’s likely to be highly customised. The industry has talked for decades about the need to configure, rather than customise, but the reality is that people do customise the heck out of their systems. They naturally write external code as workarounds. However, for a vendor-hosted model and SLA there will be severe boundaries with respect to the degree of customizations that is feasible.” These restrictions can make migration of an existing, heavily customised system into the cloud unfeasible. “At the end of the day migrating existing customizations can come up against a huge brick wall and tremendous costs, and it may be worth considering not so much a migration as a completely new implementation, and so the cost-benefits of migration versus new implementation will have to be calculated,” and with that organisations then have to negotiate the minefields of functionality, features and change management, Dorsett suggested.

Whichever route is taken – migration or starting afresh - organisations and users will have to expect that the resulting cloud-based LIMS platform will not be identical to their original system in every respect of functionality. “Even if you take the same vendor software and you move it from in-house to a hosted model, it's going to be a radically different system.”

The problem here is that there is still a tremendous reluctance within the life science community to accept change, Dorsett said. “People want their blue icons and buttons to remain blue.” And this reluctance to change – even if the changes are positive or result in equivalence in function and operational ability - filters up from the operators, scientists, end users, and lab managers, to other stakeholders within the enterprise. 

Accept change will happen, whether to migrate or to start afresh, and cost analysis then becomes a particularly tough issue, Dorsett stated. “People still by and large don't really understand the true cost of ownership when comparing cloud options to their existing on-prem system.” And this is for a variety of reasons, he suggested. When comparing an existing LIMS infrastructure to one that is moved into the cloud, it can be hard to objectively list and calculate the true cost of all the internal resources, including the physical assets like servers and machines, and also the ‘soft costs,’ including the people that run them. “So, when you attempt to figure out how much your on-prem system is really costing it be really tough to get to get actual numbers.” And if an organisation moves the LIMS up into the cloud, outside of their environment and direct control, then making these comparative cost analyses can be very cumbersome, “so cost-benefit analysis also becomes tricky and the benefits, in terms of feature function, operational, all those types of things, may get vetoed by the hard financials.”

Particularly in today’s economic climate, there is intense pressure to dampen capital expenditure while operational budgets are being squeezed. Cloud hosting may be perceived to have the edge as it is paid for on a timed basis, with upgrades and service built in, Dorsett acknowledged. “However, there is a lot that’s hard to quantify, particularly in the LIMS space where systems must be validated, and there are additional costs of maintaining validation on top of that. In many instances, people are just not upgrading their systems because the costs of validations are so very high.”

FDA is changing its approach, however, he continued. “The agency’s intent is to move away from this document-oriented validation model to one that is more risk centred, which will encourage people to upgrade their systems and stay up to date.”

So, is it always best to migrate into the cloud? Not necessarily, Dorsett concluded. “It is not a given ‘better.’ There are evident benefits, such as agility, regular upgrades, and service-level support, for example. And for many organisations those benefits will overshadow the initial upheaval. But not all those benefits will necessarily be important to all users of a particular type of system, especially in the LIMS space.” The technical hurdles, expectation to fine-tune and customise functionality and features, coupled with the requirement to negotiate change management and issues of validation, as well as the cost of implementation, may all sway the cost-benefits back to retaining an on-prem system.