Skip to main content

Making the case for cloud

LIMS and ELN systems are increasingly deployed in the cloud to provide flexibility, added security and reduced costs. The advent of cloud-based systems and Software as a Service (SaaS) platforms are changing the way that laboratory management software is implemented.

For many users cloud and the SaaS model provides a platform that can support today’s workflows, while also offering the scalability needed for customers who are looking to use their data for machine learning (ML) and artificial intelligence (AI) workflows. It provides a software stack that can help reduce costs and increase productivity in the laboratory. The platform also provides an opportunity to aggregate data and provide context for data, stored today to be used in AI/ML workflows in the future.

Thermo Fisher, for example, offers LIMS, ELN SDMS and instrument connection software across both on-premise and cloud deployments in the form of either standard SaaS or enterprise-grade SaaS platforms. Users can pick which flavour of the technology is right for them and supplement the platform through the additional apps which help to streamline the implementation of key workflows.

Ajay Shrestha, manager of technical operations, Digital Science at Thermo Fisher Scientific, said: ‘We provide software that can either run in the cloud or even on-premise. We also provide that as a hosted solution, for customers that want us to manage their data. Based on my experience in the past five years, I have seen more customers with a desire to move to the cloud as opposed to hosting these software systems on-premise.

‘The cloud provides a lot of benefits from a cost perspective, reliability perspective and in terms of security, contrary to what some people might believe. People think that because they are putting their data in the cloud, it might be less secure but it is, in fact, more secure and there have been studies which prove this,’ Shrestha added.

Cloud security

Today’s cloud systems are primarily hosted by a few key players such as Amazon Web Services (AWS) Microsoft Azure and Google Cloud in the US, and Huawei or Ali Baba Cloud in China. These companies provide the underlying infrastructure, and, in many cases provide much higher levels of security than the private cloud environments. Using companies with specialised cloud security expertise helps to provide a level of assurance that was not always possible with private cloud or in-house cloud systems developed by laboratories in the past.

Shrestha attributes this difference between the level of security provided and the assumed risks, as being primarily an issue of mindset. ‘Anything that you can have your hands around can make you feel more secure, so some people might have anxiety about having their data reside somewhere.

‘Cloud provides much more robust security but at the same time, there is a shared responsibility model so anything that is in the cloud is still the responsibility of the customer. In the case of managed services, that would be us as Digital Science. The security of the cloud is the responsibility of the cloud providers, and they address and take care of a lot of the low-level infrastructure and security so we, as customers, do not have to spend time and money managing security at the lower levels of the software stack,’ Shrestha added.

Trish Meek, director of marketing at Thermo Fisher Scientific, agrees that perception is the main stumbling block. ‘Having been in the LIMS industry for 20 years, there was a time when there was, particularly in the manufacturing space, this perception that “if I have a system behind my firewalls, that is the safest location”.

‘There have been so many breaches of people’s own infrastructures that there is now an understanding from the IT community. Due to the economies of scale, and because Amazon has so much time and resources to invest into security in a way that a single petrochemical company or pharmaceutical company cannot. They are now looking to providers like Amazon and saying “this is your core business so we would like to leverage those capabilities, rather than building the same level of cybersecurity in-house”,’ added Meek.

She also noted that while Thermo chooses to deploy the managed systems using AWS, there is no technical limitation that prevents a customer from deploying their software on another cloud platform.

Shrestha said: ‘We are building our applications to be cloud-agnostic. They can take our software and deploy it on-premise, or using the cloud vendor of their choice. We have multiple tiers in our offering, we have the infrastructure at the very bottom, and on top of that we have the platform which supports additional products. The platforms themselves can also support additional specific workflows, which are available in the form of apps which cater to specific workloads, such as life sciences or NGS and data analytics.

Developing a platform for science

Thermo offers a number of different software tools which are fine-tuned to the type of laboratory work that is going to be run on the system. Academic users or R&D organisations may choose Platform for Science, while users in manufacturing or other regulated industries would be more suited to Sample Manager.

‘We acquired Core Informatics in 2017, largely because of their advanced cloud capabilities. They had taken a cloud-native approach and built up the technical operations support and managed cloud deployments, so that was part of that acquisition strategy,’ said Meek.

‘We still have our traditional products in Sample Manager and Watson that lots of our customers are using in regulated spaces. I would say that the distinction is that Platform for Science is an R&D solution, so that applies to academia, as well as pharmaceutical industries, and Sample Manager continues to be our manufacturing solution designed for GLP environments – for those customers that are regulated,’ she continued.

This breaks down into several layers with the cloud infrastructure at the bottom, and then Platform for Science software as the data management infrastructure designed to support scientific workflows. Users can then make use of apps targetted at specific workflows. These apps sit on top of the Platform for Science software.

‘The apps are pre-configured workflows, so the advantage is that these are implementation accelerators. If you think about genomics, for example, the workflow is pretty prescriptive in terms of sample preparation and handling those samples. Each of those steps in the process can be managed using protocols, whether that is our equipment or other vendors equipment, added Meek.

By building out those pre-configured steps, it helps to drive standardisation and also ensures that they behave in a way that is complimentary with the protocol they are executing,’ said Meek. ‘There are also data analytics apps to help with the further analysis of data beyond just the step-by-step workflows. It is really about accelerating implementation and driving standardisation in that implementation.

‘They are a reflection of your business process, just as any LIMS or LEN deployment would be. You want to make sure you are mapping the workflow that is happening in the lab into the software, so that it is truly adding value to the customer and accelerating their laboratory workflow,’ said Meek.

Implementing machine learning

The applications provided by Thermo allow users to streamline processes, such as sample preparation or common steps, in many workflows such as defined Polymerase Chain Reaction (PCR) experiments or gel electrophoresis and many others. They cover areas with many apps targetted towards genomics or biopharmaceuticals, but there are also apps for things like data analytics which can be used in conjunction with core Thermo software products, such as LIMS or ELN.

‘Machine learning is huge in the life science healthcare industry. It is able to sift through a lot of data, provide insight and make predictions based on the trained models. It has also shown a lot of potential in areas such as disease diagnosis and even drug discovery. Just after the new year, Google AI announced that they were able to detect breast cancer far better than actual doctors, and last year they had a similar story looking at the diagnosis of prostate cancer,’ said Shrestha

‘AI is definitely positioned to disrupt the life sciences industries to a great extent. That is more of a long-term impact but we are seeing the advent of microservices which is the breakdown of large monolithic applications into smaller pieces, so they can scale independently and provide better fault tolerance for the underlying application, provide a greater degree of reliability and also reduce the lead time to deliver features and functionality in software applications to customers,’ said Shrestha. ‘I would say that microservices in the area of software development and AI and ML in terms of addressing these more complex issues in healthcare and life sciences. Those would be two major trends that we are observing.’

For users who want to begin implementing AI and ML techniques with their own laboratory data, Meek notes that a good place to start is developing an infrastructure that can support the necessary stream of data and the contextual metadata needed to support AI. Only by ensuring that the correct data is being stored can scientists hope to get the full value out of their data sources.

For example metadata such as instrument parameters are vital if a computer model is to compare different experiments. ‘The way our customers are using data analytics today is that they are using the LIMS capabilities as a central point of aggregation for all of their data, the SMDS capabilities that help pull all of this together, and then use the APIs that we have to feed that data to ML and other AI applications,’ said Meek.

‘In the same way that we talked about cloud technologies and companies wanting to make their own decisions about their cloud technology and their infrastructure, we are seeing the same thing from an AI and ML perspective.

‘It is really about partnering with the right organisations out there that are delivering these technologies, and ensuring we are providing information that is accessible and available for predictive and prescriptive analytics,’ Meek continued. ‘It is not just about data availability but also the completeness of the data. By that, I mean all the associated metadata because ML needs that ancillary data to understand the data model. We are seeing customers capture far more data than they ever did and make sure that they are pulling that data together, so they can inform the data model.’

‘ML can really be complemented by cloud computing, because most customers cannot run those heavy processing workloads in on-premise environments. ML and cloud go together very well,’ concluded Shrestha.



Topics

Read more about:

Laboratory informatics

Media Partners