Advancing laboratory thinking with cloud technologies

Share this on social media:

Credit: Sergey Nivens/Shutterstock

Thermo Fisher’s James Pena gives his thoughts on how scientists can advance their research goals through the implementation of cloud technology in the laboratory

Organisations in the life science sector have been gradually progressing their digital transformation journeys over the past decade, but the COVID-19 pandemic sparked a period of rapid change as companies explored and implemented new digital technologies to future-proof their business. 

With a shift to more remote working practices, technologies like video conferencing, remote access and virtual private networks (VPNs) have leapt to the forefront of company strategies to ensure successful business practice. Indeed, a survey of 800 executives by McKinsey in May 2020 found that 48 per cent of companies accelerated the digitisation of their customer and supply chain interactions due to the impact of the pandemic, while 67 per cent invested in automation and artificial intelligence.

One key technology within many companies’ digitisation strategy is cloud-based data storage and access systems. These solutions promise improved scalability and flexibility in data management, allowing companies across a range of sizes and applications to tailor their cloud set-up according to individual needs. However, implementing these systems in laboratories can often be challenging and may fall short of its significant potential to connect the different software and hardware systems that scientists use daily. 

So, what are the different cloud deployment strategies and services available to organisations, and how can companies futureproof the digital connectivity of laboratories by building well-architected cloud networks? Here, we discuss the factors companies should consider when choosing the correct set-up for their organisation.

The value of cloud infrastructures

When companies are looking to develop a solid cloud infrastructure, it’s important to focus on how the system can be integrated into their particular organisational structure and what they want to achieve. Essentially, cloud infrastructures provide flexible solutions to scale IT systems in a way that matches a company’s growth strategy, ultimately saving costs and providing greater operational efficiencies.

Cloud systems drive a reduction in costs primarily by shifting from a capital expenditures-based (CAPEX) IT infrastructure to an operating expenses (OPEX) system. An OPEX strategy allows companies to avoid the high costs of buying expensive hardware, facilities and IT expertise that often becomes redundant after several years due to new technological advances. OPEX-based leasing strategies allow companies to only purchase the amount of storage and computing capabilities they need, providing the flexibility to adapt to their specific requirements at a given time, without the added complexity of maintaining and replacing hardware every few years.

In terms of cloud adoption, companies can avoid the one-size-fits-all technologies and cherry-pick the exact services they need to progress their digital journey, whether it be by connecting directly to the cloud, across multiple clouds or taking advantage of a multitude of services, including laboratory integration, data analysis or business intelligence technologies. In essence, this allows companies to adapt their cloud systems in a manner that reflects their unique business model. It’s all about choice: multiple cloud integration services

The flexibility of cloud systems comes from the availability of different services that offer varying degrees of cloud-based infrastructure. There are three primary service models: Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and Software as a Service (SaaS). These models offer increasing cloud involvement, ranging from mixtures of cloud-based servers and data storage and in-house app development systems with IaaS and PaaS models, all the way to fully cloud-based packages, like SaaS.

A customer has access to a category of components, such as servers, routers, gateways and so forth, and can choose how they want their IT systems configured. IaaS systems, for example, allow simple VPN access to secure, private data storage systems, but the customer is in charge of which hardware and operating systems they purchase. PaaS takes this one step further by pre-loading operating systems and additional middleware onto computing systems supplied to the laboratories, facilitating the integration of any applications needed by the customer.

With SaaS, the cloud is typically multi-tenant and upgrades are rolled out across all users by the cloud provider. However, SaaS models also allow varying degrees of customisation and configuration of cloud packages. For example, managed SaaS models are typically chosen by regulated customers in the GxP environments who require validation testing for their software. Managed SaaS allows for a high level of control over the configurations, and adherence to strict security policies that encrypt and separate data.

Flexible cloud deployment strategies 

Along with different cloud service options, there are also different strategies available to deploy these systems in the workplace. Public cloud infrastructures, like AWS, Dropbox and Google have been around for years and are often the default choice for many companies. They are accessible via the internet and shared between organisations. Providers also offer single-tenant virtual private clouds, which allow users to customise the security policies they require.

Private clouds are ‘on premises’ storage solutions that are solely dedicated to one organisation. These systems are the most cost-effective methods for data storage and access, but often have limitations in terms of customisation and compliance with specific regulations on data security, particularly when dealing with sensitive data like patient health records.

Many companies use hybrid strategies where they maintain more sensitive datasets within their in-house systems, and other datasets within a public or virtual private cloud. Increasingly, companies are not limited to a single public cloud provider, and more customers are leveraging different providers for different cloud services and integrating them all within a vast multi-cloud strategy. This is often the case when large businesses have sites in multiple countries, meaning they have the flexibility to retain data within a specific geographic zone to meet compliance requirements for that region, or to integrate across zones to meet the needs of their global offices.

Overall, these strategies give organisations further flexibility to tailor their cloud service packages for their company, depending on factors such as global reach or the need for compliance with strict data security and privacy laws.

The cloud adoption lifecycle Technologies are always developing and being replaced, and this is especially true in the rapidly accelerating cloud service world. As such, it is equally important for companies to consider the capabilities of cloud services for their present requirements, as well as plan for the growth of their company and renewal of new IT systems in the future.

This also affects how companies choose the right partner before implementing cloud services. At the foundational level, organisations must consider non-flexible constraints, like regulatory compliance. While security and privacy are important for all data, this is especially true for sensitive data such as healthcare records or data which is under IP protection. 

These non-flexible components impact how much control an organisation must maintain over their cloud environments.

Second, organisations must assess how they want to interact with the cloud environment, establish user access for different employees, and control remote or local access to data. This depends on the nature of the laboratory and whether items like analytical instruments and ‘smart’ equipment, like refrigerators, can be connected to the cloud. Capacity is also a critical consideration, both for the volume of data that must be stored, along with how many scientists around the globe, or in local groups, need to access the data. 

It benefits companies to enter cloud discussions with the future in mind.  Whether they are looking to grow organically, seeking to acquire new companies or be acquired themselves, considering these elements in advance can save significant effort in the future. 

The benefits of a well-architected cloud 

When adopting a new cloud service, it pays to create a well-architected system that ensures data security, increases the efficiency and reliability of data solutions, and offers value for the investment. 

Protecting the security of information is perhaps the most important factor, and cloud systems offer the ideal way to store data in compliance with any regulatory requirement. Cloud services also offer reliable ways to prevent, and quickly recover from, failures of data storage. This includes multiple data backups with time restores, capabilities of replacing or renewing faulty hardware to avoid cloud interruptions, and preventing downtime if there are issues. This increases the performance efficiency of companies that use these cloud networks.

Overall, cloud services can drive operational excellence in a rapidly changing, post-pandemic work environment. The flexibility of different cloud services enables companies to maximise their budget, while scaling into the future, and helps laboratories advance their digital transformation.

James Pena, product manager, Digital Science, will provide more details on how scientists can advance their research goals with a cloud-first strategy in a free upcoming webcast.

Credit: Billion Photos/Shutterstock

12 October 2021

Credit: SeventyFour/Shutterstock

13 September 2021

Credit: Greenbutterfly/ Shutterstock

06 September 2021

Credit: SeventyFour/Shutterstock

13 September 2021