In her second article on cloud computing, Sophia Ktori looks at how both vendors and customers of informatics systems are having to change their IT culture
One of the most notable changes in the culture of pharmaceutical research and development is the move towards collaboration and outsourcing as the business model. This has been driven, at least in part, by a shift from the development of traditional small-molecule drugs, to large molecule biologics, biomarkers and personalised medicine.
‘Biologics discovery and development will generally require collaboration with CROs, small biotechs or academia that have required IP, or are specialised in different segments of the biologics workflow, perhaps next gen sequencing or biomarker discovery,’ according to Anthony Uzzo, president and cofounder of Core Informatics. ‘This means that collaboration and outsourcing will become an inevitable part of biologics and precision medicine initiatives.’
On the informatics front, he continued: ‘Cloud solutions can effectively and efficiently capture and secure data that has been generated by CROs and collaborators who are outside of the client’s infrastructure. Think about all the money that large pharmaceutical companies spend on outsourcing and on collaborative research. If the partners are not all using the same platforms for data collection and sharing, a huge amount of valuable information will lack any scientific context.’ In contrast, it is relatively painless and cost-effective for the company and each of its collaborators to adopt the same cloud platform, and thus communicate seamlessly.
A shift in IT culture
However, switching to SaaS from traditional on-premise informatics platforms does lead to a natural shift in the IT culture, according to J J Medina, senior director of product strategy at GoInformatics. ‘Companies adopting SaaS can focus less on hiring people to support their technology infrastructure, and more on hiring the data scientists, project managers and other project-related personnel who will accelerate discovery and drive development forward. Transitioning to the cloud does not happen overnight. Large enterprises still have to make the most out of their investment in legacy systems. And so, critically, SaaS solutions are being developed to work side by side with existing on-premises solutions. However, our cloud solutions are designed with an “open handshake” and can integrate with legacy systems through suitable application program interfaces (APIs) and other types of integrations.’
Moving into the cloud doesn’t just require a change in how the customer perceives its data handling and management, but also requires the vendor to change its operational model. Nic Encina, VP, Innovation Lab, at PerkinElmer, comments: ‘This change will depend at least in part on whether you are entering the market with a new cloud product, or whether you are migrating a traditionally on-premises solution, such as an ELN, into the cloud. But in either case, large vendors, such as PerkinElmer, which have traditionally provided on-premises systems, need to change the way they think and interact with customers.’ This can be an evolving process, adds Jens Hoefkens, director of research, strategic marketing at PerkinElmer. ‘You can’t just flip a switch overnight or abandon a legacy on-premises solution. Change is gradual and, for large deployments, can take years. It involves a true collaboration with the customer rather than service provision only.’
Getting the right people on board at the customer’s end is also vital, suggests Brian Gilman, strategic marketing, elements at PerkinElmer. ‘It starts with the people, not the software. There has to be a willingness to take cloud technology on board, and you have to have a set of individuals within the customer organisation who will understand the benefits of that technology and take on the roles of first adopters. Delivery from our side is based first on dialogue and training, before the solution itself can be deployed.
LIMS specialist, Eusoft, has had significant experience helping companies to migrate from on premises or web-based solutions to a fully cloud-based LIMS platform, claims Pasquale de Tullio, the firm’s International marketing specialist. ‘We start all our LIMS projects with a thorough understanding of every requirement that the client has. This allows us to configure the system to their exact needs and demonstrate each benefit, which ultimately helps them to move into the cloud with our SaaS LIMS. In some instances it is the vendor who has to point out to clients that moving a software system such as a LIMS into the cloud has benefits above finance (cost savings), support and computing power, de Tullio adds. ‘Unlike on premises solutions that may have a lifespan of, say, five to seven years before the client decides to move to a new generation of software, cloud solutions are ideally suited to innovation in informatics. By implementing a SaaS solution, clients can be ready to adopt innovation, and help to inform the direction of new developments.’
Headquartered in Italy, and with a UK office, Eusoft released the first version of a re-engineered, cloud-based LIMS solution, EuSoft.Lab 10, about three years ago, de Tullio explains. Developed as part of a restructuring programme that started in 2009, the SaaS platform is designed for clients in different industries, including food and beverage, oil and gas, environment, chemicals, and different manufacturing sectors. ‘We have also continued to address industry needs with the launch at the 2015 Paperless Lab Academy of a mobile app that allows users to upload results and track the progress of tests from a smartphone or tablet. This is not only important for remote, or field-based activities, but also aids collaboration and flexibility that cannot generally be achieved by using on-premises solutions.’
A ‘mixed bag’
While the benefits of the Cloud and SaaS are evident for certain R&D and manufacturing environments, moving an informatics infrastructure into the cloud doesn’t necessarily resonate with all companies for all situations, suggests Oliver Leven, head of Genedata Screener Professional Services at Genedata. ‘We don’t see across-the-board adoption of the cloud, but rather a ‘mixed bag,’ if you will, with companies using it for specific types of research and collaboration projects.’ Switzerland-based Genedata offers a suite of enterprise solutions for data analytics to support large-scale, experimental processes in life science research. For example, the Genedata Screener software system is designed to capture, analyse and manage plate-based screening data across a whole pharma’s R&D organisation.
Genedata operates a typical three-tier architecture, comprising an application server and a database, on top of which sits the web-based client tier. The firm’s experience shows that while some customers use Genedata software systems as a hosted service, other clients elect to keep the whole implementation, and thus the data and all included IP, in house. ‘Client data may be stored in a Genedata software systems database, which can reside within their own data centres (i.e. within their firewall) or use an external hosting services provider’, Leven notes. ‘From the user’s perspective, however, there is effectively no difference.’
Security concerns aside, the cloud is of particular value for clients who don’t want to run an informatics infrastructure, and in particular want to avoid the set up and maintenance costs of hardware, Leven comments. ‘One of the cloud’s advantages is its elasticity. That is, for companies with their own informatics infrastructure, even temporarily spinning out intensive computing tasks to the Cloud can be beneficial. Our solutions are available to them on-premises, in the private cloud, or in the cloud. All three options can be set up to deliver the security and reliability that our customers – pharmas, CROs, and academic research institutions – demand.’
Confidentiality issues, the size of datasets, and the required internet bandwidth may also make cloud deployment impractical for some companies, points out BSSN Software president, Burkhard Schaefer. For these clients a hybrid deployment model can help. ‘Here, the application is delivered and managed through the cloud, but customer data resides locally. This gives organisations the agility of deploying applications quickly by leveraging cloud technology, while maintaining full control over the location of their data.’
BSSN Software uses Java Webstart to deliver applications to desktop PCs, Schaefer explains. ‘A deployment server hosts the application packages and delivers them to the client on demand. Updates can be deployed transparently using the same mechanism. Code is only executed on the client PC, and the deployment server does not have access to any customer scientific data.’ A similar mechanism works for server applications. ‘Docker container technology packages up the server applications into a repository, and a local agent installed in the customer data centre can effectively roll the applications to local servers. Updates can be distributed the same way. Again, the Docker application containers are stateless, do not contain data, and are effectively disposable. They can be replaced, updated or migrated at any time, without affecting the data.’
Ease of configuration
Using this model, a customer simply subscribes to the management tools and can configure both its server and client landscape through a convenient web management console, he adds. ‘Administrators can fully reproduce and deploy any configuration, and an audit trail of deployments and configuration sets is maintained.’ Using hybrid software provisioning combines the benefits of both worlds, Schaefer stresses. ‘You benefit from the agility and ease of deployment of cloud solutions, paired with the data confidentiality of a local system. The burden on IT groups that is typically associated with deployment and management is greatly reduced. Adding new users, clients and servers to the system is easy and convenient and does not require IT involvement. Creating testing or training configurations for new versions or new instrument integrations is similarly straightforward, and all deployments are fully audit-trailed, traceable and subject to change control.’
PerkinElmer has been moving a number of its informatics platforms into the cloud, and has introduced cloud options for some of its well-established software. ‘We have a SaaS offering that enables clients to embed ChemDraw into their own web applications, for example’ Hoefkens explains. ‘Many parts of our business are moving towards the cloud, including some of our instrumentation data management.’
Connecting analytical instrumentation directly into the cloud makes perfect sense, Encina adds. ‘PerkinElmer offers a significant portfolio of instrumentation for environment and human health applications. Environmental instrumentation may be sited out in the field, rather than in a laboratory setting; then connecting these devices to the cloud – where you can easily collect, manage and distribute the data that comes out of them from remote sites, globally – is compelling. And as the instrumentation becomes more sophisticated, the data that they generate becomes richer, which will also require greater capacity in terms of storage or computing capacity, again pointing to the benefits of moving directly into the cloud.’
New solutions for translational scientists
As part of its drive to offer new solutions in the life sciences sector, PerkinElmer has launched a cloud-based platform to support translational scientists in the pharmaceutical sector. PerkinElmer Signals for Translational, launched during late 2015, is designed to bridge the divide between discovery research and formal development, Hoefkens explains. ‘Discovery and development are two, often very separate processes, and crossing the chasm that divides them is challenging. Signals for Translational supports the cross-flow of data in a very agile and flexible way, bringing information from those two different processes and parts of the organisation to the translational scientist, facilitating collaboration, biomarker discovery and the development of personalised medicines.’
The Signals for Translational software is the first of a series of solutions that PerkinElmer aims to develop from the cloud-based Signals platform, which is aligned to big data. ‘It’s a platform for aggregation, integration and organisation that we can layer on top of all sorts of other services,’ Encina adds. ‘So as we continue to build this platform, we will be releasing new Signals solutions.’
Multi-tenant partnerships in the cloud
Core Informatics, meanwhile, is developing a solution that will enable businesses to create private, multi-tenant partnerships within its cloud infrastructure. ‘This capability means that clients will rapidly be able to create new accounts within the Core Informatics platform, for supporting individual research collaborations and outsourced activities, and then disassemble them as the project or collaboration ends,’ Uzzo notes. ‘Without this ability, every research project, collaboration or outsourced contract that a company embarks on likely requires the provision of new hardware, a virtual machine, and the installation of vendor software applications. This can all take months to install and deploy, at considerable cost, by which time the project or collaboration may already be over.’
The capabilities of cloud computing and SaaS do point to exciting future prospects, Medina claims. ‘Companies are collecting more data than ever before, and new visualisation and analytical tools are being developed to maximise IP within that data. Combining these new analytical tools with the power of cloud computing means that analytical capacity and contextualisation are increasing in breadth, depth and speed, which is propelling discovery research and accelerating scientific discoveries.’
Continued focus on integration
Cloud platform development aside, there are still some inevitable issues associated with integrating multiple systems – whether cloud, privately hosted or on-premises – and the ability to integrate will be a key factor in the evolution of cloud platforms, Uzzo believes. ‘Cloud providers really need to focus on integration and establishing an open API that will allow businesses to integrate disparate cloud solutions quickly and easily. No single vendor is going to provide every scientific, logistic and business solution that these businesses need, in one infrastructure. Integration solutions will need to be able to define how and when data is exchanged between applications, how data is mapped and structured, and how it is transformed when transferred from one application to another, whether from one cloud solution to another or between cloud and on-premises platforms.’