FEATURE

The future of laboratory informatics

Robert Roe interviews laboratory informatics software providers who discuss potentially disruptive technologies and their impact on the laboratory informatics market

What has changed in laboratory informatics software over the last 12 months?

Laurence Painell, vice president, product management and marketing at IDBS: The cloud is becoming the norm for laboratory informatics software. Informatics providers must now have a cloud and SaaS strategy or they will not be considered in selection processes, and nearly all of the new players in the market are providing their technology as SaaS only.

Integration is an area that has really moved on this year – and you can see why: integration is the foundation of automation and is the gateway to more efficiency gains and insights. Organisations can’t truly benefit from open APIs, data lakes and IoT without integration. So, from software platforms through to the management and monitoring of instruments themselves, businesses have become more aware than ever before about their need to integrate and leverage technology to do so.

We have also seen a big change in what big data really means in this industry. This is not just about velocity and volume – that is a given – it is about how to leverage all that data effectively. AI and semantic enrichment means that, increasingly, analysis is becoming more about providing insight, context and exploration of data instead of just providing a single, specific and defined answer.

Daniela Jansen, director of product marketing at Dassault Systèmes: Software and hardware vendors are moving towards a platform offering, but in most cases it is a closed, proprietary platform. Platforms can provide data continuity and traceability, and support decision making across the product lifecycle. 

A platform allows for a substantially different architectural setup: small dedicated agile applications can be plugged into the platform in a modular fashion; services like reporting or instrument interfacing can be used by the different applications. This approach avoids overlap and redundancy of capabilities, improving the user experience and work efficiency. It also lowers the cost of ownership.

In addition, the enablement of cloud-based technologies and solutions is increasing. The increased adoption of cloud technology is based on both, the natural progression from legacy systems, as well as the need to transform laboratory operations. Many legacy systems are about to run out of support, so organisations are looking for replacements. Due to technological advancements, organisations have the opportunity to not only move to a newer system but to also take a new approach towards their laboratory informatics landscape.

They can move away from rigid monolithic systems that often come along with duplication in capabilities, to a platform-based approach with modular applications and common services allowing them to transform the way labs are working today. Many of the new solutions are allowing them to adapt the scale, cost and capabilities of their deployment to their current needs. In times of ongoing merger, acquisitions and divestments, this is a compelling value proposition.

Klemen Zupancic, CEO of sciNote: Software solutions are becoming more and more user-friendly, data is moving to the cloud and the number of startups in the field of laboratory informatics is on the rise.

There is an emphasis on the user friendliness, i.e. ease of use of the software. User interfaces and overall user experience is improving with the aim to enable the users to understand the software and seamlessly integrate it in their work. The need for affordable software that adapts to a laboratory’s way of work and is compatible with other platforms used today is being recognised. There are many great companies working towards that goal and we are excited to see what will happen in the following years, with the rise of Internet of Things and similar concepts.

We see that cloud providers and cloud based software is gaining recognition as well. The major benefits of it are: encrypted data storage, high level data safety and security, automatic upgrades with virtually zero downtime. The amounts of valuable digital research data are on the rise and the risks of losing the data due to lab accidents or technical issues that labs may face can be solved by storing the data on the cloud. We all rely on cloud solutions in our everyday life and we see that this trend is entering the scientific world as well.

We are also glad to see many different startups and innovative platforms rising up and working towards empowering the scientists in many different ways, whether it is data management, team collaboration, compliance, data analysis, data visualisation, virtual and augmented reality and more.

Since our main focus is in the field of electronic lab notebooks, sciNote team conducted and published one of the largest studies on user perception of electronic lab notebooks (ELNs) software. The study focuses on user perception of ELNs, market trends and market barriers. The paper gives a detailed insight not only in which direction the ELN market is moving, but also why ELN adoption is so slow.

Looking to the future, what technologies or changes to workflow and process will cause the biggest change to the laboratory in the next year or two?

Painell: Although it has been a hot topic for a long time, we need to improve the availability of insight across the research, development, manufacturing and clinical cycle by integrating and moving data backwards and forwards in an automated and meaningful way – translational research. This is the area where all technologies converge, by creating, linking and providing data and insight at the right place and time to impact the decision making processes in each area.

Jansen: Systems consolidation and convergence will provide laboratory users with a new experience and transformation in efficiency. Traditional systems will start being replaced by more agile user-friendly cloud-based lab informatics applications, based on a holistic open platform approach.

Organisations will start leveraging the Internet of Things (IoT) in the laboratory. This will provide them with more data – delivering more data and more insight as well as more reliable data of higher quality and integrity.

IoT will allow users to work more efficiently in the lab, as it will remove many time consuming non-value adding steps from the workflows as the ‘things’ are not only limited to lab equipment, but can also include wearable devices, google glasses, biometric bracelets, motion sensors, location beacons etc. At the same time, it will improve data integrity and quality as data transfer from and to the devices is automated. And through the introduction (more) sensors labs will be able to generate more data faster that can improve and accelerate decision making.

The adoption of IoT technology depends largely on the maturity of the organisation and the industry, as well as the kind of laboratory. While some companies are only now replacing paper in the lab and digitalise their processes, other companies have already evaluated and adopted IoT. Generally speaking, the research labs of innovative life science organisations are spearheading the crowd and leveraging the technology today, while other industries and regulated labs still might take years to step into the ‘lab of the future’.

Zupancic: We definitely see the benefit for the scientists in the future integrations between the platforms that are used in labs today. This correlates with the concept of IoT. For example, Gilson inc and sciNote are working on introducing the IoT platform, which would enable scientists to do various things such as: setting up your experiments in sciNote electronic lab notebook, adding the lab instruments you use, connecting your pipettes to track your progress etc.

It is not individual tools that will revolutionise science, it is all of these tools that are available working together, integrating and talking to each other. sciNote is only a piece of the puzzle and we need to step together as a community, promote collaboration and help each other for better science. IoT is coming in the labs and with that comes automation, traceability, transparency and reproducibility.

It is important that we, as a community, adopt this mindset of using technology to our advantage and recognise the benefits it can bring. Another interesting aspect is the introduction of artificial intelligence and machine learning – software that can understand the need of the scientist and can be a helpful tool.

In your opinion, what are the driving factors for change or innovation in the laboratory?

Painell: The need to increase efficiency and throughput in order to bring a step change in the time and cost of bringing new products and therapies to market is the primary driver. Linked to this is the diminishing returns of the old manual approaches and the intrinsic need to automate and innovate for a competitive advantage.

Finally, the sheer volume of data to wade through needs different automated approaches beyond what a human can manage, and this requires new technology.

Jansen: In conversations with our customers, we have identified time-to-market being the ultimate driver for change.

Personalised health and the desire for more precision therapies is changing the way how they are developed. Knowledge capitalisation is basic for leveraging of new and existing knowledge in the lab and next-gen manufacturing, with moving from large batches towards continuous manufacturing having a deep impact on the analytical instruments and methods used, as well as on the related data analytics. And total quality management efforts are attempting to make compliance and quality an asset, instead of a cost.

Laboratory informatics need to allow users to work not only in a more efficient and cost-effective way, while remaining compliant, they also need to provide the flexibility to adapt to completely new ways of working. They need to be able to deliver contextualised data in real-time for faster decision making. Machine learning technology does not need to be adapted, it just needs to be used in the right way. It is more about identifying the right data and providing the data in the right format to be leveraged. Data needs to be standardised and contextualised for meaningful outcomes. Dassault Systèmes does provide the tools today, and in order to help our customers to leverage their data, we are actively engaged in data standardisation projects of consortia like Allotrope, of Pistoia Alliance.

Zupancic: Labs need to cope with the ever-increasing amounts of digital data generated while conducting research. Keeping track of data, preventing data loss and data management, in general, are the major factors that influence labs’ need to adopt digital solutions. sciNote’s aim is to help out with that and contribute to the more reproducible science.

There is also an increasing need to collaborate. Research that results in high-ranking scientific publications is in most cases conducted by various teams working together from different locations, and even in different languages. Platforms that enable seamless team collaboration within an institution, and with external partners, can play an important role as well.

sciNote electronic lab notebook, for example, besides helping labs organise work on projects, enables teams to collaborate and share comments and use the same protocols within protocol repositories.

Are there any additions or improvements requested by your users?

Painell: The main area our customers are interested in is the integrations space and the ability to provide data out of the system, then pull derivative data and analysis back in at the point of use. This is linked to all the cloud-based technologies such as IoT, AI, semantics and automation that can impact this process. To help, many of our legacy customers are deciding to move to our cloud platform, to ensure their technology is future-proofed, enabling them to move capital from maintaining systems to driving innovation.

Jansen: A focus for many organisations is to provide their users with intuitive, user-friendly interfaces. This increases the user adoption and makes operations more efficient. We also see an increasing demand in cloud-based technology and solutions. The ability to integrate is a key request and an open platform is often considered as the right approach to achieve this. This will replace costly bespoke point-to-point integrations and at the same time provides the ability to integrate existing and legacy systems into the lab informatics landscape.

Zupancic: Currently AI is working on open access articles. The biggest desire expressed by our users, besides smaller improvements in the user experience, was to include non-open-access research articles in the literature. We are receiving valuable feedback from the users and are taking into account their opinion to improve the manuscript further, to become an even better, helpful tool for the scientists.

The aim is to shorten the time to gather data and start writing the manuscript. We would really like to point out we are open to discussions, open to receive opinions and feedback and would like to invite all interested scientists to test the Manuscript Writer and share feedback, ideas, concerns and comments with our team.

Feature

Robert Roe reports on developments in AI that are helping to shape the future of high performance computing technology at the International Supercomputing Conference

Feature

James Reinders is a parallel programming and HPC expert with more than 27 years’ experience working for Intel until his retirement in 2017. In this article Reinders gives his take on the use of roofline estimation as a tool for code optimisation in HPC

Feature

Sophia Ktori concludes her two-part series exploring the use of laboratory informatics software in regulated industries.

Feature

As storage technology adapts to changing HPC workloads, Robert Roe looks at the technologies that could help to enhance performance and accessibility of
storage in HPC

Feature

By using simulation software, road bike manufacturers can deliver higher performance products in less time and at a lower cost than previously achievable, as Keely Portway discovers