Building a Smart Laboratory

Scientific Computing World has recently published a revised edition of its special supplement Building a Smart Laboratory. John Trigg sets out the changing demands that laboratories face and why it is only by becoming smart that they can respond effectively.

The laboratory informatics industry is facing new challenges. Simply increasing throughput while reducing costs and errors are not enough anymore. In the long-term, laboratory data and information need to be managed in an integrated way so as to advance the science by extracting information and knowledge from the repositories of data.

It is only by becoming smart that laboratories can respond effectively to this new demand. And it is to set out these new challenges, and how they can be addressed, that we have revised Building a Smart Laboratory, our special supplement to Scientific Computing World.

Today’s laboratory needs to do more than just store data, it needs to make sense of the data, to uncover correlations, and to support evidence-based decision making. Moreover, informatics systems need to be able to adapt to changing business and operational needs.

Here lies a paradox: the concept of ROI (i.e. productivity) is used to quantify the benefits and thus justify the purchase and deployment of informatics systems, but the long-term benefits may arise through non-quantifiable factors such as better understanding, better decisions, and better science. In other words, the emphasis needs to change from the elimination of waste (time, effort, errors) to providing greater capability, greater flexibility, and more predictive approaches to supporting science.

Unlike some industries, going digital in the laboratory has proceeded at a relatively leisurely pace. After four decades or so, a growing number of laboratories can now consider themselves to be ‘electronic’ or ‘paperless’.

The journey started with data capture, data processing, and laboratory automation. It went on to storing and managing digital data from disparate sources, with the evolution of systems such as LIMS and ELNs, which serve to collate and add context to laboratory data.

Throughout this period, the underlying business drivers were process efficiency, laboratory productivity, and error reduction. All purchases had to be justified against these criteria, as industries sought competitive advantage by reducing their costs and time-to-market. Informatics tools therefore focused primarily on eliminating waste (time, effort, errors) while providing management with the added bonus of a perspective on laboratory performance.

Different business issues are now having an impact on laboratory operations, and hence putting new demands on the ‘smart laboratory’. In recent years, the distribution of laboratory processes across geographic boundaries and third parties (externalisation) has become common. Businesses can take advantage of the cheapest source of commodity laboratory functions and, in some cases, tap into external sources of expertise to provide research and development. Laboratory informatics systems can meet this business need by providing capabilities for collaboration and the sharing of laboratory data and information, as long as there are infrastructures and levels of access control to ensure protection of IP.

In addition to externalisation, two other major business issues now challenge the laboratory informatics market. Firstly, there is a growing demand for more and better innovation, and secondly, the systems need to be adaptable and agile so they can cope with relentless market, business, and process changes.

Historically, a considerable amount of scientific innovation came about through serendipity and the investigation of unexpected outcomes of planned experiments – where the primary objective was to advance scientific knowledge and understanding.

Nowadays, innovation is a systematic, industrial, and time-pressured process, dependent to a large extent on making sense of existing data, prior knowledge, and evidence-based decision-making. At the same time as they cope with the demand for and changing nature of innovation, organisations have to accommodate the changes driven by externalisation, mergers and acquisitions, as well as other market forces. Frequently, such business changes entail the need to consolidate systems, processes, and workforces.

Most laboratories already depend on an informatics hub comprising one or more of the major tools: laboratory information management systems (LIMS); electronic laboratory notebooks (ELN); scientific data management systems (SDMS); and laboratory execution systems (LES). The trend over recent years has been towards convergence. Originally these tools served distinct market sectors, and were provided by quite separate vendor communities. That has changed so that many laboratory data and information functions – from data acquisition to data usage – can be accommodated within a single solution from a single vendor with the scope extending.

In the past few years, mergers and acquisitions amongst the major players in the laboratory informatics market has led to a shift in their product portfolios such as the provision of tools to support data analysis and visualisation. This is an important change of emphasis and one that makes better provision for supporting science.

The design and infrastructure of informatics tools will influence their ability to adapt to changing business circumstances. Over the years, ELNs, for example, have evolved in a typical software fashion, with extended and more detailed functionality, progressively adding to their complexity.

But the more simplistic ‘paper on glass’ style of ELN has recently been gaining market share with faster deployments and greater user acceptance. If the trend continues, this may influence greater modularity across the market, opening up the options for ‘best of breed’ products to meet the specific demands of different types of laboratory. In some respects this type of approach, modularity and simplicity, reflects what has been happening with consumer technologies. The combination of mobile devices, ‘apps’, and sharing/collaboration through social media has set a precedent for ‘user experience’ and raises some interesting questions for the laboratory informatics industry.

With respect to mobile devices, there is a delicate balancing act that falls somewhere between their desirability and their practical value in the laboratory. Limited screen sizes, gesture/touch navigation, virtual keyboards and their physical vulnerability in a laboratory all conspire to challenge the business case for their use.

But there is potential in that mobility, and ‘apps’ that offer dedicated functionality and are tailored for the small screen sizes. The social media argument is an interesting one. Although in the consumer world, social media serve an entire spectrum of good bad and ugly usage, the underlying principles of communication, sharing and collaboration are highly relevant to the modern ‘externalised’ laboratory environment. For this reason, they do have significant potential for the laboratory, particularly if they can be incorporated into the controlled environment of the informatics portfolio.

Of course, ‘cultural’ issues remain with respect to user adoption, but using the ‘push’ principles to inform, rather than the traditional ‘pull’, i.e. the information is there, but you have to find it, would seem to take a distinct advantage of the benefits of ‘social media’.

All of this adds up to some interesting times for laboratory informatics. With evolving business needs and rapid changes in consumer technologies that influence expectations in the laboratory, the gauntlet has been thrown down, and there is good evidence that the market is moving to address these demands.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon
Analysis and opinion

Robert Roe investigates some of the European projects focusing on preparing today’s supercomputers and HPC programmers for exascale HPC


The Jülich Supercomputing Centre (JSC) at Forschungszentrum Jülich in Germany has been operating supercomputers of the highest performance class since 1987. Tim Gillett talks to Norbert Attig and Thomas Eickermann


Gemma Church investigates how simulation and modelling de-risks the detection and extraction of geothermal energy resources


Robert Roe investigates the importance of upgrading legacy laboratory informatics systems and the benefits this provides to scientists