Why are consumers better at technology than lab scientists?

Tom Wilkie looks to the Laboratory Informatics Guide 2014, published by Europa Science, to understand the drivers for change in contemporary laboratory informatics

By the end of last year, the number of Apple devices being sold around the world exceeded the number of Windows PCs. Yet step into any analytical laboratory and there is scant evidence of this consumer revolution being reflected in either the hardware or software of laboratory informatics.

It is difficult not to feel sympathy for today’s laboratory managers for they face a paradox, as explained in the current Laboratory Informatics Guide 2014 (LIG2014). On the one hand, software apps and mobile hardware being developed for the consumer market can help drive down costs and improve the efficiency of the way in which analytical laboratories acquire, store, and manage their data. On the other hand, LIG2014 carries two articles about the difficulties of standardising data formats and about how the sort of information-sharing that consumers take for granted – Flickr or Facebook – is currently impossible in laboratory informatics.

It is precisely to help laboratory managers inform their decisions on purchasing or specifying laboratory informatics products that the publishers of Scientific Computing World felt there was a need for the Laboratory Informatics Guide 2014. It delivers not only a comprehensive listing of products and suppliers, which provides the best starting point when renewing or upgrading existing systems, but also reports first-hand on the experiences of other lab managers who faced the same problems in implementing new informatics systems or trying to integrate different systems in one laboratory.

Change has to be slow in laboratory informatics. Many labs have to conform to regulations issued by the Food and Drug Administration in the USA or its counterparts in other countries – and the FDA is by no means the only regulatory body with an interest in ensuring that the results emerging from an analytical lab conform to certain standards. Even labs that operate in unregulated industries may have to ensure that they ‘capture’ their results using software that would withstand legal challenge – for example, in patent litigation. Neither The Law nor Government are known to favour a fast rate of change.

But there are two sources of pressure for change, and both are inexorable. One is technology ‘push’, as developments in entirely different spheres for entirely different purposes seep into the laboratory. Mobile phones and tablets started outselling PCs about three years ago and the laboratory cannot be immune from the mobile trend and with it he demand for easy-to-use ‘apps’.

Not surprisingly, the vendors of laboratory informatics systems have diverging views as to how the future will unfold. As reported on page 26 the Laboratory Informatics Guide 2014, Stuart Ward, product management for IDBS’s ELN product E-WorkBook sees mobile devices ‘as having specific apps for very discrete types of transactions, rather than some sort of universal application like you can deliver on a desktop or laptop.’ In contrast, Seamus Mac Conaoniagh, director of technology at Thermo Fisher, said: ‘Our approach is really to extend our applications to allow mobile use, rather than to develop single-use mobile applications.’

Mobile devices can capture some types of data, QR codes or bar codes, better than conventional techniques. But, as Ward remarked: ‘You’re not going to have someone on a mobile device typing in 10,000 data-points. It’s just not going to work.’ Apps on mobile devices can be useful for reviewing data post-analysis, on the other hand. Stephen Gallagher, CEO of Dotmatics, pointed out that the company had implemented a feature in its chemical drawing app, Elemental, that allows a scientist to draw or annotate a compound or reaction on their phone or mobile device and then upload it to the ELN.

But as Peter Boogaard, of Industrial Lab Automation, points out on page 4 of LIG2014, it is still more difficult than it should be to get useful information out of scientific databases, largely because of a lack of data standards. He draws the analogy with a music industry in which ‘each label has its own proprietary music file format. How would you be able to share music? By default, standards make it easier to create, share and integrate data.’ And he continues: ‘As of today, there is no unified scientific data standard in place to support heterogeneous and multi-discipline analytical technologies. Since in today’s world, consumers of laboratory data can be found across the entire product lifecycle, and may include external organisations such as CROs, a different mind-set is required if a truly integrated laboratory is to be achieved.

And an integrated laboratory is one that will accommodate the second driver of change – market ‘pull’, as the laboratory’s customers, predominantly management, seek to increase the value of what the lab produces to the enterprise as a whole while decreasing its costs. John Trigg, of phaseFourInformatics, continues the theme by noting that if and when laboratory data interchange standards emerge laboratory integration would become easier, cheaper, and more effective. But there is an undertone of pessimism to his account: ‘The long history of incremental adoption of laboratory technologies has left a legacy of proprietary approaches to interfacing laboratory and business systems. A data standard would be dependent on industry-wide agreement, approval by various regulatory bodies and other interested parties, and the willingness of the vendor community to cooperate. As strong as the business case may be, the task is therefore far from straightforward and has no guarantee of success.'

Trigg does see a path to salvation in those same external technological developments that are being driven by the wider consumer market and with which this article started – if every piece of laboratory equipment has its own IP address and can link to the internet (whether private or public networks), he believes, then a truly integrated business ecosystem that incorporates laboratory data and information management may come into being.

Until that happy day arrives, the guide to the perplexed that is offered by the Laboratory Informatics Guide 2014 remains indispensable.

 Dr Tom Wilkie is editor-in-chief of Scientific Computing World

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers