Paperless Lab Academy

At the Paperless Lab Academy (PLA), which took place in
Amsterdam on 13 and 14 May, the importance of managing
change and the limits to data sharing were important
topics. Tom Wilkie reports

Gayle Dagnell, of the contract research organisation Evotec, summed up a central message of the Paperless Lab Academy (PLA), when she observed: ‘People initially did not want to change.’ But after implementation of a new informatics system within the organisation: ‘100 per cent said they did not want to go back to the old way of doing things’.

As the PLA’s chairman Peter Boogaard remarked, the comment encapsulated the theme of the meeting – that in moving from a paper-based to a paperless laboratory, successful implementation depends as much on successful ‘change management’ among the staff affected by the move as on the details of the technology itself.

The point was endorsed in the keynote address by Lawrence Barrett, programme director of Digital R&D at Unilever. But he also had some unexpected messages. When introducing a new informatics system, it is as important to get ‘buy-in’ from senior managers as from the scientists at the laboratory bench, he told the meeting. During the initial phase, people naturally find it takes longer to work with a new, unfamiliar system rather than the old way of working which they know well. So lab scientists, faced with a need to complete assays quickly to meet production deadlines, might well suggest short-cuts to their managers. Managers too will be under pressure to ensure that deadlines are met. It is an understandable reaction to resolve the dilemma by agreeing that the scientists can work round the new systems and revert to the old methods. But the unintended consequence of a decision taken in good faith to fulfil the short-term priority can be slippage in the longer-term priority – even though that will eventually save the company money and improve efficiency in the future. So part of the successful implementation of a new electronic lab notebook (ELN) at Unilever, Barrett said, was a set of meetings and briefings to senior managers to bring out the potential for conflict between shortterm and longer-term goals and the need to balance the two, especially during the early stages of implementation before the improvements become apparent to all.

Barrett’s job is to help ‘connect people to a world of data’ and to capture and manage data consistently across the company’s R&D sites. So far, about 20 per cent of experiments were producing data that was re-used later elsewhere in the organisation.

But, Barrett said: ‘There were many more projects than we had expected where the data had to be protected and not shared.’ Some of the data might be important in future for filing patents. Other projects represented long-term trials where it was necessary to prevent the premature disclosure of data to avoid the wrong conclusions being drawn from an incomplete data set.

Despite this, the need for better ways to share data is fundamental to the future and the point was emphasised by Gerhard Noelken, business IT lead for Pfizer. At present, he said, laboratory scientists often claim that it is quicker and easier to repeat the experiment than to look for the data from an earlier one. Moreover, between 10 per cent and 25 per cent of data records contain errors or missing values. A three per cent error rate could push costs up by 30 per cent, he said. So several pharmaceutical companies have come together in the Allotrope Foundation to build an open framework for laboratory data using common information standards. On the technical side, Allotrope is working with the German software company Osthus to create a commercial-quality framework with demonstration software to acquire, record, and automatically archive HPLC-UV data. But as Wolfgang Colsman, chief technology officer of Osthus told the meeting, it is not enough to acquire, record and archive data: ‘Finding and searching, to get the data back, is critical to archiving.’ The project was being guided by the 4Rs’ – data had to be retrievable, readable, reprocessible, and re-usable.

The sheer monetary value of being able to ‘breathe new life into dead data’ was starkly illustrated by Nick Nugent from ACD/Labs who pointed out that searching for data was now taking longer, simply because so much data was being generated. Creating laboratory reports was taking too much time and leading to ‘death by cut and paste,’ he said.

A dark Cloud for traditional lab informatics vendors?

Just as in consumer computing, where Microsoft’s Office365 represents a shift to subscribing to software rather than buying a package outright, so the Paperless Lab Academy (PLA) in Amsterdam in May offered a glimpse of a future in which no one sold expensive stand-alone systems anymore but where laboratory informatics was delivered by software as a service (SaaS).

One trend driving the move to SaaS is that, across all industries, according to Steve Yemm of Core Informatics, research and specialised capabilities are being outsourced and externalised. Such externalising trends are not confined to research either but are pushing through in to development work and even to manufacturing.

But there is a problem, he pointed out: how do you capture and reuse your data across that chain of external contributors? Core Informatics’s solution is to offer its ‘Platform for Science’ as a   unified technology incorporating the features of a LIMS, ELN and scientific data management system (SDMS) and to do so in the form of software as a service (SaaS). It is 100 per cent web-based, Yemm said, and can run on a company’s internal servers – a private cloud – on Core Informatics’s own servers, or even on Amazon Web Services. Companies could build applications on the platform and it was scalable to thousands of users if need be.

The pharmaceutical industry has concerns with distributed computing. Its lifeblood is patenting, and it is also a highly-regulated industry. But, as Paul Denny-Gouldson from IDBS pointed out, the banking industry has even greater concerns about the integrity of its data yet it has successfully rolled out internet and mobile banking to its customers. ‘If the bankers can make this happen,’ he asked, ‘Why can’t we?’ But, he counselled, it had taken the banks 15 years to build up their electronic systems. He warned that organisations cannot absorb too much change too quickly. It may be different, he continued, in 10 years’ time when all laboratory instruments are connected tothe network but ‘today you have to be pragmatic’.

Is the end of the QC lab and its LIMS in sight?

Lab informatics software faces a challenge from the rise of web and cloud-based services, but it may be under threat from a different direction – not on-line but in-line quality assurance – the growth of process-analytic quality assurance rather than end-product quality control.

Many manufacturers rely on end-of-line quality control (QC), or ‘quality by testing’. In-line quality assurance (QA), on the other hand, focuses on what happens throughout the manufacturing process rather than taking a snapshot of the output at the end. It thus offers a way of feeding back information while the manufacturing process is continuing and thus preventing errors before they arise.

Jan Verelst, from Siemens, summed it up the trend in one question to the Paperless Lab Academy held in Amsterdam in May: ‘Do we need to stay with the traditional QC laboratory?’

Quality by design (QbD) can be a way of improving productivity and reducing manufacturing costs. One critical underpinning technique is ‘process analytical technology’ (PAT) – instruments and software that makes it possible to measure critical quality attributes and process parameters in-line, allowing running processes to be adjusted in real time. As Verelst put it: ‘While we are making it, we are predicting the quality of the final product. The lab is no longer needed to measure end-product quality, because it was done in-process.’

The penalties of waiting for results from the QC lab were graphically illustrated by another contributor to the Paperless Lab Academy who recounted how one pharmaceutical manufacturer lost a batch of product worth up to $20 million. It was a short-lived product which expired on the quayside while waiting to be loaded on a ship for export. Unfortunately, the certificate of analysis from the laboratory did not arrive until after the batch had expired, so it all had to be thrown away.

Jan Verelst believes the traditional QC lab might still have a future at the input end of the process – verifying that raw materials coming from suppliers are in accordance with the specification. However, the meeting heard how medical device company Medtronic was moving in the opposite direction so that, according to the company’s Iraida Quinn, a ‘supplier-owned quality programme’ at Medtronic’s premises in Galway, Ireland, was pushing quality control back to the suppliers themselves.

Medtronic has opened a web portal using the PerkinElmer iLab and a SAP QM connector. When Medtronic places an order through its SAP, the system requests samples that have to be tested by the supplier to check conformity with Medtronic’s specifications and the supplier enters the results through the iLab portal. Provided the test results meet the specification, the system generates the authorisation for the supplier to ship to Medtronic. Thus the workload on  Medtronic’sown QC laboratory is cut back.

Peter Boogaard, chairman of the Paperless Lab Academy, believes changes in the role of the traditional QC laboratory are inevitable and that there are profound implications for its associated informatics software. Quality assurance may indeed be done in-line, he believes, to avoid waiting  times for samples to be sent for analysisand results returned to the production plant.

Such a change in the role of the laboratory may well bring changes for laboratory software in its wake. According to some observers, it could mean the end of LIMS as a stand-alone piece of software and its incorporation into product lifecycle management systems.

But Boogaard believes that there are ways to prevent the lab from going out of business. On option, he believes, could be for it to switch to problem-solving. If something has gone wrong on the production line, then the lab can create a new and valued role for itself by identifying what  has gone wrong and coming up with a solution – moving from routine testing of product to the  appliance of science to the process itself.


Sophia Ktori investigates the use of informatics software to increase data integrity in the laboratory


Tim Gillett reports from PRACEDays 2016, held in May in the city of Prague


Robert Roe investigates the motivation behind the architectural changes to Europes fastest supercomputer, Piz Daint, housed at the Swiss National Computing Centre


Robert Roe discusses the merits of the latest storage technologies, including a push by storage providers to develop end-to-end platforms featuring intelligent data management systems


As AMD launches its latest FirePro GPU, Robert Roe investigates a new suite of open-source tools, released by the company that convert code from CUDA into C++