Paperless laboratories do better science

In the first of his three reports from the Paperless Lab Academy (PLA), which took place in Amsterdam last week, Tom Wilkie considers the importance of managing change and the limits to data sharing

Gayle Dagnell, of the contract research organisation Evotec, summed up a central message of the Paperless Lab Academy (PLA) in Amsterdam earlier this week, when she observed: ‘People initially did not want to change.’ But after implementation of a new informatics system within the organisation: ‘100 per cent said they did not want to go back to the old way of doing things’.

As the PLA’s chairman, Peter Boogaard remarked, the comment encapsulated the theme of the meeting – that in moving from a paper-based to a paperless laboratory, successful implementation depends as much on successful ‘change management’ among the staff affected by the move as on the details of the technology itself.

The point was powerfully endorsed in the keynote address by Lawrence Barrett, programme director of Digital R&D at Unilever. But he also had some unexpected messages. When introducing a new informatics system, it is as important to get ‘buy-in’ from senior managers as from the scientists at the laboratory bench, he told the meeting. During the initial phase, people naturally find it takes longer to work with a new, unfamiliar system rather than the old way of working which they know well. So lab scientists, faced with a need to complete assays quickly to meet production deadlines, might well suggest short-cuts to their managers. Managers too will be under pressure to ensure that deadlines are met. It is an understandable reaction to resolve the dilemma by agreeing that the scientists can work round the new systems and revert to the old methods. But the unintended consequence of a decision taken in good faith to fulfill the short-term priority can be slippage in the longer-term priority -- even though that will eventually save the company money and improve efficiency in the future. So part of the successful implementation of a new electronic lab notebook (ELN) at Unilever, Barrett said, was a set of meetings and briefings to senior managers to bring out the potential for conflict between short-term and longer-term goals and the need to balance the two, especially during the early stages of implementation before the improvements become apparent to all. 

Unilever’s corporate strategy is to double the size of its business while decreasing its environmental footprint. Research and development is critical to achieving this goal, Barrett said, and it was his job to help deliver ‘Digitally enabled R&D’ through the company’s e-Science programme. Its mission, he continued, was ‘to connect people to a world of data’ and to capture and manage data into the future in a way that was consistent across the company’s R&D sites. And it looks as if it is being successful: so far, about 20 per cent of experiments were producing data that was re-used later elsewhere in the organisation.

At this point came a second unexpected message: ‘There were many more projects than we had expected where the data had to be protected and not shared,’ he said. Some of the data might be important in future for filing patents, and so needed to be protected as the company’s intellectual property. Other projects represented long-term trials where it was necessary to prevent the premature disclosure of data to avoid the wrong conclusions being drawn from an incomplete data set.

The company also has students and temporary staff working in its laboratories, so sensitive information needed to be protected from those who did not have the right to see it.

Given that one of the bases of the case for introducing an ELN was the expected benefits of increased data-sharing, Barratt remarked that they had been surprised by the amount of data where sharing was not in the company’s interests. The move towards a paperless system was teaching them about their own business, he said.

The ELN ‘was the right tool for discovery' but there was still discussion internally as to whether it would be useful in other areas, he said. Some 260 out of the 300 people who had been trained in the ELN are still using it, he said, and there are plans to roll it out to a further 700. Nonetheless, he pointed out: ‘Unilever is a couple of years into a journey. It will take 10 years to drive change in thinking. The aim is to change people’s relationship to data and the ELN has been a great way to start.’

The need for better ways to share data was emphasised by Gerhard Noelken, Business IT Lead for the pharmaceutical company Pfizer. At present, he said, laboratory scientists often claim that it is quicker and easier to repeat the experiment than to look for the data from an earlier one. Moreover, between 10 per cent and 25 per cent of data records contain errors or missing values. A 3 per cent error rate could push costs up by 30 per cent he said. It was to try to remove such issues that several pharmaceutical companies have come together in the Allotrope Foundation to build an open framework for laboratory data using common information standards. Allotrope is already delivering proof-of-concept applications, he pointed out, and in June will broaden out its work to other organisations that have similar needs for standardised laboratory data but which are not within the pharmaceutical industry, by hosting an open meeting in Aachen to discuss what it has been doing.

There are some surprising obstacles to the pharmaceutical industry’s efforts to deliver common data formats, not least being US anti-trust legislation. Thus the secretariat of Allotrope is provided, not by scientists or IT professionals, but by a firm of Washington lawyers, to ensure that there is no hint of commercial collusion. On the technical side, Allotrope is working with the German software company Osthus to create a commercial-quality framework with demonstration software to acquire, record, and automatically archive HPLC-UV data. It has three aspects: a common non-proprietary file format based on AnIML; a meta-data repository; and freely distributable class libraries – software tools – that instrument vendors can use to ensure common information standards.

As Wolfgang Colsman, Chief Technology Officer of Osthus told the meeting later, it is not enough to acquire, record and archive data: ‘Finding and searching, to get the data back, is critical to archiving.’ The project was being guided by the ‘4Rs’ – data had to be retrievable, readable, re-processible, and re-usable.

The sheer monetary value of being able to ‘breathe new life into dead data’ was starkly illustrated by Nick Nugent from ACD/Labs who pointed out that searching for data was now taking longer, simply because so much data was being generated. Creating laboratory reports was taking too much time and leading to ‘death by cut and paste,’ he said. If improved tools for searching data could shave just one month off the time to bring a new chemical entity to market in the pharmaceutical industry, then that would represent more than $10.5 million in net present value.

Twitter icon
Google icon icon
Digg icon
LinkedIn icon
Reddit icon
e-mail icon

For functionality and security for externalised research, software providers have turned to the cloud, writes Sophia Ktori


Robert Roe investigates the growth in cloud technology which is being driven by scientific, engineering and HPC workflows through application specific hardware


Robert Roe learns that the NASA advanced supercomputing division (NAS) is optimising energy efficiency and water usage to maximise the facility’s potential to deliver computing services to its user community


Robert Roe investigates the use of technologies in HPC that could help shape the design of future supercomputers