Skip to main content

Technology drives a new era in the laboratory

Robert Roe reports on the Paperless Lab Academy event on the potential impact of disruptive technologies such as blockchain and IoT in the laboratory.

The Paperless Lab Academy conference held this year in Milan, Italy celebrated its sixth year in 2018. With the conference held in a hotel on the bank of Lake Maggiore it was the first time the event had been held in Italy after three years in Barcelona. 

The event kicked off with the usual introduction from organisers Isabel Munoz and Roberto Castelnovo, who welcomed the attendees and highlighted some key facts including a 10 per cent increase in the number of visitors.

The first keynote was delivered by Pat Pijanowski, managing director at Accenture Scientific Informatics Services (ASIS), and Dr Matt Ellis, senior manager, ASIS Europe on the potential for blockchain technology to be used in the laboratory. 

Pijanowski opened with quotes from various sources highlighting the hype behind this technology in recent years quoting The Economist, who said that the technology was ‘a catalyst for global prosperity’ and Gartner who reported that ’10 per cent of global GDP will be on blockchain in 10 years.

While these claims refer to the use of blockchain as a technology underpinning the development of cryptocurrencies, the technology has other potential use cases that can help to increase the security and traceability of data. In the same way that blockchain can help to produce a secure and distributed record of transactions, the technology can also be applied to laboratory systems.

Blockchain allows for the creation of a chain consisting of blocks of transactions, the order of transactions is represented by their order in the chain. 

Transactions cannot be modified and each participant or node keeps an identical copy of all transactions so new data can only be added by consensus between nodes.

What does this mean for the laboratory? In simple terms, blockchain can be used to create a distributed database of data transactions that is secure and cannot be tampered with or edited after creation. This provides regulatory authorities such as the FDA with a full, chronologically ordered record of all experiments and data recorded in a laboratory for a given project.  

Pijanowski did note that was not necessary for all scenarios but highlighted two potential use cases where the technology could be applied – externalisation and increased data integrity. 

In the case of data integrity, Pijanowski stated that ‘replication of data across all nodes provides a reliable source of truth.’ While a network of nodes can self-verify all transactions an organisation could also introduce ‘smart contracts that could be used to apply business logic and process transactions in near to real-time.’

The cost of data

The following presentation, from Eric Little, chief data officer Osthus, helped to clarify the importance of recording and storing data correctly in order to maximise its use and therefore its value to an organisation.
Little provided a strategy for reference master data management ‘Meeting the Challenge to support analytics in the e-data lifecycle’

Little noted that in a world of increasingly costly and difficult research and development pharma companies must reduce costs through better use of data that is generated within an organisation.

Little stated that companies should ‘use the data you have before you generate more.’ This can be accomplished by reducing the number of re-run experiments system integration and standardisation.
Little also highlighted the importance of automating 'as much as possible once data is available. This process can start with simple reoccurring tasks such as workflows, models, query patterns and then expanding the process.
‘We need to move from big data to big analysis’ stated Little. ‘It’s about making big data small data; it’s about making big data usable data.’

The importance of maintaining data integrity

If costs and productivity alone do not make the case for drastic new approaches to the way in which data is recorded and stored then these changes will be enforced through regulation. 

One example of this can be found in the way that new GPDR regulation will affect patient and medical data but looking into the future Mark Newton, from Heartland QA, suggests that fraud and data integrity issues may result in significant changes in the way that research data is collected, processed, and retained.
Newton opened his presentation with some facts regarding fraud in scientific research. He highlighted several examples and one article in 2017 reporting that the scale may be much larger than originally thought. Newton then informed the audience that from 2015-2017 the FDA issued 130 warning letters with each representing findings of multiple, serious or critical infractions. 

Newton also highlighted a report from 2013 published by National Center for Biotechnology Information that stated that ‘nearly 40 per cent of researchers knew of fraud but did not report it.’

During the talk newton noted that most loss of data integrity ‘happens at the point of collection’ but ‘any place where humans interact is an integrity risk point’.

As with the other speakers Newton highlighted several ways to alleviate the risk of incorrect or fraudulently entered data but much of this revolves around automating the process and removing the potential for human error or fraud.

Here is another example of how blockchain could help to increase data integrity in the laboratory. While without systems integration if data or transactions are created by manual data entry there is still a possibility for fraud it is much easier to detect and prove in a system that is recording a full, unedited record of all data transactions in sequence.

The inherent transaction history and traceability provides a complete trail to facilitate regulatory oversight and increase organisations audibility.

Pijanowski also noted that ‘distributed consensus’ can, in some cases, ‘reduce the costs associated with manual data transcription and reconciliation during the data lifecycle.’
This is a technology that is already starting to make an appearance in the laboratory – particularly in industries that are highly regulated. However, Pijanowski expects this growth to continue citing that ‘In 2018, approximately 35 per cent of life sciences companies will deploy blockchain into their organisations.’

Preparing for the lab of the future

However, it is not just blockchain technology that threatens to disrupt traditional laboratory operations. As Gerhard Noelken, business development at the Pistoia Alliance, highlighted in his talk which focused on technologies such as IoT, machine learning and blockchain to create the ‘Lab of the Future (LotF)’ one of Pistoia’s key themes for its work in 2018.

Noelken suggested that while technology may be adapted quickly by consumer markets it takes significantly longer in the laboratory. To help accelerate this process LotF was chosen as one of the strategic themes for 2018 in order to provide pre-competitive support for more rapid implementation of value-adding components to today’s laboratory environment.

Another point noted by Nolken was that the topics chosen by the Pistoia Alliance reflected problems or challenges that were seen by Pistoia ‘again and again.’

There are several short and long-term areas of focus for the LotF project. Blockchain, IoT, augmented reality and the automation of intelligent systems are immediate goals while the future will see the focus shift towards AI and eventually quantum computing and virtual research.

As an example of how this might drive new paradigms of research the use of IoT and the creation of huge amounts of genetic data, coupled with AI and more powerful computing systems will be fundamental pillars in the development of precision medicine. However, if these technologies are not adopted then this research will be stalled so it is important for companies such as the Pistoia Alliance to help drive adoption of new technology.

In the immediate future, the plan is to help drive the IoT tech into the lab. ‘We are evaluating the maturity and usability of IoT in an end-to-end use case, sharing actual data across the industry and analysing the data using artificial intelligence tools’ commented Noelken.

‘The second project deals with the role of the analytical method documentation in the Lab. Translating the human-readable protocol into an electronic instruction set in a standard way would hugely improve reproducibility and data quality for experiments from discovery to manufacturing’ added Noelken.

Media Partners