Skip to main content

Designing the lab of the future

Scientists are now beginning to use new technologies such as the internet of things (IoT), artificial intelligence (AI) and machine learning (ML) in their daily workflows. What was once a niche option to fulfil a particular challenge is now becoming more ubiquitous in many lab environments. In response, informatics software providers are designing software tools that can help integrate these technologies into the workflows, and software platforms used by scientists to assist their laboratory operations.

A trend that applies to almost all lab-based industries is the increasing amounts of data – either generated in the laboratory or available through previous work by the organisation and its partners, or through public data repositories. This increase in data helps to drive innovations, such as AI. Deep learning (DL), for example, requires huge datasets to properly train a neural network, so that it can predict information with the required accuracy and precision needed for scientific research.

IoT is a source of increased data, as the sensors or other IoT devices provide constant streams of data which need to be analysed and interpreted. If they are not used directly for scientific data, but maintenance or some other use, then the data still must at least be stored and managed, so it can provide contextual or maintenance information if not directly used to provide raw data for laboratory experiments.

Data is also increasing from other sources. Previous shifts in laboratory operations, designed to meet regulation or to provide more insight from available data sources, mean that many laboratories have begun to store more data. This data comes from a variety of sources such as; experimental data, including ‘failed’ experiments, organisation partners and collaborations; meta data and contextual data from laboratory instruments; AI or DL architecture parameters; all of this data must be saved so that science is reproducible or valuable when needed in the future.

To combat the challenge of increasing data and to look at the potential to integrate new technology into the lab, the Pistoia Alliance started Lab of the Future as one of its primary strategic themes in 2018. The project hopes to explore and accelerate the use of certain technologies which could be key to unlocking new technologies in the laboratory.

In Scientific Computing World earlier this year Nick Lynch, external advisor and liaison at the Pistoia Alliance, explained the vision for Pistoia’s Lab of the Future: ‘The purpose of the Laboratory of the Future (LotF) is to try and envision what a laboratory might look like in the years to come – partly because there are so many potential technologies that are beginning to mature. Equally, we see other ways that people want to work, because of collaborations or other technologies like tablet and mobile.’

‘It is a chance to put our thinking caps on and imagine what it might be like in five years’ time, but also to put some practical steps down as to what we should be doing to try out technologies and turn the vision into reality in a number of areas. That is why it is laid out more as a roadmap, we might not get there in one step, but we want to explore the range of things, as well as doing some practical activities in the short to medium term,’ added Lynch.

Creating the lab of the future

Several firms are already embracing technologies that could shape the future of laboratories. DNA Nexus is a Pistoia Alliance member that focuses on trying to streamline and accelerate genomic science with its cloud-based informatics and data management platform. It aims to help companies increase their sequencing capacity through a cloud-based system which allows users to build and configure custom pipelines and facilitate regulatory compliance

‘We feel that DNAnexus allows the problem of large scale, reliable, and accurate application of genomics to be solved through our services,’ said DNAnexus CEO Richard Daly. ‘The next generation of innovations involve the combination of that genomic data with phenotypic and clinical data with the same production-grade reliability and automation that is possible in the generation of the genomic data.

‘Key ingredients in that development are innovative new ML and AI technologies. These methods – such as deep learning – allow models to discover underlying concepts in raw data, without requiring significant feature engineering, while presenting more extensibility. These methods change the problem of driving insight from one that focuses mostly on engineering to one that focuses mostly on acquiring the highest quality, most diverse, best labelled and most representative datasets for the training of the best models,’ said Daly.

‘As a result, this will shift the leverage to drive discovery more toward the domain experts – genomicists, clinicians and biologists who best understand their data and will be able to leverage the engineering frameworks to train and understand models without needing to be engineers themselves,’ added Daly.

Transcriptic, another Pistoia Alliance member has designed the Transcriptic Common Lab Environment (TCLE), a cloud-based platform which acts a laboratory operating system. This allows protocols to be shared and also connect instruments and robotic arms, which can be managed remotely.

Yvonne Linney, CEO of Transcriptic, explained that the company was created around the frustration of having to repeat experiments as part of routine lab work that was carried out by the founder during an internship in a professor’s lab. ‘He was doing bio-engineering and part of the internship involved taking readings from various analytical instruments at whatever time of day or night,’ said Linney.

‘He thought “there has got to be a better way of doing this,” and so he started to think about some type of remote access labs and instruments, so the initial idea was about connecting devices and robotic arms to a cloud-based infrastructure,’ said Linney.

Certara is also developing technology for changing workloads in the laboratory. While the company is known for drug development consultancy, it also produces software which aims to accelerate the entire drug discovery process.

The company is now focusing on the development of modelling and simulation solutions which can help further enhance its users’ capabilities to take new drugs to market. Thomas Kerbusch, PhD, president of Certara strategic consulting services, said: ‘The development cycle is being expedited and positively impacted by new technology. At Certara, we are focused on the use of new technologies, especially modelling and simulation (M&S). The use of M&S has been encouraged by global regulators, to the point it is now essential to modern drug development.

‘M&S, to some extent, is used in almost every new drug approval today. It is used across the entire development cycle – from assessing drug safety and efficacy, determining dose selection, and addressing the needs of special populations – to making go/no-go R&D and commercial decisions – and assessing alternative formulations and opportunities for successful new drug indications. While this increased use is well underway, we see the opportunity for M&S to grow by three- or four-fold during the next decade,’ added Kerbusch.

‘Software today needs to be more modular and interconnectable than ever. That need for a holistic ecosystem underpins our development programs, allowing users to incorporate multiple types of software in their work, while providing an infrastructure that enables them to best organise, share, communicate and leverage their work. For example, we recently purchased a product called Pirana, a flexible, extendible pharmacometrics workbench that provides modellers with structure, tools and a graphical user interface to facilitate the iterative processes used to create models and perform simulations. That product leverages tools that exist today and those that will emerge in the future,’ said Kerbusch.

‘The diversity of our users’ needs drives our development strategy, which ensures that we provide them with robust products. Developing a fully-supported, maintained and validated software product requires a sustained commitment, investment and supporting infrastructure, including resources dedicated to software training, education around coding language and use-case proficiency, customer software and license support, ongoing maintenance, and validation for compliance with 21 CFR Part 11 requirements,’ said Kerbusch.

Rise of the machines

At Transcriptic, the vision for the future of the laboratory is based on automating menial processes to free scientists to focus on more important tasks. However, that is not to say all experiments should be carried out by robotic arms; the system uses cloud-based informatics software to pass protocols to humans and robotic systems. The added benefit of this is processes and protocols can be meticulously maintained, so an organisation can ensure scientific procedure is carried out correctly over multiple users, labs or even different sites across the world.

‘The vision that we have got is really the development of a programmatic laboratory. You could call this a robotic lab or a cloud lab but basically converting all those important instructions into a format, i.e. code, that can be easily transferred – remove the human interpretation and either the human or the machine is doing things in exactly the same way,’ said Linney. ‘You don’t necessarily need a full robotic arm to use the Transcriptic system; in fact we have a combination of robotic interfaces and human interfaces.

Transcriptic uses its own software system called the ‘TCLE’ to control the Robotic Cloud Lab. ‘It is the overlying process that is involved in running that lab. We really think about it as an operating system, in the same way that you would have a similar system for your computer. All instructions and data are fed into TCLE, which is then fed into the cloud and passed down to either the instruments or the scientist.

‘We have developed these work cells which are a collection of devices and, like other automated systems, some might be connected to a robotic arm. All that is run through TCLE, which is the programmatic interface over the top,’ added Linney. ‘Everything is connected through that lab, whether it is connected to a robotic arm or through a human interface. All the instruments are connected through very small computer interfaces, then those instruments are tied into the operating system.’

While the idea of automation has been around for some time, Linney suggests that previously this was used for more specialist uses, as a system would only be suitable for a single instrument or experiment. ‘What is different, and I used to run the automation division at Agilent, what we were finding even there was that people didn’t have the resources and the funding anymore – this was large pharma, as well as smaller companies. These organisations did not want to go out and buy a set of equipment to complete a single process over and over again, because those processes were likely to change. This led to redundant instrumentation.

‘The way things are going is to think more about how we can produce a more generic automated lab system that can be used for a lot of different experiments, rather than a single type of protocol over and over again,’ added Linney.

‘The whole area of our idea of the LotF, where there is a combination of humans and instruments connected together, is an area that is of great interest to everybody. We are thinking about how to ensure that everybody is working from the same interface, from the sharing of information and the sharing of data, so it really is consistent with where it has been produced,’ concluded Linney.

AI in the lab

AI and related disciplines, such as machine learning and deep learning, are making an impact on how scientific research is conducted.

‘We foresee that AI-enabled data extraction and structuring will start to enhance this process [drug development]. But not until the onset of actionable deep-learning methodologies can we expect to reduce the human interpretation element of creating databases. Until that time, AI will facilitate repetitive tasks, like extraction, to speed up the creation and update of databases.’

Certara has already begun preparing scientific data that the business is curating and organising publicly-available trial data, so that it can be used in quantitative model-based assessments for drug development, market access/formulary, and patient care decisions. Certara is currently working on databases across more than 40 therapeutic areas. 

‘Additionally, we have an AI-technology called ClinGenuity that is used for creating clinical study reports and building narratives, along with the redaction of patient-protected information in clinical documents in advance of publishing,’ added Kerbusch.

Not every laboratory needs to be fully automated, based in the cloud, with extensive AI workflows. But the maturation of these technologies for laboratory users is crucial to creating the labs of the future. By adopting technology carefully, based on the needs of lab users and their workflows, laboratory managers can streamline, and accelerate, the way that scientists carry out research.



Topics

Read more about:

Laboratory informatics

Media Partners