Skip to main content

Digital technologies drive laboratory transformation

‘Choosing a system that offers flexible configuration to match specific teams’ needs is key to digital transformation,’ says Sharon Williams, Interactive Software’s product director. The UK-based firm works with universities and pharma to configure its Achiever Medical LIMS, designed for managing complex workflows.

It’s important to look at the operational requirements of any potential user, and to understand what they already have in place to manage their workflows, and where, or what their deficits are, Williams noted. ‘We go into labs, see how they operate, and try to understand their end-to-end processes. We need to see what they are struggling with – they may be still relying on spreadsheets that get emailed between people with little or no security, for example – and then we can highlight those deficits.’

It’s ‘gap analysis,’ at a high level, she continued, which is more than just trying to reduce or remove the need for manual data input. ‘Once we understand where they are now, and what their aims are, we can then demonstrate how we can fill the gaps and build in value, security and efficiency. Not infrequently, implementation of a new platform might also highlight processes that may have been partially buried. ‘We’ve had customers come to us and say that they need to establish an SOP for a process that they only realised they were carrying out when they automated the workflow,’ said Williams.

Understand the ultimate endgame and you can work backwards from there, suggests Williams. ‘People tend to get tied up with looking at the software platforms that are on the market, rather than starting with what they want to achieve and how they want to get there. Realistically we can’t change the world in one go. Our aim is to try to break the problem down and look at what is going to have the greatest impact. Then we can start from there and work outwards.’

Is a LIMS always necessary?

Organisations may also try to shoehorn their existing processes into mismatched legacy software, either through configuration, customisation or upgrades. ‘This may seem like the logical option, but if the software isn’t designed to do the job, then the outcome may be disappointing, and costly,’ Williams stated.

‘You have to be realistic about what your existing software can do, and acknowledge when it may be better all round to look for something that is more fit for that purpose.’

In fact, companies are quick to assume that they will need an expensive, complex LIMS platform whereas it’s not always a given that a LIMS will be necessary, she continued.

‘If you are only dealing with a couple of hundred samples and there is no complexity associated with the data associated with those samples, or what you are doing with them, then commercial sample tracking software may be sufficient to help you manage samples. They are reasonably priced, and would do the job for you.’

Interactive Software works with those organisations that do have the scale or complexity of workflow to require an informatics infrastructure founded on a LIMS for sample management. For organisations working in areas that involve accessing patient data and samples, the implications for security, and permissions for individual users, are uppermost, and so the concept of digital transformation may be highly reliant on security,’ she noted. The ability to set authorisations and access permissions at the individual user level is therefore critical.

Journey from any starting point

Of course, every laboratory or lab-based organisation will have its own vision for short- and long-term digital transformation, according to where they already are on their digital journey.

Some organisation’s labs and businesses have taken steps to become paperless, but, Gabi Koberg, senior regional sales manager, EMEA, at Abbott Informatics, feels it is not uncommon for organisations to be reliant on paper and/or Excel, especially when it comes to performing lab procedures, for example, sample preparation.

‘We see that even in 2020 there are still companies that haven’t yet implemented fundamental IT software solutions, such as an enterprise resource planning (ERP) platform.’

For these organisations, in particular, but also for those that might be much further down the digital road, it is critical to be realistic about immediate goals. A good starting point, according to Andreas Schüler, technical solution design manager EMEA at Abbott Informatics, could be looking for opportunities to implement software that will offer quick and immediate benefits.

‘For example, using mobile technologies and associated software for functions – such as environmental sampling or sample preparation for testing – where there is no ready access to a PC, can have immediate time and error-saving benefits.’

The ubiquitous nature of barcoding has now made this a relatively simple form of digitalisation to implement, suggested Schüler. ‘It won’t necessitate high overheads, and the relevant software and equipment – everyone these days is familiar with a touchscreen-enabled tablet, or mobile – is easily integrated into daily working practices.’

RFID tagging technology also offers an opportunity to completely automate inventory. ‘Stock management can become a self-documenting process, with the benefit that you always are aware of where any given piece of inventory is, and how much of any inventory you still have. Not only that, but feasibly the system will automatically generate a PO to order new inventory, once a threshold level has been reached,’ added Schüler.

Consider user acceptance

‘For any organisation considering advanced technologies, the human element is a major consideration,’ Schüler continued. If we think about implementing relatively well-known digital systems, whether that be barcode scanning, or transitioning from other paper-based workflows to tablet-based systems, it’s likely that the end-user will only have to learn how to use a new touchscreen digital interface for data recording and access. ‘For this type of digital system, the threshold for user acceptance probably isn’t that great,’ Schüler suggested. ‘Plus, users can see the immediate benefits of not having to record their data manually, scan it into a system, and then archive that paper original. Instead, they immediately have their data digitally stored and easily accessible.

‘Progress to the concept of using more complex technologies, whether that be sophisticated voice control, or augmented reality, and there may well be more human resistance, both with respect to learning how to use the technology itself, but also with the potential that such systems may watch, listen, record and report on their every move throughout the day.’

In this environment, end-to-end automation is just more complicated, Schüler noted. ‘It will always be more difficult to implement for the R&D Labs, just because the work that’s being done is typically less routine. The more routine your work is, the easier it is to optimise and make it more efficient digitally.’

Digitisation and automation is a key goal also for the R&D lab, Koberg continued. ‘Implementing advanced technologies, such as predictive modelling, could allow labs to reduce the number of experiments they are carrying out, and save on both resources and time. Such technologies can feasibly aid in the design of experiments that provide more insight, which would reduce development time, and ultimately speed time-to-market for new products.’

The ability to reuse data is really important for maximising insight from historical experiments. ‘One key priority is to reduce the amount of unstructured data, and so put as much data and metadata as possible into a structured format that can be easily accessed and mined in the future,’ added Koberg.

Data standards

Thinking about data utility naturally leads to the concept of data standards. Schüler said: ‘You want to make sure that the data you are capturing will still be accessible for data mining long after the system that created it has been retired.

One focus of product development at Abbott Informatics is to enable labs to progress their digital transformation and enable that end-goal of a seamless, end-to-end automation and digital data handling, Koberg noted. The firm maintains that its StarLIMS integrated solution has long been at the forefront of LIMS-based platforms that are designed to facilitate interfacing and integration of typical laboratory and other enterprise hardware and software platforms.

Speeding drug development

Ultimately, digitisation in the pharma or biotech lab will help to speed the identification of new targets and the development of drugs, vaccines or gene therapies, suggested Markus Gershater, co-founder and CSO at UK-based Synthace. Embracing smart software that can automate the complexity associated with experimental design and execution will take the benefits of digitisation in the lab beyond automating how (and what) data can be directly obtained from analytical instrumentation, and its management and analysis.

Synthace is developing a new generation of software tools that will effectively speed up how scientists can design and carry out experiments that will help to answer highly complex biological questions. By doing this it should be possible to maximise the depth and utility of data that can be derived from those experiments, and the associated metadata that can map every step of the experimental execution. ‘It’s a concept of digital augmentation that we call ‘computer-aided biology,’ Gershater noted.

The aim is to harness software that can close the gaps in digitisation and automation gaps in the R&D cycle. ‘This cycle typically starts with a hypothesis or question, against which an experiment is designed,’ he explained. ‘The experiment is then executed, the resulting data is collated, recorded and reported, and data analysis can then be carried out. That analysis subsequently informs the design of the next experiment.’

There has been a lot of focus on automating how results data are handled, managed and in what formats, with a view to maximising value from that data, which will then aid in its analysis to help make next-step decisions, he explained.

However, the design, planning and execution of those experiments is still very much a human process. ‘Scientists today will commonly use tools such as Excel to help them work out their experimental plans,’ and electronic lab notebooks (ELNs) are used to record their processes, SOPs and results, Gershater said. ‘These tools are digital aids to design,’ but they don’t automate the detailed experimental planning.

In reality, planning experiments is a hugely complicated task, he suggested. ‘Powerful experimentation is a requirement for trying to address biological complexity, but it can also be very difficult to carry out by hand. Each experiment can take a huge amount of very meticulous planning, and running experiments may involve 14-hour days of complex manual pipetting steps.

‘So, if experiments are limited at the planning or execution stage, it doesn’t matter how much of the rest of the lab is digitised, there will always be limits in terms of experimental feasibility.’

Taking the limits of 
experimental design

The vision for the future is to enable scientists to design and run whatever experiments are required to address biological complexity, without these human limits, he continued. Without such constraints, scientists should be able to generate more sophisticated datasets, because the design of the experiment has been set up to optimise the value of each run, and automation can carry out these complex designs.

‘If you then are able to generate datasets that are very well structured for addressing the hypotheses that you just looked to address, you can have very sophisticated analysis on the top of those sophisticated datasets. And that’s when it will be more feasible to layer on things such as machine learning, because you have laid the foundations of really robust, rich datasets, which have incredibly detailed metadata. This depth of metadata completes that experimental map, to maximise context, which provides the foundation for then building machine learning on top,’ stated Gershater.

Synthace’s flagship cloud-hosted platform Antha allows scientists to plan complex experimental protocols, execute the workflows – through direct interface with liquid handling robotics – and then associate and integrate resulting data and every point of metadata back to the experimental source.

‘Our software maps out exactly what has to be done to fulfil a particular experimental design,’ Gershater said. ‘It works out every single liquid handling action that’s needed, and then drives the robot to carry out the experimental steps as defined. So that means you know exactly what’s happened throughout the entire experiment.’ In effect, the software generates a kind of ‘experiment digital twin,’ which is then executed in the lab, he suggested. ‘Integrate this capability with complex data handling and management, and you get much closer to the vision of seamless digitisation. With this as an ultimate goal, Antha can be interfaced with ELNs, LIMS and other emerging digital tools to generate a potentially gapless end-to-end workflow.’

Having started life as a biology company focused on addressing biological complexity in process development, Synthace initially developed the Antha software to help its own labs execute the type of science that otherwise wouldn’t have been possible. ‘We realised that the platform would have much more impact if we pushed it out across all of biology, instead of keeping it in house,’ Gershater explained.

The firm started marketing the first iteration of the Antha software in 2017, and the most recent version can interface with most state-of-the-art laboratory robotic systems. ‘Importantly, we have our own laboratories,’ Gershater noted, and so the platform continues to be developed through experience with real-world experimental scenarios and setups.

The firm has close working relationships with its clients, which include major pharma and biotech companies – such as Merck & Co in the US, and Oxford BioMedica in the UK – and with some of the major liquid handling robotics and other hardware providers. Synthace also continues close research links with Microsoft Research.

Completing the circle

Synthace clients are demonstrating the utility of the Antha platform to improve how complex lab experiments are designed, set up and executed, Gershater noted. ‘Oxford BioMedica, for example, has really embraced the concept of digitising that complete experimental loop, and is using our platform for execution, and for collation of data, and are collaborating with Microsoft Research to add artificial intelligence on top of that collated data, in order to loop back round again to experimental design.’

Ultimately the aim is to enable a completely interconnected digital laboratory ecosystem, and while Synthace is currently working with clients on specific applications, the aim is to roll out comprehensive digital solutions for broader applications.

Citing Oxford BioMedica as an example, Gershater explained how the Synthace software can have a tangible impact on the value of experiments. ‘What’s really encouraging is when you see signs that scientists are using our tool to do the kind of science that they otherwise just wouldn’t be able to consider. Computer-aided biology is how people could work in the future, and represents an ecosystem of different tools we think will be required for truly digitised lab working.’

 


Topics

Read more about:

Laboratory informatics

Media Partners