Skip to main content

A digital transformation journey in the laboratory

Determining the best approach to a digital transformation effort can be a confusing and complex task for most organisations, suggests Dave Dorsett, principal software architect at information technology consultancy, Astrix. ‘No single approach exists to transport you from a current state of paper processes, siloed technologies and inconsistent data sets to the digitally transformed laboratory of the future. The  reality is the digitisation journey will be  different for everyone, based upon the  unique set of circumstances surrounding  your business.’ However, three pillars build the foundation for all successful digital transformation initiatives: people, process and technology. 

Think of the example of a three-legged stool, he continues. All three legs must be present and solid for the stool to provide support and its intended function for long-term use. ‘The most important ‘leg’ (or pillar) for the support and sustainability of your digital transformation strategy is people. Having the right leadership in place is absolutely critical to providing a culture of innovation to foster the dramatic change that will take place in your organisation. The people within your organisation are essential not only to carry out the effort itself, but also must embrace the changes that will continue to drive your digital transformation strategy. 

The second pillar, process, is the biggest single tool necessary for digital transformation. Achieving practical process analysis to support digital transformation means keeping your eye on the goal, not on the process itself, Dorsett maintains. ‘Basic operational process improvement opportunities are likely to be found across the organisation, and include the elimination of the manual entry of data and automating what people are doing repeatedly. 

Before embarking down the path to directly eliminate these practices, take the time to determine what people are doing before and after. The key to creating a truly transformative environment is to develop a roadmap of your ‘as is’ and ‘to be’ future-state scientific processes, then mapping out the data from end to end.’ 

The third and final pillar of a successful digital transformation strategy is technology. Perhaps one of the most commonly misunderstood elements of digital transformation programs is the tendency to focus on the operational systems within the laboratory: LIMS, ELN, CDS and SDMS, to name a few. ‘The goal is to address the data created and consumed by the organisation, not to implement systems per se,’ Dorsett stresses. ‘It is not a ‘buy and deploy it’ project: no vendor sells what is required to operate as a data-driven organisation.’ Accessibility of data is, Dorsett continues, ‘everything!’ Achieving a  state of ‘analytics-ready’ is about R&D data flow throughout every aspect of your operations. ‘Siloed technologies and manual processes impede the flow and accessibility of data. Achieving seamless R&D data flow also has cultural  requirements (data governance) and technical requirements (semantic tooling).’ 

Exponential growth in R&D data has, in addition, generated immense interest in whether and how R&D data may be used more effectively. ‘Efforts to become data-centric to enable ‘running with algorithms’ and machine learning (ML) for predictive purposes have become commonplace.’ 

Iterative progress 

Digital transformation isn’t a linear journey but rather something that will be continuously iterative as people, processes and technology change. ‘Just moving our systems out of paper is not enough to call ourselves ‘digitised’. At the same time, the biggest trap for the future is letting ‘good enough’ be the enemy of perfect during the journey, Dorsett maintains. ‘Small iterative changes can be more effective in moving the organisation forward along the path to digitalisation than large, burdensome implementation projects. The seamless flow of high-quality data from producers to consumers across the organisation is the real point of a holistic, digital transformation strategy that will drive innovation and power your lab of the future.’ 

For many companies and organisations, the concept of digital transformation has progressed well beyond the practicalities of reducing paper and manual data entry. The goal now is to achieve true seamless lab connectivity and data harmonisation, to maximise data context, utilisation and longevity. But achieving this goal of potentially enterprise-wide digitalisation will hinge on realising plug-and-play integration of all lab informatics systems,  instruments, devices, and people and organisations, suggests Geoff Gerhardt PhD, chief technology officer at Scitara.  

Established two years ago, Scitara specialises in the development of laboratory-specific, cloud-based software solutions and tools that facilitate digital transformation by enabling connectivity for life science and other industries. ‘The Scitara DLX platform enables plug-and-play connectivity between any device, instrument, application, informatics system, web service or lab resource, on a vendor-neutral basis,’ Gerhardt noted. ‘On top of this interconnectivity, the DLX platform facilitates multidirectional data exchange and offers the tools and libraries that allow the transformation of data into required formats, in-flight.’ 

Cohesive integration 

Scitara’s aim is thus to move away from the concept of creating bespoke integrations between individual pieces of hardware and software. ‘Our approach is based on the formation of a cohesive data exchange, so you don’t have to reinvent the wheel every time you want  a new integration for your LIMS system,  data lake or ELN.’ Another benefit of this cohesive strategy is that the lab becomes more easily ‘expandable,’ and can diversify, Gerhardt suggested, as new instruments or informatics systems can be added and integrated using a relevant connector. And importantly, the Scitara platform is founded on an open framework, so third parties can also develop and add new plug-and-play connectors for instruments and software. 

Digital transformation should enable better use of data, without losing or skewing context, he suggests. And this means more than just implementing standards for data or communication between lab systems. ‘Scitara is developing the mechanisms that will make it relatively straightforward to harness the full value of data, and that means finding, analysing, extracting and, if necessary, transforming data into a format required by the platforms to which it is transferred. Complete connectivity and retention of context flow will mean labs can push and  retrieve contextualised insight into, and out of, data lakes, which can then be better mined and interrogated.’ 

The laboratory has lagged in other industries in its drive to generate this ecosystem of connectivity, Gerhardt suggests. ‘We enjoy all manner of integration in the consumer world. PayPal integrates seamlessly with my bank, in our homes digital light switches, thermostats and door locks from different vendors can be made to work together.’ 

So why has the laboratory dragged its heels behind other industrial sectors? Historically, lab digitalisation efforts tended to be vendor-specific, and developing point-to-point integrations for individual instruments was perhaps the norm, Gerhardt says. ‘This may have been partly due to regulatory pressures. For regulated industries, there may not be a huge incentive to modernise because, when changing your architecture, you must then revalidate your instruments and software for regulatory compliance.  

Scitara is working with software and hardware manufacturers to establish a framework for that pan-laboratory connectivity between the instruments, key platforms such as electronic laboratory notebooks, web services and analytical tools, including AI and ML. ‘We work with so their products can participate in the exchange. This opens the way to lab interconnectivity, communication and the exchange of data and instructions.’ 

In addition to the development of its integration platform, Scitara has generated an orchestration layer that allows users ‘in a user-friendly, drag-and-drop interface type of way’, to create automated, event-driven lab workflows,  Gerhardt explains. So, when an ELN, for example, requests a balance reading, this triggers a cascade of events as an automation. ‘The user will be notified to take the balance reading which, once taken, will be published as a new event, and the resulting data and metadata then moves back into the ELN,’ he explains. ‘And then it becomes feasible to execute this workflow seamlessly.’  

Transforming data 

Keeping data in an inherently adaptable format, such as JSON, also makes it possible to transform data, agnostic to the original format, Gerhardt notes. ‘Data can be transformed, in flight, to the format required by its destination, whether that be an ELN, or an artificial intelligence and machine-learning tool. Rather than  imposing standardisation on everything,  our approach allows this flexible data  transformation, which is enabled by taking the data out if its native format, putting it into a friendly format, such as JSON, and then transforming that data into the shape that your destination application, such as  an AI tool, can ingest.’  

One of the major problems with applying data-driven R&D is that you cannot go straight to unsupervised learning models, comments Max Petersen, AVP  of chemicals and materials marketing at  Dotmatics. ‘You need to have some kind of supervision to train your algorithms to explain and demonstrate the data that contributes to a positive outcome, and that relies on complete, clean data.’ The ability to derive end-to-end, clean and insightful data for that learning model by interrogating a complete ecosystem of data is hampered when your informatics infrastructure is founded on different point solutions, Petersen notes. ‘It’s a  foundational issue that Dotmatics has addressed through the development of its unified platform. We are driving the technology to effectively integrate all the different data types that might populate an  experimental ecosystem, and so connect all the dots.’  

A major challenge for data-driven  R&D is the lack of high-quality data,  he continues. ‘While accurate sample,  analytical and physical characterisation data may exist, they are generally not helpful when analysing why or how a  specific experiment contributed to the overall success of an innovation project.  This can only be achieved with a unified platform approach that links data from all disciplines together and contextualises experiments by implementing workflows and roles. This provides data insights that  are otherwise impossible and helps our  customers to innovate faster.’ 

What you can achieve using the unified platform is a more holistic data-centric view, across disciplines, so it’s possible to cross-reference disparate data from chemistry, biology, process,  physical characterisation, formulation and analytical perspectives. ‘In the synthetic chemistry lab, for example, this could give the scientist a way to map a  complete synthetic route, pulling out all the experimental data that would then facilitate data modelling, because you can generate a complete data framework for instructing an algorithm,’ says Petersen. The unified platform also gives labs flexibility in the configuration of workflows,  and a free hand to query all experimental data in the context of that workflow, and then visualise analyses in multiple ways.  

UK firm Interactive Software has developed its Achiever Medical LIMS as a  web-based platform for labs and biobanks. The software solution has been specially designed to aid lab digitalisation and regulatory compliance, from the central point of a LIMS outwards into all areas of lab and business operation. The Achiever Medical LIMS solution offers the flexibility to share and configure data from the labs on-site, as well as external platforms, share data between collaborators, automate samples explains Interactive Software product director Sharon Williams. 

Graphical tools make it possible to interact with data, and manage and query disease and patient information according to regulatory permissions. ‘When you have multiple types of data, including both patient, sample and disease-related data, some of the key issues and barriers to improving that seamless connectivity through digitalisation centre on having data in formats that can be understood by these different platforms,’ says Williams. Parallel this with the need to retain data value as that information flows through a workflow to analytical endpoints. ‘Disconnect can result in the loss of data context or utility, which then impacts on how you can model from that data,’ Williams says. ‘And while it may not necessarily be possible to standardise or integrate your systems at the outset, at least in the shorter term, finding an approach to linking that data in a meaningful way – through an appropriate  LIMS system, or potentially multiple different LIMS systems – is still important.’ 

Technological challenges to digitalisation are, understandably, a key management should also be high on the list, Williams suggests. ‘Scientists carrying out everyday processes and procedures  will have their standard working practices, and implementation of any new platform may, in the shorter term, cause issues at the level of everyday practice and  efficiency.’ This issue can be compounded when labs need to interact with each other, and with possibly different technological setups and practices. ‘It’s a toss-up between the importance of getting the day job done effectively and efficiently under the umbrella of compliance constraints, versus integrating the systems and  platforms that will help to connect the  various parts of that workflow.’ 

So, any drive to effect new approaches to enabling more seamless digitalisation must involve the end-user, as well as the IT  department, decision-makers and purse-string holders, Williams says. ‘Just as there can be a disconnect between software and hardware systems, so there is also often a disconnect between the scientists,  managers and IT teams. 

Making sure that everyone is on board and engaged with  change is critical.’ Before embarking on new software/hardware acquisition for it’s important to have a thorough understanding of what you want to achieve, as well as what this achievement will look like with respect to change for the user, data flow and compliance.  

Despite these caveats, the ability to digitalise a lab can have important implications for the healthcare sector, as well as for research, manufacturing and quality control settings, Williams adds. ‘We recently worked to integrate a university Covid testing lab, to integrate  LIMS and other platforms with the qPCR, rack scanners, equipment and a  registration portal for their testing kits. This was a great demonstration of the flexibility of our platform and allowed us to [show]  how to connect a process from beginning to end. 

In this case, we achieved automation from the point of self-registration, to scanning received samples, testing and results generation, and then automation of necessary communication out of the system, dependent on the results. For example, the system will automatically send out emails to individuals who have a negative SARS-CoV-2 PCR test result, and automatically alerts the contact centre to call those who have tested positive.’ 



Topics

Media Partners